Skip to content
This repository has been archived by the owner on Jun 3, 2019. It is now read-only.

Commit

Permalink
Merge pull request #64 from wangshirui33/master
Browse files Browse the repository at this point in the history
Character-Level Language Modeling with Deeper Self-Attention
  • Loading branch information
huan authored Nov 3, 2018
2 parents e95f8d7 + 931096c commit 6b88e17
Show file tree
Hide file tree
Showing 2 changed files with 15 additions and 1 deletion.
4 changes: 3 additions & 1 deletion docs/_sidebar.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,8 @@
- [ELMo: Deep contextualized word representations,2018](papers/elmo-deep-contextualized-word-representations-2017.md)
- [Skip-Thought Vectors,2015](papers/skip-thought-vectors-2015.md)
- [GLoMo: Unsupervisedly Learned Relational Graphs as Transferable Representations](papers/glomo-unsupervisedly-learned-relational-graphs-as-transferable-representations.md)

- [Character-Level Language Modeling with Deeper Self-Attention,2018](papers/character-level-language-modeling-with-deeper-self-attention-2018.md)

- 9. Knowledge Graph
- [Graph2Seq: Graph to Sequence Learning with Attention-Based Neural Networks,2018](papers/graph2seq-graph-to-sequence-learning-with-attention-based-neural-networks-2018.md)

Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
# Character-Level Language Modeling with Deeper Self-Attention

| | |
| :--- | :--- |
| title | Character-Level Language Modeling with Deeper Self-Attention |
| author | Rami Al-Rfou∗ Dokook Choe∗ Noah Constant∗ Mandy Guo∗ Llion Jones∗ |
| year | 2018 |
| paper | https://arxiv.org/abs/1808.04444|
| github | |
| paperweekly | |
| page | 11 |
| member | wangshirui33 |

0 comments on commit 6b88e17

Please # to comment.