arXiv preprint arXiv:1710.09767
GPTKB entity
Statements (26)
Predicate | Object |
---|---|
gptkbp:instanceOf |
gptkb:academic_journal
|
gptkbp:arXivID |
1710.09767
|
gptkbp:author |
gptkb:Illia_Polosukhin
gptkb:Łukasz_Kaiser gptkb:Aidan_N._Gomez gptkb:Ashish_Vaswani gptkb:Jakob_Uszkoreit gptkb:Llion_Jones gptkb:Niki_Parmar gptkb:Noam_Shazeer |
gptkbp:citation |
over 100,000
|
gptkbp:field |
gptkb:machine_learning
natural language processing |
https://www.w3.org/2000/01/rdf-schema#label |
arXiv preprint arXiv:1710.09767
|
gptkbp:influenced |
gptkb:T5
gptkb:BERT gptkb:GPT gptkb:Vision_Transformer |
gptkbp:language |
English
|
gptkbp:proposedBy |
Transformer architecture
|
gptkbp:publicationDate |
2017-06-12
|
gptkbp:publisher |
gptkb:arXiv
|
gptkbp:title |
gptkb:Attention_Is_All_You_Need
|
gptkbp:url |
https://arxiv.org/abs/1710.09767
|
gptkbp:bfsParent |
gptkb:MLSH
|
gptkbp:bfsLayer |
7
|