GPTKB
Browse
Query
Compare
Download
Publications
Contributors
Search
Transformer: Attention Is All You Need
URI:
https://gptkb.org/entity/Transformer:_Attention_Is_All_You_Need
GPTKB entity
Statements (38)
Predicate
Object
gptkbp:instanceOf
gptkb:academic_journal
gptkbp:application
natural language processing
gptkbp:arXivID
1706.03762
gptkbp:author
gptkb:Illia_Polosukhin
gptkb:Łukasz_Kaiser
gptkb:Aidan_N._Gomez
gptkb:Ashish_Vaswani
gptkb:Jakob_Uszkoreit
gptkb:Llion_Jones
gptkb:Niki_Parmar
gptkb:Noam_Shazeer
gptkbp:citation
over 100,000
gptkbp:contribution
attention mechanism as primary component
improved training efficiency
parallelizable architecture
state-of-the-art results in machine translation
gptkbp:doi
10.48550/arXiv.1706.03762
https://www.w3.org/2000/01/rdf-schema#label
Transformer: Attention Is All You Need
gptkbp:influenced
gptkb:T5
gptkb:BERT
gptkb:GPT
gptkb:Vision_Transformer
gptkbp:introduced
self-attention
multi-head attention
positional encoding
gptkbp:language
English
gptkbp:openAccess
true
gptkbp:pages
15
gptkbp:proposedBy
Transformer architecture
gptkbp:publicationYear
2017
gptkbp:publishedIn
gptkb:NeurIPS_2017
gptkbp:removes
convolutional neural networks
recurrent neural networks
gptkbp:trainer
gptkb:WMT_2014_English-to-French
gptkb:WMT_2014_English-to-German
gptkbp:url
https://arxiv.org/abs/1706.03762
gptkbp:bfsParent
gptkb:Google_Brain_(former)
gptkbp:bfsLayer
7