GPTKB
Browse
Query
Compare
Download
Publications
Contributors
Search
Attention is All You Need
URI:
https://gptkb.org/entity/Attention_is_All_You_Need
GPTKB entity
Statements (33)
Predicate
Object
gptkbp:instanceOf
gptkb:academic_journal
gptkbp:arXivID
1706.03762
gptkbp:author
gptkb:Illia_Polosukhin
gptkb:Łukasz_Kaiser
gptkb:Aidan_N._Gomez
gptkb:Ashish_Vaswani
gptkb:Jakob_Uszkoreit
gptkb:Llion_Jones
gptkb:Niki_Parmar
gptkb:Noam_Shazeer
gptkbp:citation
over 80,000
researchers worldwide
gptkbp:contribution
Transformer model for sequence modeling
gptkbp:field
gptkb:machine_learning
natural language processing
gptkbp:focusesOn
self-attention mechanism
https://www.w3.org/2000/01/rdf-schema#label
Attention is All You Need
gptkbp:impact
revolutionized NLP
gptkbp:influenced
gptkb:T5
gptkb:BERT
gptkb:GPT
gptkb:XLNet
many modern NLP models
gptkbp:introduced
Transformer architecture
gptkbp:language
English
gptkbp:openAccess
true
gptkbp:pages
11
gptkbp:proposedBy
removal of recurrence in sequence transduction models
gptkbp:publicationYear
2017
gptkbp:publishedIn
gptkb:NeurIPS_2017
gptkbp:url
https://arxiv.org/abs/1706.03762
gptkbp:bfsParent
gptkb:Attention_is_All_You_Need_(contributor)
gptkbp:bfsLayer
8