Statements (46)
Predicate | Object |
---|---|
gptkbp:instanceOf |
Researcher
|
gptkbp:affiliation |
gptkb:Google_Brain
|
gptkbp:author |
gptkb:Noam_Shazeer
gptkb:Aidan_N._Gomez gptkb:Taku_Kudo gptkb:Jakob_Uszkoreit Llion Jones Lukasz Kaiser Katherine_Lee Niki_Parmar |
gptkbp:awards |
Best Paper Award at NeurIPS 2017
|
gptkbp:birthPlace |
gptkb:India
|
gptkbp:birthYear |
1985
|
gptkbp:contribution |
Development of Attention Mechanism
Improvement of Translation Systems Research on Self-Attention Advancements_in_NLP Influence_on_AI_Research_Community |
gptkbp:education |
gptkb:University_of_Southern_California
gptkb:Indian_Institute_of_Technology,_Bombay |
gptkbp:field |
Artificial Intelligence
Machine Learning Natural Language Processing |
https://www.w3.org/2000/01/rdf-schema#label |
Ashish Vaswani
|
gptkbp:influencedBy |
gptkb:Geoffrey_Hinton
gptkb:Yoshua_Bengio gptkb:Yann_LeCun gptkb:Ilya_Sutskever |
gptkbp:knownFor |
Transformer model
|
gptkbp:nationality |
Indian
|
gptkbp:notableWork |
Attention is All You Need
|
gptkbp:occupation |
Computer_Scientist
|
gptkbp:patentCitation |
Method for Neural Machine Translation
|
gptkbp:project |
gptkb:Longformer
gptkb:GPT gptkb:T5 Big Bird Reformer BERT DALL-E Vision Transformer XLNet Transformer-XL |
gptkbp:researchInterest |
Deep Learning
Neural Networks Sequence-to-Sequence Learning |