Neural Machine Translation by Jointly Learning to Align and Translate
GPTKB entity
Statements (22)
Predicate | Object |
---|---|
gptkbp:instanceOf |
gptkb:academic_journal
|
gptkbp:arXivID |
1409.0473
|
gptkbp:author |
gptkb:Yoshua_Bengio
gptkb:Kyunghyun_Cho gptkb:Dzmitry_Bahdanau |
gptkbp:citation |
gptkb:Transformer_model
over 10000 Google Neural Machine Translation System |
gptkbp:contribution |
introduction of attention mechanism in neural machine translation
|
gptkbp:field |
machine translation
natural language processing |
https://www.w3.org/2000/01/rdf-schema#label |
Neural Machine Translation by Jointly Learning to Align and Translate
|
gptkbp:impact |
pioneered attention-based neural machine translation
|
gptkbp:language |
English
|
gptkbp:proposedModel |
encoder-decoder with attention
|
gptkbp:publicationYear |
2015
|
gptkbp:publishedIn |
gptkb:International_Conference_on_Learning_Representations
|
gptkbp:trainer |
WMT English-French dataset
|
gptkbp:url |
https://arxiv.org/abs/1409.0473
|
gptkbp:bfsParent |
gptkb:Nal_Kalchbrenner
gptkb:Google_Brain_(former) |
gptkbp:bfsLayer |
7
|