Neural Machine Translation by Jointly Learning to Align and Translate

GPTKB entity

Statements (48)
Predicate Object
gptkbp:instanceOf research paper
gptkbp:application Language translation
gptkbp:author gptkb:Yoshua_Bengio
gptkb:Dzmitry_Bahdanau
Kyunghyun Cho
gptkbp:availableIn arXiv
gptkbp:citedBy over 5000 times
gptkbp:completed Attention improves translation
Joint learning is effective
gptkbp:contributedTo Natural Language Processing
gptkbp:dataUsage IWSLT dataset
WMT dataset
gptkbp:description Alignment between source and target languages
Joint learning framework
gptkbp:discusses Future directions in research
Challenges in machine translation
gptkbp:evaluates ROUGE score
BLEU_score
gptkbp:fellowship 10.1109/ICLR.2015.00112
gptkbp:focusesOn Neural networks
Machine translation
gptkbp:futurePlans Improve computational efficiency
Enhance alignment techniques
Expand to low-resource languages
Explore unsupervised learning
Investigate multilingual translation
https://www.w3.org/2000/01/rdf-schema#label Neural Machine Translation by Jointly Learning to Align and Translate
gptkbp:impact Improved translation quality
gptkbp:influencedBy Neural networks
Statistical machine translation
gptkbp:introduced Attention mechanism
gptkbp:keywords Deep learning
Alignment
Attention mechanism
Neural machine translation
Sequence-to-sequence learning
gptkbp:language English
gptkbp:model Encoder-decoder architecture
gptkbp:provides Theoretical insights
Empirical results
gptkbp:publishedIn gptkb:Proceedings_of_the_International_Conference_on_Learning_Representations_(ICLR)
gptkbp:relatedTo Deep learning
Word embeddings
Recurrent neural networks
Sequence-to-sequence models
gptkbp:technique Attention-based model
End-to-end learning
gptkbp:year 2015