BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
GPTKB entity
Statements (26)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:academic_journal
|
| gptkbp:application |
machine translation
natural language generation text comprehension |
| gptkbp:arXivID |
1910.13461
|
| gptkbp:author |
gptkb:Mike_Lewis
gptkb:Omer_Levy gptkb:Yinhan_Liu gptkb:Veselin_Stoyanov gptkb:Abdelrahman_Mohamed gptkb:Marjan_Ghazvininejad gptkb:Luke_Zettlemoyer gptkb:Naman_Goyal |
| gptkbp:citation |
gptkb:transformation
gptkb:BERT gptkb:GPT |
| gptkbp:focusesOn |
denoising autoencoder
sequence-to-sequence pre-training |
| gptkbp:proposedBy |
BART model
|
| gptkbp:publicationYear |
2019
|
| gptkbp:publishedIn |
gptkb:arXiv
|
| gptkbp:url |
https://arxiv.org/abs/1910.13461
|
| gptkbp:bfsParent |
gptkb:Yinhan_Liu
gptkb:Naman_Goyal |
| gptkbp:bfsLayer |
8
|
| https://www.w3.org/2000/01/rdf-schema#label |
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
|