Statements (25)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:large_language_model
|
| gptkbp:architecture |
encoder-decoder
|
| gptkbp:author |
gptkb:Siddharth_Batra
gptkb:Angela_Fan gptkb:Patrick_Lewis Other Facebook AI Research authors |
| gptkbp:basedOn |
Transformer architecture
|
| gptkbp:designedFor |
multilingual document summarization
|
| gptkbp:developedBy |
gptkb:Facebook_AI_Research
|
| gptkbp:length |
16384 tokens
|
| gptkbp:notableFeature |
handles long input sequences
multilingual capability |
| gptkbp:notablePublication |
mLED: A Massively Multilingual Pre-trained Encoder-Decoder for Long-Form Document Summarization
https://arxiv.org/abs/2007.12626 |
| gptkbp:openSource |
true
|
| gptkbp:releaseYear |
2020
|
| gptkbp:supportsLanguage |
101 languages
|
| gptkbp:trainer |
large-scale multilingual datasets
|
| gptkbp:usedFor |
multilingual natural language processing
long document summarization |
| gptkbp:bfsParent |
gptkb:MicroLEDs
gptkb:Micro-LED gptkb:MicroLED |
| gptkbp:bfsLayer |
8
|
| https://www.w3.org/2000/01/rdf-schema#label |
mLED
|