mLED

GPTKB entity

Statements (23)
Predicate Object
gptkbp:instanceOf large language model
gptkbp:architecture encoder-decoder
gptkbp:author gptkb:Siddharth_Batra
gptkb:Angela_Fan
gptkb:Patrick_Lewis
Other Facebook AI Research authors
gptkbp:basedOn Transformer architecture
gptkbp:designedFor multilingual document summarization
gptkbp:developedBy gptkb:Facebook_AI_Research
https://www.w3.org/2000/01/rdf-schema#label mLED
gptkbp:length 16384 tokens
gptkbp:notableFeature handles long input sequences
multilingual capability
gptkbp:notablePublication mLED: A Massively Multilingual Pre-trained Encoder-Decoder for Long-Form Document Summarization
https://arxiv.org/abs/2007.12626
gptkbp:openSource true
gptkbp:releaseYear 2020
gptkbp:supportsLanguage 101 languages
gptkbp:trainer large-scale multilingual datasets
gptkbp:usedFor multilingual natural language processing
long document summarization
gptkbp:bfsParent gptkb:MicroLED
gptkbp:bfsLayer 6