GPTKB
Browse
Query
Compare
Download
Publications
Contributors
Search
mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer
URI:
https://gptkb.org/entity/mT5:_A_Massively_Multilingual_Pre-trained_Text-to-Text_Transformer
GPTKB entity
Statements (31)
Predicate
Object
gptkbp:instanceOf
large language model
pre-trained model
gptkbp:architecture
encoder-decoder
gptkbp:arXivID
2010.11934
gptkbp:author
gptkb:Adam_Roberts
gptkb:Colin_Raffel
gptkb:Katherine_Lee
gptkb:Michael_Matena
gptkb:Noam_Shazeer
gptkb:Peter_J._Liu
gptkb:Sharan_Narang
gptkb:Wei_Li
gptkb:Yanqi_Zhou
gptkbp:availableOn
gptkb:Hugging_Face
gptkbp:basedOn
gptkb:T5_architecture
gptkbp:developedBy
gptkb:Google_Research
gptkbp:format
gptkb:text
https://www.w3.org/2000/01/rdf-schema#label
mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer
gptkbp:license
Apache 2.0
gptkbp:memiliki_tugas
text-to-text
gptkbp:notableFor
multilingual capabilities
gptkbp:notablePublication
gptkb:mT5:_A_Massively_Multilingual_Pre-trained_Text-to-Text_Transformer
gptkbp:openSource
yes
gptkbp:parameter
300M to 13B
gptkbp:preTrainedOn
mC4 dataset
gptkbp:relatedTo
gptkb:T5
gptkb:mC4
gptkbp:releaseYear
2020
gptkbp:supportsLanguage
over 100 languages
gptkbp:bfsParent
gptkb:mT5
gptkbp:bfsLayer
8