T5 architecture

GPTKB entity

Statements (46)
Predicate Object
gptkbp:instanceOf transformer model
gptkbp:appliesTo natural language processing
gptkbp:basedOn text-to-text framework
gptkbp:developedBy gptkb:Google_Research
gptkbp:hasRelatedPatent healthcare
education
finance
customer service
entertainment
gptkbp:hasResearchInterest T5:_Exploring_the_Limits_of_Transfer_Learning_with_a_Unified_Text-to-Text_Transformer
gptkbp:hasVariants gptkb:T5-large
gptkb:T5-small
gptkb:T5-3B
gptkb:T5-11B
gptkb:T5-base
https://www.w3.org/2000/01/rdf-schema#label T5 architecture
gptkbp:introduced 2019
gptkbp:isBasedOn encoder-decoder architecture
gptkbp:isCompatibleWith GPUs
TPUs
gptkbp:isEvaluatedBy gptkb:GLUE_benchmark
gptkb:WMT_benchmark
SQuAD benchmark
MNLI benchmark
TREC benchmark
gptkbp:isInfluencedBy GPT-2
BERT
Transformer architecture
gptkbp:isKnownFor flexibility in task formulation
gptkbp:isNotableFor specific tasks
gptkbp:isOpenTo available_on_Hugging_Face
gptkbp:isPartOf NLP_toolkit
gptkbp:isSuitableFor to larger datasets
gptkbp:isTrainedIn supervised tasks
unsupervised tasks
gptkbp:isUsedIn question answering
chatbots
text summarization
text classification
translation tasks
gptkbp:powerOutput text sequences
gptkbp:providesTrainingFor Colossal Clean Crawled Corpus (C4)
gptkbp:raisesAwarenessAbout various_NLP_tasks
gptkbp:supports multi-task learning
gptkbp:uses transfer learning
gptkbp:utilizes self-attention mechanism