Statements (52)
Predicate | Object |
---|---|
gptkbp:instance_of |
gptkb:Transformers
|
gptkbp:application |
language translation
dialog systems sentiment analysis chatbots information retrieval summarization content generation text completion knowledge extraction |
gptkbp:architecture |
encoder-decoder
|
gptkbp:based_on |
gptkb:Transformers
|
gptkbp:community_support |
gptkb:Hugging_Face_Transformers
gptkb:Py_Torch_Hub gptkb:Tensor_Flow_Hub |
gptkbp:developed_by |
gptkb:Google_Research
|
gptkbp:has_achieved |
state-of-the-art results
|
https://www.w3.org/2000/01/rdf-schema#label |
T5 family
|
gptkbp:introduced_in |
gptkb:2019
|
gptkbp:is_trained_in |
C4 dataset
|
gptkbp:key_feature |
flexibility
scalability transfer learning multi-task learning large-scale training text-to-text framework open-source availability pretraining and fine-tuning |
gptkbp:language |
supervised learning
unsupervised learning |
gptkbp:mission |
gptkb:translator
question answering text generation text classification |
gptkbp:model |
gptkb:T5-large
gptkb:T5-small gptkb:T5-base gptkb:T5-11_B gptkb:T5-3_B |
gptkbp:notable_feature |
high performance on benchmarks
adaptability to new tasks extensive pretraining flexible input-output format robustness to various tasks |
gptkbp:performance |
gptkb:GLUE
gptkb:Super_GLUE |
gptkbp:supports |
multiple languages
|
gptkbp:tuning |
task-specific datasets
|
gptkbp:used_for |
natural language processing
|
gptkbp:bfsParent |
gptkb:T5-large
gptkb:T5-small |
gptkbp:bfsLayer |
5
|