gptkbp:instance_of
|
gptkb:Transformers
|
gptkbp:architecture
|
gptkb:Transformers
|
gptkbp:can_perform
|
gptkb:translator
question answering
text generation
text summarization
text classification
|
gptkbp:developed_by
|
gptkb:Google_Research
|
gptkbp:focus_area
|
gptkb:true
|
gptkbp:has_achieved
|
state-of-the-art results on various benchmarks
|
gptkbp:has_dropout_rate
|
0.1
|
gptkbp:has_feed_forward_size
|
4096
|
gptkbp:has_hidden_size
|
1024
|
gptkbp:has_layer_count
|
24 layers
|
https://www.w3.org/2000/01/rdf-schema#label
|
T5-Large
|
gptkbp:input_output
|
gptkb:text
|
gptkbp:is_available_on
|
gptkb:Hugging_Face_Model_Hub
|
gptkbp:is_compared_to
|
gptkb:GPT-3
gptkb:BERT
gptkb:XLNet
|
gptkbp:is_evaluated_by
|
gptkb:SQu_AD
gptkb:MNLI
gptkb:WMT
RTE
QQP
MRPC
|
gptkbp:is_optimized_for
|
text-to-text tasks
|
gptkbp:is_part_of
|
gptkb:T5_family
NLP models
|
gptkbp:is_supported_by
|
gptkb:Tensor_Flow
gptkb:Py_Torch
|
gptkbp:is_tasked_with
|
gptkb:Natural_Language_Processing
|
gptkbp:is_trained_in
|
unsupervised learning
C4 dataset
|
gptkbp:is_used_for
|
content creation
sentiment analysis
chatbots
data augmentation
virtual assistants
|
gptkbp:is_used_in
|
research and industry
|
gptkbp:model
|
770 million parameters
|
gptkbp:performance
|
BLEU score
ROUGE score
NLP tasks
GLUE score
|
gptkbp:release_year
|
gptkb:2020
|
gptkbp:requires
|
GPU for training
|
gptkbp:supports
|
multiple languages
|
gptkbp:tuning
|
possible for specific tasks
|
gptkbp:uses
|
transfer learning
|
gptkbp:bfsParent
|
gptkb:Noam_Shazeer
|
gptkbp:bfsLayer
|
6
|