Generative Pre-trained Transformer 2

GPTKB entity

Statements (46)
Predicate Object
gptkbp:instanceOf large language model
gptkbp:abbreviation gptkb:GPT-2
gptkbp:architecture gptkb:transformation
gptkbp:author gptkb:Ilya_Sutskever
gptkb:Alec_Radford
gptkb:Dario_Amodei
gptkb:David_Luan
gptkb:Rewon_Child
gptkb:Jeffrey_Wu
gptkbp:basedOn transformer architecture
gptkbp:contextWindowSize 1024 tokens
gptkbp:controversy initially withheld due to misuse concerns
gptkbp:developedBy gptkb:OpenAI
gptkbp:eventuallyReleased November 2019
gptkbp:hasVariant 1.5B
117M
345M
762M
https://www.w3.org/2000/01/rdf-schema#label Generative Pre-trained Transformer 2
gptkbp:input gptkb:text
gptkbp:language English
gptkbp:license OpenAI license
gptkbp:memiliki_tugas text generation
text completion
language modeling
gptkbp:notableFor controversy over release
large-scale language modeling
demonstrating few-shot learning
influencing subsequent language models
scaling up transformer models
gptkbp:notablePublication gptkb:Language_Models_are_Unsupervised_Multitask_Learners
gptkbp:openSource partially
gptkbp:output gptkb:text
gptkbp:parameter 1.5 billion
gptkbp:predecessor gptkb:Generative_Pre-trained_Transformer
gptkbp:releaseDate 2019
gptkbp:successor gptkb:Generative_Pre-trained_Transformer_3
gptkbp:trainer gptkb:WebText
unsupervised learning
gptkbp:usedFor natural language processing
translator
question answering
text classification
text summarization
gptkbp:bfsParent gptkb:Generative_Pre-trained_Transformer_1
gptkbp:bfsLayer 7