Statements (31)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:large_language_model
gptkb:model |
| gptkbp:activatedBy |
gelu
|
| gptkbp:architecture |
gptkb:transformation
|
| gptkbp:availableOn |
gptkb:Hugging_Face_Model_Hub
|
| gptkbp:basedOn |
gptkb:GPT-2
|
| gptkbp:compressionRatio |
0.66
|
| gptkbp:contextWindowSize |
1024 tokens
|
| gptkbp:developedBy |
gptkb:Hugging_Face
|
| gptkbp:hasDistillery |
gptkb:GPT-2
|
| gptkbp:hiddenSize |
768
|
| gptkbp:input |
gptkb:text
|
| gptkbp:isAutoregressive |
true
|
| gptkbp:isSmallerVersionOf |
gptkb:GPT-2
|
| gptkbp:isUnidirectional |
true
|
| gptkbp:language |
English
|
| gptkbp:level |
6
|
| gptkbp:license |
gptkb:MIT_License
|
| gptkbp:memiliki_tugas |
text generation
language modeling |
| gptkbp:notableFauna |
12
|
| gptkbp:openSource |
true
|
| gptkbp:output |
gptkb:text
|
| gptkbp:parameter |
82 million
|
| gptkbp:pretrained |
true
|
| gptkbp:releaseYear |
2019
|
| gptkbp:supportsFineTuning |
true
|
| gptkbp:tokenizerType |
byte pair encoding
|
| gptkbp:trainer |
gptkb:OpenWebText
|
| gptkbp:usesAttentionMechanism |
true
|
| https://www.w3.org/2000/01/rdf-schema#label |
DistilGPT2
|