Statements (22)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:large_language_model
|
| gptkbp:architecture |
gptkb:transformation
|
| gptkbp:context |
2048 tokens
|
| gptkbp:developedBy |
gptkb:BigScience_Workshop
|
| gptkbp:hasModel |
decoder-only transformer
|
| gptkbp:intendedUse |
gptkb:research
|
| gptkbp:license |
gptkb:Responsible_AI_License
|
| gptkbp:notableFor |
first open multilingual LLM over 100B parameters
|
| gptkbp:openSource |
true
|
| gptkbp:parameter |
176 billion
|
| gptkbp:pretrained |
true
|
| gptkbp:releaseDate |
2022
|
| gptkbp:supportsLanguage |
true
|
| gptkbp:supportsTextCompletion |
true
|
| gptkbp:supportsTextGeneration |
true
|
| gptkbp:trainingDataLanguages |
46
|
| gptkbp:trainingDataSize |
1.6 terabytes
|
| gptkbp:trainingInfrastructure |
Jean Zay supercomputer
|
| gptkbp:website |
https://bigscience.huggingface.co/blog/bloom
|
| gptkbp:bfsParent |
gptkb:BigScience
|
| gptkbp:bfsLayer |
7
|
| https://www.w3.org/2000/01/rdf-schema#label |
BLOOM-176B
|