MPT (Mosaic Pretrained Transformer)

GPTKB entity

Statements (36)
Predicate Object
gptkbp:instanceOf large language model
gptkbp:architecture gptkb:transformation
gptkbp:availableOn gptkb:Hugging_Face
gptkbp:context 8k tokens
65k tokens (StoryWriter variant)
gptkbp:developedBy gptkb:MosaicML
gptkbp:fineTunedWith conversational AI
instruction following
story writing
gptkbp:hasVariant gptkb:MPT-30B
gptkb:MPT-7B
gptkb:MPT-7B-Chat
gptkb:MPT-7B-Instruct
gptkb:MPT-7B-StoryWriter
https://www.w3.org/2000/01/rdf-schema#label MPT (Mosaic Pretrained Transformer)
gptkbp:language English
gptkbp:license Apache 2.0
gptkbp:notableFeature long context window
efficient memory usage
open weights
scalable training
gptkbp:openSource true
gptkbp:optimizedFor gptkb:graphics_card
gptkb:TPU
gptkbp:parameter 30 billion
7 billion
gptkbp:releaseYear 2023
gptkbp:supports chat
code generation
text generation
gptkbp:trainer public datasets
gptkbp:usedFor research
commercial applications
gptkbp:bfsParent gptkb:Databricks_MosaicML
gptkb:MosaicML
gptkbp:bfsLayer 6