gptkbp:notableFor
|
text completion
few-shot learning
multi-task learning
zero-shot learning
controversy over release
large-scale unsupervised learning
coherent long-form text generation
delayed full model release due to misuse concerns
demonstrated risks of large language models
influenced subsequent language models
no fine-tuning required for many tasks
|