Statements (22)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:natural_language_processing_technique
|
| gptkbp:category |
gptkb:artificial_intelligence
gptkb:machine_learning deep learning |
| gptkbp:input |
text sequence
|
| gptkbp:maskToken |
[MASK]
|
| gptkbp:objective |
predict masked words
|
| gptkbp:output |
predicted masked tokens
|
| gptkbp:proposedBy |
gptkb:BERT:_Pre-training_of_Deep_Bidirectional_Transformers_for_Language_Understanding
gptkb:Jacob_Devlin |
| gptkbp:relatedTo |
self-supervised learning
language model pretraining transformer models |
| gptkbp:usedFor |
pretraining language models
|
| gptkbp:usedIn |
gptkb:BERT
gptkb:ALBERT gptkb:DistilBERT gptkb:RoBERTa |
| gptkbp:yearProposed |
2018
|
| gptkbp:bfsParent |
gptkb:Language_modeling
|
| gptkbp:bfsLayer |
7
|
| https://www.w3.org/2000/01/rdf-schema#label |
Masked language modeling
|