Statements (17)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:academic_journal
|
| gptkbp:affiliation |
gptkb:Google_AI_Language
|
| gptkbp:arXivID |
1810.04805
|
| gptkbp:author |
gptkb:Jacob_Devlin
gptkb:Kenton_Lee gptkb:Ming-Wei_Chang gptkb:Kristina_Toutanova |
| gptkbp:citation |
over 50,000
|
| gptkbp:contribution |
Introduced BERT, a pre-trained deep bidirectional transformer model for NLP
|
| gptkbp:field |
gptkb:Natural_Language_Processing
|
| gptkbp:influenced |
transformer-based NLP models
|
| gptkbp:publicationYear |
2018
|
| gptkbp:publishedIn |
gptkb:arXiv
|
| gptkbp:title |
gptkb:BERT:_Pre-training_of_Deep_Bidirectional_Transformers_for_Language_Understanding
|
| gptkbp:bfsParent |
gptkb:Multilingual_BERT
|
| gptkbp:bfsLayer |
8
|
| https://www.w3.org/2000/01/rdf-schema#label |
Devlin et al., 2018
|