GPTKB
Browse
Query
Compare
Download
Publications
Contributors
Search
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
URI:
https://gptkb.org/entity/BERT:_Pre-training_of_Deep_Bidirectional_Transformers_for_Language_Understanding
GPTKB entity
Statements (35)
Predicate
Object
gptkbp:instanceOf
gptkb:academic_journal
gptkbp:arXivID
1810.04805
gptkbp:author
gptkb:Jacob_Devlin
gptkb:Kenton_Lee
gptkb:Ming-Wei_Chang
gptkb:Kristina_Toutanova
gptkbp:citation
over 50,000
gptkbp:field
gptkb:Natural_Language_Processing
https://www.w3.org/2000/01/rdf-schema#label
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
gptkbp:impact
state-of-the-art results on NLP tasks
gptkbp:influenced
gptkb:ALBERT
gptkb:DistilBERT
gptkb:RoBERTa
gptkb:XLNet
gptkbp:language
English
gptkbp:memiliki_tugas
gptkb:Named_Entity_Recognition
gptkb:Text_Classification
gptkb:Natural_Language_Inference
gptkb:Question_Answering
Sentiment Analysis
gptkbp:method
gptkb:Masked_Language_Modeling
gptkb:Next_Sentence_Prediction
Transformer architecture
gptkbp:openSource
yes
gptkbp:organization
gptkb:Google_AI_Language
gptkbp:proposedBy
large language model
gptkbp:publicationDate
gptkb:arXiv
gptkbp:publicationYear
2018
gptkbp:trainer
gptkb:GLUE
gptkb:MNLI
gptkb:SQuAD
gptkbp:url
https://arxiv.org/abs/1810.04805
gptkbp:bfsParent
gptkb:large_language_model
gptkb:Large_Language_Models
gptkbp:bfsLayer
5