ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
URI: https://gptkb.org/entity/ERNIE_2.0:_A_Continual_Pre-training_Framework_for_Language_Understanding
GPTKB entity
Statements (22)
Predicate | Object |
---|---|
gptkbp:instanceOf |
gptkb:academic_journal
|
gptkbp:author |
gptkb:Haifeng_Wang
gptkb:Han_Zhang gptkb:Hao_Tian gptkb:Hua_Wu gptkb:Shikun_Feng gptkb:Xin_Tian gptkb:Yu_Sun |
gptkbp:demonstrates |
state-of-the-art results on NLP tasks
|
gptkbp:field |
gptkb:machine_learning
natural language processing |
gptkbp:focusesOn |
language understanding
continual pre-training |
https://www.w3.org/2000/01/rdf-schema#label |
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
|
gptkbp:improves |
gptkb:ERNIE
|
gptkbp:proposedBy |
gptkb:ERNIE_2.0_model
|
gptkbp:publicationYear |
2020
|
gptkbp:publishedIn |
gptkb:AAAI_2020
|
gptkbp:uses |
multi-task learning
incremental knowledge integration |
gptkbp:bfsParent |
gptkb:ERNIE_2.0
|
gptkbp:bfsLayer |
6
|