De BERTa v2

GPTKB entity

Statements (58)
Predicate Object
gptkbp:instance_of gptkb:language
gptkbp:applies_to self-supervised learning
gptkbp:architecture gptkb:Transformers
gptkbp:developed_by gptkb:Python
gptkb:Microsoft_Research
gptkbp:has multiple variants
gptkbp:has_achieved state-of-the-art performance
https://www.w3.org/2000/01/rdf-schema#label De BERTa v2
gptkbp:improves gptkb:De_BERTa
gptkbp:is_available_on gptkb:Hugging_Face_Model_Hub
gptkbp:is_based_on BERT architecture
gptkbp:is_cited_in NLP research
gptkbp:is_designed_for improved understanding of context
gptkbp:is_documented_in research papers
gptkbp:is_evaluated_by gptkb:AX-bench
gptkb:GLUE_benchmark
gptkb:historical_memory
gptkb:SQu_AD
gptkb:MNLI
gptkb:Super_GLUE_benchmark
F1 score
accuracy
precision
WNLI
HANS
RTE
QQP
MRPC
Co LA
STSB
gptkbp:is_open_source gptkb:true
gptkbp:is_optimized_for performance on downstream tasks
gptkbp:is_part_of De BERTa family
gptkbp:is_supported_by gptkb:Tensor_Flow
gptkb:Py_Torch
gptkbp:is_trained_in large text corpus
masked language modeling
next sentence prediction
gptkbp:is_used_for question answering
sentiment analysis
text generation
text classification
named entity recognition
gptkbp:is_used_in natural language processing tasks
gptkbp:outperforms gptkb:BERT
gptkb:Ro_BERTa
gptkbp:performance gptkb:GPT-3
gptkb:T5
gptkb:XLNet
ALBERT
ERNIE
NLP models
gptkbp:release_year gptkb:2021
gptkbp:released_in gptkb:2021
gptkbp:supports multiple languages
gptkbp:uses disentangled attention
gptkbp:bfsParent gptkb:De_BERTa
gptkbp:bfsLayer 6