TinyBERT

GPTKB entity

Statements (35)
Predicate Object
gptkbp:instanceOf gptkb:model
large language model
gptkbp:accuracyComparedToBERT comparable
gptkbp:architecture transformer encoder
gptkbp:author Jiao, Xin and Yin, Yichun and Shang, Lifeng and Jiang, Xin and Chen, Xiao and Li, Linlin and Wang, Fang and Liu, Qun
gptkbp:availableOn gptkb:GitHub
gptkbp:basedOn gptkb:BERT
gptkbp:designedFor knowledge distillation
model compression
gptkbp:developedBy gptkb:Huawei_Noah's_Ark_Lab
gptkbp:fineTunedWith true
gptkbp:hasDistillery gptkb:BERT-base
gptkb:BERT-large
https://www.w3.org/2000/01/rdf-schema#label TinyBERT
gptkbp:input gptkb:text
gptkbp:language English
gptkbp:level 4
6
8
gptkbp:license Apache 2.0
gptkbp:notablePublication gptkb:TinyBERT:_Distilling_BERT_for_Natural_Language_Understanding
gptkbp:openSource true
gptkbp:parameter 14.5 million
gptkbp:pretrained true
gptkbp:publishedIn gptkb:EMNLP_2020
gptkbp:releaseYear 2019
gptkbp:sizeComparedToBERT smaller
gptkbp:speedComparedToBERT faster
gptkbp:url https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/TinyBERT
gptkbp:usedFor question answering
natural language understanding
sentence similarity
text classification
gptkbp:bfsParent gptkb:transformation
gptkbp:bfsLayer 5