gptkbp:instance_of
|
gptkb:Model
|
gptkbp:application
|
gptkb:Natural_Language_Processing
|
gptkbp:architecture
|
gptkb:Transformers
|
gptkbp:available_on
|
gptkb:Hugging_Face_Model_Hub
|
gptkbp:can_be_fine_tuned_for
|
specific tasks
|
gptkbp:community_support
|
gptkb:Tutorials
Open-source
Research papers
Git Hub repositories
|
gptkbp:developed_by
|
gptkb:Huawei
|
gptkbp:evaluates
|
Accuracy
F1 score
|
gptkbp:feature
|
Transfer learning
Contextual embeddings
Masked language modeling
Next sentence prediction
Multi-task learning
Layer normalization
Positional encoding
Self-attention mechanism
|
gptkbp:goal
|
Maintain performance
Reduce model size
|
gptkbp:has_achieved
|
state-of-the-art results
|
gptkbp:has_variants
|
gptkb:Tiny_BERT-4
gptkb:Tiny_BERT-6
gptkb:Tiny_BERT-8
|
https://www.w3.org/2000/01/rdf-schema#label
|
Tiny BERT
|
gptkbp:impact
|
Industry applications
AI development
Research advancements
Chatbot development
Text processing tasks
|
gptkbp:improves
|
BERT's performance
|
gptkbp:input_output
|
512 tokens
|
gptkbp:is_available_on
|
gptkb:Hugging_Face_Model_Hub
gptkb:Py_Torch_Hub
gptkb:Tensor_Flow_Hub
|
gptkbp:is_compared_to
|
gptkb:Distil_BERT
ALBERT
|
gptkbp:is_evaluated_by
|
gptkb:MNLI_benchmark
gptkb:SQu_AD_benchmark
gptkb:RTE_benchmark
gptkb:GLUE_benchmark
gptkb:Co_LA_benchmark
gptkb:QQP_benchmark
gptkb:WNLI_benchmark
STS-B benchmark
|
gptkbp:is_lighter_than
|
gptkb:BERT
|
gptkbp:is_optimized_for
|
gptkb:mobile_devices
edge devices
|
gptkbp:is_part_of
|
transformer models
Tiny BERT family
|
gptkbp:is_popular_in
|
gptkb:scientific_community
industry applications
|
gptkbp:is_trained_in
|
large text corpora
English text
|
gptkbp:is_used_for
|
gptkb:market_research
gptkb:academic_research
language translation
dialog systems
content moderation
chatbots
information retrieval
transfer learning
spam detection
feature extraction
text generation
text mining
text summarization
data annotation
social media analysis
named entity recognition
semantic similarity
customer support automation
text entailment
user intent prediction
|
gptkbp:is_used_in
|
question answering
sentiment analysis
text classification
language understanding
|
gptkbp:language
|
English
|
gptkbp:performance
|
Faster than BERT
|
gptkbp:provides_information_on
|
gptkb:Wikipedia
gptkb:Common_Crawl
gptkb:Book_Corpus
|
gptkbp:purpose
|
gptkb:Natural_Language_Processing
|
gptkbp:related_to
|
gptkb:Artificial_Intelligence
gptkb:BERT
Deep learning
Machine learning
NLP models
|
gptkbp:release_date
|
gptkb:2019
gptkb:2020
|
gptkbp:resolution
|
768 dimensions
|
gptkbp:size
|
14 million parameters
|
gptkbp:speed
|
gptkb:BERT
|
gptkbp:successor
|
gptkb:Tiny_BERT_2.0
|
gptkbp:supports
|
multiple languages
|
gptkbp:training
|
Knowledge Distillation
|
gptkbp:tuning
|
Task-specific datasets
|
gptkbp:type
|
Lightweight model
|
gptkbp:use_case
|
Question answering
Sentiment analysis
Text classification
Named entity recognition
|
gptkbp:uses
|
knowledge distillation
|
gptkbp:bfsParent
|
gptkb:BERT
|
gptkbp:bfsLayer
|
5
|