Distil BERT

GPTKB entity

Statements (71)
Predicate Object
gptkbp:instance_of gptkb:Transformers_character
gptkbp:bfsLayer 3
gptkbp:bfsParent gptkb:Model
gptkbp:based_on gptkb:BERT
gptkbp:developed_by gptkb:Hugging_Face
gptkbp:has 12 attention heads
12 layers
768 hidden units
smaller architecture than BERT
gptkbp:has_achievements faster inference times
https://www.w3.org/2000/01/rdf-schema#label Distil BERT
gptkbp:improves efficiency of NLP tasks
gptkbp:is lightweight
open-source
widely used in industry
a model for feature extraction
popular in research
a popular choice for developers
used in recommendation systems
used in chatbots
a benchmark for other models
a distilled version of BERT
a model for brand monitoring
a model for compliance monitoring
a model for content moderation
a model for contextual embeddings
a model for cross-lingual tasks
a model for customer feedback analysis
a model for data analysis
a model for data mining
a model for dialogue systems
a model for domain adaptation
a model for few-shot learning
a model for fraud detection
a model for information retrieval
a model for knowledge extraction
a model for language modeling
a model for market research.
a model for multi-task learning
a model for risk assessment
a model for semantic understanding
a model for sentiment tracking
a model for social media analysis
a model for spam detection
a model for syntactic understanding
a model for text generation
a model for text mining
a model for transfer learning
a model for zero-shot learning
a state-of-the-art model
fine-tunable
part of the Hugging Face Transformers library
pre-trained
suitable for edge devices
used in search engines
used in summarization tasks
used in translation tasks
used in virtual assistants
gptkbp:is_designed_for natural language processing
gptkbp:is_integrated_with gptkb:Graphics_Processing_Unit
gptkb:Py_Torch
gptkbp:is_used_for question answering
sentiment analysis
text classification
named entity recognition
gptkbp:maintains 95% of BERT's performance
gptkbp:reduces model size
gptkbp:released_in gptkb:2019
gptkbp:supports multiple languages
gptkbp:training English language data
gptkbp:uses self-attention mechanism