gptkbp:instanceOf
|
deep contextualized word representation model
|
gptkbp:application
|
natural language processing
question answering
sentiment analysis
named entity recognition
|
gptkbp:architecture
|
deep bidirectional LSTM
|
gptkbp:author
|
gptkb:Matthew_Peters
|
gptkbp:citation
|
high
|
gptkbp:developedBy
|
gptkb:Allen_Institute_for_AI
|
gptkbp:fullName
|
gptkb:Embeddings_from_Language_Models
|
https://www.w3.org/2000/01/rdf-schema#label
|
ELMo
|
gptkbp:improves
|
traditional word embeddings
|
gptkbp:influenced
|
contextual word embedding research
|
gptkbp:input
|
pre-trained word embeddings
|
gptkbp:introducedIn
|
2018
|
gptkbp:language
|
English
|
gptkbp:notablePublication
|
Deep contextualized word representations
|
gptkbp:openSource
|
yes
|
gptkbp:output
|
contextualized word vectors
|
gptkbp:predecessor
|
gptkb:BERT
|
gptkbp:publishedIn
|
gptkb:NAACL_2018
|
gptkbp:trainer
|
gptkb:1_Billion_Word_Benchmark
|
gptkbp:url
|
https://allennlp.org/elmo
|
gptkbp:uses
|
bidirectional LSTM
|
gptkbp:bfsParent
|
gptkb:Bidirectional_Encoder_Representations_from_Transformers
gptkb:word2vec
|
gptkbp:bfsLayer
|
6
|