gptkbp:instanceOf
|
large language model
pretrained model
|
gptkbp:architecture
|
gptkb:transformation
|
gptkbp:author
|
gptkb:Chan_Ho_So
gptkb:Donghyeon_Kim
gptkb:Eunjeong_Kang
gptkb:Jaewoo_Kang
gptkb:Jinhyuk_Lee
gptkb:Sungdong_Kim
gptkb:Sunkyu_Kim
gptkb:Wonjin_Yoon
|
gptkbp:basedOn
|
gptkb:BERT
|
gptkbp:citation
|
over 3000
|
gptkbp:developedBy
|
gptkb:DMIS_Lab
gptkb:Korea_University
|
gptkbp:domain
|
biomedical text mining
|
gptkbp:firstPublished
|
2019
|
gptkbp:format
|
tokenized text
contextual embeddings
|
gptkbp:github
|
https://github.com/dmis-lab/biobert
|
https://www.w3.org/2000/01/rdf-schema#label
|
BioBERT
|
gptkbp:language
|
English
|
gptkbp:license
|
Apache 2.0
|
gptkbp:notablePublication
|
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
|
gptkbp:publicationYear
|
2020
|
gptkbp:publishedIn
|
gptkb:Bioinformatics
|
gptkbp:trainer
|
gptkb:PMC_full-text_articles
gptkb:PubMed_abstracts
|
gptkbp:usedFor
|
question answering
named entity recognition
relation extraction
|
gptkbp:bfsParent
|
gptkb:large_language_model
|
gptkbp:bfsLayer
|
5
|