Distilling the Knowledge in a Neural Network

GPTKB entity

Statements (19)
Predicate Object
gptkbp:instanceOf gptkb:academic_journal
gptkbp:arXivID 1503.02531
gptkbp:author gptkb:Geoffrey_Hinton
gptkb:Jeff_Dean
gptkb:Oriol_Vinyals
gptkbp:citation over 10000
gptkbp:hasMethod knowledge distillation
https://www.w3.org/2000/01/rdf-schema#label Distilling the Knowledge in a Neural Network
gptkbp:influenced model compression research
student-teacher model
gptkbp:language English
gptkbp:publicationYear 2015
gptkbp:publishedIn gptkb:arXiv
gptkbp:topic neural networks
knowledge distillation
model compression
gptkbp:bfsParent gptkb:Jonathon_Shlens
gptkb:Google_Brain_(former)
gptkbp:bfsLayer 7