GPTKB
Browse
Query
Compare
Download
Publications
Contributors
Search
Kullback-Leibler divergence
URI:
https://gptkb.org/entity/Kullback-Leibler_divergence
GPTKB entity
Statements (30)
Predicate
Object
gptkbp:instanceOf
gptkb:information_theory
statistical analysis
gptkbp:alsoKnownAs
gptkb:KL_divergence
relative entropy
gptkbp:application
information gain
machine learning loss function
model selection
variational inference
gptkbp:defines
A measure of how one probability distribution diverges from a second, expected probability distribution.
gptkbp:field
gptkb:information_theory
gptkb:machine_learning
statistics
gptkbp:form
D_{KL}(P || Q) = \\sum_i P(i) \\log \\frac{P(i)}{Q(i)}
D_{KL}(P || Q) = \\int p(x) \\log \\frac{p(x)}{q(x)} dx
https://www.w3.org/2000/01/rdf-schema#label
Kullback-Leibler divergence
gptkbp:introducedIn
1951
gptkbp:namedAfter
gptkb:Richard_Leibler
gptkb:Solomon_Kullback
gptkbp:property
asymmetric
non-negative
gptkbp:publishedIn
gptkb:Annals_of_Mathematical_Statistics
gptkbp:relatedTo
gptkb:organization
gptkb:Jensen-Shannon_divergence
cross-entropy
gptkbp:usedIn
gptkb:reinforcement_learning
Bayesian statistics
deep learning
natural language processing
gptkbp:bfsParent
gptkb:information_theory
gptkbp:bfsLayer
4