Kullback-Leibler divergence

GPTKB entity
AI-created image of Kullback-Leibler divergence
AI-created image

Statements (33)
Predicate Object
gptkbp:instanceOf gptkb:information_theory
gptkb:statistical_analysis
gptkbp:alsoKnownAs gptkb:KL_divergence
relative entropy
gptkbp:application information gain
machine learning loss function
model selection
variational inference
gptkbp:defines A measure of how one probability distribution diverges from a second, expected probability distribution.
gptkbp:field gptkb:information_theory
gptkb:machine_learning
statistics
gptkbp:form D_{KL}(P || Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}
D_{KL}(P || Q) = \int p(x) \log \frac{p(x)}{q(x)} dx
gptkbp:introducedIn 1951
gptkbp:namedAfter gptkb:Richard_Leibler
gptkb:Solomon_Kullback
gptkbp:property asymmetric
non-negative
gptkbp:publishedIn gptkb:Annals_of_Mathematical_Statistics
gptkbp:relatedTo gptkb:organization
gptkb:Jensen-Shannon_divergence
cross-entropy
gptkbp:usedIn gptkb:reinforcement_learning
Bayesian statistics
deep learning
natural language processing
gptkbp:bfsParent gptkb:information_geometry
gptkb:f-divergence
gptkb:t-SNE
gptkb:Information_Theory
gptkbp:bfsLayer 6
https://www.w3.org/2000/01/rdf-schema#label Kullback-Leibler divergence