GPTKB
Browse
Query
Compare
Download
Publications
Contributors
Search
Kullback–Leibler divergence
URI:
https://gptkb.org/entity/Kullback–Leibler_divergence
GPTKB entity
AI-created image
× Close
Statements (33)
Predicate
Object
gptkbp:instanceOf
gptkb:information_theory
statistical distance
gptkbp:alsoKnownAs
gptkb:KL_divergence
relative entropy
gptkbp:category
gptkb:information_theory
gptkb:probability_theory
statistics
gptkbp:compatibleWith
symmetric
gptkbp:describes
difference between two probability distributions
gptkbp:field
gptkb:information_theory
gptkb:machine_learning
statistics
gptkbp:form
D_{KL}(P || Q) = \\sum_i P(i) \\log \\frac{P(i)}{Q(i)}
gptkbp:generalizes
cross-entropy
https://www.w3.org/2000/01/rdf-schema#label
Kullback–Leibler divergence
gptkbp:introducedIn
1951
gptkbp:minimumAt
when P = Q
gptkbp:minimumPressure
0
gptkbp:namedAfter
gptkb:Richard_Leibler
gptkb:Solomon_Kullback
gptkbp:relatedTo
gptkb:organization
gptkb:Jensen–Shannon_divergence
gptkb:f-divergence
mutual information
total variation distance
gptkbp:usedIn
gptkb:machine_learning
gptkb:information_geometry
data compression
hypothesis testing
statistical inference
variational inference
gptkbp:bfsParent
gptkb:Shannon_entropy
gptkbp:bfsLayer
4