Statements (28)
Predicate | Object |
---|---|
gptkbp:instanceOf |
statistical analysis
|
gptkbp:alsoKnownAs |
relative entropy
|
gptkbp:describes |
difference between two probability distributions
|
gptkbp:field |
gptkb:information_theory
gptkb:machine_learning statistics |
gptkbp:form |
D_{KL}(P||Q) = \\sum_i P(i) \\log \\frac{P(i)}{Q(i)}
|
gptkbp:fullName |
gptkb:Kullback–Leibler_divergence
|
gptkbp:hasUnit |
bits
nats |
https://www.w3.org/2000/01/rdf-schema#label |
KLD
|
gptkbp:includesMetric |
false
|
gptkbp:introduced |
gptkb:Richard_Leibler
gptkb:Solomon_Kullback |
gptkbp:introducedIn |
1951
|
gptkbp:isAsymmetric |
true
|
gptkbp:minValue |
0
|
gptkbp:minValueCondition |
P = Q
|
gptkbp:relatedTo |
gptkb:organization
gptkb:Jensen-Shannon_divergence cross-entropy |
gptkbp:usedFor |
feature selection
model selection variational inference measuring information loss distribution comparison |
gptkbp:bfsParent |
gptkb:Kaladesh
|
gptkbp:bfsLayer |
6
|