Statements (30)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:statistical_analysis
|
| gptkbp:alsoKnownAs |
gptkb:Kullback–Leibler_divergence
|
| gptkbp:category |
divergence measures
information theory concepts |
| gptkbp:definedIn |
continuous probability distributions
discrete probability distributions |
| gptkbp:describes |
difference between two probability distributions
|
| gptkbp:field |
gptkb:information_theory
gptkb:machine_learning statistics |
| gptkbp:form |
KL(P||Q) = Σ P(x) log(P(x)/Q(x))
|
| gptkbp:generalizes |
cross-entropy
|
| gptkbp:heldBy |
asymmetric
non-negative non-symmetric |
| gptkbp:minimumAt |
P = Q
|
| gptkbp:minimumPressure |
0
|
| gptkbp:namedAfter |
gptkb:Richard_Leibler
gptkb:Solomon_Kullback |
| gptkbp:relatedTo |
gptkb:organization
gptkb:Jensen–Shannon_divergence mutual information cross-entropy |
| gptkbp:usedFor |
relative entropy
model selection variational inference measuring information loss |
| gptkbp:bfsParent |
gptkb:Kullback–Leibler_divergence
|
| gptkbp:bfsLayer |
5
|
| https://www.w3.org/2000/01/rdf-schema#label |
KL divergence
|