Statements (30)
| Predicate | Object | 
|---|---|
| gptkbp:instanceOf | gptkb:statistical_analysis | 
| gptkbp:alsoKnownAs | relative entropy | 
| gptkbp:describes | difference between two probability distributions | 
| gptkbp:field | gptkb:information_theory gptkb:machine_learning statistics | 
| gptkbp:form | D_{KL}(P||Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)} | 
| gptkbp:fullName | gptkb:Kullback–Leibler_divergence | 
| gptkbp:hasUnit | gptkb:nats bits | 
| gptkbp:includesMetric | false | 
| gptkbp:introduced | gptkb:Richard_Leibler gptkb:Solomon_Kullback | 
| gptkbp:introducedIn | 1951 | 
| gptkbp:isAsymmetric | true | 
| gptkbp:minValue | 0 | 
| gptkbp:minValueCondition | P = Q | 
| gptkbp:relatedTo | gptkb:organization gptkb:Jensen-Shannon_divergence cross-entropy | 
| gptkbp:usedFor | feature selection model selection variational inference measuring information loss distribution comparison | 
| gptkbp:bfsParent | gptkb:Khalilabad_railway_station gptkb:Kalideres_Station gptkb:Mendeleyevo_Airport | 
| gptkbp:bfsLayer | 7 | 
| https://www.w3.org/2000/01/rdf-schema#label | KLD |