Statements (29)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:security
|
| gptkbp:abbreviation |
gptkb:LDP
|
| gptkbp:advantage |
no trusted aggregator required
|
| gptkbp:challenge |
utility-privacy tradeoff
|
| gptkbp:contrastsWith |
gptkb:Central_Differential_Privacy
|
| gptkbp:defines |
A privacy model where data is randomized before leaving the user's device.
|
| gptkbp:enables |
privacy-preserving data analysis
|
| gptkbp:field |
computer science
data privacy |
| gptkbp:goal |
limit information leakage
|
| gptkbp:guarantees |
data collector cannot learn much about any individual
|
| gptkbp:method |
noise addition
randomized response |
| gptkbp:notableFor |
gptkb:Google_Chrome's_RAPPOR
Apple iOS data collection Microsoft telemetry |
| gptkbp:parameter |
privacy budget (epsilon)
|
| gptkbp:proposedBy |
2008
|
| gptkbp:protectedBy |
individual data
|
| gptkbp:relatedConcept |
privacy amplification
privacy guarantee privacy loss |
| gptkbp:relatedTo |
gptkb:Differential_Privacy
|
| gptkbp:usedIn |
gptkb:machine_learning
data collection statistics |
| gptkbp:bfsParent |
gptkb:Differential_Privacy
|
| gptkbp:bfsLayer |
5
|
| https://www.w3.org/2000/01/rdf-schema#label |
Local Differential Privacy
|