What is it about?

Differential Privacy has been widely accepted as a de facto standard within both academia and industry, when it comes to use personal data in a privacy-respecting and privacy-preserving manner. However, according to the philosophy that ``one size does not fit all'', the generalized DP framework is not sufficiently effective to accurately define privacy risks and maximize utility gains in all real-world scenarios. We summarize DP adaptations in real-world scenarios by categorizing them into three types of adaptation separately in two parallel lines of contexts including statistical database privacy and local data privacy.

Featured Image

Why is it important?

It will demonstrate how Differential Privacy definition and mechanisms can be modified by capturing and modelling featured characteristics of data/database along with incorporating them to update privacy definition.

Read the Original

This page is a summary of: Scenario-based Adaptations of Differential Privacy: A Technical Survey, ACM Computing Surveys, April 2024, ACM (Association for Computing Machinery),
DOI: 10.1145/3651153.
You can read the full text:

Read

Contributors

The following have contributed to this page