Differential Privacy

Differential privacy is provided through plausible deniability. An adversary should not be able to distinguish if any individual record was used in any given data release. Mechanisms, or algorithms, typically add bounded noise to the query results to achieve this. This noise is controlled via a privacy parameter, otherwise known as a privacy budget, called epsilon. The larger the epsilon value, the weaker the privacy guarantee but the larger the potential for accurate results, and vice versa.

The output distribution of a differentially private algorithm is insensitive to small changes to the input database, where databases that only differ by the small change are called neighbours. In graphs, there are two popular options for defining neighbours: (i) edge-DP considers pairs of graphs G and G' differing by an edge; (ii) node-DP considers pairs of graphs differing by a node, where two nodes are different if they have different edges. We assume that the total number of nodes are fixed under node-DP.

For more information, please see the following links:

Differential Privacy Resources

The Algorithmic Foundations of Differential Privacy, by Cynthia Dwork and Aaron Roth