This documentation is for an old version of OpenDP.

The current release of OpenDP is v0.11.1.

Welcome#

What is differential privacy?#

Differential privacy is a rigorous mathematical definition of privacy. Consider an algorithm that analyzes a dataset and releases statistics: The algorithm is differentially private if by looking at the output, you cannot tell whether any individual’s data was included in the original dataset or not. Differential privacy achieves this by carefully injecting random noise into the released statistics to hide the effects of each individual.

For more background on differential privacy and its applications:

Why OpenDP?#

  • OpenDP is based on a solid conceptual framework for expressing privacy-aware computations.

  • OpenDP is built on a Rust core for memory and thread safety and performance.

  • OpenDP has a process for independent review of algorithms and implementations.

  • OpenDP has performed well in independent security audits.

  • OpenDP supports a range of differential privacy algorithms.

  • OpenDP has bindings for Python and R, both built on the same Rust core for consistency and security.

  • OpenDP is a community effort and is not owned or directed by a single corporation.

That said, OpenDP is not the best tool for every job. In particular, it is a fairly low-level interface: There are a number of other projects which try to make it easy to add differential privacy to existing SQL interfaces or ML frameworks. One such tool is SmartNoise SDK, which is built on the OpenDP library.

Who is using OpenDP?#

Some of the applications of OpenDP in healthcare, government, and tech include:

What next?#

There are multiple tracks through the documentation:

  • New users of the library should begin with Getting Started.

  • For Python, R, and Rust references, see the API.

  • If you want to understand how the fundamentals of DP are applied in OpenDP, see Theory.

  • Finally, if you’re joining the project, see Contributing.