Dynamic Enforcement of Differential Privacy
Licentiate thesis, 2015

With recent privacy failures in the release of personal data, differential privacy received considerable attention in the research community. This mathematical concept, despite its young age (Dwork, 2006), has grabbed the attention of many researchers for its robustness against identification of individuals even in presence of background information. Beside that, its flexible definition makes it compatible with different data sources, data mining algorithms and data release models. Its compositionality properties facilitates design of "differential privacy aware" programming languages. These programming languages and frameworks empower non-experts to construct complex data mining analyses with proven differential privacy guarantees. This thesis mostly focuses on two aspects of such systems: proving correctness of these frameworks, and improving their accuracy. The correctness is important since some frameworks (like PINQ) derivates from theory without proper justification. With respect to accuracy, we present an improved method of enforcing privacy that provides improved data utilisation and further benefits. In this setting, individuals take control of their privacy requirements rather than being seen as a part of database. As a result, they can opt-in to a database with their expected privacy level and optionally opt-out later.

Differential privacy

EC, EDIT building, Rännvägen 6B, Chalmers University of Technology
Opponent: Catuscia Palamidessi

Author

Hamid Ebadi Tavallaei

Chalmers, Computer Science and Engineering (Chalmers)

Areas of Advance

Information and Communication Technology

Roots

Basic sciences

Subject Categories

Computer Science

Technical report L - Department of Computer Science and Engineering, Chalmers University of Technology and Göteborg University: 134

EC, EDIT building, Rännvägen 6B, Chalmers University of Technology

Opponent: Catuscia Palamidessi

More information

Created

10/7/2017