A Programming Framework for Differential Privacy with Accuracy Calculations
Research Project, 2020
– 2024
The growing availability of large scale personal data has put forward thechallenge to cope with privacy as a major one. The EU GDPR legislation for dataprotection contributed to creating such awareness, but privacy legislation is notexclusivity for Europe -- e.g., the California Consumer Privacy Act becameeffective this year (and is broadly similar to GDPR).Differential privacy (DP) is currently the only mathematical framework forrigorously reasoning about privacy-accuracy trade-offs. A standard way toachieve DP is by adding statistical noise to the result of data analysis. DP iscompositional for privacy: there are basic building blocks and compositionalproperties which ensure that the privacy guarantees come from the guarantees ofthe building blocks. Unfortunately, reasoning about accuracy is lesscompositional. It is not surprising that most of DP tools and programmingframeworks lack support for reasoning about the accuracy of composed analyses.This proposal presents techniques to enable reasoning about accuracy in acompositional manner. We propose the novel idea to apply information-flowcontrol techniques---originally designed to track how data flows withinsystems---in order to internalize the use of probabilistic bounds whencalculating the accuracy of composed data analyses. The proposal will implementthe programming framework DPella, where accuracy reasoning is automated and canbe combined with reasoning about privacy.
Participants
Alejandro Russo (contact)
Chalmers, Computer Science and Engineering (Chalmers), Information Security
Funding
Swedish Research Council (VR)
Project ID: 2020-03881
Funding Chalmers participation during 2020–2024