Perspex: Flexible and Transparent Local Differential Privacy
Research Project, 2019 – 2022

Scientific Abstract

This proposal studies differential privacy, a rigorous quantified definition of privacy which guarantees a bound on the loss of privacy due to the release of statistical queries. Using both static and dynamic program analysis, a number of programming frameworks have been designed for the construction of differentially private analyses which are private-by-construction. These frameworks all assume that the sensitive data is collected centrally, and processed by a trusted curator. However, the main examples of differential privacy applied in practice – for example in Google Chrome’s collection of browsing statistics, or Apple’s training of predictive messaging – use a purely local mechanism, thus avoiding the collection of sensitive data altogether. While this is a benefit of the local approach, with systems like Apple’s, users are required to completely trust that the analysis running on their system has the claimed privacy properties.

We propose a framework for transparent local differential privacy in which analyses can be statically verified for their differential privacy properties. Our proposal builds on a novel method for abstracting the behaviour of probabilistic programs not previously used for static analysis. Beyond this, we will explore new flexible variants of local differential privacy that permit information declassification for improved accuracy.

Se den svenskversionen för den populärvetenskaplig beskrivning

Participants

David Sands (contact)

Chalmers, Computer Science and Engineering (Chalmers), Information Security

Funding

Swedish Research Council (VR)

Project ID: 2018-04230
Funding Chalmers participation during 2019–2022

Related Areas of Advance and Infrastructure

Information and Communication Technology

Areas of Advance

Publications

2021

A Quantale of Information

Paper in proceeding

More information

Latest update

2019-07-03