Dynamic Enforcement of Differential Privacy
Doktorsavhandling, 2018

With recent privacy failures in the release of personal data, differential privacy received considerable attention in the research community.
This mathematical concept, despite its young age (Dwork et al., 2006), has grabbed the attention of many researchers for its robustness against identification of individuals even in presence of background information.
Besides that, its flexible definition makes it compatible with different data sources, data mining algorithms and data release models.
Its compositionality properties facilitate design of differential privacy aware programming languages and frameworks that empower non-experts to construct complex data mining analyses with proven differential privacy guarantees.
The goal of this research is to introduce new (and improve the current) differential privacy backed frameworks, prominent both in utility and flexibility of use.
We study dynamic enforcement of differential privacy both in the centralised model in which a trusted curator process data stored in a centralised database and the local model with no trust on the third party.

For the centralised model the thesis mostly focuses on the privacy impact of the basic building blocks used in these frameworks, proving correctness of the system built upon them.
%The correctness is important since some frameworks (in this case PINQ) derivate from theory without proper justification.
With respect to accuracy, we present personalised differential privacy as an improved method of enforcing privacy that provides better data utilisation and other benefits. In this setting, individuals take control of their privacy requirements rather than being seen as a part of a database. As a result, they can opt-in to a database with their expected privacy level and optionally opt-out later. We further study the privacy implication of other building blocks such as different kinds of sampling and partitioning.

For the local model we propose a general framework in which the users can verify the recieved analyses and with a flexible policy express their privacy preference in different forms such as enforcing their personalised privacy budget.

differential privacy

HA2 , Hörsalvägen 4 - Chalmers University of Technology
Opponent: Marco Gaboardi, Assistant professor in the Computer Science and Engineering department of the University at Buffalo, SUNY.

Författare

Hamid Ebadi Tavallaei

Chalmers, Data- och informationsteknik, Informationssäkerhet

Differential Privacy: Now it’s Getting Personal

ACM SIGPLAN Notices,; Vol. 50(2015)p. 69-81

Artikel i vetenskaplig tidskrift

Sampling and Partitioning for Differential Privacy

Privacy Security & Trust Conference 2016,; (2016)p. 664-673

Paper i proceeding

Featherweight PINQ

Journal of Privacy and Confidentiality,; Vol. 7(2016)p. 159-164

Artikel i vetenskaplig tidskrift

Computers learn to recognise patterns and automate tasks by looking at
data. Data collection and processing usually involves sensitive personal data
such as location, age, medical information, opinion or the way people behave.
That is why many people are concerned about their privacy when their
data is processed in different analyses. Common methods for protecting
people’s privacy include removing sensitive information like name, email
address, or birth-date. However, in many cases, simply removing this type
of information is not enough to protect the data, since sensitive information
can sometimes be guessed from the rest of the data. Differential privacy
is a stronger method for gathering information that became more common
recently. It protects participants’ privacy by making the data a little less
accurate. This means that researchers or companies can still collect and
analyse data, but they can not track individual responses. This thesis focuses
on tools that help data scientists to build complex analyses that guarantee
differential privacy.
This thesis presents tools for data scientists who do not know much
about privacy. Analyses are created from building blocks that follow the
rules for differential privacy; therefore, the analyst can combine them in any
way without compromising privacy.
Google uses differential privacy in the Chrome browser and Apple has
started using it in computer devices and mobile phones. Hopefully, a broader
adaptation of these techniques will improve users’ privacy while allowing the
use of data in analyses that previously were not possible because of privacy
concerns.

Ämneskategorier

Annan data- och informationsvetenskap

Styrkeområden

Informations- och kommunikationsteknik

ISBN

978-91-7597-685-3

Doktorsavhandlingar vid Chalmers tekniska högskola. Ny serie: 4366

Utgivare

Chalmers

HA2 , Hörsalvägen 4 - Chalmers University of Technology

Opponent: Marco Gaboardi, Assistant professor in the Computer Science and Engineering department of the University at Buffalo, SUNY.

Mer information

Senast uppdaterat

2018-04-27