Programming Language Techniques for Differential Privacy

Public presentation by Marco Vassena
on Wed. 10 May 2017 at 10:00-12:00 in room EL41
Data analysts mine large databases and crunch data in order to extrapolate statistics and interesting patterns. However, people's privacy is jeopardized in the process, whenever a database contains private data. *Differential privacy* has emerged recently as an appealing rigourous definition of privacy, which protects individuals in a database, while allowing data analysts to learn facts about the underling population, by adding noise to queries. Unfortunately, proving differential privacy of programs is a difficult and error-prone task. In this paper, we survey the state-of-the-art applications of programming languages techniques to develop principled approaches and tool support to ease the analysis and verification of probabilistic differential private programs.
View PDF

Introductory papers
  • The Algorithmic Foundations of Differential Privacy (Chapter 1-2)
  • Advanced papers
  • Distance Makes the Types Grow Stronger (ICFP2010)
  • Differentially Private Bayesian Programming (CCS16)
  • (Optional) Linear dependent types for differential privacy (POPL'13)
  • (Optional) Higher-Order Approximate Relational Refinement Types for Mechanism Design and Differential Privacy (POPL'15)
  • Fork me on GitHub