ABSTRACT
A scheme that publishes aggregate information about sensitive data must resolve the trade-off between utility to information consumers and privacy of the database participants. Differential privacy [5] is a well-established definition of privacy--this is a universal guarantee against all attackers, whatever their side-information or intent. Can we have a similar universal guarantee for utility?
There are two standard models of utility considered in decision theory: Bayesian and minimax [13]. Ghosh et. al. [8] show that a certain "geometric mechanism" gives optimal utility to all Bayesian information consumers. In this paper, we prove a similar result for minimax information consumers. Our result also works for a wider class of information consumers which includes Bayesian information consumers and subsumes the result from [8].
We model information consumers as minimax (risk-averse) agents, each endowed with a loss-function which models their tolerance to inaccuracies and each possessing some side-information about the query. Further, information consumers are rational in the sense that they actively combine information from the mechanism with their side-information in a way that minimizes their loss. Under this assumption of rational behavior, we show that for every fixed count query, the geometric mechanism is universally optimal for all minimax information consumers.
Additionally, our solution makes it possible to release query results, when information consumers are at different levels of privacy, in a collusion-resistant manner.
- A. Blum, K. Ligett, and A. Roth. A learning theory approach to non-interactive database privacy. In Proceedings of the 40th Annual ACM Symposium on Theory of Computing (STOC), pages 609--618, 2008. Google Scholar
Digital Library
- I. Dinur and Nissim K. Revealing information while preserving privacy. In Proceedings of the 22nd ACM SIGACT-SIGMOD-SIGART Symposium on Principles of Database Systems (PODS), pages 202--210, 2003. Google Scholar
Digital Library
- C. Dwork. Differential privacy. In Proceedings of the 33rd Annual International Colloquium on Automata, Languages, and Programming (ICALP), volume 4051 of Lecture Notes in Computer Science, pages 1--12, 2006. Google Scholar
Digital Library
- C. Dwork. Differential privacy: A survey of results. In 5th International Conference on Theory and Applications of Models of Computation (TAMC), volume 4978 of Lecture Notes in Computer Science, pages 1--19, 2008. Google Scholar
Digital Library
- C. Dwork, F. McSherry, K. Nissim, and A. Smith. Calibrating noise to sensitivity in private data analysis. In Third Theory of Cryptography Conference (TCC), volume 3876 of Lecture Notes in Computer Science, pages 265--284, 2006. Google Scholar
Digital Library
- C. Dwork, F. McSherry, and K. Talwar. The price of privacy and the limits of LP decoding. In Proceedings of the 39th Annual ACM Symposium on Theory of Computing (STOC), pages 85--94, 2007. Google Scholar
Digital Library
- C. Dwork and K. Nissim. Privacy-preserving datamining on vertically partitioned databases. In 24th Annual International Cryptology Conference (CRYPTO), volume 3152 of Lecture Notes in Computer Science, pages 528--544, 2004.Google Scholar
Cross Ref
- A. Ghosh, T. Roughgarden, and M. Sundararajan. Universally utility-maximizing privacy mechanisms. In Proceedings of the 41st Annual ACM Symposium on Theory of Computing (STOC), pages 351--360, 2009. Google Scholar
Digital Library
- M. Hardt and K. Talwar. On the geometry of differential privacy. In CoRR abs/0907.3754, 2009.Google Scholar
- M. Hay, V. Rastogi, G. Miklau, and D. Suciu. Boosting the accuracy of differentially-private queries through consistency. In To appear in 36th International Conference on Very Large Databases (VLDB), 2010.Google Scholar
- S. P. Kasiviswanathan, H. K. Lee, K. Nissim, S. Raskhodnikova, and A. Smith. What can we learn privately? In Proceedings of the 49th Annual IEEE Symposium on Foundations of Computer Science (FOCS), pages 531--540, 2008. Google Scholar
Digital Library
- S. P. Kasiviswanathan and A. Smith. A note on differential privacy: Defining resistance to arbitrary side information. http://arxiv.org/abs/0803.3946v1, 2008.Google Scholar
- Graham Loomes and Robert Sugden. Regret theory: An alternative theory of rational choice under uncertainty. Economic Journal, 92(368):805--24, December 1982.Google Scholar
Cross Ref
- A. Mas-Colell, M. D. Whinston, and J. R. Green. Microeconomic Theory. Oxford University Press, New York, 1995.Google Scholar
- F. McSherry and K. Talwar. Mechanism design via differential privacy. In Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS), pages 94--103, 2007. Google Scholar
Digital Library
- A. Narayanan and V. Shmatikov. Robust de-anonymization of large sparse datasets. In Proceedings of the 2008 IEEE Symposium on Security and Privacy (SP), pages 111--125, 2008. Google Scholar
Digital Library
- K. Nissim, S. Raskhodnikova, and A. Smith. Smooth sensitivity and sampling in private data analysis. In Proceedings of the 39th Annual ACM Symposium on Theory of Computing (STOC), pages 75--84, 2007. Google Scholar
Digital Library
- California Department of Public Health. H1n1 flu-data tables. http://www.cdph.ca.gov/HealthInfo/discond/Documents/ H1N1-Data-Table-CA-Cases-by-County-102409.pdf, October 2009.Google Scholar
- D. G. Poole. The stochastic group. American Mathematical Monthly, 102(798--801), 1995.Google Scholar
- L. Sweeney. k-anonymity: a model for protecting privacy. International Journal on Uncertainty, Fuzziness and Knowledge-based Systems, 10(5):557--570, 2002. Google Scholar
Digital Library
- Wikipedia. AOL search data scandal. http://en.wikipedia.org/wiki/AOL_search_data_scandal.Google Scholar
- M. Wu, W. Trappe, Z. Jane Wang, and K.J. Ray Liu. Collusion-resistant fingerprinting for multimedia. IEEE Signal Processing Magazine, 21(2):15 -- 27, 2004.Google Scholar
- X. Xiao, Y. Tao, and M. Chen. Optimal random perturbation at multiple privacy levels. In 35th International Conference on Very Large Databases (VLDB), pages 814--825, 2009.Google Scholar
Digital Library
Index Terms
Universally optimal privacy mechanisms for minimax agents
Recommendations
Universally utility-maximizing privacy mechanisms
STOC '09: Proceedings of the forty-first annual ACM symposium on Theory of computingA mechanism for releasing information about a statistical database with sensitive data must resolve a trade-off between utility and privacy. Publishing fully accurate information maximizes utility while minimizing privacy, while publishing random noise ...
Information Measures in Statistical Privacy and Data Processing Applications
In statistical privacy, utility refers to two concepts: information preservation, how much statistical information is retained by a sanitizing algorithm, and usability, how (and with how much difficulty) one extracts this information to build ...
Truthful mechanisms for agents that value privacy
EC '13: Proceedings of the fourteenth ACM conference on Electronic commerceRecent work has constructed economic mechanisms that are both truthful and differentially private. In these mechanisms, privacy is treated separately from the truthfulness; it is not incorporated in players' utility functions (and doing so has been ...






Comments