Result page:
1
2
3
4
5
6
7
8
9
10
>>
1
October 2012
CCS '12: Proceedings of the 2012 ACM conference on Computer and communications security
Publisher: ACM
Bibliometrics:
Citation Count: 0
Downloads (6 Weeks): 3, Downloads (12 Months): 47, Downloads (Overall): 211
Full text available:
PDF
The need for privacy-aware policies, regulations, and techniques has been widely recognized. This workshop discusses the problems of privacy in the global interconnected societies and possible solutions. The 2012 Workshop, held in conjunction with the ACM CCS conference, is the eleventh in a yearly forum for papers on all the ...
Keywords:
privacy
Title:
11th workshop on privacy in the electronic society
CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Keywords:
privacy
Abstract:
<p>The need for privacy- -aware policies, regulations, and techniques has been widely recognized. This ... has been widely recognized. This workshop discusses the problems of privacy in the global interconnected societies and possible solutions. The 2012 ... yearly forum for papers on all the different aspects of privacy
Primary CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
References:
D. Dittrich, M. Bailey, and S. Dietrich. Building an Active Computer Security Ethics Community. IEEE Security and Privacy, 9(4):32--40, 2011.
Full Text:
11th workshop on privacy in the electronic society11th Workshop on Privacy in the Electronic SocietyNikita BorisovUniversity of Illinois at Urbana-ChampaignUrbana, ILnikita@illinois.eduABSTRACTThe ... Electronic SocietyNikita BorisovUniversity of Illinois at Urbana-ChampaignUrbana, ILnikita@illinois.eduABSTRACTThe need for privacy- -aware policies, regulations, and tech-niques has been widely recognized. This ... tech-niques has been widely recognized. This workshop discussesthe problems of privacy in the global interconnected soci-eties and possible solutions. The 2012 ... electronic society.Categories and Subject DescriptorsK.4.1 [Computers and Society]: Public Policy Issues?Privacy ; K.6.m [Management of Computing and In-formation Systems]: Miscellaneous?SecurityGeneral TermsAlgorithms, ... becoming easier to collect, exchange, ac-cess, process, and link information. Privacy issues have beenthe subject of public debates, and the need ... have beenthe subject of public debates, and the need for privacy- -awarepolicies, regulations, and techniques has been widely recog-nized.Since the workshop?s ... well as experimental studies of fielded systems. Dur-ing this time, privacy- -related research has seen tremendousgrowth and can be found in ... to cyber-physical systems. The workshop takes a cross-cutting ap-proach to privacy, , bringing people from various communi-ties together and in addition ... Raleigh, North Carolina, USA.ACM 978-1-4503-1651-4/12/10.? anonymity, pseudonymity, and unlinkability? data privacy? ? economics of privacy? ? electronic commerce privacy? ? health information privacy? ? identity management? location privacy? ? personally identifiable information? privacy and anonymity in the Web? privacy and confidentiality management? privacy and data mining? privacy and human rights? privacy enhancing technologies? privacy in health care and public administration? privacy in mobile computing? privacy in pervasive and ubiquitous computing? privacy in social networks? privacy in the cloud systems? privacy in the electronic records? privacy metrics? privacy policies? privacy threats? privacy vs. security? privacy- -aware access control? privacy- -preserving computation? public records and personal privacy? ? traffic analysis? unobservability10642. PAPERSThe workshop received 48 submissions, of ... recent attention due to concernsabout user tracking and the corresponding privacy impact; 3papers in the program address various aspects of this ... the program address various aspects of this prob-lem. Another key privacy issue is ensuring that users? pri-vacy expectations are met; the ... contains papersthat study users expectations, evaluate the usability of tools,highlight privacy risks, and help users make sense of poli-cies. Finally, several ... users make sense of poli-cies. Finally, several papers propose new privacy enhancingtechnologies that enable privacy- -preserving genomic tests,allow old data to disappear from the Internet, ... understand computer sys-tems, we often gather data that puts users? privacy at risk.The last few years have seen a growing amount ... entire workshops held on the topic1.As a workshop focused on privacy, , the committee felt thatWPES should maintain high standards of ...
... the University of Illinois at Urbana-Champaign.His research interests are online privacy, , network security,and Internet-scale distributed systems. The program com-mittee is ... S. Dietrich. Building anActive Computer Security Ethics Community. IEEESecurity and Privacy,
2
November 2009
CIKM '09: Proceedings of the 18th ACM conference on Information and knowledge management
Publisher: ACM
Bibliometrics:
Citation Count: 12
Downloads (6 Weeks): 3, Downloads (12 Months): 29, Downloads (Overall): 318
Full text available:
PDF
User search query logs have proven to be very useful, but have vast potential for misuse. Several incidents have shown that simple removal of identifiers is insufficient to protect the identity of users. Publishing such inadequately anonymized data can cause severe breach of privacy. While significant effort has been expended ...
Keywords:
privacy
CCS:
Theory of database privacy and security
Security and privacy
Keywords:
privacy
Abstract:
... Publishing such inadequately anonymized data can cause severe breach of privacy. . While significant effort has been expended on coming up ... such as the diversity of queries and the causes of privacy breach. This necessitates the need to design privacy models and techniques specific to this environment. This paper takes ... to achieve such anonymization. We analyze the inherent utility and privacy
Primary CCS:
Theory of database privacy and security
Security and privacy
References:
A. Cooper. A survey of query log privacy-enhancing techniques from a policy perspective. TWEB, 2(4), 2008.
C. Dwork. Differential privacy. In 33rd International Colloquium on Automata, Languages and Programming (ICALP 2006), pages 1--12, Venice, Italy, July 9-16 2006.
L. Sweeney. k-anonymity: a model for protecting privacy. Int. J. Uncertain. Fuzziness Knowl.-Based Syst., 10(5):557--570, 2002.
Full Text:
... Publishing such inadequately anonymized data can cause se-vere breach of privacy. . While significant effort has been expendedon coming up with ... aspects, such as the diversityof queries and the causes of privacy breach. This necessitates theneed to design privacy models and techniques specific to this en-vironment. This paper takes ... techniques to achieve suchanonymization. We analyze the inherent utility and privacy trade-off, and experimentally validate the performance of our techniques.Categories and ... to develop anonymization methods to publish query logdata without breaching privacy or diminishing utility.Indeed, query logs differ in several important characteristics, ... characteristics, andtheir anonymization problem involves new challenges. First, thecauses of privacy breach are not the same. Privacy breaches oc-cur in released tabular data mainly due to the ... Inthe query logs, however, one of the big causes of privacy leaksis that the queries issued by a particular user unintentionally ... given query log QL into another querylog QL? satisfying some privacy requirements. The framework ofk-anonymity[11, 12] guarantees that every individual is ... applicability is widely accepted, and itdoes provide a measure of privacy. . In this paper, we first followthe spirit of k-anonymity ...
... a large-scale query log to evaluate ourapproach and examine both privacy and utility behavior.4.1 Clustering and AnonymizationFigure 2(a) shows that it ...
... data utility. Cooper [6]provides a survey on the query log privacy- -enhancing techniquesfrom a policy perspective, including log deletion, hashing queries,identifier ... select and publish a subset of searchquery logs providing sound privacy guarantees (based on Differen-tial Privacy[ [7]). However, we resolve this problem in a differentway, and ... andSweeney [11] first introduced the concept of k-anonymity to pro-tect privacy. . k-anonymity is the standard privacy model used, forexample, by HIPAA and the European Union Data ... statistical analyses prove useful, releasing them withoutproper anonymization may compromise privacy of persons as querylogs can potentially contain users? personal information. ... have several limitations, we will explore the appli-cability of other privacy models to this domain.7. REFERENCES[1] E. Adar. User 4xxxxx9: Anonymizing ... pages875?883. ACM, 2008.[6] A. Cooper. A survey of query log privacy- -enhancing techniques from a policyperspective. TWEB, 2(4), 2008.[7] C. Dwork. ... techniques from a policyperspective. TWEB, 2(4), 2008.[7] C. Dwork. Differential privacy. . In 33rd International Colloquium onAutomata, Languages and Programming (ICALP ... NY,USA, 1998. ACM.[12] L. Sweeney. k-anonymity: a model for protecting privacy.
3
August 2003
KDD '03: Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Publisher: ACM
Bibliometrics:
Citation Count: 164
Downloads (6 Weeks): 9, Downloads (12 Months): 141, Downloads (Overall): 1,736
Full text available:
PDF
Privacy and security concerns can prevent sharing of data, derailing data mining projects. Distributed knowledge discovery, if done correctly, can alleviate this problem. The key is to obtain valid results, while providing guarantees on the (non)disclosure of data. We present a method for k -means clustering when different sites contain ...
Keywords:
privacy
CCS:
Theory of database privacy and security
Security and privacy
Keywords:
privacy
Abstract:
Privacy and security concerns can prevent sharing of data, derailing data ...
Title:
Privacy-preserving k-means clustering over vertically partitioned data
References:
D. Agrawal and C. C. Aggarwal. On the design and quantification of privacy preserving data mining algorithms. In Proceedings of the Twentieth ACM SIGACT-SIGMOD-SIGART Symposium on Principles of Database Systems, pages 247--255, Santa Barbara, California, USA, May 21--23 2001. ACM.]]
R. Agrawal and R. Srikant. Privacy-preserving data mining. In Proceedings of the 2000 ACM SIGMOD Conference on Management of Data, pages 439--450, Dallas, TX, May 14--19 2000. ACM.]]
W. Du and M. J. Atallah. Privacy-preserving statistical analysis. In Proceeding of the 17th Annual Computer Security Applications Conference, New Orleans, Louisiana, USA, December 10--14 2001.]]
A. Evfimievski, R. Srikant, R. Agrawal, and J. Gehrke. Privacy preserving mining of association rules. In The Eighth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 217--228, Edmonton, Alberta, Canada, July 23--26 2002.]]
M. Kantarcioglu and C. Clifton. Privacy-preserving distributed mining of association rules on horizontally partitioned data. In The ACM SIGMOD Workshop on Research Issues on Data Mining and Knowledge Discovery (DMKD'02), pages 24--31, Madison, Wisconsin, June 2 2002.]]
M. Kantarcioĝlu and C. Clifton. Privacy-preserving distributed mining of association rules on horizontally partitioned data. IEEE-TKDE, submitted.]]
M. Kantarcioglu and J. Vaidya. An architecture for privacy-preserving mining of client information. In C. Clifton and V. Estivill-Castro, editors, IEEE International Conference on Data Mining Workshop on Privacy, Security, and Data Mining, volume 14, pages 37--42, Maebashi City, Japan, Dec. 9 2002. Australian Computer Society.]]
X. Lin and C. Clifton. Privacy preserving clustering with distributed EM mixture modeling. Knowledge and Information Systems, Submitted.]]
Y. Lindell and B. Pinkas. Privacy preserving data mining. In Advances in Cryptology - CRYPTO 2000, pages 36--54. Springer-Verlag, Aug. 20--24 2000.]]
S. J. Rizvi and J. R. Haritsa. Maintaining data privacy in association rule mining. In Proceedings of 28th International Conference on Very Large Data Bases, pages 682--693, Hong Kong, Aug. 20--23 2002. VLDB.]]
J. Vaidya and C. Clifton. Privacy preserving association rule mining in vertically partitioned data. In The Eighth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 639--644, Edmonton, Alberta, Canada, July 23--26 2002.]]
Full Text:
Privacy- -Preserving K-Means Clustering over Vertically Partitioned DataPrivacy-Preserving K-Means Clustering over ... and protection; H.2.4[Database Management]: Systems?Distributed databasesGeneral TermsSecurityKeywordsPrivacy1. INTRODUCTIONData mining and privacy are often perceived to be at odds,witness the recent U.S. ... a ?Data Min-ing Moratorium Act?[11]. Data mining results rarely vio-late privacy, , as they generally reveal high-level knowledgerather than disclosing instances ... knowledgerather than disclosing instances of data. However, the con-cern among privacy advocates is well founded, as bringingdata together to support data ... to an individual?s entire financialhistory. This raises justifiable concerns among privacy ad-vocates.Privacy and data mining can coexist. The problem withthe above ... effective regulations onwhat transactions must be reported could actually improveboth privacy and the ability to detect criminal activity.While obtaining globally meaningful ... how results from securemultiparty computation can be used to generate privacy- -preserving data mining algorithms. We assume verticallypartitioned data: The data ...
... kept private;if sharing the is desired, an evaluation of privacy/ /secrecyconcerns can be performed after the values are known.At first ... the k-means algorithm on its own data. Thiswould preserve complete privacy. . Figure 1 shows why thiswill not work. Assume we ... discussion ofrelated work, as well as suggestions for future research.2. PRIVACY PRESERVING K-MEANS AL-GORITHMWe now formally define the problem. Let r ...
... a threshold? This is shown formally in Algorithm 1.207Algorithm 1 Privacy Preserving k-means clusteringRequire: r parties, k clusters, n points.1: for ...
... MultipartyComputation.3.1 Secure Multi-Party ComputationTo prove that our k-means algorithm preserves privacy, , weneed to define privacy preservation. We use the frameworkdefined in Secure Multiparty Computation.Yao first ... private two-party computationin the semi-honest model is given below.Definition 1. (privacy w.r.t. semi-honest behavior):[14]Let f : {0, 1}? {0, 1}? ...
... with the minimum row sum). To prove this algo-rithm is privacy preserving, we must show that each partycan construct a polynomial ...
... besecure, application of the composition theorem proves thatAlgorithm 3 preserves privacy. .3.4 Stopping CriterionBefore analyzing the security of the entire k ...
... reputation. Similar legal and rep-utation safeguards could be enforced for privacy- -preservingdata mining. In addition, if there were not at least ...
... data to one site.This gives O(n) bits in one round. Privacy is adding a factorof O(r+k) rounds and O(rk) bit communication ... CONCLUSIONSThere has been work in distributed clustering that doesnot consider privacy issues, e.g., [6, 19]. Generally, the goalof this work is ... to compute the global patterns. However, sharing localpatterns inherently compromises privacy. . Our work ensuresreasonable privacy while limiting communication cost.There recently been a surge in interest ... limiting communication cost.There recently been a surge in interest in privacy- -preservingdata mining. One approach is to add ?noise? to the ... the data mining re-sults[2, 1, 10, 27].The approach of protecting privacy of distributed sourceswas first addressed for the construction of decision ... followed the secure multiparty computa-tion approach discussed below, achieving ?perfect? privacy, ,i.e., nothing is learned that could not be deduced from ... workmakes trade-offs between efficiency and information disclo-sure, all maintain provable privacy of individual informationand bounds on disclosure, and disclosure is limited ... data. An important extension toour work would be to allow privacy preserving computationof such estimators, giving higher confidence in clustering re-sults. ... discussion can befound in [8]. Currently, assembling these into efficient privacy- -preserving data mining algorithms, and proving them se-cure, is a ... be combined to implement a standard data miningalgorithm with provable privacy and information disclosureproperties. Our hope is that as the library ... grow, standard methods willdevelop to ease the task of developing privacy- -preservingdata mining techniques.7. ACKNOWLEDGMENTSWe thank Patricia Clifton for comments, corrections, ...
... Agrawal and C. C. Aggarwal. On the design andquantification of privacy preserving data miningalgorithms. In Proceedings of the Twentieth ACMSIGACT-SIGMOD-SIGART Symposium ... USA, May 21-23 2001. ACM.[2] R. Agrawal and R. Srikant. Privacy- -preserving datamining. In Proceedings of the 2000 ACM SIGMODConference on ... Volume 1759,pp. 245-260, 2000).[7] W. Du and M. J. Atallah. Privacy- -preservingstatistical analysis. In Proceeding of the 17th AnnualComputer Security Applications ... Sons, 1973.[10] A. Evfimievski, R. Srikant, R. Agrawal, andJ. Gehrke. Privacy preserving mining of associationrules. In The Eighth ACM SIGKDD InternationalConference ... Theory of Computing, pages218?229, 1987.[16] M. Kantarcioglu and C. Clifton. Privacy- -preservingdistributed mining of association rules on horizontallypartitioned data. In The ... 24?31, Madison,Wisconsin, June 2 2002.[17] M. Kantarc?og?lu and C. Clifton. Privacy- -preservingdistributed mining of association rules on horizontallypartitioned data. IEEE-TKDE, submitted.[18] ... and V. Estivill-Castro, editors, IEEEInternational Conference on Data Mining Workshopon Privacy, , Security, and Data Mining, volume 14,pages 37?42, Maebashi City, ... San Francisco, California,2001. ACM Press.[21] X. Lin and C. Clifton. Privacy preserving clusteringwith distributed EM mixture modeling. Knowledgeand Information Systems, Submitted.[22] ... modeling. Knowledgeand Information Systems, Submitted.[22] Y. Lindell and B. Pinkas. Privacy preserving datamining. In Advances in Cryptology ? CRYPTO 2000,pages 36?54. ...
... Kong, Aug. 20-23 2002. VLDB.[28] J. Vaidya and C. Clifton. Privacy preservingassociation rule mining in vertically partitioned data.In The Eighth ACM ...
4
May 2017
SIGMOD '17: Proceedings of the 2017 ACM International Conference on Management of Data
Publisher: ACM
Bibliometrics:
Citation Count: 0
Downloads (6 Weeks): 76, Downloads (12 Months): 178, Downloads (Overall): 178
Full text available:
PDF
Many modern databases include personal and sensitive correlated data, such as private information on users connected together in a social network, and measurements of physical activity of single subjects across time. However, differential privacy, the current gold standard in data privacy, does not adequately address privacy issues in this kind ...
Keywords:
differential privacy, privacy, pufferfish privacy
Title:
Pufferfish Privacy Mechanisms for Correlated Data
CCS:
Security and privacy
Human and societal aspects of security and privacy
Privacy protections
Keywords:
differential privacy
privacy
pufferfish privacy
Abstract:
... of physical activity of single subjects across time. However, differential privacy, , the current gold standard in data privacy, , does not adequately address privacy issues in this kind of data.</p> <p>This work looks at ... data.</p> <p>This work looks at a recent generalization of differential privacy, , called Pufferfish, that can be used to address privacy in correlated data. The main challenge in applying Pufferfish is ... computationally efficient. Our experimental evaluations indicate that this mechanism provides privacy and utility for synthetic as well as real data in ...
Primary CCS:
Security and privacy
Human and societal aspects of security and privacy
Privacy protections
References:
R. Bassily, A. Groce, J. Katz, and A. Smith. Coupled-worlds privacy: Exploiting adversarial uncertainty in statistical data privacy. In FOCS, 2013.
R. Chen, N. Mohammed, B. C. Fung, B. C. Desai, and L. Xiong. Publishing set-valued data via differential privacy. VLDB Endowment, 2011.
C. Dwork and J. Lei. Differential privacy and robust statistics. In STOC, 2009.
C. Dwork and A. Roth. The algorithmic foundations of differential privacy. TCS, 9(3--4):211--407, 2013.
A. Ghosh and R. Kleinberg. Inferential privacy guarantees for differentially private mechanisms. arXiv preprint arXiv:1603.01508, 2016.
X. He, A. Machanavajjhala, and B. Ding. Blowfish privacy: tuning privacy-utility trade-offs using policies. In SIGMOD '14, pages 1447--1458, 2014.
S. Kessler, E. Buchmann, and K. Böhm. Deploying and evaluating pufferfish privacy for smart meter data. Karlsruhe Reports in Informatics, 1, 2015.
D. Kifer and A. Machanavajjhala. Pufferfish: A framework for mathematical privacy definitions. ACM Trans. Database Syst., 39(1):3, 2014.
C. Li, M. Hay, V. Rastogi, G. Miklau, and A. McGregor. Optimizing linear counting queries under differential privacy. In PODS '10.
C. Li and G. Miklau. An adaptive mechanism for accurate query answering under differential privacy. VLDB, 2012.
C. Liu, S. Chakraborty, and P. Mittal. Dependence makes you vulnerable: Differential privacy under dependent tuples. In NDSS 2016, 2016.
A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, and L. Vilhuber. Privacy: Theory meets practice on the map. In ICDE, 2008.
F. McSherry and K. Talwar. Mechanism design via differential privacy. In FOCS, 2007.
A. Sarwate and K. Chaudhuri. Signal processing and machine learning with differential privacy: Algorithms and challenges for continuous data. Signal Processing Magazine, IEEE, 30(5):86--94, Sept 2013.
Y. Xiao and L. Xiong. Protecting locations with differential privacy under temporal correlations. In Proceedings of the 22nd ACM SIGSAC CCS.
B. Yang, I. Sato, and H. Nakagawa. Bayesian differential privacy on correlated data. In SIGMOD '15.
Full Text:
Pufferfish Privacy Mechanisms for Correlated DataShuang SongUC San Diegoshs037@eng.ucsd.eduYizhen Wang?UC San Diegoyiw248@eng.ucsd.eduKamalika ... across time. However, differentialprivacy, the current gold standard in data privacy, , does notadequately address privacy issues in this kind of data.This work looks at a ... and is computationallyefficient. Our experimental evaluations indicate that thismechanism provides privacy and utility for synthetic as wellas real data in two ... as wellas real data in two separate domains.CCS Concepts?Security and privacy ? Privacy protections;Keywordsprivacy, differential privacy, , Pufferfish privacy1. INTRODUCTIONModern database applications increasingly involve per-sonal data, ... data while still preservingprivacy. For the past several years, differential privacy [8]has emerged as the gold standard in data privacy, , and thereis a large body of work on differentially ... building management, alsoincreasingly involve a different setting ? correlated data ?privacy issues in which are not as well-understood. Consider,for example, a ... humanactivities change slowly over time.What is a good notion of privacy for this example? Sincethe data belongs to a single subject, ... this example? Sincethe data belongs to a single subject, differential privacy isnot directly applicable; however, a modified version calledentry-privacy [15] applies. Entry-privacy ensures that theinclusion of a single time-series entry does not ... continue for several time points. A related notion isgroup differential privacy [9], which extends the definitionto participation of entire groups of ... individuals orentries. Here, all entries are correlated, and hence groupdifferential privacy will add ? O(T ) noise to a histogramover T ... a histogramover T measurements, thus destroying all utility. Thus toaddress privacy challenges in this kind of data, we need adifferent privacy notion.A generalized version of differential privacy called Puffer-fish was proposed by [20]. In Pufferfish, privacy require-ments are specified through three components ? S, a set ...
... hide private values against correlation across1291multiple entries/individuals; second, unlike group privacy, , italso allows utility in cases where a large number ... ofthe mechanism for the physical activity measurement exam-ple. We provide privacy and utility guarantees, establishcomposition properties, and finally demonstrate the prac-tical ... instantiation and is a generalization of theLaplace mechanism for differential privacy. . We callthis the Wasserstein Mechanism.? Since the above mechanism ... multiple times over the samedatabase leads to a gracefully decaying privacy param-eter. This makes the mechanism particularly attrac-tive as Pufferfish privacy does not always compose [20].? We derive a simplified and ... ? see surveys [30, 9]. As we ex-plain earlier, differential privacy is not the right formalismfor the kind of applications we ... kind of applications we consider. A related frame-work is coupled-worlds privacy [2]; while it can take datadistributions into account through a ... suitable for ourapplications. We remark that while mechanisms for specificcoupled-worlds privacy frameworks exist, there is also nogeneric coupled-worlds privacy mechanism.Our work instead uses Pufferfish, a recent generalizationof differential privacy [20]. [20, 18] provide some specific in-stances of Pufferfish frameworks ... notapply to our physical activity monitoring example. [24] de-signs Pufferfish privacy mechanisms for distribution classesthat include Markov Chains. Their mechanism adds ...
... only? is known. [33] releases time-varying location trajecto-ries under differential privacy while accounting for tempo-ral correlations using a Hidden Markov Model ... different frequencies.In concurrent work, [14] provide an alternative algorithmfor Pufferfish privacy when data can be written as X =(X1, . . ... single subjects as in our work.2. THE SETTINGTo motivate our privacy framework, we use two applica-tions ? physical activity monitoring of ... release(an approximation to)?iXi, the number of infected peo-ple, while ensuring privacy against an adversary who wishesto detect whether a particular person ... workplace-level or school-level),so individuals do not control their participation.2.1 The Privacy FrameworkThe privacy framework of our choice is Pufferfish [20], anelegant generalization of ... of our choice is Pufferfish [20], anelegant generalization of differential privacy [8]. A Pufferfishframework is instantiated by three parameters ? a ... adversary may hold aboutthe data, and the goal of the privacy framework is to ensureindistinguishability in the face of these beliefs.Definition ... to ensureindistinguishability in the face of these beliefs.Definition 2.1. A privacy mechanism M is said to be ?-Pufferfish private in a ... 6= 0, P (sj |?) 6= 0.Readers familiar with differential privacy will observe thatunlike differential privacy, , the probability in (1) is with re-spect to the ... and sj , compared to theinitial belief.[20] shows that Differential Privacy is a special case ofPufferfish, where S is the set ... wisely; if ? is toorestrictive, then we may not have privacy against legitimateadversaries, and if ? is too large, then the ... large, then the resulting privacymechanisms may have little utility.Finally, Pufferfish privacy
that the privacy guarantees may not decaygracefully as more computations are carried out ... are carried out on the samedata. However, some of our privacy mechanisms themselveshave good composition properties ? see Section 4.3 for ... see Section 4.3 for moredetails.A related framework is Group Differential Privacy [9],which applies to databases with groups of correlated records.Definition 2.2 ... to databases with groups of correlated records.Definition 2.2 (Group Differential Privacy) ). LetD be a database with n records, and let ... subsets Gi ? {1, . . . , n}. A privacy mech-anism M is said to be ?-group differentially private with ... next illustrate how instantiating these examples inPufferfish would provide effective privacy and utility.Example 1: Physical Activity Measurement. Let Abe the set ... ,??0.9 0.1 0.00.0 0.9 0.10.1 0.0 0.9????}.In this example,? Differential privacy does not directly apply since wehave a single person?s data.? ... directly apply since wehave a single person?s data.? Entry differential privacy [15, 20] and coupled worldsprivacy [2] add noise with standard ... equal to the mixing timeover ?, and thus offer both privacy and utility for rapidlymixing chains.Example 2: Flu Status over Social ... |Ci|.Similarly, for the flu status example, we have:? Both differential privacy and coupled worlds privacyadd noise with standard deviation ? 1/? ... to hide evidence of Alice?s flu status.Note that unlike differential privacy, , as the decision toparticipate is made at group level, ... cannotargue it is enough to hide her participation.? Group differential privacy will add noise proportionalto the size of the largest connected ... ?average spread? of flu is low.Again, we can achieve Pufferfish privacy by adding noiseproportional to the ?average spread? of flu which ... spread? of flu which may beless noise than group differential privacy. . For a concretenumerical example, see Section 3.2.3 Guarantee Against ... AdverariesA natural question is what happens when we offer Puffer-fish privacy with respect to a distribution class ?, but theadversary?s belief ... ?. Our first result is toshow that the loss in privacy is not too large if ?? is close to? conditioned ...
... into a scalar, we design mech-anism M that satisfies ?-Pufferfish privacy in this instantia-tion and approximates F (X).Our proposed mechanism is ... proposed mechanism is inspired by the Laplace mech-anism in differential privacy; ; the latter adds noise to the re-sult of the ...
... in Theorem 3.2. Observe that when Puffer-fish reduces to differential privacy, , then the correspondingWasserstein Mechanism reduces to the Laplace mechanism;it ... 1 Wasserstein Mechanism (Database D, queryF , Pufferfish framework (S,Q,?), privacy parameter ?)for all (si, sj) ? Q and all ? ... Lap(4/?) noise, which gives worse utility.3.2 Performance GuaranteesTheorem 3.2 (Wasserstein Privacy) ). The Wasser-stein Mechanism provides ?-Pufferfish privacy in the frame-work (S,Q,?).Utility. Because of the extreme generality of ... a Pufferfish framework, and let G be the correspondinggroup differential privacy framework (so that G includes agroup G for each set ... equal to the global sensitivityof F in the G-group differential privacy framework.4. A MECHANISM FOR BAYESIAN NET-WORKSThe Wasserstein Mechanism, while general, ...
... Details are presented in Algorithm 2,and Theorem 4.3 establishes its privacy properties.Vector-Valued Functions. The mechanism can be easilygeneralized to vector-valued functions. ... each coordinateof F guarantees ?-Pufferfish privacy.Theorem 4.3 (Markov Quilt Mechanism Privacy) ).If F is L-Lipschitz, and if each SQ,i contains the ... XR = ?) , then the Markov QuiltMechanism preserves ?-Pufferfish privacy ... in the instantia-tion (S,Q,?) described in Section 4.1.4.3 CompositionUnlike differential privacy, , Pufferfish privacy ... does not al-ways compose [20], in the sense that the privacy parame-ter may not decay gracefully as the same data (or ...
... Z, where Z ? Lap(1)data) is used in multiple privacy- -preserving computations.We show below that the Markov Quilt Mechanism does ... Mk(D) de-note the Markov Quilt Mechanism that publishes Fk(D) with?-Pufferfish privacy under (S,Q,?) using Markov Quilt sets{SQ,i}ni=1. Then publishing (M1(D), . ... sets{SQ,i}ni=1. Then publishing (M1(D), . . . ,MK(D)) guaran-tees K?-Pufferfish privacy under (S,Q,?).To prove the theorem, we define active Markov Quilt ... and transition matrix([0.80.2],[0.9 0.10.4 0.6]).Suppose we want to guarantee Pufferfish privacy with ? =10. Consider the middle node X2. The possible ... use the same active MarkovQuilt. This holds automatically if the privacy parameter ?and Markov Quilt set SQ,i are the same for ... are the same for all Mk. If differ-ent levels of privacy {?k}Kk=1 are required, we can guaranteeK maxk?{1,...,K} ?k-Pufferfish privacy as long as the sameSQ,i is used across Mk.4.4 Case ... searching over all ? ? ?. The improvement preserves ?-Pufferfish privacy, , but might come at the expense of someadditional loss ...
... up thealgorithm considerably.Algorithm 3 MQMExact(Database D, L-Lipschitz queryF , ?, privacy parameter ?, maximum Markov Quilt sizeparameter `)for all ? ? ...
... queryF , ? containing Markov chains of length T , privacy param-eter ?, maximum Markov Quilts length `for all Xi doSQ,i ... ? consists of rapidly mixing chains, then Algorithm 4provides both privacy
... this section isto address the following questions:1. What is the privacy- -utility tradeoff offered by the MarkovQuilt Mechanism as a function ... offered by the MarkovQuilt Mechanism as a function of the privacy parame-ter ? and the distribution class ??2. How does this ... tradeoff compare against existing base-lines, such as [14] and Group-differential privacy? ?3. What is the accuracy-run time tradeoff offered by theMQMApprox ... that are representative of three differentprivacy regimes ? 0.2 (high privacy) ), 1 (moderate privacy) ),5 (low privacy) ). All run-times are reported for a desktopwith a 3.00 ... and 8GB memory.Algorithms. Our experiments involve four mechanismsthat guarantee ?-Pufferfish privacy ? GroupDP, GK16,MQMApprox and MQMExact.GroupDP is a simple baseline that ...
... patterns. While in theory this task can be achievedwith differential privacy, , this gives poor utility as the groupsizes are small. ...
... detailed study of how Pufferfish may be ap-plied to achieve privacy in correlated data problems. Weestablish robustness properties of Pufferfish against ... data. Our results demonstrate that Pufferfishoffers a good solution for privacy in these problems.We believe that our work is a first ... work is a first step towards a compre-hensive study of privacy
privacy problems ? such as privacy of users con-nected into social networks and privacy of spatio-temporalinformation gathered from sensors. With the proliferation ofsensors and ... from sensors. With the proliferation ofsensors and ?internet-of-things? devices, these privacy prob-lems will become increasingly pressing. We believe that animportant line ... line of future work is to model these problems inrigorous privacy frameworks such as Pufferfish and designnovel mechanisms for these models.7. ... 2002.[2] R. Bassily, A. Groce, J. Katz, and A. Smith.Coupled-worlds privacy: : Exploiting adversarialuncertainty in statistical data privacy. . In FOCS, 2013.[3] K. Chaudhuri, D. Hsu, and S. ... Wiley & Sons, 2012.[7] C. Dwork and J. Lei. Differential privacy and robuststatistics. In STOC, 2009.[8] C. Dwork, F. McSherry, K. ... 2006.[9] C. Dwork and A. Roth. The algorithmic foundationsof differential privacy. . TCS, 9(3-4):211?407, 2013.[10] K. Ellis et al. Multi-sensor physical ... 2015.[18] X. He, A. Machanavajjhala, and B. Ding. Blowfishprivacy: tuning privacy- -utility trade-offs usingpolicies. In SIGMOD ?14, pages 1447?1458, 2014.[19] S. ... S. Kessler, E. Buchmann, and K. Bo hm. Deployingand evaluating pufferfish privacy for smart meter data.Karlsruhe Reports in Informatics, 1, 2015.[20] D. ... 2015.[20] D. Kifer and A. Machanavajjhala. Pufferfish: Aframework for mathematical privacy definitions. ACMTrans. Database Syst., 39(1):3, 2014.[21] D. Koller and N. ... Rastogi, G. Miklau, andA. McGregor. Optimizing linear counting queriesunder differential privacy. . In PODS ?10.[23] C. Li and G. Miklau. An ... Liu, S. Chakraborty, and P. Mittal. Dependencemakes you vulnerable: Differential privacy underdependent tuples. In NDSS 2016, 2016.[25] A. Machanavajjhala, D. Kifer, ... A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke,and L. Vilhuber. Privacy: : Theory meets practice onthe map. In ICDE, 2008.[26] S. ... 3(160037):1?12, 2016.[27] F. McSherry and K. Talwar. Mechanism design viadifferential privacy. . In FOCS, 2007.[28] K. Nissim, S. Raskhodnikova, and A. ... Sarwate and K. Chaudhuri. Signal processing andmachine learning with differential privacy: : Algorithmsand challenges for continuous data. Signal ProcessingMagazine, IEEE, 30(5):86?94, ... arXiv:1411.5428, 2014.[33] Y. Xiao and L. Xiong. Protecting locations withdifferential privacy
... Data Management.[35] B. Yang, I. Sato, and H. Nakagawa. Bayesiandifferential privacy on correlated data. In SIGMOD?15.APPENDIXA. PUFFERFISH PRIVACY DETAILSProof. (of Theorem 2.4) Since ? is a closed set, ... frame-work G is defined as: ?GF = maxk?{1,...,m}?GkF.Analogous to differential privacy, , adding to the result ofquery F Laplace noise with ...
... a grid search over ?, whichfurther improves efficiency.1306IntroductionRelated WorkThe SettingThe Privacy FrameworkProperties of PufferfishExamplesGuarantee Against Close AdverariesAdditional NotationA General MechanismThe Wasserstein ... Markov ChainsExploiting Structural InformationApproximating the max-influenceExperimentsMethodologySimulationsReal DataPhysical Activity MeasurementElectricity ConsumptionDiscussionConclusionAcknowledgmentsReferencesPufferfish Privacy DetailsWasserstein Mechanism ProofsComparison with Group DPMarkov Quilt MechanismGeneral PropertiesMarkov ChainsFast ...
5
June 2014
SIGMOD '14: Proceedings of the 2014 ACM SIGMOD International Conference on Management of Data
Publisher: ACM
Bibliometrics:
Citation Count: 11
Downloads (6 Weeks): 18, Downloads (12 Months): 138, Downloads (Overall): 524
Full text available:
PDF
Privacy definitions provide ways for trading-off the privacy of individuals in a statistical database for the utility of downstream analysis of the data. In this paper, we present Blowfish , a class of privacy definitions inspired by the Pufferfish framework, that provides a rich interface for this trade-off. In particular, ...
Keywords:
blowfish privacy, privacy, differential privacy
CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Keywords:
blowfish privacy
privacy
differential privacy
Abstract:
<p>Privacy definitions provide ways for trading-off the privacy of individuals in a statistical database for the utility of ... data. In this paper, we present <i>Blowfish</i>, a class of privacy definitions inspired by the Pufferfish framework, that provides a rich ... trade-off. In particular, we allow data publishers to extend differential privacy using a <i> policy</i>, which specifies (a) <i>secrets</i>, or information ... We show that there are reasonable policies under which our privacy mechanisms for k-means clustering, histograms and range queries introduce significantly ... lesser noise than their differentially private counterparts. We quantify the privacy- -utility trade-offs for various policies analytically and empirically on real ...
Title:
Blowfish privacy: tuning privacy-utility trade-offs using policies
References:
A. Blum, C. Dwork, F. McSherry, and K. Nissim. Practical privacy: the sulq framework. In PODS, 2005.
K. Chatzikokolakis, M. András, N. Bordenabe, and C. Palamidessi. Broadening the scope of differential privacy using metrics. In Privacy Enhancing Technologies. 2013.
B.-C. Chen, D. Kifer, K. Lefevre, and A. Machanavajjhala. Privacy-preserving data publishing. Foundations and Trends in Databases, 2 (1--2):1--167, 2009.
C. Dwork. Differential privacy. In ICALP, 2006.
S. R. Ganta, S. P. Kasiviswanathan, and A. Smith. Composition attacks and auxiliary information in data privacy. In KDD, pages 265--273, 2008.
X. He, A. Machanavajjhala, and B. Ding. Blowfish privacy: Tuning privacy-utility trade-offs using policies. CoRR, abs/1312.3913, 2014.
D. Kifer and B.-R. Lin. Towards an axiomatization of statistical privacy and utility. In PODS, 2010.
D. Kifer and A. Machanavajjhala. No free lunch in data privacy. In SIGMOD, pages 193--204, 2011.
D. Kifer and A. Machanavajjhala. A rigorous and customizable framework for privacy. In PODS, 2012.
D. Kifer and A. Machanavajjhala. Pufferish: A framework for mathematical privacy definitions. To appear ACM Transactions on Database Systems, 39(1), 2014.
C. Li, M. Hay, V. Rastogi, G. Miklau, and A. McGregor. Optimizing histogram queries under differential privacy. In PODS, pages 123--134, 2010.
X. Xiao, G. Wang, and J. Gehrke. Differential privacy via wavelet transforms. In ICDE, pages 225--236, 2010.
Full Text:
mod784-heBlowfish Privacy: : Tuning Privacy- -Utility Trade-offs usingPoliciesXi HeDuke UniversityDurham, NC, USAhexi88@cs.duke.eduAshwin MachanavajjhalaDuke UniversityDurham, NC, ... data. In this paper, we present Blow-fish, a class of privacy definitions inspired by the Pu?erfishframework, that provides a rich interface ... con-straints. We show that there are reasonable policies underwhich our privacy mechanisms for k-means clustering, his-tograms and range queries introduce significantly ... introduce significantly lesser noisethan their di?erentially private counterparts. We quantifythe privacy- -utility trade-o?s for various policies analyticallyand empirically on real datasets.Categories ... DescriptorsH.2.8 [Database Applications]: Statistical Databases; K.4.1[Computers and Society]: PrivacyKeywordsprivacy, di?erential privacy, , Blowfish privacy1. INTRODUCTIONWith the increasing popularity of ?big-data? applicationswhich ... of our life, ensuring thatthese applications do not breach the privacy of individuals isan important problem. The last decade has seen ... last decade has seen the devel-opment of a number of privacy definitions and mechanismsthat trade-o? the privacy of individuals in these databasesPermission to make digital or hard ... (or accuracy) of data analysis (see [4] for asurvey). Di?erential privacy [6] has emerged as a gold stan-dard not only because ... since it provides asimple knob, namely ?, for trading o? privacy for utility.While ? is intuitive, it does not su?ciently capture ... is intuitive, it does not su?ciently capture thediversity in the privacy- -utility trade-o? space. For instance,recent work has shown two seemingly ... two seemingly contradictory results.In certain applications (e.g., social recommendations [17])di?erential privacy is too strong and does not permit su?-cient utility. Next, ... learn sensitive information [12].Subsequently, Kifer and Machanavajjhala [13] proposed asemantic privacy framework, called Pu?erfish, which helpsclarify assumptions underlying privacy definitions ? specif-ically, the information that is being kept secret, ... kept secret, and theadversary?s background knowledge. They showed that dif-ferential privacy is equivalent to a specific instantiation ofthe Pu?erfish framework, where ... Webelieve that these shortcomings severely limit the applicabil-ity of di?erential privacy to real world scenarios that eitherrequire high utility, or deal ... by Pu?erfish, we seek to better explore the trade-o? between privacy and utility by providing a richer set of?tuning knobs?. We ... of?tuning knobs?. We explore a class of definitions called Blow-fish privacy. . In addition to ?, which controls the amount ofinformation ... may be known publicly about the data. By ex-tending di?erential privacy using these policies, we can hopeto develop mechanisms that permit ...
... introduce and formalize sensitive information speci-fications, constraints, policies and Blowfish privacy. . Weconsider a number of realistic examples of sensitive in-formation ... how to adapt well known di?erential privacymechanisms to satisfy Blowfish privacy, , and using theexample of k-means clustering illustrate the gains ... several practi-cal scenarios.Organization: Section 2 introduces the notation. Section 3formalizes privacy policies. We define Blowfish privacy, , anddiscuss composition properties and its relationship to priorwork in ... 7) underBlowfish policies without constraints and empirically evalu-ate the resulting privacy- -utility trade-o?s on real datasets.We show how to release histograms ... not change. Hence we will usethe indistinguishability notion of di?erential privacy [7]. Wewill denote the set of possible databases using In, ... or the setof databases with |D| = n.1Definition 2.1 (Differential Privacy [6]). Twodatasets D1 and D2 are neighbors, denoted by (D1, ... the value of one tuple. A randomizedmechanism M satisfies ?-di?erential privacy if for every setof outputs S ? range(M), and every ... S] ? e?Pr[M(D2) 2 S] (1)Many techniques that satisfy di?erential privacy use thefollowing notion of global sensitivity:Definition 2.2 (Global Sensitivity). The ... formally,S(f) = max(D1,D2)2N||f(D1)? f(D2)||1 (2)A popular technique that satisfies ?-di?erential privacy isthe Laplace mechanism [7] defined as follows:1In Sec. 3 we ... we briefly discuss how to generalize our results to otherdi?erential privacy notions by relaxing this assumption.Definition 2.3. The Laplace mechanism,MLap, privatelycomputes ... data. We will use this policyspecification as input in our privacy
Sensitive InformationAs indicated by the name, Blowfish2 privacy is inspired bythe Pu?erfish privacy framework [13]. In fact, we will showlater (in Section 4.2) ... In fact, we will showlater (in Section 4.2) that Blowfish privacy is equivalent tospecific instantiations of semantic definitions arising fromthe Pu?erfish ... instantiations of semantic definitions arising fromthe Pu?erfish framework.Like Pu?erfish, Blowfish privacy also uses the notions ofsecrets and discriminative pairs of secrets. ... pairs. For instance, we can model an in-dividual who is privacy agnostic and does not mind disclos-ing his/her value exactly by ...
... counts can be reconstructed ? in thisway tuples are correlated.Di?erential privacy allows answering all the count queriesc(ri) by adding independent noise ... of Q.4. BLOWFISH PRIVACYIn this section, we present our new privacy definition,called Blowfish Privacy. . Like di?erential privacy, , Blowfishuses the notion of neighboring datasets. The key di?erenceis ...
... and has the smallest size (of 1). Neighboringdatasets in di?erential privacy correspond to neighbors whenG is a complete graph.For policies having ... di?erentin terms of discriminative pairs and tuple changes.Definition 4.2 (Blowfish Privacy) ). Let ? > 0 be areal number and P ... be a policy. A randomizedmechanism M satisfies (?, P )-Blowfish privacy if for everypair of neighboring databases (D1, D2) 2 N(P ... havePr[M(D1) 2 S] ? e?Pr[M(D2) 2 S] (8)Note that Blowfish privacy takes in the policy P in additionto ? as an ... releases. We will study two kinds of com-position for Blowfish privacy ? sequential and parallel com-position. Sequential composition ensures that a ... Sequential composition ensures that a sequenceof computations that each ensure privacy in isolation alsoensures privacy. . This allows breaking down computationsinto smaller building blocks. Parallel ... ofrandomness that satisfy (?1, P ) and (?2, P )-Blowfish privacy, ,resp. Then an algorithm that outputs both M1(D) = !1 ... [10].4.2 Relation to other definitionsIn this section, we relate Blowfish privacy to existing no-tions of privacy. . We discuss variants of di?erential privacy[ [6] (including restricted sensitivity [1]), the Pu?erfish frame-work [13], privacy axioms [11], and a recent independentwork on extending di?erential privacy with metrics [3].Di?erential Privacy [6]: One can easily verify that amechanism satisfies ?-di?erential privacy ... (Definition 2.1) ifand only if it satisfies (?, P )-Blowfish privacy, , where P =(T ,K, In), and K is the ... and K is the complete graph on the domain.Thus, Blowfish privacy is a generalization of di?erential pri-vacy that allows a data ... of di?erential pri-vacy that allows a data curator to trade-o? privacy vs utilityby controlling sensitive information G (instead of K) andauxiliary ... for tuple i over T . Thena mechanism satisfies (?,Spairs,D)-Pu?erfish privacy if andonly if it satisfies (?, P )-Blowfish privacy.Theorem 4.3. ...
... con-straints (we conjecture the su?ciency of Blowfish as well).Thus Blowfish privacy policies correspond to a subclass ofprivacy definitions that can be ... using Pu?erfish.Both Pu?erfish and Blowfish aid the data publisher tocustomize privacy definitions by carefully defining sensitiveinformation and adversarial knowledge. However, Blowfishimproves ... Thus, we can?t compare Blowfish and Pu?erfishexperimentally. Second, all Blowfish privacy policies resultin composable privacy definitions. This is not true for thePu?erfish framework. Finally, we ... than the Pu?erfish frame-work for data publishers who are not privacy experts.3 Forinstance, one needs to specify adversarial knowledge as setsof ... one only needs to specify conceptually sim-pler publicly known constraints.Other Privacy Definitions: Kifer and Lin [11] stipulatethat every ?good? privacy definition should satisfy two ax-ioms ? transformation invariance, and convexity. ... ax-ioms ? transformation invariance, and convexity. We canshow that Blowfish privacy satisfy both these axioms.Recent papers have extended di?erential privacy to handleconstraints. Induced neighbor privacy [12, 13] extends thenotion of neighbors such that neighboring databases ... 0n or 1n.A very recent independent work suggests extending dif-ferential privacy using a metric over all possible databases[3]. In particular, given ... S and all in-stances X and Y . Thus di?erential privacy corresponds toa specific distance measure ? Hamming distance. The sensi-tive ... ! Rd, outputting f(D) + ? ensures (?, P )-Blowfish privacy if ? 2 Rd is a vector of independent randomnumbers ... = (T , G, In)),(?, P )-Blowfish di?ers from ?-di?erential privacy only in thespecification of sensitive information. Note that every pair(D1, ... following result trivially holds.Lemma 5.2. Any mechanism M that satisfies?-di?erential privacy also satisfies (?, (T , G, In))-Blowfish pri-vacy for all ... fact that ?-di?erential privacyis equivalent to (?, (T ,K, In))-Blowfish privacy, , where K isthe complete graph.In many cases, we can ... cases, we can do better in terms of utility thandi?erentially privacy mechanisms. It is easy to see thatS(f, P ) is ...
... with more utility (lesser er-ror) than mechanisms that satisfy di?erential privacy. . Ink-means clustering we will see that using Blowfish policieshelps ... the sums for twoclusters by at most d(T ).Under Blowfish privacy policies, the policy specific sensi-tivity of qsum can be much ... qsum can be much smaller than |T | under Di?eren-tial privacy (i.e. complete graph Gfull for Blowfish policies).Since qsize is the ... policy specific sensitivityfor qsum (from Lemma 6.1) and thus satisfy privacy underthe Blowfish policy while ensuring better accuracy.6.1 Empirical EvaluationWe empirically ... accuracy of k-means cluster-ing for (?, (T , G, In))-Blowfish privacy on three data sets.The first two datasets are real-world datasets ...
... twitter, GPFigure 1: K-Means: Error under Laplace mechanism v.s Blowfish privacy for di?erent discriminative graphs.for Laplace mechanism in Figure Figure 1(c) ...
... policies allow us to e?ectively im-prove utility by trading o? privacy. . In certain cases, we ob-serve that Blowfish policies attain ... that Theorem 5.1 already ensures that releasings?i?s satisfies (?, P1)-Blowfish privacy. . Furthermore, observethat the counts in ST ( ) are in ...
... to obtain a much smaller error by trading util-ity with privacy under Gd,1. Next, we describe the orderedhierarchical mechanism that works ... called Ordered Hierarchical StructureOH for (?, (T , Gd,?, In))-Blowfish privacy. . As shown in Fig-ure 2(a), OH has two types ... answered as q[xi, xj ] = q[x1, xj ]? q[x1, xi?1].Privacy Budgeting Strategy Given total privacy budget?, we denote the privacy budget assigned to all the S nodesby ?S and to ...
... is going to be the only treeto have all the privacy budget. This is equivalent to thehierarchical mechanism for di?erential privacy.Theorem ... forthe Ordered Hierarchical Mechanism with (?, (T , Gd,?, In))-Blowfish privacy on two real-world datasets ? adult andtwitter. The adult data ... the adversarycannot distinguish between all the domain values (same asdi?erential privacy) ). Figure 2(c) considers 4 threshold values? = {full, 500km, ...
... for releasing the complete his-togram while ensuring (?, P )-Blowfish privacy. .8.2 ApplicationsThe problem of calculating ?(GP ) and ?(GP ) ...
... = 2(maxcomp(Q) + 1).9. CONCLUSIONSWe propose a new class of privacy definitions, called Blow-fish privacy, , with the goal of seeking better trade-o? be-tween privacy and utility. The key feature of Blowfish isa policy, where ... how to tune utility using reasonablepolicies with weaker specifications of privacy. . For the latter,we develop strategies that are more accurate ... andC. Palamidessi. Broadening the scope of di?erential privacyusing metrics. In Privacy Enhancing Technologies. 2013.[4] B.-C. Chen, D. Kifer, K. Lefevre, and ... private spatial decompositions. InICDE, pages 20?31, 2012.[6] C. Dwork. Di?erential privacy. . In ICALP, 2006.[7] C. Dwork, F. McSherry, K. Nissim, ... 2010.[10] X. He, A. Machanavajjhala, and B. Ding. Blowfish privacy:Tuning privacy- -utility trade-o?s using policies. CoRR,abs/1312.3913, 2014.[11] D. Kifer and B.-R. ... 2014.[11] D. Kifer and B.-R. Lin. Towards an axiomatization ofstatistical privacy and utility. In PODS, 2010.[12] D. Kifer and A. Machanavajjhala. ... D. Kifer and A. Machanavajjhala. A rigorous andcustomizable framework for privacy. . In PODS, 2012.[14] D. Kifer and A. Machanavajjhala. Pu?erish: ... 2012.[14] D. Kifer and A. Machanavajjhala. Pu?erish: A frameworkfor mathematical privacy definitions. To appear ACMTransactions on Database Systems, 39(1), 2014.[15] C. ... Rastogi, G. Miklau, and A. McGregor.Optimizing histogram queries under di?erential privacy. . InPODS, pages 123?134, 2010.[16] C. Li and G. Miklau. ...
... InPVLDB, 2013.[19] X. Xiao, G. Wang, and J. Gehrke. Di?erential privacy viawavelet transforms. In ICDE, pages 225?236, 2010.[20] J. Xu, Z. ...
6
June 2011
SIGMOD '11: Proceedings of the 2011 ACM SIGMOD International Conference on Management of data
Publisher: ACM
Bibliometrics:
Citation Count: 66
Downloads (6 Weeks): 28, Downloads (12 Months): 235, Downloads (Overall): 1,376
Full text available:
PDF
Differential privacy is a powerful tool for providing privacy-preserving noisy query answers over statistical databases. It guarantees that the distribution of noisy query answers changes very little with the addition or deletion of any tuple. It is frequently accompanied by popularized claims that it provides privacy without any assumptions about ...
Keywords:
differential privacy, privacy
CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Keywords:
differential privacy
privacy
Abstract:
<p>Differential privacy is a powerful tool for providing privacy- -preserving noisy query answers over statistical databases. It guarantees that ... It is frequently accompanied by popularized claims that it provides privacy without any assumptions about the data and that it protects ... but one record. In this paper we critically analyze the privacy protections offered by differential privacy. .</p> <p>First, we use a no-free-lunch theorem, which defines non-privacy as a game, to argue that it is not possible ... game, to argue that it is not possible to provide privacy and utility without making assumptions about how the data are ... Then we explain where assumptions are needed. We argue that privacy of an individual is preserved when it is possible to ... cases the notion of participation varies, the use of differential privacy can lead to privacy breaches, and differential privacy
Title:
No free lunch in data privacy
References:
A. Blum, C. Dwork, F. McSherry, and K. Nissim. Practical privacy: the SuLQ framework. In PODS, 2005.
A. Blum, K. Ligett, and A. Roth. A learning theory approach to non-interactive database privacy. In STOC, 2008.
K. Chaudhuri and N. Mishra. When random sampling preserves privacy. In CRYPTO, 2006.
I. Dinur and K. Nissim. Revealing information while preserving privacy. In PODS, 2003.
C. Dwork. Differential privacy. In ICALP, 2006.
C. Dwork, K. Kenthapadi, F. McSherry, and I. Mironov. Our data, ourselves: Privacy via distributed noise generation. In EUROCRYPT, 2006.
C. Dwork and M. Naor. On the difficulties of disclosure prevention in statistical databases or the case for differential privacy. JPC, 2(1), 2010.
C. Dwork and A. Smith. Differential privacy for statistics: What we know and what we want to learn. JPC, 1(2), 2009.
A. Friedman and A. Schuster. Data mining with differential privacy. In KDD, 2010.
S. R. Ganta, S. P. Kasiviswanathan, and A. Smith. Composition attacks and auxiliary information in data privacy. In KDD, 2008.
G. Ghinita, P. Karras, P. Kalnis, and N. Mamoulis. A framework for efficient data anonymization under privacy and accuracy constraints. ACM TODS, 34(2), 2009.
M. Hardt and K. Talwar. On the geometry of differential privacy. In STOC, 2010.
S. P. Kasiviswanathan and A. Smith. A note on differential privacy: Defining resistance to arbitrary side information. http://arxiv.org/abs/0803.3946, 2008.
D. Kifer and B.-R. Lin. Towards an axiomatization of statistical privacy and utility. In PODS, 2010.
A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, and L. Vihuber. Privacy: From theory to practice on the map. In ICDE, 2008.
F. McSherry and I. Mironov. Differentially private recommender systems: building privacy into the net. In KDD, 2009.
F. McSherry and K. Talwar. Mechanism design via differential privacy. In FOCS, 2007.
I. Mironov, O. Pandey, O. Reingold, and S. Vadhan. Computational differential privacy. In CRYPTO, 2009.
V. Rastogi, M. Hay, G. Miklau, and D. Suciu. Relationship privacy: Output perturbation for queries with joins. In PODS, 2009.
Full Text:
... Free Lunch in Data PrivacyDaniel KiferPenn State Universitydan+sigmod11@cse.psu.eduAshwin MachanavajjhalaYahoo! Researchmvnak@yahoo-inc.comABSTRACTDifferential privacy is a powerful tool for providing privacy- -preserving noisy query answers over statistical databases.It guarantees that the ... tu-ple. It is frequently accompanied by popularized claims thatit provides privacy without any assumptions about the dataand that it protects against ... all but onerecord. In this paper we critically analyze the privacy pro-tections offered by differential privacy.First, we use a no-free-lunch theorem, ... by differential privacy.First, we use a no-free-lunch theorem, which defines non-privacy as a game, to argue that it is not possible ... Then we explain where assump-tions are needed. We argue that privacy of an individual ispreserved when it is possible to limit ... cases the notion of par-ticipation varies, the use of differential privacy can lead toprivacy breaches, and differential privacy does not alwaysadequately limit inference about participation.Categories and Subject DescriptorsH.2.8 ... fee.SIGMOD?11, June12?16, 2011, Athens, Greece.Copyright 2011 ACM 978-1-4503-0661-4/11/06 ...$10.00.1. INTRODUCTIONData privacy is an expanding sub-field of data manage-ment whose goal is ... goal is to answer queries over sensitive datasetswithout compromising the privacy of the individuals whoserecords are contained in these databases.One recent ... in these databases.One recent breakthrough in this field is a privacy defini-tion called differential privacy [10, 8]. Query answering al-gorithms that satisfy differential privacy must produce noisyquery answers such that the distribution of query ... has shown that the resultingquery answers can enable very accurate privacy- -preservingstatistical analyses of sensitive datasets [25, 13, 5, 18]. Thereare ... datasets [25, 13, 5, 18]. Thereare two flavors of differential privacy, , which we call un-bounded and bounded. The formal definitions ... un-bounded and bounded. The formal definitions are:Definition 1.1. (Unbounded Differential Privacy [8]). Arandomized algorithm A satisfies unbounded ?-differentialprivacy if P (A(D1) ... by either adding or removing one tuple.Definition 1.2. (Bounded Differential Privacy [10]). Arandomized algorithm A satisfies bounded ?-differential pri-vacy if P ... D2 by changing the value of exactly one tuple.Bounded differential privacy derives its name from thefact that all of the datasets ... datasets D1, D2 have a fixed size n, whileunbounded differential privacy has no such restriction.Additional popularized claims have been made about ... popularized claims have been made about theprivacy guarantees of differential privacy. . These include:? It makes no assumptions about how data ... are loose, though popular, interpretations of formalguarantees provided by differential privacy. . In this paper,we critically analyze the guarantees of differential ... of differential privacy.We start with a no-free-lunch theorem, which defines non-privacy as a game, and which states that it is impossibleto ... a game, and which states that it is impossibleto provide privacy and utility without making assumptionsabout the data. This is a ...
assumptions about thedata are needed for differential privacy to avoid this game-theoretic notion of non-privacy. . For comparison, we presenta variation of differential privacy which does guarantees pri-vacy without assumptions but which provides very ... little util-ity.Then we argue that a major criterion for a privacy defi-nition is the following: can it hide the evidence of ... Formal-izing the notion of participation is extremely difficult, butas with non-privacy, , it is easier to identify cases where ev-idence of ... the applicability of differential privacy.Relying on the principle that the privacy of an individ-ual is preserved whenever we can prevent inference ... individual?s participation, we see that the main guar-antee of differential privacy (i.e. the distribution of noisyquery answers changes very little with ... and 4), this addresses thefirst popularized claim ? that differential privacy requires noassumptions about the data.The second popularized claim is better ... second popularized claim is better interpreted as stat-ing that differential privacy limits (up to a multiplicativefactor) the probabilistic inference of an ... this latter assumption is flawed by(1) creating variations of differential privacy where the cor-responding attackers have more knowledge yet less noise ... moreprivate information to attackers with less prior knowledge.Thus when choosing privacy definitions based on resistanceto certain classes of attackers, the right ... of attackers, the right choice of attackersis crucial for meaningful privacy guarantees, but the rightchoice is by no means obvious. We ... are not always the biggest threats.The third claim, that differential privacy is robust to arbi-trary background knowledge, has been formalized and ... has been formalized and stud-ied by [19]. More accurately, differential privacy is robustwhen certain subsets of the tuples are known by ... by an attacker.However, many reasonable types of background knowledgecan cause privacy breaches when combined with differentialprivacy (in other words, differential privacy composes wellwith itself [14] but not necessarily with other privacy defini-tions or data release mechanisms). Such background knowl-edge includes previously ... background knowl-edge includes previously released exact query answers. Wedemonstrate a privacy breach using this background knowl-edge in Section 4 and we ... goal of this paper is to clear up misconceptions aboutdifferential privacy and to provide guidelines for determin-ing its applicability. In particular ... of Dwork and Naor [8, 11] in orderto show how privacy relies on assumptions about data gen-eration, (2) propose a participation-based ...
participation in the data-generating process?? ?for determining whether differential privacy is suitable for agiven application, (3) demonstrate that differential privacydoes ... arbitrary socialnetworks and show how this can result in a privacy breach,(4) demonstrate that differential privacy does not meet thisguideline when applied to tabular data when ... when an attackerhas aggregate-level background knowledge (and show a po-tential privacy breach), and (5) propose a modification ofdifferential privacy that avoids privacy breaches for tabulardata with aggregate-level background knowledge.The outline of this ... We discuss the no-free-lunch theorem and the game-theoretic definition of non-privacy in Section 2, where we also show that more knowl-edgeable ... more knowl-edgeable attackers do not necessarily present the greatestrisks to privacy. . We specialize this discussion to social net-works in Section ... where experiments with several popularsocial network models show that differential privacy doesnot adequately limit inference about the participation of anedge in ... already provided by the released statistics.We propose a generalization differential privacy to take intoaccount those statistics to limit further privacy breaches,and algorithms that satisfy our generalizations.2. ANALYSIS OF ATTACKERS2.1 The ... accurately.194Definition 2.1. (Discriminant ?). Given an integer k >1, a privacy- -infusing query processor A, and some constantc, we say that ... about how many cancer patients are in thedata. If the privacy- -infusing query processor A can answerthis query with reasonable accuracy, ... get useful answers to any query.Since the goal of modern privacy definitions is to answerqueries relatively accurately, they all allow the ... 2.1. The discriminant ?(k,A) of the LaplaceMechanism for unbounded differential privacy is 1 for anyk. For bounded differential privacy, , the discriminant of theLaplace Mechanism becomes arbitrarily close to ... = 1 for unbounded differentialprivacy. The result for bounded differential privacy followssince we can make ? arbitrarily small by making n ...
... [8, 11] on the im-possibility of Dalenius? vision of statistical privacy [6], wedefine a game between an attacker and a data ... P be a data-generating mechanism.The game proceeds as follows.Definition 2.2. (Non-Privacy Game). The data cura-tor obtains a database instance D from ... data curator provides the attacker withthe output A(D) of a privacy- -infusing query processor A.The attacker then guesses the value of ... of q(D). If the guess iscorrect, the attacker wins and non-privacy is achieved.The following no-free-lunch theorem states that if thereare no ... if thereare no restrictions on the data-generating mechanism and ifthe privacy- -infusing query processor A has sufficient utility(discriminant close to 1) ... a sensitivequery with k possible outcomes. Let A be a privacy- -infusingquery processor with discriminant ?(k,A) > 1? ? (for some? ... assumptions areneeded: without restrictions on P, we cannot guarantee thatthe privacy- -infusing query processor A avoids non-privacy. .2.1.3 No-Free-Lunch and Differential PrivacyWe now explain how the no-free-lunch ... Differential PrivacyWe now explain how the no-free-lunch theorem relates todifferential privacy and evidence of participation and how anassumption that P generates ... ?how many cancer pa-tients are there?? and to satisfy 0.1-differential privacy, , thequery processor A can add Laplace(10) noise to the ... cancer patients. Thus under the assumption ofindependence of records, differential privacy would be safefrom this attack. In Sections 3 and 4 ... which are not artificial and for which the useof differential privacy can lead to privacy breaches.Before concluding this section, note that there do existprivacy definitions ... make no assumptions about thedata. One example is:Definition 2.3. (Free-Lunch Privacy) ). A randomized al-gorithm A satisfies ?-free-lunch privacy if for any pair ofdatabase instances D1, D2 (not just ... see that the discriminant ?(k,A) of anyalgorithm A satisfying ?-free-lunch privacy is bounded bye?k?1+e? . Notice that this is bounded away ... of the database or not. In contrast, virtu-ally all noise-infusing privacy
privacy, , randomized response [32], etc.) havediscriminants that become arbitrarily close ... close to 1 as the datasize increases. Thus with free-lunch privacy, , we would havedifficulty distinguishing between small and large databases(let ... pay for making no assumptions about thedata-generating mechanism.2.2 Knowledge vs. Privacy RiskBefore discussing privacy breaches in social networks andtabular data (with pre-released exact query ... present in this section a general guidelinefor determining whether a privacy definition is suitable fora given application. This guideline is based ... by attackers that are less knowledgeable than thoseconsidered by differential privacy (i.e. attackers who knowabout all but one tuple in a ... more knowl-edgeable attackers can sometimes leak more sensitive in-formation than privacy definitions that protect against lessknowledgeable attackers. Then we present our ... ofparticipation? guideline.Let us start with two more forms of differential privacy ?attribute and bit differential privacy.Definition 2.4. (Attribute Differential Privacy) ). An al-gorithm A satisfies ?-attribute differential privacy if for ev-ery pair D1, D2 of databases such that ... (A(D2) ? S) (for any set S).Definition 2.5. (Bit Differential Privacy) ). An algorithmA satisfies ?-attribute differential privacy if for every pairD1, D2 of databases such that D2 ... ? e?P (A(D2) ? S) (for any set S).Attribute differential privacy corresponds to an attackerwho has full information about the database ... the database except for oneattribute of one record. Bit differential privacy correspondsto an attacker who knows everything except one bit in ... about the remainder of the data. Clearly theattacker for bit-differential privacy knows more than theattacker for attribute-differential privacy who knows morethan the corresponding attacker for bounded-differential pri-vacy (Definition ... is easy to see that an algorithmA that satisfies bounded-differential privacy also satisfies attribute-differential pri-vacy and an algorithm that satisfies attribute-differentialprivacy ... pri-vacy and an algorithm that satisfies attribute-differentialprivacy also satisfies bit-differential privacy. . In these cases,an algorithm that protects against (i.e. limits ... 00 10There are 16 possible databases instances with 1 tuple.Bounded-differential privacy allows an algorithm A1 to out-put Bob?s tuple with probability ... tuple with probability e?15+e?and any other tuplewith probability 115+e?. Attribute-differential privacy allowsan algorithm A2 to output Bob?s record with probabilitye2?9+6e?+e2?(each of ... can be output withprobability 19+6e?+e2?). The allowable distribution for bit-differential privacy is more complicated, but Bob?s recordcan be output with probability ... outputs Bob?s record with the highest probabil-ity, followed by attribute-differential privacy and followedby bounded-differential privacy (which we associate with theleast knowledgeable attacker).The intuition about why ... we associate with theleast knowledgeable attacker).The intuition about why bit-differential privacy leaked themost amount of private information is that, since the ...
... as ? (x-axis) varies for bit-differential pri-vacy (top line), attribute-differential privacy (mid-dle line), bounded-differential privacy (bottom line)attacker who knows all 9 family members have this ... correlations between edges in a social networkcan lead to a privacy breach, and in Section 4 we show thatprior release of ... growth of pathogens (i.e. other relevantaspects of data generation).Thus if privacy guarantees are based on limiting the infer-ence of a class ... be chosen wisely.We argue that an important consideration when evaluatinga privacy definition is the following question: how well canit hide the ... all evidence of its participa-tion. In such as case, differential privacy is appropriate sinceit guarantees that differentially private query answers arebarely ... deletionmay not hide evidence of participation (in which case dif-ferential privacy may not be suitable). In general, it is thedata-generating mechanism ... Thus the definition of participation andhence the suitability of various privacy definitions actuallydepends on how the data are generated.It is difficult ... weconsider special cases. In Section 3, where we show thatdifferential privacy does not always hide the participationof an edge in a ... for suchentities: nodes and edges. A standard application of differ-ential privacy would try to prevent inference about whetheror not a node ...
... for practical applications[17], and so it is common to sacrifice privacy for more util-ity by using differentially private algorithms Aedge for ... 1, and someone else in Community2. Let us define a privacy breach to be the event that the at-tacker can determine ... existed.Deleting Bob?s edge simply changes the answer by 1. Differ-ential privacy would guarantee that if the true answer wereeither X or ... did not, is a measure of the degradationof the differential privacy guarantees. It is also a measureof the causal influence of ... the causal influence of Bob?s edge. A na ?ve applicationof ?-differential privacy would add Laplace(1/?) noise (withvariance 2/?2) to the query ?how ... We modify it slightly to make itmore fair to differential privacy (in the original version, ifthere was no edge between two ...
... was ? 240. This is a significant dif-ference, and ?-differential privacy would need an extremelysmall ? value to hide the participation ... 240?119 = 121 times smaller than a na ?ve applicationof differential privacy would suggest. In addition, this dif-ference in number of cross-community ... information about the social network. Thus, inorder to use differential privacy to provide meaningful infor-mation about social networks, we would have ... a forest fire model. Al-ternatively, a na ?ve application of differential privacy whichdid not take into account possible large effects of causalitywould ... to the forest fire model, but makes it difficultfor differential privacy to limit inference about Bob?s initialedge. The reason is that ... small (thusdestroying utility), or to ignore causality (i.e. na ?ve applydifferential privacy) ) and possibly reveal the participation ofBob?s edge.MVS Model [24]. ...
... allowed to reach its steady state (Fig-ure 5). Thus differential privacy may be a reasonable choicefor limiting inference about the participation ... forest firemodel, for example).4. CONTINGENCY TABLESIn this section we consider privacy- -preserving query an-swering over a table for which some deterministic ... addition to motivating thisproblem, we show that applications of differential privacy inthis setting can lead to a privacy breach beyond the possiblebreach caused by the initial release of ... of those statistics. Wethen propose a solution by modifying differential privacy totake into account such prior deterministic data releases.The U.S. Census ... technology. It both collects and disseminates dataabout the U.S. population. Privacy is an important issue somany of the datasets released to ... in the data.In some cases, utility is more important than privacy. . Oneexample is in the release of population counts which ... accuracy, as long as they do not introduce an addi-tional privacy ... breach (note that this is a special case of thedual privacy problem where the goal is to provide as muchprivacy subject ... this approximate chi-squaredvalue should not lead to additional leakage of privacy (whencombined with the previously released histograms).Differential privacy
... perturbed version of the chi-squared statistic. In cases like these, privacy guarantees maydegrade. We provide a simple demonstration in Section 4.1.4.1 ... maydegrade. We provide a simple demonstration in Section 4.1.4.1 A Privacy LeakWe now show that when deterministic statistics have beenpreviously released, ... show that when deterministic statistics have beenpreviously released, applying differential privacy to subse-quent data releases is not enough to prevent additional ... and we are one exact queryanswer away from a complete privacy breach (i.e. recon-struction of the table).Now, differential privacy would allow us to answer the fol-lowing k queries by ... tableT is reconstructed with very high probability, thus causinga complete privacy breach. This was only possible because200additional noisy queries were answered ... meaningful before any query answers werereleased ? so the differential privacy guarantees for futurequery answers would not be as compelling to ... this idea is the relationship be-tween bounded and unbounded differential privacy. . Un-bounded differential privacy (Definition 1.1) guarantees thatthe distribution of noisy query answers will ... stay the same. This is the approach taken by boundeddifferential privacy (Definition 1.2), which guarantees thatthe distribution of future noisy query ...
... of her tuple to maleto maintain consistency.Our extension to differential privacy, , which accounts forpreviously released exact query answers, follows these ... any way that is still consistent withpreviously answered queries.4.3 Differential Privacy Subject to BackgroundKnowledgeIn this section we modify differential privacy to account forpreviously released exact query answers. In Section 4.4 ... release of query answers (SeeSections 2 and 3).discuss algorithms for privacy- -preserving query answering.First, we need the following definitions:Definition 4.1. (Contingency ... us to instantiatedifferent versions of differential privacy.Definition 4.3 (Generic Differential Privacy) ). Arandomized algorithm A satisfies ?-differential privacy if forany set S, P (A(Ti)) ? e?P (A(Tj)) whenever ... neighbors.This definition can be specialized into bounded and un-bounded differential privacy (Definitions 1.2 and 1.1) throughthe following choice of neighbors.? Unbounded ... of unbounded neighbors N1 in Definition4.3 results in unbounded differential privacy. . The use ofbounded neighbors N2 in Definition 4.3 results ... use ofbounded neighbors N2 in Definition 4.3 results in boundeddifferential privacy (compare to Definition 1.2) because chang-ing the value of one ... is assumed to be known in the case ofbounded differential privacy) ).Another way to think about bounded neighbors N2 in thecontext ... to think about bounded neighbors N2 in thecontext of differential privacy is that Ti and Tj are neigh-bors if and only ... subjectto constraints imposed by previously released exact query201answers. Bounded differential privacy then guarantees thatan attacker who knows that the true table ...
... this query, and now Drewwould like a variant of differential privacy to limit any pos-sible damage due to future (noisy) query ... eachother. Unbounded neighbors N1, used in the definition ofunbounded differential privacy (see Definitions 1.1 and 4.3),do not work here; removing or ... theabsence of additional correlations (as discussed in Sections2 and 3), privacy is guaranteed by ensuring that an attackerhas difficulty distinguishing between ... by exact query answers (Definition4.4). Plugging this into generic differential privacy (Defini-tion 4.3), we arrive at the version of differential privacy thattakes into account previously released exact query answers.Before presenting Definition ... induced neighbors NQ (whichcan now be plugged into generic differential privacy in Def-inition 4.3) with the following example. Consider contin-gency table ...
... PrivacyIn this section we adapt two algorithms for (generic) dif-ferential privacy that use the concept of induced neighborsNQ (Definition 4.4) to ... use the concept of induced neighborsNQ (Definition 4.4) to guarantee privacy. . We also showthat in general this problem (i.e. accounting ... [26], we can show that this will guarantee 2?-generic dif-ferential privacy (Definition 4.3, using NQ as the definitionof neighbors).The exponential mechanism ...
... smallest set of moves is as describedin Figure 7.2035. RELATEDWORKDifferential privacy was developed in a series of papers[1, 10, 8] following ... orig-inal database almost exactly. Since then, many additionalvariations of differential privacy have been proposed [9, 4,29, 2, 23, 20, 28].With the ... 27, 30]. Rastogi et al. [30] proposeda relaxation of differential privacy for social networks byproving an equivalence to adversarial privacy (which makesassumptions about the data), and then adding further con-straints ... non-privacy.Dwork and Naor [8, 11] have several results stating thatany privacy ... definition which provides minimal utility guar-antees cannot even safeguard the privacy of an individualwhose records are not collected by the data ... can becombined with query answers to create a threat to privacy) ).Thus Dwork proposed the principle that privacy definitionsshould only guarantee privacy for individuals in the data.Kasiviswanathan and Smith [19] formalized the ... one record in a table and showed an equivalence todifferential privacy. ... .6. CONCLUSIONSIn this paper we addressed several popular misconceptionsabout differential privacy. ... . We showed that, without furtherassumptions about the data, its privacy guarantees can de-grade when applied to social networks or when ... We proposed a prin-ciple for evaluating the suitability of a privacy definition toan application: can it hide evidence of an individual?s ... is equivalent to hiding evidence of participation, in whichcase differential privacy is a suitable definition to use. Pro-viding privacy for correlated data is an interesting directionfor future work.7. REFERENCES[1] ... Ligett, and A. Roth. A learning theoryapproach to non-interactive database privacy. . In STOC,2008.[3] P. J. Cantwell, H. Hogan, and K. ... 58(3):203?212, 2004.[4] K. Chaudhuri and N. Mishra. When random samplingpreserves privacy. . In CRYPTO, 2006.[5] K. Chaudhuri, C. Monteleoni, and A. ... 15, 1977.[7] I. Dinur and K. Nissim. Revealing information whilepreserving privacy. . In PODS, 2003.[8] C. Dwork. Differential privacy. . In ICALP, 2006.[9] C. Dwork, K. Kenthapadi, F. McSherry, ... Dwork, K. Kenthapadi, F. McSherry, and I. Mironov.Our data, ourselves: Privacy via distributed noisegeneration. In EUROCRYPT, 2006.[10] C. Dwork, F. McSherry, ...
statistical databases or the case fordifferential privacy. ... . JPC, 2(1), 2010.[12] C. Dwork and A. Smith. Differential privacy for statistics:What we know and what we want to learn. ... 2008.[20] D. Kifer and B.-R. Lin. Towards an axiomatization ofstatistical privacy and utility. In PODS, 2010.[21] R. Kumar, P. Raghavan, S. ... A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, andL. Vihuber. Privacy: : From theory to practice on the map.In ICDE, 2008.[24] ... 101(6),2004.[25] F. McSherry and I. Mironov. Differentially privaterecommender systems: building privacy into the net. InKDD, 2009.[26] F. McSherry and K. Talwar. ... InKDD, 2009.[26] F. McSherry and K. Talwar. Mechanism design viadifferential privacy. . In FOCS, 2007.[27] D. J. Mir and R. N. ... I. Mironov, O. Pandey, O. Reingold, and S. Vadhan.Computational differential privacy. . In CRYPTO, 2009.[29] K. Nissim, S. Raskhodnikova, and A. ... Association, 1965.204IntroductionAnalysis of AttackersThe No-Free-Lunch TheoremUtilityNon-privacyNo-Free-Lunch and Differential PrivacyKnowledge vs. Privacy RiskParticipation-based guidelinesSocial NetworksContingency TablesA Privacy LeakA Plausible DeniabilityDifferential Privacy Subject to Background KnowledgeNeighbors induced by prior statistics.Neighbor-based Algorithms for ...
7
June 2011
SIGMOD '11: Proceedings of the 2011 ACM SIGMOD International Conference on Management of data
Publisher: ACM
Bibliometrics:
Citation Count: 31
Downloads (6 Weeks): 8, Downloads (12 Months): 83, Downloads (Overall): 696
Full text available:
PDF
Prior work in differential privacy has produced techniques for answering aggregate queries over sensitive data in a privacy-preserving way. These techniques achieve privacy by adding noise to the query answers. Their objective is typically to minimize absolute errors while satisfying differential privacy. Thus, query answers are injected with noise whose ...
Keywords:
differential privacy, privacy
Title:
iReduct: differential privacy with reduced relative errors
CCS:
Theory of database privacy and security
Security and privacy
Keywords:
differential privacy
privacy
Abstract:
<p>Prior work in differential privacy has produced techniques for answering aggregate queries over sensitive data ... techniques for answering aggregate queries over sensitive data in a privacy- -preserving way. These techniques achieve privacy by adding noise to the query answers. Their objective is ... objective is typically to minimize absolute errors while satisfying differential privacy. . Thus, query answers are injected with noise whose scale ...
Primary CCS:
Theory of database privacy and security
Security and privacy
References:
B. Barak, K. Chaudhuri, C. Dwork, S. Kale, F. McSherry, and K. Talwar. Privacy, accuracy, and consistency too: a holistic solution to contingency table release. In Proc. of ACM Symposium on Principles of Database Systems (PODS), pages 273--282, 2007.
A. Blum, K. Ligett, and A. Roth. A learning theory approach to non-interactive database privacy. In Proc. of ACM Symposium on Theory of Computing (STOC), pages 609--618, 2008.
K. Chaudhuri and C. Monteleoni. Privacy-preserving logistic regression. In Proc. of the Neural Information Processing Systems (NIPS), pages 289--296, 2008.
G. Cormode. Individual privacy vs population privacy: Learning to attack anonymization. Computing Research Repository (CoRR), abs/1011.2511, 2010.
C. Dwork. Differential privacy. In International Colloquium on Automata, Languages and Programming (ICALP), pages 1--12, 2006.
C. Dwork, M. Naor, T. Pitassi, and G. N. Rothblum. Differential privacy under continual observation. In Proc. of ACM Symposium on Theory of Computing (STOC), pages 715--724, 2010.
A. Friedman and A. Schuster. Data mining with differential privacy. In Proc. of ACM Knowledge Discovery and Data Mining (SIGKDD), pages 493--502, 2010.
A. Ghosh, T. Roughgarden, and M. Sundararajan. Universally utility-maximizing privacy mechanisms. In Proc. of ACM Symposium on Theory of Computing (STOC), pages 351--360, 2009.
M. Götz, A. Machanavajjhala, G. Wang, X. Xiao, and J. Gehrke. Publishing search logs - a comparative study of privacy guarantees. IEEE Transactions on Knowledge and Data Engineering (TKDE), 99, 2011.
M. Hardt and K. Talwar. On the geometry of differential privacy. In Proc. of ACM Symposium on Theory of Computing (STOC), 2010.
C. Li, M. Hay, V. Rastogi, G. Miklau, and A. McGregor. Optimizing linear counting queries under differential privacy. In Proc. of ACM Symposium on Principles of Database Systems (PODS), pages 123--134, 2010.
A. Machanavajjhala, D. Kifer, J. M. Abowd, J. Gehrke, and L. Vilhuber. Privacy: Theory meets practice on the map. In Proc. of International Conference on Data Engineering (ICDE), pages 277--286, 2008.
F. McSherry and I. Mironov. Differentially private recommender systems: Building privacy into the netflix prize contenders. In Proc. of ACM Knowledge Discovery and Data Mining (SIGKDD), pages 627--636, 2009.
F. McSherry and K. Talwar. Mechanism design via differential privacy. In Symposium on Foundations of Computer Science (FOCS), pages 94--103, 2007.
A. Roth and T. Roughgarden. Interactive privacy via the median mechanism. In Proc. of ACM Symposium on Theory of Computing (STOC), pages 765--774, 2010.
X. Xiao, Y. Tao, and M. Chen. Optimal random perturbation at multiple privacy levels. Proc. of Very Large Data Bases (VLDB) Endowment, 2(1):814--825, 2009.
X. Xiao, G. Wang, and J. Gehrke. Differential privacy via wavelet transforms. In Proc. of International Conference on Data Engineering (ICDE), pages 225--236, 2010.
Full Text:
acc-vs-epsilon-USA-NBC.epsiReduct: Differential Privacy with Reduced Relative ErrorsXiaokui Xiao?, Gabriel Bender?, Michael Hay?, Johannes ... UniversitySingaporexkxiao@ntu.edu.sg?Department of Computer ScienceCornell UniversityIthaca, NY, USA{gbender,mhay,johannes}@cs.cornell.eduABSTRACTPrior work in differential privacy has produced techniques foranswering aggregate queries over sensitive data in ... produced techniques foranswering aggregate queries over sensitive data in a privacy- -preserving way. These techniques achieve privacy by adding noiseto the query answers. Their objective is typically ... objective is typically to minimize ab-solute errors while satisfying differential privacy. . Thus, query an-swers are injected with noise whose scale ... been the subject of researchfor decades.The current state-of-the-art paradigm for privacy- -preserving datapublishing is differential privacy. . Differential privacy requires thatthe aggregate statistics reported by a data publisher should ... infer much about any single tuple inthe input, and thus privacy is protected.In this paper, we consider the setting in which ...
reduce relative er-rors while still ensuring differential privacy. . Our main contributionis the iReduct algorithm (Section 4). In ... minimizing relative errors. Ordinarily, iterative resam-pling would incur a considerable privacy ?cost? because each noisyanswer to the same query leaks additional ... queries in Q on T using an algorithm thatsatisfies ?-differential privacy, , a notion of privacy defined based onthe concept of neighboring datasets.DEFINITION 1 (NEIGHBORING DATASETS). ... the samecardinality but differ in one tuple. ?DEFINITION 2 (?-DIFFERENTIAL PRIVACY [9]). A random-ized algorithm G satisfies ?-differential privacy, , if for any output Oof G and any neighboring ... notations that will be frequently used in this paper.2.2 Differential Privacy via Laplace NoiseDwork et al. [9] show that ?-differential privacy can be achievedby adding i.i.d. noise to the result of ... the noise scale) is aparameter that controls the degree of privacy protection. A ran-dom variable that follows the Laplace distribution has ... as defined in Equation 6Table 1: Frequently Used NotationsThe resulting privacy depends both on the scale of the noise andthe sensitivity ... adding Laplace noise of scale ? leads to(S(Q)/?)-differential privacy.PROPOSITION 1 (PRIVACY FROM LAPLACE NOISE [9]).Let Q be a sequence of queries ... follows a Laplace distribution of scale ?. Then, G satisfies(S(Q)/?)-differential privacy. . ?3. FIRST-CUT SOLUTIONSDwork et al.?s method, described above, may ...
... thatachieves uniform expected relative errors but fails to satisfy differ-ential privacy, , while the second is a first attempt at a ... adding Laplace noise withunequal scale leads to GS(Q,?)-differential privacy.PROPOSITION 2 (PRIVACY FROM UNEQUAL NOISE [32]).Let Q = [q1, . . . ... noisefollows a Laplace distribution of scale ?i. Then, G satisfiesGS(Q,?)-differential privacy. . ?For convenience, we use LaplaceNoise(T,Q,?) to denote theabove algorithm. ... worst-case relative error.Unfortunately Proportional is flawed because it violates differ-ential privacy. . This is because the scale of the noise depends ... on one dataset than the other. Thefollowing example illustrates the privacy defect of Proportional.EXAMPLE 1. Let ? = 1 and ? ... + 101/1.2)= exp(535/42) > exp(?).This demonstrates that Proportional violates differential privacy. . ?Algorithm TwoPhase (T , Q, ?, ?1, ?2)1. let ...
... respectively).The following proposition states that TwoPhase ensures ?-differential privacy.PROPOSITION 3 (PRIVACY OF TwoPhase). TwoPhase en-sures ?-differential privacy when its input parameters ?1 and ?2satisfy ?1 + ?2 ... two invocations of the LaplaceNoise mechanism. Thefirst invocation satisfies ?1-differential privacy, , which follows fromProposition 1 and the fact that ?i ... at most ?2, therefore by Proposition 2 it sat-isfies ?2-differential privacy. . Finally, differentially private algo-rithms compose: the sequential application of ... algo-rithms compose: the sequential application of algorithms {Gi},each satisfying ?i-differential privacy, , yields (?i ?i)-differentialprivacy [24]. Therefore TwoPhase satisfies (?1 + ... (but private!)answers. While this is an improvement over Proportional froma privacy perspective, TwoPhase has two principal limitations interms of utility. The ... decrease the noise scale of any an-swer without violating differential privacy.
... only if ? ? 1?? +1?. In other words, the privacy ?cost? ofpublishing independent estimates of Y ? and Y is ... 1? y| ? |c? y|))= exp(1/?? + 1/?).In contrast, the privacy cost of publishing Y ? alone is 1/??. Thatis, we ... once we generate Y ?1.Intuitively, the reason for this excess privacy cost is that bothY and Y ? leak information about ... (5)To see how this restriction allows us to use the privacy budgetmore efficiently, let us again consider any two neighboring datasetsT1 ... This does reduce the expected error, but it stillhas excess privacy cost compared to NoiseDown, as explained inAppendix A.232This allows us ... A.232This allows us to derive an upper bound on the privacy cost ofan algorithm that outputs Y followed by Y ?. ...
of just 1/??, i.e., no privacy budget is wasted on Y .In summary, if Y ? ... perform the desired resampling procedure withoutincurring any loss in the privacy budget. We now define the condi-tional probability distribution of Y ...
... whether the conservative set-ting of the noise scale guarantees ?-differential privacy (Line 3).This is done by measuring the generalized sensitivity of ... iReduct checks whether the revised scales are permissi-ble given the privacy budget. This is done by measuring the gener-alized sensitivity given ... subset of queries whose noise can bereduced without violating the privacy constraint. At this point, itoutputs the noisy answers Y .THEOREM ... At this point, itoutputs the noisy answers Y .THEOREM 2 (PRIVACY OF iReduct). iReduct ensures ??-differential privacy whenever its input parameter ? satisfies ? ? ??.?2345. CASE ... present an instantiation of the TwoPhase andiReduct algorithms for generating privacy- -preserving marginals.Section 5.1 describes the problem and discusses existing solutionsand ...
... not be able to toler-ate the random noise added for privacy. . Instead, we therefore pub-lish a set M of low ... 2 |M|, i.e., Laplace noise of scale2 |M|/? suffices for privacy, , as shown in Proposition 1. However,adding an equal amount ... balance the quality of M1 and M2 withoutdegrading their overall privacy guarantee.We measure the utility of a set of noisy marginals ... that?Mi(?i/|Mi| ?|Mi|j=1 1/max{?, qij(T )}) is mini-mized, subject to the privacy constraint that the marginals shouldensure ?-differential privacy, , i.e.,?|M|i=1 2/?i ? ?. Using a La-grange multiplier, it ...
... the decrease in the noise scale of M?j )but the privacy guarantee degrades. Ideally, running NoiseDownon the selected marginal should lead ... in the over-all error and a small increase in the privacy overhead. To identifygood candidates, we adopt the following methods to ... following methods to quantify thechanges in the overall error and privacy overhead that are incurredby invoking NoiseDown on a marginal M?j ... and 13. In light of this, we quantify thecost of privacy entailed by applying NoiseDown on M?j asg? ? g? = ... betweenthe estimated decrease in overall error and the estimated increasein privacy cost.00.020.040.060.080.10.120.140 0.1 0.2 0.3 0.4 0.5 0.6?1 / ?overall errorTwoPhase00.010.020.030.040.050.060.070 ... expected overall error. Although Oracle does not conform to ?-differential privacy, , it provides a lower bound on the error incurredby ...
privacy algorithms.The second and third methods are the TwoPhase and iResamp ... the performance of theTwoPhase algorithm depends on how the fixed privacy budget isallocated across its two phases. This means that before ...
... 5, 9?12, 14, 15, 17?20, 22,26?28, 32] for enforcing ?-differential privacy in the publication ofvarious types of data, such as relational ...
... for Dwork et al?s method.Besides the aforementioned work on ?-differential privacy, , thereis also a technique by Xiao et al. [31] ... for sampling [7]. However, a challengeremains: the noise introduced for privacy may produce marginalsthat are infeasible, meaning that it is impossible ... K. Chaudhuri, C. Dwork, S. Kale, F. McSherry,and K. Talwar. Privacy, , accuracy, and consistency too: aholistic solution to contingency table ...
... Ligett, and A. Roth. A learning theory approachto non-interactive database privacy. . In Proc. of ACMSymposium on Theory of Computing (STOC), ... of Computing (STOC), pages609?618, 2008.[5] K. Chaudhuri and C. Monteleoni. Privacy- -preserving logisticregression. In Proc. of the Neural Information ProcessingSystems (NIPS), ... Neural Information ProcessingSystems (NIPS), pages 289?296, 2008.[6] G. Cormode. Individual privacy vs population privacy:Learning to attack anonymization. Computing ResearchRepository (CoRR), abs/1011.2511, ... forsampling from conditional distributions. Annals of Statistics,1998.[8] C. Dwork. Differential privacy. . In International Colloquiumon Automata, Languages and Programming (ICALP), pages1?12, ... C. Dwork, M. Naor, T. Pitassi, and G. N. Rothblum.Differential privacy under continual observation. In Proc. ofACM Symposium on Theory of ... 30(4):888?928, 2005.[14] A. Ghosh, T. Roughgarden, and M. Sundararajan.Universally utility-maximizing privacy mechanisms. In Proc.of ACM Symposium on Theory of Computing (STOC), ... Machanavajjhala, D. Kifer, J. M. Abowd, J. Gehrke, andL. Vilhuber. Privacy: : Theory meets practice on the map. InProc. of International ... 2008.[23] F. McSherry and I. Mironov. Differentially privaterecommender systems: Building privacy into the netflix prizecontenders. In Proc. of ACM Knowledge Discovery ... 627?636, 2009.[24] F. McSherry and K. Talwar. Mechanism design via239differential privacy. . In Symposium on Foundations ofComputer Science (FOCS), pages 94?103, ... Data (SIGMOD), pages735?746, 2010.[28] A. Roth and T. Roughgarden. Interactive privacy via themedian mechanism. In Proc. of ACM Symposium on Theoryof ... Xiao, Y. Tao, and M. Chen. Optimal random perturbationat multiple privacy levels. Proc. of Very Large Data Bases(VLDB) Endowment, 2(1):814?825, 2009.[32] ... 2(1):814?825, 2009.[32] X. Xiao, G. Wang, and J. Gehrke. Differential privacy
... generating y?i is not cost-effective since, intu-itively, it incurs a privacy overhead of 1/??i + 1/?i (i.e., the com-bined cost of ... next iteration (Line 17).To show that the algorithm guarantees ?-differential privacy, , itsuffices to show that the set of all noisy ... of the resulting geometric series, it can be verifiedthat the privacy cost of producing y(1)i , . . . , y(k)i ... The aboveargument concludes our proof of the following theorem:THEOREM 3 (PRIVACY OF iResamp). iResamp ensures ??-differential privacy
8
May 2012
PODS '12: Proceedings of the 31st ACM SIGMOD-SIGACT-SIGAI symposium on Principles of Database Systems
Publisher: ACM
Bibliometrics:
Citation Count: 18
Downloads (6 Weeks): 4, Downloads (12 Months): 83, Downloads (Overall): 556
Full text available:
PDF
In this paper we introduce a new and general privacy framework called Pufferfish. The Pufferfish framework can be used to create new privacy definitions that are customized to the needs of a given application. The goal of Pufferfish is to allow experts in an application domain, who frequently do not ...
Keywords:
differential privacy, privacy
CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Keywords:
differential privacy
privacy
Abstract:
<p>In this paper we introduce a new and general privacy framework called Pufferfish. The Pufferfish framework can be used to ... Pufferfish. The Pufferfish framework can be used to create new privacy definitions that are customized to the needs of a given ... an application domain, who frequently do not have expertise in privacy, , to develop rigorous privacy definitions for their data sharing needs. In addition to this, ... the Pufferfish framework can also be used to study existing privacy ... definitions.</p> <p>We illustrate the benefits with several applications of this privacy framework: we use it to formalize and prove the statement ... use it to formalize and prove the statement that differential privacy assumes independence between records, we use it to define and ...
Title:
A rigorous and customizable framework for privacy
References:
A. Blum, K. Ligett, and A. Roth. A learning theory approach to non-interactive database privacy. In STOC, pages 609--618, 2008.
K. Chaudhuri and N. Mishra. When random sampling preserves privacy. In CRYPTO, 2006.
B.-C. Chen, D. Kifer, K. LeFevre, and A. Machanavajjhala. Privacy-preserving data publishing. Foundations and Trends in Databases, 2(1--2):1--167, 2009.
C. Clifton, M. Kantarcioglu, and J. Vaidya. Defining privacy for data mining. In Proc. of the NSF Workshop on Next Generation Data Mining, 2002.
C. Clifton and D. Marks. Security and privacy implications of data mining. In Proceedings of the ACM SIGMOD Workshop on Data Mining and Knowledge Discovery, 1996.
Y. Duan. Privacy without noise. In CIKM, 2009.
C. Dwork. Differential privacy. In ICALP, 2006.
C. Dwork and M. Naor. On the difficulties of disclosure prevention in statistical databases or the case for differential privacy. JPC, 2(1), 2010.
B. C. M. Fung, K. Wang, R. Chen, and P. S. Yu. Privacy-preserving data publishing: A survey on recent developments. ACM Computing Surveys, 42(4), 2010.
S. R. Ganta, S. P. Kasiviswanathan, and A. Smith. Composition attacks and auxiliary information in data privacy. In KDD, 2008.
J. Gehrke, E. Lui, and R. Pass. Towards privacy for social networks: A zero-knowledge based definition of privacy. In TCC, 2011.
D. Kifer and B.-R. Lin. An axiomatic view of statistical privacy and utility. To appear in Journal of Privacy and Confidentiality.
D. Kifer and B.-R. Lin. Towards an axiomatization of statistical privacy and utility. In PODS, 2010.
D. Kifer and A. Machanavajjhala. No free lunch in data privacy. In SIGMOD, 2011.
B.-R. Lin and D. Kifer. A framework for extracting semantic guarantees from privacy definitions. Technical report, Penn State University, 2012.
A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, and L. Vilhuber. Privacy: From theory to practice on the map. In ICDE, 2008.
F. D. McSherry. Privacy integrated queries: An extensible platform for privacy-preserving data analysis. In SIGMOD, pages 19--30, 2009.
I. Mironov, O. Pandey, O. Reingold, and S. Vadhan. Computational differential privacy. In CRYPTO, 2009.
S. R. M. Oliveira and O. R. Zaiane. Algorithms for balancing privacy and knowledge discovery in association rule mining. In International Database Engineering and Applications Symposium, 2003.
V. Rastogi, M. Hay, G. Miklau, and D. Suciu. Relationship privacy: Output perturbation for queries with joins. In PODS, pages 107--116, 2009.
S. Sankararaman, G. Obozinski, M. I. Jordan, and E. Halperin. Genomic privacy and limits of individual detection in a pool. Nature genetics, 41(9):965--967, September 2009.
C. C. Aggarwal, J. Pei, and B. Zhang. On privacy preservation against adversarial data mining. In KDD, 2006.
V. S. Verykios, E. Bertino, I. N. Fovino, L. P. Provenza, Y. Saygin, and Y. Theodoridis. State-of-the-art in privacy preserving data mining. SIGMOD Rec., 33(1), 2004.
E. T. Wang and G. Lee. An efficient sanitization algorithm for balancing information privacy and knowledge discovery in association patterns mining. Data & Knowledge Engineering, 65(3), 2008.
K. Wang, B. Fung, and P. Yu. Template-based privacy preservation in classification problems. In ICDM, 2005.
X. Xiao and Y. Tao. Anatomy: Simple and effective privacy preservation. In VLDB, 2006.
S. Zhou, K. Ligett, and L. Wasserman. Differential privacy with compression. In ISIT, 2009.
Full Text:
... MachanavajjhalaYahoo! ResearchABSTRACTIn this paper we introduce a new and general privacy frame-work called Pufferfish. The Pufferfish framework can beused to create ... called Pufferfish. The Pufferfish framework can beused to create new privacy definitions that are customizedto the needs of a given application. ... in an application domain, who frequentlydo not have expertise in privacy, , to develop rigorous privacydefinitions for their data sharing needs. ... we use it to formalize and prove thestatement that differential privacy assumes independencebetween records, we use it to define and study ... needfor different parties to share datasets, the field of statis-tical privacy is seeing an unprecedented growth in impor-tance and diversity of ... in impor-tance and diversity of applications. These applications in-clude protecting privacy and confidentiality in (computer)Permission to make digital or hard copies ... Arizona, USA.Copyright 2012 ACM 978-1-4503-1248-6/12/05 ...$10.00.network data collections [32], protecting privacy and iden-tifiability in genome-wide association studies (GWAS) [19],protecting confidentiality in ... In each case, the goal is to release useful information(i.e. privacy- -preserving query answers or a sanitized versionof the dataset) while ... data-generating process,such as correlations between records, play an important role.Assumptionless privacy definitions are a myth: if one wantsto publish useful, privacy- -preserving sanitized data then onemust make assumptions about the original ... [22, 14].Thus application domain experts, who are frequently notexperts in privacy, , cannot simply use a single generic pri-vacy definition ? ... single generic pri-vacy definition ? they must develop a new privacy definitionor customize an existing one. Without guidance this hasresulted in ... by proposed fixes, followed by new attacks.At the same time, privacy experts need new methods foranalyzing new and existing privacy definitions in order toevolve the state of the art. Many ... datasets; weshow how these approaches frequently obscure the semanticsof the privacy ... guarantees that are provided.In this paper we present a new privacy framework, calledPufferfish. This framework can be used to study existingprivacy ... framework can be used to study existingprivacy definitions like differential privacy [12], to study im-portant concepts like composition [17], and to ... The Pufferfish framework follows modern designguidelines such as adherence to privacy axioms [21, 20] andmaking assumptions as explicit as possible.The contributions ... explicit as possible.The contributions of this paper are:? A new privacy framework which provides rigorous statis-tical semantic privacy guarantees.? An application of the framework to differential privacywhich formalizes ... differential privacywhich formalizes and then proves the statement that dif-ferential privacy? ?assumes?independence between records.? Another application to differential privacy showing how tomodify it in response to prior release of ... information (naive applications of differential pri-vacy can lead to a privacy breach [22]).77? An application of the framework that allows us ... framework that allows us to expandthe notion of composition between privacy
... related work in Section4. We show that Pufferfish satisfies fundamental privacy ax-ioms in Section 5. Subsequent sections present a variety ofapplications ... ofapplications of this framework. We use Pufferfish to ana-lyze differential privacy and prove its record-independenceassumption in Section 6. We show how ... to study composition in Section 8. We show howto provide privacy while accounting for prior data releasesin Section 9. Proofs can ... Data represents a random variable. The data curatorwill choose a privacy definition and an privacy mechanism(algorithm) M that satisfies that privacy definition. Thedata curator will then apply M to the data ... data equals Di is P (Data =Di | ?).M A privacy mechanism: a deterministic or ran-domized algorithm (often used in the ... deterministic or ran-domized algorithm (often used in the context ofa privacy definition)Table 1: Table of Notationdata?, ?the record for individual hi ...
... of Dk and sj is true ofD` ? the resulting privacy definition (instantiated by Defini-tion 3.1) will often be too strong ... assumptions are absolutely necessary ? pri-vacy definitions that can provide privacy guarantees withoutmaking any assumptions provide little utility beyond the de-fault ... of specifying these distributions in Section 6 where weanalyze differential privacy. . Below we give some examplesof possible choices of D ... Mostimportantly, the domain expert is no longer required to bea privacy expert.Definition 3.1 (Pufferfish Privacy) ). Given set ofpotential secrets S, a set of discriminative ... pairs Spairs, aset of data evolution scenarios D, and a privacy
-PufferF ish(S, Spairs,D) privacy if? for all possible outputs ? ? range(M),? for all ... (sj | ?) 6= 0)).The Pufferfish framework differs from differential privacy[ [12] and its variants [11, 30, 5, 3, 22, 21, ... trying to distinguishbetween whether si or sj is true.3.2 Example: Privacy with no AssumptionsAs a warmup, we use Pufferfish to create ... no AssumptionsAs a warmup, we use Pufferfish to create a privacy def-inition with no assumptions (a re-interpretation of no-free-lunch privacy [22]). Let T be the domain of tuples and letH ... where ta and tb rangeover all possible tuple values).To get privacy with no assumptions, we must make the setof distributional assumptions ...
... formal equivalence between Pufferfish with no assump-tions and a strawman privacy definition called no-free-lunchprivacy that was used in [22] as an ... of a privacydefinition without utility.4. RELATED WORK4.1 Relation to Differential Privacy VariantsOne can view the Pufferfish framework as a substantialgeneralization of ... can view the Pufferfish framework as a substantialgeneralization of differential privacy. . Thus we explain thedifferences between Pufferfish and differential privacy [12]and its variants [11, 30, 5, 3, 22, 21, 25, ... 25, 27, 18, 41, 33].A related framework, known as adversarial privacy [33]allows domain experts to plug in various data generatingdistributions. While ... various data generatingdistributions. While there is no known equivalence betweenadversarial privacy and differential privacy, , Rastogi et al.[33] have proved an equivalence between a ... have proved an equivalence between a certain instan-tiation of adversarial privacy and a variant of differentialprivacy known as ?-indistinguishability (the main ... variant of differentialprivacy known as ?-indistinguishability (the main differencewith differential privacy is the requirement that neighbor-ing databases have the same size) ... more flexible. We canprove equivalences between instantiations of Pufferfish anddifferential privacy as well as ?-indistinguishability. Puffer-fish also allows more fine-grained protection ... is the resultof a different philosophy about how to phrase privacy defi-nitions. Differential privacy uses concepts like neighboringdatabases (i.e. those that differ on only ... to be independent). This allowsus to compare Pufferfish to differential privacy and to showthat differential privacy is an instantiation of the Pufferfishframework under an assumption of ... 22, 21, 25, 27,41, 18]. Of the remaining work, adversarial privacy [33]only seeks to protect inference about the presence of a ... protect inference about the presence of a tuple,while BLR distributional privacy [3] does not indicate whatsecrets it protects. One of the ... allow the applications describedin this paper.4.2 Relationship to Other WorkOther privacy frameworks also exist. A large class, ofwhich k-anonymity [34] is ... to outputs: P (M(Di) = ?). Because of this,the resulting privacy definitions tend to be less secure andare often subject to ...
... follows the syntactic paradigm and it is oftenunclear what rigorous privacy guarantees can be providedor what data assumptions are needed for ... can be providedor what data assumptions are needed for ensuring privacy. .5. PUFFERFISH AND PRIVACY AXIOMSResearch in statistical privacy has been moving away fromad-hoc privacy definitions and towards formal and rigorousprivacy definitions. The reason is ... towards formal and rigorousprivacy definitions. The reason is that rigorous privacy def-initions offer the promise of ending the endless cycle of ... ending the endless cycle of dis-covering a vulnerability in a privacy definition, proposing afix, finding a vulnerability in the ?fixed? version, ... examples).To this end, recent research has started examining theproperties that privacy definitions need to have [21, 20].Modern design guidelines for privacy definitions include 2fundamental axioms known as transformation invariance andconvexity [20]. ... the ideas contained in the axioms havebeen accepted by the privacy community for a long time,only recently has there been an ... a long time,only recently has there been an insistence that privacy defi-nitions actually satisfy them (in fact, many of the vulnera-bilities ... satisfying those axioms [24]).In this section we show that every privacy definition inthe Pufferfish framework satisfies both fundamental axioms,thus ensuring that ... 5.1. (Transformation Invariance [20]). If an algo-rithm M satisfies a privacy definition and A is any algorithmsuch that (1) its domain ... then runs A on the outputshould also satisfy the same privacy definition.The justification for the transformation invariance axiom isthat A is ... the output of M; thus it would bestrange if a privacy definition implied the output ? of Mwas safe to release, ... the statistical analysison this output ? were not (many existing privacy definitionsfail to satisfy this property [20]).Axiom 5.2 (Convexity [20]). If ... property [20]).Axiom 5.2 (Convexity [20]). If M1 and M2 satisfya privacy definition, and p ? [0, 1], then the algorithm Mpwhich ... p and M2 with probability1? p should also satisfy the privacy definition.The convexity axiom says that a data curator is allowed ... into the creation of sani-tized data). Again, most proposed existing privacy defini-tions fail to satisfy this property.The following theorem confirms that ... property.The following theorem confirms that the Pufferfish frame-work satisfies modern privacy design guidelines.Theorem 5.1. For every S, Spairs, D, and ? ... axiomsof convexity and transformation invariance.6. PUFFERFISH ANALYSIS OF DIFFEREN-TIAL PRIVACYDifferential privacy [12] is a state of the art privacy def-inition which has been very influential in modern privacyresearch. It ... in modern privacyresearch. It is formally defined as:Definition 6.1 (Differential Privacy [12]). Givena privacy parameter ? > 0, an algorithm M satisfies ?-differential privacy if for all ? ? range(M) and all pairsof datasets ... much about the remaining tuple.Because neither the definition of differential privacy norits interpretations mentioned any data-generating distribu-tions, many believed that it ... what data evolution scenarios D is ?-differential privacyequal to the privacy definition ?-PufferF ish(S, Spairs,D)?. Wewill see that the appropriate D ...
... then the resulting Pufferfish instan-tiation is strictly stronger than ?-differential privacy (Theo-rem 6.2); alternatively, under correlations between records,?-differential privacy is not strong enough to guarantee thatthe changes in attacker?s ... bounds on an attacker?sinference that result from uses of ?-differential privacy de-grade under correlations).To proceed, we must first specify the potential ... and fi).The following theorem says that under this probabilisticmodel, ?-differential privacy becomes an instantiation of thePufferfish framework.Theorem 6.1. Let S and ... CONTINUOUS ATTRIBUTES AND AG-GREGATE SECRETSOne of the difficult problems in privacy- -preserving datapublishing is protecting the values of continuous variablesthat are ... large values (such as income). For example, manyalgorithms for differential privacy do not work in the firstcase (i.e. no a priori ... protectaggregate secrets (Section 7.2). Finally, we use Pufferfishto provide approximate privacy semantics for ?-constrained?-differential privacy [41] and ZLW distributional privacy[ [41] (Section 7.3); those two definitions were also designedfor continuous ... clear.7.1 Protecting Continuous AttributesAs we saw in Section 6, differential privacy is designedto make it difficult to distinguish between the case ...
... of utility is unacceptable, the data curator maywant to relax privacy by stating requirements such as (1) anattacker should not be ... Both of these requirementscan be handled in the Pufferfish framework.7.1.1 Privacy via Absolute ErrorFor ease of explanation, suppose that records belongingto ... to each other with high probability. In contrast, satis-fying differential privacy by adding noise to the sum wouldrequire a distribution with ... therelaxation created using Pufferfish allows more utility whileclearly describing the privacy lost (i.e. income is inferableto an absolute error of ? ... X +?ni=1 ti where X has density?8ke??|x|/4k satisfies?-PufferF ish(S, Spairs,D).7.1.2 Privacy via Relative ErrorWe can extend these ideas to protect against ...
... there is little focus in the literature on rigorousand formal privacy guarantees for business data.6In some cases a business may have ... ?-Constrained and Distributional PrivacyZhou et al. [41] also proposed two privacy definitions thatcan be used with continuous variables. Those definitionswere introduced ... introduced solely for the study of utility and their pre-cise privacy semantics (i.e. what inferences do they protectagainst) were not explained ... they protectagainst) were not explained nor explored. As with differ-ential privacy, , they are phrased in terms of databases thatshould be ... terms of databases thatshould be indistinguishable. As a result the privacy guaran-tees and conditions under which they hold are not clear. ... hold are not clear. Weshow an approximate equivalence between those privacy def-initions and instantiations of Pufferfish, so that the Puffer-fish framework ... nonrigorous approaches are rare for business data.83and gives them clearer privacy semantics. We start withthose definitions:Definition 7.1. (Constrained and ZLW-Distributional Pri-vacy ... tuple and (D1, D2) ? ? then algorithm M satisfies ?-constrained?-differential privacy. .? If those conditions hold when (1) D1 and D2 ... 0 be constants. An algo-rithm M satisfies (?, ?)-modified ZLW privacy if for every? ? range(M) and every pair of databases ... ?-PufferF ish(S?, Spairs?,D) then it also satis-fies (?, ?)-modified ZLW privacy; ; conversely, if M satisfies(?, ?)-modified ZLW privacy then it satisfies the definition4?-PufferF ish(S?, Spairs?,D) (i.e. up to ... of semantic guarantees in terms of odds-ratio).Thus although the precise privacy
... use the prefix ZLW to distinguish it from the distribu-tional privacy definition introduced in [3].8This condition is achieved, for example, by ... ? is less than the radius of I.8. COMPOSITIONGiven a privacy ... definition, the notion of composition [17]refers to the degradation of privacy due to two independentdata releases. For example, Alice may choose ... can each run an algorithm MAlice and MBob (possiblysatisfying different privacy definitions) on their own datasetand output the result. It is ... their own datasetand output the result. It is known that privacy can degradein those instances. For example, two independent releasesof k-anonymous ... example, two independent releasesof k-anonymous tables can lead to a privacy breach [17].Also, a differentially private release combined with a releaseof ... also lead to a breach [22]. Onthe other hand, differential privacy composes well with itself:if Alice uses an ?1-differentially private algorithm ... guar-antees that the combination of their data releases will notbreach privacy. .8.1 Pufferfish View of CompositionSince the Pufferfish framework provides a ... to the Pufferfishframework). Alice would like to examine the consequencesto privacy that can occur when they both release sanitizeddata using their ... can occur when they both release sanitizeddata using their chosen privacy definitions.In order for Alice to study how her privacy definition com-poses with possible data releases from Bob, she needs ... = ?)E[[[P (A(DataBob) = ??)|DataAlice = D,?]]].Thus to study the privacy properties of this joint data re-lease, Alice only needs to ... all M satisfying herprivacy definition, and all A satisfying Bob?s privacy
... M??,A,M (for allchoices of ?,M,A) satisfies ??-PufferF ish(S, Spairs,D) (i.e.her privacy definition with a different privacy parameter ??).8.2 Self-compositionIn this section, we study a special case ... this special case self-composition. This is a helpful property for privacy defini-tions to have since it is useful in the design ... = D) = 1).Thus, Alice has a datasetData, announces a privacy def-inition ?-PufferF ish(S, Spairs,D) and chooses two algorithmsM1 and M2 ... satisfies Alice?s chosen instantia-tion of the Pufferfish framework with some privacy parame-ter ??. This brings up the notion of linear self-composition:Definition ... simpler algorithms M1, . . . ,Mk and allocateher overall privacy budget ? among them [26].8.2.1 Sufficient conditions for self-compositionIn general, ... additional constraint thatan algorithm M must satisfy.If we have a privacy definition PufferF ish(S, Spairs,D) thatself-composes linearly, it can happen that ... ? is a universallycomposable evolution scenario for Spairs if the privacy defini-tion PufferF ish(S, Spairs,D?{?}) self-composes linearly when-ever PufferF ish(S, Spairs,D) ... ?)with randomness only depending on M.In the case of differential privacy, , those universally com-posable evolution scenarios ? are those for ... ofthe other distributions that generate records independentlywithout change to the privacy guarantees, and Theorem 6.2says that differentially private algorithms may leak ...
PRIVACY WITH DETER-MINISTIC CONSTRAINTSIt was shown in [22] that differential privacy does notcompose well with deterministic data constraints such asthose caused ... about the data and subsequently publishes addi-tional information using ?-differential privacy, , the combineddata releases can leak much more information than ... [22] proposed a modification of differential pri-vacy, called induced neighbors privacy [22] to account forprior deterministic data releases. As with many ... account forprior deterministic data releases. As with many variantsof differential privacy, , it was a ?neighbors-based? based def-inition that tried to ... we haveshown in Sections 6 and 7, this can obscure privacy guar-antees and the conditions under which the guarantees hold.Viewed through ... bound the attacker?s odds ratio (themain reason being that neighbor-based privacy definitionsoften cannot explicitly mention what secrets to protect).In this section ... 8 to show how to use Pufferfish to modify dif-ferential privacy in a way that takes into account arbitrarydeterministic constraints (not ... by prior de-terministic releases of data). The result is a privacy defini-tion with precise semantic guarantees and clearly specifiedassumptions under which ... they hold. We also show someconditions under which induced neighbors privacy [22] is ac-tually equivalent to an instantiation of the Pufferfish ... an instantiation of the Pufferfish frame-work, thus providing induced neighbors privacy with precisesemantic guarantees in those situations.9.1 PreliminariesSeveral types of constraints ... help illustrate the benefits of Pufferfish anddistinguish it from neighbor-based privacy definitions. Sup-pose there are n students with ID numbers ranging ... not possible to release meaningful information inthis situation, but differential privacy
induced-neighbors privacy, , Definition 9.2).Induced neighbors privacy [22] uses the following definitions:Definition 9.1. (Move [22]). Given a ... can transform Da into someDc ? IQ.Definition 9.3. (Induced Neighbors Privacy) ). An algo-rithm M satisfies induced neighbor privacy with constraintQ, if for each output ? ? range(M) and ... that the Laplace mechanism from Example9.1 also satisfies induced neighbor privacy for this particularscenario since all induced neighbors are pairs (Di, ... ZQ = P (Q(Data) = 1).We show that ?-induced neighbors privacy is a necessarycondition for guaranteeing ?-PufferF ish(S, Spairs,D?Q), forany general ... if M satisfies ?-PufferF ish(S, Spairs,D?Q) thenM satisfies ?-induced neighbors privacy with respect to Q.However, the next example shows ?-induced neighbors ... theLaplace mechanism, which satisfies both ?-differential pri-vacy and ?-induced neighbors privacy, , does not satisfy ?-PufferF ish(S, Spairs,D?Q).Consider a ? of ... P (M(Data) = n | ?(1,0), ?)Therefore satisfying ?-differential privacy or induced neigh-bors privacy in this situation does not bound an attacker?sodds-ratio to the ... like partitioning and microaggregation [1],or bucketization algorithms based on syntactic privacy no-tions like k-anonymity (with say k = 10, 000), `-diversity,etc. ...
... an attacker could make.In fact, for those cases, ?-induced neighbor privacy be-comes an instantiation of the Pufferfish framework (Theo-rems 9.1 and ... Then M satisfies?-PufferF ish(S, Spairs,D?Quni) if M satisfies ?-induced neigh-bors privacy with respect to Quni.Thus the algorithms proposed in [22] can ... question that was left open in [22] is whetherinduced neighbor privacy is linear self composable. Theo-rems 9.1 and 9.2 allow us ... to answer this question. Since?-PufferF ish(S, Spairs,D?Quni) and induced neighbor privacy( (for univariate histograms) are equivalent definitions, it iseasy to see ... and gen-eral framework that allows application domain experts todevelop rigorous privacy definitions for their data sharingneeds. The framework allows the domain ... data sharingneeds. The framework allows the domain experts to cus-tomize privacy to the specific set of secrets and data evo-lution scenarios ... We usedour general framework to prove the statement that differ-ential privacy assumed independence between records, anddefine and study notions of composition ... Ligett, and A. Roth. A learning theoryapproach to non-interactive database privacy. . In STOC,pages 609?618, 2008.[4] P. J. Cantwell, H. Hogan, ... 58(3):203?212, 2004.[5] K. Chaudhuri and N. Mishra. When random samplingpreserves privacy. . In CRYPTO, 2006.[6] B.-C. Chen, D. Kifer, K. LeFevre, ... Data Mining, 2002.[9] C. Clifton and D. Marks. Security and privacy implicationsof data mining. In Proceedings of the ACM SIGMODWorkshop on ... approach to sensitive classification rule hiding.In SAC, 2010.[11] Y. Duan. Privacy without noise. In CIKM, 2009.[12] C. Dwork. Differential privacy. . In ICALP, 2006.[13] C. Dwork, F. McSherry, K. Nissim, ... difficulties of disclosureprevention in statistical databases or the case fordifferential privacy. . JPC, 2(1), 2010.[15] S. E. Fienberg. Confidentiality and disclosure ... KDD, 2008.[18] J. Gehrke, E. Lui, and R. Pass. Towards privacy for socialnetworks: A zero-knowledge based definition of privacy. . InTCC, 2011.[19] N. Homer, S. Szelinger, M. Redman, D. ...
appear in Journal of Privacy ... andConfidentiality.[21] D. Kifer and B.-R. Lin. Towards an axiomatization ofstatistical privacy and utility. In PODS, 2010.[22] D. Kifer and A. Machanavajjhala. ... Lin and D. Kifer. A framework for extractingsemantic guarantees from privacy definitions. Technicalreport, Penn State University, 2012.[25] A. Machanavajjhala, D. Kifer, ... A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, andL. Vilhuber. Privacy: : From theory to practice on the map.In ICDE, 2008.[26] ... to practice on the map.In ICDE, 2008.[26] F. D. McSherry. Privacy integrated queries: An extensibleplatform for privacy- -preserving data analysis. In SIGMOD,pages 19?30, 2009.[27] I. Mironov, O. ... I. Mironov, O. Pandey, O. Reingold, and S. Vadhan.Computational differential privacy. . In CRYPTO, 2009.[28] G. V. Moustakides and V. S. ... S. R. M. Oliveira and O. R. Zaiane. Algorithms forbalancing privacy and knowledge discovery in associationrule mining. In International Database Engineering ... S. Sankararaman, G. Obozinski, M. I. Jordan, andE. Halperin. Genomic privacy and limits of individualdetection in a pool. Nature genetics, 41(9):965?967,September ... Wang and G. Lee. An efficient sanitization algorithmfor balancing information privacy and knowledge discoveryin association patterns mining. Data & KnowledgeEngineering, 65(3), ... privacywith compression. In ISIT, 2009.88IntroductionNotation and TerminologyThe Pufferfish FrameworkSemantic GuaranteesExample: Privacy with no AssumptionsRelated WorkRelation to Differential Privacy VariantsRelationship to Other WorkPufferfish and Privacy AxiomsPufferfish Analysis of Differential PrivacyContinuous Attributes and Aggregate SecretsProtecting Continuous ... Secrets-Constrained and Distributional PrivacyCompositionPufferfish View of CompositionSelf-compositionSufficient conditions for self-compositionDifferential Privacy
9
January 2014
ACM Transactions on Database Systems (TODS): Volume 39 Issue 1, January 2014
Publisher: ACM
Bibliometrics:
Citation Count: 13
Downloads (6 Weeks): 12, Downloads (12 Months): 197, Downloads (Overall): 886
Full text available:
PDF
In this article, we introduce a new and general privacy framework called Pufferfish. The Pufferfish framework can be used to create new privacy definitions that are customized to the needs of a given application. The goal of Pufferfish is to allow experts in an application domain, who frequently do not ...
Keywords:
Privacy, differential privacy
CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Keywords:
Privacy
differential privacy
Abstract:
<p>In this article, we introduce a new and general privacy framework called Pufferfish. The Pufferfish framework can be used to ... Pufferfish. The Pufferfish framework can be used to create new privacy definitions that are customized to the needs of a given ... an application domain, who frequently do not have expertise in privacy, , to develop rigorous privacy definitions for their data sharing needs. In addition to this, ... the Pufferfish framework can also be used to study existing privacy ... definitions.</p> <p>We illustrate the benefits with several applications of this privacy framework: we use it to analyze differential privacy and formalize a connection to attackers who believe that the ... data records are independent; we use it to create a privacy definition called hedging privacy, , which can be used to rule out attackers whose ...
Title:
Pufferfish: A framework for mathematical privacy definitions
References:
Raghav Bhaskar, Abhishek Bhowmick, Vipul Goyal, Srivatsan Laxman, and Abhradeep Thakurta. 2011. Noiseless database privacy. In Proceedings of the 17th International Conference on the Theory and Application of Cryptology and Information Security (ASIACRYPT).
Avrim Blum, Katrina Ligett, and Aaron Roth. 2008. A learning theory approach to non-interactive database privacy. In Proceedings of the 40th Annual ACM Symposium on Theory of Computing (STOC). ACM, New York, NY, 609--618.
Kamalika Chaudhuri and Nina Mishra. 2006. When random sampling preserves privacy. In Proceedings of the 26th Annual International Cryptology Conference on Advances in Cryptology (CRYPTO). Lecture Notes in Computer Science, vol. 4117, Springer-Verlag, Berlin, 198--213.
Bee-Chung Chen, Daniel Kifer, Kristen LeFevre, and Ashwin Machanavajjhala. 2009. Privacy-preserving data publishing. Foun. Trends Data. 2, 1--2, 1--167.
C. Clifton, M. Kantarcioglu, and J. Vaidya. 2002. Defining privacy for data mining. In Proceedings of the NSF Workshop on Next Generation Data Mining.
Chris Clifton and Don Marks. 1996. Security and privacy implications of data mining. In Proceedings of the ACM SIGMOD Workshop on Data Mining and Knowledge Discovery. ACM, New York, NY.
Irit Dinur and Kobbi Nissim. 2003. Revealing information while preserving privacy. In Proceedings of the 22nd ACM SIGMOD-SIGACT-SIGART (PODS).
Yitao Duan. 2009. Privacy without noise. In Proceedings of the 18th ACM Conference on Information and Knowledge Management (CIKM).
Cynthia Dwork. 2006. Differential privacy. In Proceedings of the 33rd International Colloguium on Automata, Languages and Programming (ICALP).
Cynthia Dwork. 2008. Differential privacy: A survey of results. In Proceedings of the 5th International Conference on Theory and Applications of Mode Computation (TAMC). Lecture Notes in Computer Science, vol. 4978, Springer-Verlag, Berlin, 1--9.
Cynthia Dwork, Krishnaram Kenthapadi, Frank McSherry, Ilya Mironov, and Moni Naor. 2006a. Our data, ourselves: Privacy via distributed noise generation. In Proceedings of the Advances in Cryptology (EUROCRYPT). Lecture Notes in Computer Science, vol. 4004, Springer-Verlag, Berlin, 486--503.
C. Dwork, F. McSherry, and K. Talwar. 2007. The price of privacy and the limits of LP decoding. In Proceedings of the 39th Annual Symposium on Theory Computing (STOC). 85--94.
Cynthia Dwork and Moni Naor. 2010. On the difficulties of disclosure prevention in statistical databases or the case for differential privacy. J. Privacy Confidentiality, 2, 1, Article 8.
Cynthia Dwork, Moni Naor, Toniann Pitassi, and Guy N. Rothblum. 2010a. Differential privacy under continual observation. In Proceedings of the 42nd ACM Symposium on Theory of Computing (STOC).
B. C. M. Fung, K. Wang, R. Chen, and P. S. Yu. 2010. Privacy-preserving data publishing: A survey on recent developments. Comput. Surv. 42, 4, Article 14.
Srivatsava Ranjit Ganta, Shiva Prasad Kasiviswanathan, and Adam Smith. 2008. Composition attacks and auxiliary information in data privacy. In Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD). ACM, New York, NY, 265--273.
Johannes Gehrke, Michael Hay, Edward Lui, and Rafael Pass. 2012. Crowd-blending privacy. In Proceedings of the 32nd Annual Cryptology Conference on Advances in Cryptology (CRYPTO). Lecture Notes in Computer Science, vol. 7417, Springer-Verlag, Berlin, 479--496.
Johannes Gehrke, Edward Lui, and Rafael Pass. 2011. Towards privacy for social networks: A zero-knowledge based definition of privacy. In Proceedings of the 8th Theory of Cryptology Conference (TCC). Lecture Notes in Computer Science, vol. 6597, Springer-Verlag, Berlin, 432--449.
Arpita Ghosh, Tim Roughgarden, and Mukund Sundararajan. 2009. Universally utility-maximizing privacy mechanisms. In Proceedings of the 41st Annual ACM Symposium on Theory of Computing (STOC). 351--360.
Robert Hall, Larry Wasserman, and Alessandro Rinaldo. 2012. Random differential privacy. J. Privacy Confidentiality 4, 2.
Shiva Prasad Kasiviswanathan and Adam Smith. 2008. A note on differential privacy: Defining resistance to arbitrary side information. http://arxiv.org/abs/0803.3946.
Daniel Kifer. 2009. Attacks on Privacy and de Finetti's theorem. In Proceedings of the ACM SIGMOD International Conference on Management of Data (SIGMOD). 127--138.
Daniel Kifer and Bing-Rong Lin. 2010. Towards an axiomatization of statistical privacy and utility. In Proceedings of the 29th ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems (PODS). ACM, New York, NY, 147--158.
Daniel Kifer and Bing-Rong Lin. 2012. An axiomatic view of statistical privacy and utility. J. Privacy Confidentiality 4, 1.
Daniel Kifer and Ashwin Machanavajjhala. 2011. No free lunch in data privacy. In Proceedings of the ACM SIGMOD International Conference on Management of Data (SIGMOD). ACM, New York, NY, 193--204.
Daniel Kifer and Ashwin Machanavajjhala. 2012. A rigorous and customizable framework for privacy. In Proceedings of the 31st ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems (PODS). 77--88.
Ashwin Machanavajjhala, Daniel Kifer, John Abowd, Johannes Gehrke, and Lars Vilhuber. 2008. Privacy: From theory to practice on the map. In Proceedings of the 24th International Conference on Data Engineering (ICDE). 277--286.
Frank D. McSherry. 2009. Privacy integrated queries: An extensible platform for privacy-preserving data analysis. In Proceedings of the ACM SIGMOD International Conference on Management of Data (SIGMOD). ACM, New York, NY, 19--30.
Ilya Mironov, Omkant Pandey, Omer Reingold, and Salil Vadhan. 2009. Computational differential privacy. In Proceedings of the 29th Annual International Cryptology Conference on Advances in Cryptology (CRYPTO). Lecture Notes in Computer Science, vol. 5677, Springer-Verlag, Berlin, 126--142.
Stanley R. M. Oliveira and Osmar R. Zaiane. 2003. Algorithms for balancing privacy and knowledge discovery in association rule mining. In Proceedings of the 7th International Database Engineering and Applications Symposium. 54--63.
Vibhor Rastogi, Michael Hay, Gerome Miklau, and Dan Suciu. 2009. Relationship privacy: Output perturbation for queries with joins. In Proceedings of the 28th ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems (PODS). ACM, New York, NY, 107--116.
Charu C. Aggarwal, Jian Pei, and Bo Zhang. 2006. On privacy preservation against adversarial data mining. In Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD). ACM, New York, NY.
Sriram Sankararaman, Guillaume Obozinski, Michael I. Jordan, and Eran Halperin. 2009. Genomic privacy and limits of individual detection in a pool. Nature Gene. 41, 9, 965--967.
Vassilios S. Verykios, Elisa Bertino, Igor Nai Fovino, Loredana Parasiliti Provenza, Yucel Saygin, and Yannis Theodoridis. 2004a. State-of-the-art in privacy preserving data mining. SIGMOD Rec. 33, 1, 50--57.
En Tzu Wang and Guanling Lee. 2008. An efficient sanitization algorithm for balancing information privacy and knowledge discovery in association patterns mining. Data Knowl. Eng. 65, 3, 463--484.
Ke Wang, Benjamin Fung, and Philip Yu. 2005. Template-based privacy preservation in classification problems. In Proceedings of the 5th IEEE International Conference on Data Mining (ICDM). 466--473.
Xiaokui Xiao and Yufei Tao. 2006. Anatomy: Simple and effective privacy preservation. In Proceedings of the 32nd International Conference on VLDB (VLDB). 139--150.
Shuheng Zhou, Katrina Ligett, and Larry Wasserman. 2009. Differential privacy with compression. In Proceedings of the IEEE International Symposium on Information Theory (ISIT). 2718--2722.
Full Text:
... the data is protected by strong notions, such as differential privacy [Dworket al. 2006b; Dwork 2006]).Variability in sensitive information comes in ... values) need to be protected, and certain individualsmay require more privacy protection than others.Finally, in some applications, a certain level of ... howshould one account for such information disclosure when planning future privacy- -preserving data releases? A related concern is that an organization ... is that an organization can only justifythe expense of a privacy- -preserving public data release if it provides a certain level ... are fundamental limits on the utility permitted by current state-of-the-art privacy mechanisms [Dwork et al. 2007; Machanavajjhala et al. 2011]. It ... 2011] that no privacymechanism can provide both high utility and privacy against attackers with arbitrarybackground knowledge or beliefs about the data. ... customizable framework called Pufferfishthat makes it easier to generate new privacy definitions with rigorous statistical guar-antees about the leakage of sensitive ... OutlineThe contributions of this article are as follows.?A new Bayesian privacy framework which provides rigorous privacy guaranteesagainst many types of attackers.?A new privacy definition called hedging privacy ... and a histogram release algorithmfor it. The goal of hedging privacy is protect against reasonable attackers while au-tomatically ruling out those ... whose beliefs about the data are implausible.?An analysis of differential privacy within our framework. We present crisp Bayesiansemantics for differential privacy in terms of data-generating distributions thatmodel records independently.?A principled extension ... of data-generating distributions thatmodel records independently.?A principled extension of differential privacy that accounts for prior releases of non-differentially private information (without ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:3?An application of the framework to study the notion ... of applications of thisframework. We use Pufferfish to analyze differential privacy and clarify its connec-tions to data distributions that model records ... distributions that model records independently in Section 6. We presenthedging privacy and a histogram publishing algorithm in Section 7. We show ... study composition in Section 9. We show how to provide privacy while accountingfor prior data releases in Section 10. We explain ...
... denote the record values (tuples).The data curator will choose a privacy definition and a privacy mechanism (algorithm)M that satisfy that privacy definition. The data curator will then apply M to the ... these three components the framework generates a rich class of privacy definitions.The set of potential secrets S, is an explicit specification ... data equals Di is P(Data= Di | ? ).M A privacy mechanism: a deterministic or randomized algorithm (often used in the ... deterministic or randomized algorithm (often used in the context ofa privacy definition)add a statement s to the potential secrets S if ... not in the table?).The discriminative pairs allow for highly customizable privacy guarantees. For ex-ample, we can specify such discriminative pairs (e.g., ...
... is true of Dk and sjis true of D??the resulting privacy definition (instantiated by Definition 3.4) will often be too strong, ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:5Note that assumptions are absolutely necessary?privacy definitions that can provideprivacy guarantees without making any assumptions provide ... specify-ing these distributions in Section 6 where we analyze differential privacy. . Below wegive some examples of possible choices of D ... in Section6 the close connection between this example and differential privacy. . Note that recordshere are still independent, so this choice ... importantly, the domain expert is no longer required tobe a privacy expert.Definition 3.4 (Pufferfish Privacy) ). Given set of potential secrets S, a set of ... Spairs, a set of data evolution scenarios D, and a privacy parameter ? > 0,a (potentially randomized) algorithm M satisfies ?-PufferF ... ? > 0,a (potentially randomized) algorithm M satisfies ?-PufferF ish(S,Spairs,D) privacy if(i) for all possible outputs ? ? range(M),(ii) for all ...
... is true.Note that it is still possible to create ?bad? privacy definitions using this framework.To get a bad definition in terms ... using this framework.To get a bad definition in terms of privacy ... semantics, one should specify a single prior(in which case the privacy definition is sensitive to attackers who deviate from theprior) or ... instantiations due to the complex nature of Bayesian computations.3.2. Example: Privacy with No AssumptionsAs a warmup, we use Pufferfish to create ... No AssumptionsAs a warmup, we use Pufferfish to create a privacy definition with no assumptions(a reinterpretation of no-free-lunch privacy [Kifer and Machanavajjhala 2011]). Let Tbe the domain of tuples ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:7?i to be the statement ?the record belonging to ... where ta and tb range over all possibletuple values).To get privacy with no assumptions, we must make the set of evolution ...
... a formal equivalencebetween Pufferfish with no assumptions and a strawman privacy definition called no-free-lunch privacy that was used in Kifer and Machanavajjhala [2011] as an ... used in Kifer and Machanavajjhala [2011] as an exampleof a privacy definition without utility.4. RELATED WORKThis article is an extension of ... of ?-indistinguishability [Dwork et al. 2006b] (a variant of differential privacy) ... ) in Section6.2; a detailed critique of neighboring-table semantics for privacy definitions with re-spect to Bayesian semantics in Section 6.3; a ... Section 6.3; a new instantiation of Pufferfish that we callhedging privacy and corresponding histogram publishing algorithms in Section 7; anda discussion ... can be found in the online Appendix.4.1. Relation to Differential Privacy VariantsDifferential privacy [Dwork et al. 2006b; Dwork 2006] represented a breakthrough inideas ... Dwork 2006] represented a breakthrough inideas and algorithms for protecting privacy. . Informally, it states that the distribution ofan algorithm?s output ... of formalizing what ?barely affected? means.In the case of differential privacy, , it is P(M(D) ? S) ? e?P(M(D?) ? S) ... where n is the number of records (thus always causinga privacy breach although the risk is spread out among individuals). It ... al. 2008; Kasiviswanathan and Smith 2008], thepossibility of an intentional privacy breach reduces the viability of this variation in realapplications. Note ... that do work under this relaxedmodel actually satisfy a stronger privacy definition (it is not clear what this strongerdefinition is).One can ... we explain the differences between Pufferfish and other variations ofdifferential privacy [Duan 2009; Nissim et al. 2007; Chaudhuri andMishra 2006; Blumet ... 2009; Rastogi et al. 2009].A related framework, known as adversarial privacy [Rastogi et al. 2009], allows do-main experts to plug in ... data generating distributions. While there is no knownequivalence between adversarial privacy and differential privacy, , Rastogi et al. [2009]have proved an equivalence between a ... [2009]have proved an equivalence between a certain instantiation of adversarial privacy and?-indistinguishability [Dwork et al. 2006b]. Adversarial privacy only seeks to protectthe presence/absence of a tuple in the ... and 10).Zhou et al. [2009] present two variants of differential privacy (?-constrained andZLW distributional) whose privacy semantics are unclear. We show approximate equiv-alences to instantiations of ... 8.3, thereby providingapproximate semantics for those definitions as well.BLR distributional privacy [Blum et al. 2008] is a definition whose goal is ... be barely affected).The concept of neighboring databases arose in differential privacy
... a Bayesian formulation of semantic security to analyzevariants of differential privacy (with the proofs appearing in Kasiviswanathan andSmith [2008]). Their definition ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:9considers the effect of an individual?s participation in the ... been publicly released (we discuss this situation in Section 10).Noiseless privacy [Duan 2009; Bhaskar et al. 2011] is a variation of ... answers do not leak too much information about individuals. Noise-less privacy cannot be used for histogram release for many choices of ... information in cells with counts that are0 or 1. Hedging privacy, , which we present in Section 7, tries to only ... sometimesoutput true cell counts (and certify that they are exact).Crowd-blending privacy ... [Gehrke et al. 2012] is a cryptographically inspired variantof differential privacy. . It is another relaxation of differential privacy with the goal ofexplaining why moderately large deterministic counts can ... goal ofexplaining why moderately large deterministic counts can still preserve privacy (andthus has similar goals to our exploration of hedging privacy) ). The definition is rathercomplex but depends on the idea ... It treats datasets asmultisets.An algorithmAwith blending parameter k satisfies crowd-blending privacy if for everydatabase D and individual t ? D, one ... output of A) between the databases D and D {t}.This privacy definition can be sensitive to background knowledge such as the ... of tuples in D not equal to ?Bob?).M satisfies crowd-blending privacy because of the following.?M blends any two tuples that are ... number of suchtuples).The use of the second condition of crowd-blending privacy allows us to induce a cor-relation between the presence of ... the second condition,which was crucial to our attack, strengthens crowd-blending privacy to the point whereit is equivalent to differential privacy
... Kifer and A. MachanavajjhalaAlso related to the spirit of hedging privacy is the notion of random differential pri-vacy [Hall et al. ... of random differential pri-vacy [Hall et al. 2012]. In hedging privacy, , we try to rule out attackers whose priorsare inconsistent ... the data provided by a sanitizing algorithm M.In random differential privacy, , the differential privacy constraints must hold exceptpossibly for datasets that are unrepresentative of ... this exception). Thus, in the caseof histogram publishing, random differential privacy can sometimes release (withoutnoise) the cells whose counts are 0.Zero ... sometimes release (withoutnoise) the cells whose counts are 0.Zero knowledge privacy [Gehrke et al. 2011] is another cryptographically-inspiredvariant of differential privacy and is strictly stronger. It essentially ensures that an at-tacker ... consider traditional data releases here, there are other variations ofdifferential privacy for which time plays a crucial role. This includes answering ... [Chan et al. 2010; Dwork et al. 2010a] and providing privacy evenif attackers have access to snapshots of a mechanism?s internal ... internal state [Dwork et al.2010b, 2010a].4.2. Relationship to Other WorkOther privacy frameworks also exist. A large class, of which k-anonymity [Samarati2001] ... inputs to outputs: P(M(Di) = ?).Because of this, the resulting privacy definitions tend to be less secure and are oftensubject to ... be provided or what data assumptions are needed for ensuring privacy. .5. PUFFERFISH AND PRIVACY AXIOMSResearch in statistical privacy has been moving away from ad-hoc privacy definitionsand towards formal and rigorous privacy definitions. The reason being that rigorousprivacy definitions offer the promise ... ending the endless cycle of discovering a vul-nerability in a privacy definition, proposing a fix, finding a vulnerability in the ?fixed?version, ... this end, recent research has started examining the properties that privacy defi-nitions need to have [Kifer and Lin 2010, 2012]. Modern ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:11accepted by the privacy community for a long time, only recently has there been ... a long time, only recently has there been aninsistence that privacy ... definitions actually satisfy them.In this section, we show that every privacy definition in the Pufferfish frameworksatisfies both fundamental axioms, thus ensuring ... Invariance [Kifer and Lin 2012]). If an algorithm Msatisfies a privacy definition and A is any algorithm such that (1) its ...
... runs A on the output should also satisfy the same privacy definition.The justification for the transformation invariance axiom is that A ... the statisticalanalysis on this output ? were not (many existing privacy definitions fail to satisfy thisproperty [Kifer and Lin 2012]).Axiom 5.2 ... [Kifer and Lin 2012]). If M1 and M2 satisfy a privacy defini-tion, and p ? [0,1], then the algorithm Mp which ... p and M2with probability 1? p should also satisfy the privacy definition.The convexity axiom says that a data curator is allowed ... is allowed to choose any algorithmM thatsatisfies the curator?s chosen privacy definition and that this choice can be randomized(thus adding further ... uncertainty into the creation of sanitized data). Again, manyexisting syntactic privacy definitions fail to satisfy this property [Kifer and Lin 2010].The ... For every S, Spairs, D, and ? > 0, the privacy definition ?-PufferF ish(S,Spairs,D) satisfies the axioms of convexity and transformation ... B in the online Appendix.6. PUFFERFISH ANALYSIS OF DIFFERENTIAL PRIVACYDifferential privacy [Dwork 2006] is a state-of-the-art privacy definition which hasbeen very influential in modern privacy research. It is formally defined as follows.Definition 6.1 (Differential Privacy [Dwork 2006]). Given a privacy parameter ? >0, an algorithm M satisfies ?-differential privacy if for all ? ? range(M) and all pairsof datasets ... output regardless of whetherBob?s tuple was used in the computation.Differential privacy does notmention any prior probabilities although it is recognizedthat it ... of this section is to state a precise relationshipbetween differential privacy, , data generating distributions, and bounds on an attacker?sinference. In ... correlations cause additional information leakage.We present our analysis of differential privacy ... in Section 6.1 and we discuss?-indistinguishability (a variation of differential privacy) ) [Dwork et al. 2006b] inSection 6.2. In Section 6.3, ... inSection 6.2. In Section 6.3, we discuss whywe believe that privacy semantics in terms ofBayesian inference can shed more light on ... in terms ofBayesian inference can shed more light on a privacy definition than privacy semanticsin terms of indistinguishable pairs of neighboring databases.6.1. Analysis of ...
... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:13THEOREM 6.2. Let S and Spairs be defined as ... data evolutionscenarios Dother such that Dother ?? D?, then ?-differential privacy is not equivalent to?-PufferF ish(S,Spairs,Dother) (i.e., with the same ?-parameter) ... for ?-indistinguishability.Definition 6.2 (?-indistinguishability [Dwork et al. 2006b]). Given a privacy pa-rameter ? > 0, an algorithm M satisfies ?-indistinguishability if ... et al. [2009] proved an equivalence between ?-indistinguishability and ?-adversarial privacy when using a set of probability distri-butions called PTLM [Rastogi ... et al. 2009], but there is no known equivalence between?-differential privacy and ?-adversarial privacy. . However, there are connections be-tween instantiations of the Pufferfish ... be-tween instantiations of the Pufferfish framework and both ?-indistinguishability and?-differential privacy (the latter was shown in Section 6). The reason for ... data.The probabilistic model is almost the same as with ?-differential privacy. . The onlydifference is that ?-indistinguishability only considers datasets of ... straightforward and tedious yet almost identical to the proofabout differential privacy (Theorem 6.1, with proof in Section C of the electronic ... SemanticsOne of the most compelling arguments in favor of ?-differential privacy is that for small?, it guarantees that any output ? ...
... [2008] and Kasiviswanathan and Smith [2008]studied a variant of differential privacy known as ?-approximate ?-indistinguishability[Dwork et al. 2006]. This privacy definition also requires that changes in one tuplecause small changes ... [Ganta et al. 2008; Kasiviswanathan and Smith 2008] of this privacy definitionshowed that in order to prevent an attacker from learning ... uswith a new estimate whose variance is 2/k?2. Neighboring table privacy semantics donot let us reason about this situation easily. However, ... usefulhere because they give an easy way of adapting a privacy definition to account for suchside information. We illustrate this point ... in Section 10. In particular, Kifer andMachanavajjhala [2011] proposed a privacy definition with neighboring table seman-tics to address the issue of ... deterministic query answers. On the surface,it seems like a reasonable privacy definition, but in Section 10, we show that it too ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:15that his or her information be deleted prior to ... data release). In this case, differentialprivacy is often an appropriate privacy definition to use to ensure attackers cannot de-tect whether the ... records, so users may want to cooperate to limitinferences).7. HEDGING PRIVACY: : A CASE STUDY IN PUBLISHING HISTOGRAMSIn this section, we ... HISTOGRAMSIn this section, we consider algorithms for publishing histograms under privacy defini-tions that are weaker than differential privacy. ... . For illustrative purposes, we first beginwith a very weak privacy definition, which we call single-prior privacy, , and then westrengthen it to a privacy definition we call hedging privacy. . In single-prior privacy, , thedata curator specifies only one data-generating distribution. We present ...
... released with the guaran-tee that the answers are exact). Hedging privacy is a stronger definition. Although thedata curator still specifies one ... evolution scenarios containsmany other distributions. An interesting feature of this privacy definition is that it canprovide utility (and even partially deterministic ... settings, the corresponding version of Pufferfish, which we call single-prior privacy, , has the following goal: make it difficult for an ... look at how to answer one counting query under this privacy definition(Section 7.1.1), extend it to publishing a histogram (Section 7.1.2), ... then modify thealgorithm to satisfy a much stronger notion of privacy (Section 7.2).7.1.1. Answering One Counting Query. In this section, we ... section, we consider how to answer onecounting query under single-prior privacy. . Note that this is equivalent to possessing adatabase of ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:17(Lines 3?5). On the other hand, if the true ...
... every cell of a histogram when n is known achieves2?-differential privacy (since it is an output with sensitivity 2 [Dwork et ... Algorithm 2 adds less noise than this.Thus with this weaker privacy definition, we can sanitize a histogram with less noisethan under ... sanitize a histogram with less noisethan under differential privacy.However, single-prior privacy is a very weak privacy definition. First, because thereis only one probability distribution in D, ... of the data. We address these concerns with a morepowerful privacy definition in Section 7.2.7.2. Hedging PrivacyHedging privacy is a strengthening of single-prior privacy that protects data publisherswhose prior beliefs about the data are ...
... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:19In such a case, providing privacy protections against this attacker and against thismixture distribution are concordant ... distribution behaves like theprobability under f ?. Although we provide privacy protections against this mixturedistribution, we only need the protections that ... answering a single count query and publishing a histogramunder hedging privacy are shown in Algorithm 3 and 4, respectively. Note that ... and 4, respectively. Note that the onlydifference with single prior privacy is the computation of khi and klo, but now we ... the computation of khi and klo, but now we havestronger privacy protections. Thus sometimes the algorithm is allowed to output exactcounts ... that those counts are exact.THEOREM 7.3. Algorithm 3 satisfies ?-hedging privacy and Algorithm 4 satisfies 2?-hedging privacy.For proof see Section G ... algorithm that usesthe mixture parameter ? also works for hedging privacy definitions that use largermixture parameters ?? ? (?,1/2), meaning that ... more weightthan f ? (i.e., the whole point of hedging privacy is to get the data, not prior knowledge,to overrule an ... CONTINUOUS ATTRIBUTES AND AGGREGATE SECRETSOne of the difficult problems in privacy- -preserving data publishing is protecting thevalues of continuous variables that ... large values (such as income). For example, many algorithms fordifferential privacy do not work in the first case (i.e., no a ...
... aggregate secrets(Section 8.2). Finally, we use Pufferfish to provide approximate privacy semantics for?-constrained ?-differential privacy [Zhou et al. 2009] and ZLW distributional privacy[ [Zhou et al. 2009] (Section 8.3); those two definitions were ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy ... Definitions 3:218.1. Protecting Continuous AttributesAswe saw in Section 6, differential privacy is designed tomake it difficult to distinguishbetween the case when ... withstandard deviation proportional to 106 in order to satisfy differential privacy (thuspotentially masking out the signal in the data).If this loss ... utility is unacceptable, the data curator may want to relax privacy bystating requirements, such as (1) an attacker should not be ... Both of theserequirements can be handled in the Pufferfish framework.8.1.1. Privacy via Absolute Error. For ease of explanation, suppose that records ...
... close to each other with high probability. In contrast,satisfying differential privacy by adding noise to the sum would require a distributionwith ... 3, Publication date: January 2014.3:22 D. Kifer and A. Machanavajjhalathe privacy lost (i.e., income is inferable to an absolute error of ... ish(S,Spairs,D).For the proof, see Section H of the online Appendix.8.1.2. Privacy via Relative Error. We can extend these ideas to protect ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:23business may decide that letting the public learn about ...
... ?-Constrained and Distributional PrivacyZhou et al. [2009] also proposed two privacy definitions that can be used with contin-uous variables. Those definitions ... were introduced solely for the study of utility, andtheir precise privacy semantics (i.e., what inferences do they protect against) were notexplained ... should be indis-tinguishable. We show an approximate equivalence between those privacy definitionsand instantiations of Pufferfish so that the Pufferfish framework (approximately) ... framework (approximately) sub-sumes those definitions and gives them clear Bayesian privacy semantics. We startwith those definitions.Definition 8.2 (Constrained and ZLW-Distributional Privacy [Zhou et al. 2009]).Let ? be a metric on databases ... and ?(D1, D2) ? ?, then algorithmM satisfies ?-constrained ?-differential privacy, , if those conditions hold when (1) D1ACM Transactions on ... > 0be constants. An algorithm M satisfies (?, ?)-modified ZLW privacy if for every ? ?range(M) and every pair of databases ... Msatisfies ?-PufferF ish(S?,Spairs?,D), then it also satisfies (?, ?)-modified ZLW privacy; ;conversely, if M satisfies (?, ?)-modified ZLW privacy, , then it satisfies the definition4?-PufferF ish(S?,Spairs?,D) (i.e., up to ... online Appendix. Thus although the precise privacysemantics of the ZLW privacy definitions [Zhou et al. 2009] are unknown, the ideasdiscussed in ... Theorem 8.1 show that thePufferfish framework can give the ZLW privacy definitions some approximate seman-tics in terms of an odds-ratio bound ...
... (as defined by the metric ?) of ?.9. COMPOSITIONGiven a privacy definition, the notion of composition [Ganta et al. 2008] refers ... of composition [Ganta et al. 2008] refers to thedegradation of privacy due to two independent data releases. For example, Alice maychoose ... can each run an algorithm MAlice and MBob (possiblysatisfying different privacy definitions) on their own dataset and output the result. Itis ... their own dataset and output the result. Itis known that privacy can degrade in those instances. For example, two independentreleases of ... example, two independentreleases of k-anonymous tables can lead to a privacy breach [Ganta et al. 2008]. On10We use the prefix ZLW ... use the prefix ZLW to distinguish it from the distributional privacy definition introduced in Blum et al.[2008].11This condition is achieved, for ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:25the other hand, differential privacy composes well with itself: if Alice uses an ?1-differentially private ... private algorithm. We have the following.THEOREM 9.1 (COMPOSITION OF DIFFERENTIAL PRIVACY [MCSHERRY 2009; GANTA ET AL.2008]). LetDatadenote the unknown input database, ... let Mi( ), Mi( , ) be random-ized algorithms which satisfy ?i-differential privacy. . Then we have the following.(1) Serial Composition. An algorithm ... = M1(Data) and?2 = M2(Data, ?1) satisfies (?1 + ?2)-differential privacy. .(2) Parallel Composition. For datasets Data1 ?Data2 = ?, an ... it is important to studycomposition between algorithms within the same privacy definition and also to studycomposition between algorithms from different privacy definitions.Properly handling multiple data releases requires a mixture of policy ... guarantees that the combinationof their data releases will not breach privacy. .9.1. Pufferfish View of CompositionSince the Pufferfish framework provides a ... of CompositionSince the Pufferfish framework provides a wide variety of privacy definitions, it allowsus to study composition more generally. The key ... andBob has a datasetDataBob. Alice announcesthat she will use a privacy definition PufferF ish(S,Spairs,D) to publish a sanitizedversion of her data ... her data and Bob announces that he will use a privacy definition Priv(which need not belong to the Pufferfish framework). Alice ... the Pufferfish framework). Alice would like to examine theconsequences to privacy that can occur when they both release sanitized data usingtheir ... can occur when they both release sanitized data usingtheir chosen privacy definitions.In order for Alice to study how her privacy definition composes with possible data re-leases from Bob, she needs ...
... =?)E[[[P(A(DataBob) = ??)|DataAlice = D, ?]]]. Thus to study the privacy properties of thisjoint data release, Alice only needs to study ... (for allchoices of ? ? Cond, all M satisfying her privacy definition, and all A satisfying Bob?sprivacy definition). In particular, she ... M??,A,M (for all choices of ?,M,A)satisfies ??-PufferF ish(S,Spairs,D) (i.e., her privacy definition with a different privacyparameter ??).The preceding discussion generalizes to ... this special case self-composition. This is a helpful property for privacy definitionsto have, since it is useful in the design of ... D) = 1).Thus, Alice has a dataset Data, announces a privacy definition ?-PufferF ish(S,Spairs,D), and chooses two algorithmsM1 andM2 (with independent ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:27range(M?M1,M2 ) = range(M1) range(M2) such that for ... simpler algorithms M1, . . . ,Mk andallocate her overall privacy budget ? among them [McSherry 2009]. As in the previ-ous ... as the output ?1 = M1(Data). This is becauselike differential privacy, , all Pufferfish instantiations satisfy transformation invariance(Axiom 5.1), which ensures ... that postprocessing the output of a mechanism does notdegrade the privacy guarantee.9.2.1. Sufficient Conditions for Self-Composition. In general, not all instantiations ... constraint that an algorithm M must satisfy.If we have a privacy definition PufferF ish(S,Spairs,D) that self-composes linearly,it can happen that adding ...
... ?),with randomness only depending on M.In the case of differential privacy, , those universally composable evolution scenarios? are those for which ... the other distributions that generate recordsindependently without change to the privacy guarantees, and Theorem 6.2 says thatdifferentially private algorithmsmay leak toomuch ... we include any otherdistributions (i.e., those with correlated records).10. DIFFERENTIAL PRIVACY WITH DETERMINISTIC CONSTRAINTSIt was shown in Kifer and Machanavajjhala [2011] ... answers about the data and subsequentlypublishes additional information using ?-differential privacy, , the combined data re-leases can leak much more information ... Decennial Census).Kifer and Machanavajjhala [2011] proposed a modification of differential privacy, ,called induced neighbors privacy to account for prior deterministic data releases. Aswith many variants ... for prior deterministic data releases. Aswith many variants of differential privacy, , it was a ?neighbors-based? definition thattried to make certain ... Section 9 to show how to usePufferfish to modify differential privacy in a way that takes into account arbitrary de-terministic constraints ... caused by prior deterministic releases of data).The result is a privacy definition with precise semantic guarantees and clearly speci-fied assumptions under ... they hold. We also show some conditions under whichinduced neighbors privacy [Kifer and Machanavajjhala 2011] is actually equivalent toan instantiation of ...
... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:29?Univariate Histograms. These are a special kind of count ... illus-trate the benefits of Pufferfish and distinguish it from neighbor-based privacy defini-tions. Suppose there are n students with ID numbers ranging ... has not taken thetest. The Laplace mechanism, which satisfies ?-differential privacy, , would add noisewith density f (x) = 12? e?|x|/? ... differential privacydoes not warn us about it (neither will induced-neighbors privacy, , Definition 10.3).Induced neighbors privacy [Kifer and Machanavajjhala 2011] uses the followingdefinitions.Definition 10.2 (Move [Kifer ... January 2014.3:30 D. Kifer and A. MachanavajjhalaDefinition 10.4 (Induced Neighbors Privacy [Kifer and Machanavajjhala 2011]).An algorithm M satisfies induced neighbor privacy with constraint Q if, for eachoutput ? ? range(M) and ... the Laplace mechanism from Example 10.1 also satisfies in-duced neighbor privacy for this particular scenario since all induced neighbors arepairs (Di, ...
TODS3901-033Pufferfish: A Framework for Mathematical Privacy DefinitionsDANIEL KIFER, Penn State UniversityASHWIN MACHANAVAJJHALA, Duke UniversityIn this article, ... Duke UniversityIn this article, we introduce a new and general privacy framework called Pufferfish. The Pufferfish frameworkcan be used to create ... called Pufferfish. The Pufferfish frameworkcan be used to create new privacy definitions that are customized to the needs of a given ... an application domain, who frequently do not have expertise in privacy, , todevelop rigorous privacy definitions for their data sharing needs. In addition to this, ... this, the Pufferfish frameworkcan also be used to study existing privacy definitions.We illustrate the benefits with several applications of this privacy framework: we use it to analyze differ-ential privacy and formalize a connection to attackers who believe that the ... the data records are independent;we use it to create a privacy definition called hedging privacy, , which can be used to rule out attackers whoseprior ... Society]: Public Policy Issues?PrivacyGeneral Terms: TheoryAdditional Key Words and Phrases: Privacy, , differential privacyACM Reference Format:Daniel Kifer and Ashwin Machanavajjhala. 2014. ... Kifer and Ashwin Machanavajjhala. 2014. Pufferfish: A framework for mathematical privacy defini-tions. ACM Trans. Datab. Syst. 39, 1, Article 3 (January ... growth in importance and diversity of applications.These applications include protecting privacy and confidentiality in computer networkdata collections [PREDICT 2005], protecting privacy and identifiability in genome-wideassociation studies (GWAS) [Homer et al. 2008], ... In each case, the goalis to release useful information (i.e., privacy- -preserving query answers or a sanitizedversion of the dataset) while ... the data release. This is achieved by using algorithms, called privacy mechanisms,whose outputs allow researchers to learn statistical properties of the ... Publication date: January 2014.3:2 D. Kifer and A. Machanavajjhalaof these privacy mechanisms is governed by privacy definitions, which limit the amountof information disclosed about individual records.A ... limit the amountof information disclosed about individual records.A number of privacy definitions andmechanisms have been proposed in the literature(see surveys [Adam ...
... P(Q(Data) = 1 | ? ).We show that ?-induced neighbors privacy is a necessary condition for guaranteeing?-PufferF ish(S,Spairs,D?Q), for any general ... by Equations (23), (24),and (25)), then M satisfies ?-induced neighbors privacy with respect to Q.For the proof, see Section K of ... of the online Appendix. However, the next example shows?-induced neighbors privacy is not sufficient; hence, it does not guarantee an attacker?sodds ... 10.1, we show that the Laplace mechanism,which satisfies both ?-differential privacy and ?-induced neighbors privacy, , does notsatisfy ?-PufferF ish(S,Spairs,D?Q).Consider a ? of the form ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:31Therefore satisfying ?-differential privacy or induced neighbors privacy in this situationdoes not bound an attacker?s odds-ratio to the ... an attacker could make.In fact, for those cases, ?-induced neighbor privacy becomes an instantiation of thePufferfish framework (Theorems 10.1 and 10.2).THEOREM ... Then M satisfies ?-PufferF ish(S,Spairs,D?Quni ) if M satisfies ?-inducedneighbors privacy with respect to Quni.For the proof, see Section L of ... left open in Kifer and Machanavajjhala [2011] iswhether induced neighbor privacy is linear self-composable. Theorems 10.1 and 10.2allow us to answer ... answer this question. Since ?-PufferF ish(S,Spairs,D?Quni ) and induced neigh-bor privacy (for univariate histograms) are equivalent definitions, it is easy to ...
... Note that learning about the true distributionis not considered a privacy breach [Dwork 2006]. However, if the true distribution canbe used ... additional information will beleaked and this can be considered a privacy breach.As an illustration of these ideas, consider a dataset such ... chance that Bobhas cancer. This would not be considered a privacy breach, since that inference is basedon estimates of the underlying ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:33However, if Bob is a 48-year-old male smoker who ... cancer went from 0.05 to 0.84) and this hascaused a privacy breach.We next address whether a Pufferfish privacy definition can prevent such an attack.11.2. Resistance to the AttackThe ...
... general framework that allows ap-plication domain experts to develop rigorous privacy definitions for their data-sharingneeds. The framework provides crisp Bayesian semantics ... crisp Bayesian semantics and allows the domain ex-perts to customize privacy to the specific set of secrets and data evolution scenariosthat ... 515?556.Charu C. Aggarwal, Jian Pei, and Bo Zhang. 2006. On privacy preservation against adversarial data mining.In Proceedings of the 12th ACM ... Bhowmick, Vipul Goyal, Srivatsan Laxman, and Abhradeep Thakurta. 2011.Noiseless database privacy. . In Proceedings of the 17th International Conference on the ... 405?417.Kamalika Chaudhuri and Nina Mishra. 2006. When random sampling preserves privacy. . In Proceedingsof the 26th Annual International Cryptology Conference on ... 198?213.Bee-Chung Chen, Daniel Kifer, Kristen LeFevre, and Ashwin Machanavajjhala. 2009. Privacy- -preservingdata publishing. Foun. Trends Data. 2, 1?2, 1?167.Chris Clifton. 2000. ... 4, 281?307.C. Clifton, M. Kantarcioglu, and J. Vaidya. 2002. Defining privacy for data mining. In Proceedings of the NSFWorkshop on Next ... Generation Data Mining.Chris Clifton and Don Marks. 1996. Security and privacy implications of data mining. In Proceedings of theACM SIGMOD Workshop ... (SAC).Irit Dinur and Kobbi Nissim. 2003. Revealing information while preserving privacy. ... . In Proceedings of the22nd ACM SIGMOD-SIGACT-SIGART (PODS).Yitao Duan. 2009. Privacy without noise. In Proceedings of the 18th ACM Conference on ... ACM Conference on Information andKnowledge Management (CIKM).Cynthia Dwork. 2006. Differential privacy. . InProceedings of the 33rd International Colloguium on Automata,Languages and ... International Colloguium on Automata,Languages and Programming (ICALP).Cynthia Dwork. 2008. Differential privacy: : A survey of results. In Proceedings of the 5th ... Frank McSherry, Ilya Mironov, and Moni Naor. 2006a. Ourdata, ourselves: Privacy via distributed noise generation. In Proceedings of the Advances in ... Berlin, 265?284.C. Dwork, F.McSherry, andK. Talwar. 2007. The price of privacy
... of disclosure prevention in statistical databases orthe case for differential privacy. . J. Privacy Confidentiality, 2, 1, Article 8.Cynthia Dwork, Moni Naor, Toniann Pitassi, ... Moni Naor, Toniann Pitassi, and Guy N. Rothblum. 2010a. Differential privacy undercontinual observation. In Proceedings of the 42nd ACM Symposium on ... Fung, K. Wang, R. Chen, and P. S. Yu. 2010. Privacy- -preserving data publishing: A survey on recentdevelopments. Comput. Surv. 42, ... Article 3, Publication date: January 2014.Pufferfish: A Framework for Mathematical Privacy Definitions 3:35Srivatsava Ranjit Ganta, Shiva Prasad Kasiviswanathan, and Adam Smith. ... and Adam Smith. 2008. Composition attacks andauxiliary information in data privacy. . In Proceedings of the 14th ACMSIGKDD International Conferenceon Knowledge ... Gehrke, Michael Hay, Edward Lui, and Rafael Pass. 2012. Crowd-blending privacy. . In Proceedingsof the 32nd Annual Cryptology Conference on Advances ... 7417, Springer-Verlag, Berlin, 479?496.JohannesGehrke, Edward Lui, andRafael Pass. 2011. Towards privacy for social networks: A zero-knowledgebased definition of privacy. . In Proceedings of the 8th Theory of Cryptology Conference ... (STOC).351?360.Robert Hall, Larry Wasserman, and Alessandro Rinaldo. 2012. Random differential privacy. . J. PrivacyConfidentiality 4, 2.Nils Homer, Szabolcs Szelinger, Margot Redman, ... Prasad Kasiviswanathan and Adam Smith. 2008. A note on differential privacy: : Defining resistanceto arbitrary side information. http://arxiv.org/abs/0803.3946.Daniel Kifer. 2009. Attacks ... Defining resistanceto arbitrary side information. http://arxiv.org/abs/0803.3946.Daniel Kifer. 2009. Attacks on Privacy and de Finetti?s theorem. In Proceedings of the ACM SIGMODInternational ... Kifer and Bing-Rong Lin. 2010. Towards an axiomatization of statistical privacy and utility. InProceedings of the 29th ACM SIGMOD-SIGACT-SIGART Symposium on ... Kifer and Bing-Rong Lin. 2012. An axiomatic view of statistical privacy and utility. J. PrivacyConfidentiality 4, 1.Daniel Kifer and Ashwin Machanavajjhala. ... Kifer and Ashwin Machanavajjhala. 2011. No free lunch in data privacy. . In Proceedings of the ACMSIGMOD International Conference on Management ... and Ashwin Machanavajjhala. 2012. A rigorous and customizable framework for privacy. . InProceedings of the 31st ACM SIGMOD-SIGACT-SIGART Symposium on Principles ... private? Proc. VLDB Endow. 4, 7, 440?450.Frank D. McSherry. 2009. Privacy integrated queries: An extensible platform for privacy- -preserving dataanalysis. In Proceedings of the ACM SIGMOD International Conference ...
... M. Oliveira and Osmar R. Zaiane. 2003. Algorithms for balancing privacy and knowledgediscovery in association rule mining. In Proceedings of the ... Rastogi, Michael Hay, Gerome Miklau, and Dan Suciu. 2009. Relationship privacy: : Output pertur-bation for queries with joins. In Proceedings of ... Loredana Parasiliti Provenza, Yucel Saygin, and YannisTheodoridis. 2004a. State-of-the-art in privacy preserving data mining. SIGMOD Rec. 33, 1, 50?57.Vassilios S. Verykios, ... 3, 463?484.Ke Wang, Benjamin Fung, and Philip Yu. 2005. Template-based privacy preservation in classificationproblems. In Proceedings of the 5th IEEE International ... 466?473.Xiaokui Xiao and Yufei Tao. 2006. Anatomy: Simple and effective privacy preservation. In Proceedings ofthe 32nd International Conference on VLDB (VLDB). ... (ICDE). 116?125.Shuheng Zhou, Katrina Ligett, and Larry Wasserman. 2009. Differential privacy with compression. InProceedings of the IEEE International Symposium on Information ...
10
November 2013
CCS '13: Proceedings of the 2013 ACM SIGSAC conference on Computer & communications security
Publisher: ACM
Bibliometrics:
Citation Count: 8
Downloads (6 Weeks): 17, Downloads (12 Months): 165, Downloads (Overall): 806
Full text available:
PDF
We introduce a novel privacy framework that we call Membership Privacy. The framework includes positive membership privacy, which prevents the adversary from significantly increasing its ability to conclude that an entity is in the input dataset, and negative membership privacy, which prevents leaking of non-membership. These notions are parameterized by ...
Keywords:
privacy notions, membership privacy, differential privacy
Title:
Membership privacy: a unifying framework for privacy definitions
CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Keywords:
privacy notions
membership privacy
differential privacy
Abstract:
<p>We introduce a novel privacy framework that we call Membership Privacy. . The framework includes positive membership privacy, , which prevents the adversary from significantly increasing its ability ... an entity is in the input dataset, and negative membership privacy, , which prevents leaking of non-membership. These notions are parameterized ... the ability to choose different distribution families to instantiate membership privacy. . Many privacy notions in the literature are equivalent to membership privacy with interesting distribution families, including differential privacy, , differential identifiability, and differential privacy under sampling. Casting these notions into the framework leads to ... The framework also provides a principled approach to developing new privacy notions under which better utility can be achieved than what ... utility can be achieved than what is possible under differential privacy.
Primary CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
References:
Standard for privacy of individually identifiable health information. Federal Register, 67(157):53 181--53 273, Aug 2002. http://www.hhs.gov/ocr/privacy/hipaa/administrative/privacyrule/index.html.
A. Blum, C. Dwork, F. McSherry, and K. Nissim. Practical privacy: the SuLQ framework. In PODS, pages 128--138, 2005.
G. Cormode. Personal privacy vs population privacy: learning to attack anonymization. In KDD, pages 1253--1261, 2011.
I. Dinur and K. Nissim. Revealing information while preserving privacy. In PODS.
C. Dwork. Differential privacy. In in ICALP, pages 1--12. Springer, 2006.
C. Dwork. Differential privacy. In ICALP, pages 1--12, 2006.
C. Dwork and M. Naor. On the difficulties of disclosure prevention in statistical databases or the case for differential privacy. Journal of Privacy and Confidentiality, 2(1):8, 2010.
C. Dwork and K. Nissim. Privacy-preserving datamining on vertically partitioned databases. In CRYPTO, pages 528--544. Springer, 2004.
J. Gehrke, M. Hay, E. Lui, and R. Pass. Crowd-blending privacy. In CRYPTO, pages 479--496, 2012.
J. Gehrke, E. Lui, and R. Pass. Towards privacy for social networks: a zero-knowledge based definition of privacy. In TCC, pages 432--449, Berlin, Heidelberg, 2011. Springer-Verlag.
D. Kifer and B.-R. Lin. Towards an axiomatization of statistical privacy and utility. In PODS, PODS '10, pages 147--158, New York, NY, USA, 2010. ACM.
D. Kifer and A. Machanavajjhala. No free lunch in data privacy. In SIGMOD, pages 193--204, 2011.
N. Li, T. Li, and S. Venkatasubramanian. t-closeness: Privacy beyond k-anonymity and l-diversity. In ICDE, pages 106--115, 2007.
N. Li, W. Qardaji, and D. Su. On sampling, anonymization, and differential privacy or, k-anonymization meets differential privacy. In ASIACCS, pages 32--33, 2012.
A. Machanavajjhala, J. Gehrke, D. Kifer, and M. Venkitasubramaniam. $\ell$-diversity: Privacy beyond k-anonymity. In ICDE, page 24, 2006.
A. Machanavajjhala, D. Kifer, J. M. Abowd, J. Gehrke, and L. Vilhuber. Privacy: Theory meets practice on the map. In ICDE, pages 277--286, 2008.
F. McSherry and K. Talwar. Mechanism design via differential privacy. In FOCS, pages 94--103, 2007.
L. Sweeney. k-anonymity: A model for protecting privacy. Int. J. Uncertain. Fuzziness Knowl.-Based Syst., 10(5):557--570, 2002.
Full Text:
fp047-li.dviMembership Privacy: : A Unifying Framework For PrivacyDefinitionsNinghui Li, Wahbeh Qardaji, Dong ... of Computer Science and CERIAS, Purdue University{ninghui,wqardaji,su17,wu510,yang469}@cs.purdue.eduABSTRACTWe introduce a novel privacy ... framework that we call MembershipPrivacy. The framework includes positive membership privacy, ,which prevents the adversary from significantly increasing its a-bility to ... that an entity is in the input dataset, and negativemembership privacy, , which prevents leaking of non-membership.These notions are parameterized by ... in the ability to choose differentdistribution families to instantiate membership privacy. . Many pri-vacy notions in the literature are equivalent to ... are equivalent to membership privacywith interesting distribution families, including differential privacy, ,differential identifiability, and differential privacy under sampling.Casting these notions into the framework leads to deeper ... other. The framework also provides aprincipled approach to developing new privacy notions under whichbetter utility can be achieved than what is ... under differen-tial privacy.Categories and Subject DescriptorsK.4.1 [COMPUTERS AND SOCIETY]: PrivacyKeywordsDifferential Privacy; ; Privacy Notions; Membership Privacy1. INTRODUCTIONThe spate of privacy related incidents [30, 3, 27, 17] has spurreda long line ... 3, 27, 17] has spurreda long line of research in privacy ... notions for data publishing andanalysis [30, 29, 24, 21]. A privacy notion that is increasinglygaining acceptance is differential privacy [7, 10]. Informally, d-ifferential privacy requires any individual entity in a dataset to haveonly a ... a bounded multiplicative factor.There are two major flavors of differential privacy, , depending onPermission to make digital or hard copies of ... In [19], these were referred to as unbounded and boundeddifferential privacy. . In Unbounded Differential Privacy (UDP), Tand T ? are neighbors if T can be ... T ? by adding orremoving an entity. In Bounded Differential Privacy (BDP), T andT ? are neighbors if T can be ... ? by replacing oneentity in T ? with another entity.Because privacy is a social notion with many facets, there isa long ... the research community to examine the vari-ous technical formulations of privacy in order to understand theirstrengthes and weaknesses. Several researchers have ... to understand theirstrengthes and weaknesses. Several researchers have questionedwhether differential privacy provides sufficient protection, and howto choose the ? parameter. In ... Machanavajjhala ar-gued that it is incorrect to claim that differential privacy ... is robustto arbitrary background knowledge. In [5], Cormode argued thatdifferential privacy does not prevent inferential disclosure. That is,from differentially private output, ... about an individual;and this does not match legal definition of privacy, , which requiresprotection of individually identifiable data. Lee and Clifton ... identifiability. At the same time, ithas been recognized that differential privacy may be too restrictivein some settings, and there are several ... there are several efforts that aim at relaxingit, including differential privacy under sampling [22] and crowdblending privacy [15].This paper is motivated by these lines of work. Our ... of work. Our aim is togain a deeper understanding of privacy both as a social concep-t and in terms of technical ... in terms of technical formulations. We begin by analyzingthe recent privacy
so-ciety often views as a privacy breach is the ability of an adversaryto either re-identify or ... membership of an individual in asupposedly ?anonymized? dataset. Hence, a privacy measure needsto protect everyone in the anonymized dataset against membershipdisclosure. ... protect everyone in the anonymized dataset against membershipdisclosure. Such a privacy definition, however, is incomplete with-out specifying the adversary?s prior knowledge ... need to consider this background knowledgeis indeed apparent from recent privacy breaches [3, 27].We combine these two requirements and introduce a ... and introduce a novel pri-vacy framework that we call Membership Privacy. . This frameworkcomprises of two notions: Positive Membership Privacy (PMP),which prevents an adversary from significantly improving its con-fidence that ... an entity is in the input dataset; and Negative Mem-bership Privacy (NMP), which prevents an adversary from signifi-cantly improving its confidence ... D and ?. The889DistributionFamilyDescription of Distributions in the Family Equivalent Privacy NotionDU Includes all distributions over 2U ; all other families ... over 2U ; all other families are sub-families of this. Privacy with no UtilityDI Includes all mutually independent (MI) distributions Unbounded ... no UtilityDI Includes all mutually independent (MI) distributions Unbounded Differential Privacy [8]D2I Sub-family of DI . Includes MI distributions that have ... i.e., all entities that may appear have a fixed probability.Differential Privacy Under Sam-pling [22]DB Includes distributions that are the conditional distributions ... all datasets with non-zero probability have the same size.Bounded Differential Privacy [10]D2B Sub-family of DB . Includes distributions where Pr[T ? ... all subsets of U ,i.e., all entities have probability 0.5.New Privacy NotionTable 1: Distributions for which membership privacy is considered in this paper, and their equivalent privacy notions in the literature.first parameter captures an adversary?s prior knowledge. ... increase in confidence ofaccurate membership assertion.The power of the membership privacy framework is demonstrat-ed by the fact that many privacy notions in the literature are equiv-alent to membership privacy with interesting distribution families.These notions and their corresponding distribution families ... families are giv-en in Table 1. For example, Unbounded Differential Privacy (UDP)is equivalent to membership privacy in DI , the family of all mutual-ly independent (MI) ... (MI) distributions. Similarly, Bounded DifferentialPrivacy (BDP) is equivalent to membership privacy under DB , thefamily that includes those obtained by conditioning ... non-zero probability have the samesize. Differential identifiability [20] and Differential Privacy underSampling [22] are also instantiations of membership privacy.Identifying the family ... also instantiations of membership privacy.Identifying the family under which a privacy notion guaranteesmembership privacy provides deeper understanding of the powerand limitation of the privacy notion. For example, this frameworkenables us to show that under ... that it is strict-ly weaker. We stress that almost all privacy notions make someassumptions about the adversary?s background knowledge. For ex-ample, ... is independence, asalso pointed out in [19]. The only membership privacy notion with-out any assumption is the one under DU , ... the ?no free lunch? resultin [19, 8, 12].As all practical privacy notion requires some assumptions on theallowed distributions, it makes sense ... a notion is appropriate for a given setting, andchoose a privacy that is neither too strong nor too weak, in orderto ... strong nor too weak, in orderto maximize utility. Our membership privacy framework enablessuch analysis. One could develop privacy notions that are strongerthan differential privacy
privacy for beyondDI ), as well as ones that are weaker ... sub-family of DI ).It has often been recognized that differential privacy can be toostrong to satisfy in some settings, and there ... there are some efforts aim-ing at relaxing it. Our membership privacy framework provides aprincipled way to conduct this. For example, one ... the paper is organized as follows. In Section 2, weanalyze privacy incidents and motivate membership privacy. . Wethen introduce the membership privacy framework in Section 3,show how differential privacy fits in the framework in Section 4,and consider several other ... work in Section 6 and concludein Section 7.2. WHAT IS PRIVACY? ?Similar to other contexts in security and privacy, , the concept ofprivacy is easier to define by identifying ... concept ofprivacy is easier to define by identifying what are privacy breach-es. Privacy can then be simply defined by requiring that no privacybreach ... be simply defined by requiring that no privacybreach occurs. As privacy is a social concept, any formalizationof privacy violation must be based on what the society perceivesas privacy breaches. In this section, we examine several well-publicized privacy incidents in data publishing in recent years, andidentify the common ... with high confidence thatt?s data is in the original dataset.2.1 Privacy IncidentsAn early and well publicized privacy incident is from the suppos-edly anonymized medical visit data made ... without ac-cess to the public voter registration list, the same privacy breachescan occur. Many individuals? birthdate, gender and zip code arepublic ... users share seemingly in-nocuous personal information to the public.Another well-known privacy incident came from publishing we-b search logs. In 2006, AOL ... released three months of search logsinvolving 650,000 users. The only privacy protection techniqueused is replacing user ids with random numbers. This ... ratings. While the data was anonymized in order toprotect users? privacy, , Narayanan and Schmatikov [27] showed thatan adversary who has ... at least 2 of them also appear in theNetflix dataset.Another privacy incident targeted the Genome-Wide Associa-tion Studies (GWAS). These studies normally ...
... establish the likely SNP frequencies inthe general population.2.2 Lessons from Privacy IncidentsFrom these incidents, we learn the following lessons.Re-identification matters. In ... fact alone is sufficient for the so-ciety to agree that privacy is breached. It does not matter whetheran adversary has learnt ... concerns and serious consequences (e.g., the AOL case).This suggests that privacy protection must apply to every individ-ual. A method that on ... that on average offers good protection, but maycompromise some individual?s privacy is not acceptable.No separation of Quasi-Identifier and Sensitive Attributes. Muchof ... not acceptable.No separation of Quasi-Identifier and Sensitive Attributes. Muchof database privacy research assumes the division of all attributesinto quasi-identifiers (QIDs) and ... these individuals? record can bere-identified, it is still a serious privacy breach. The same difficultyis true for publishing any kind of ... that individual.2.3 Positive Membership DisclosureThe discussions above suggest that a privacy breach is a positiveassertion of membership for some entity t ... on datasets that do not include t.Some authors have considered privacy breaches as attribute dis-closures, i.e., the ability to infer one?s ... In [5], it has been shown that while satisfying dif-ferential privacy, , one could still build reasonably accurate classifierto learn sensitive ... some entity. We argue thatattribute disclosure is problematic as a privacy notion. As shownby Dwork and others [8], attribute disclosure may ...
... attribute disclosure aprivacy violation is incorrect. Under this interpretation of privacy, ,an individual could claim privacy violation if there is any data aboutanyone with some common ... (e.g., is of the same gender) asthe individual.3. THE MEMBERSHIP PRIVACY FRAME-WORKIn this section, we introduce our framework for formalizingmembership privacy. . More specifically, we introduce the notionof Positive Membership Privacy (PMP), which prevents PositiveMembership Disclosure. To enable establishing a clear ... prevents PositiveMembership Disclosure. To enable establishing a clear connectionwith differential privacy, , we also introduce Negative MembershipPrivacy (NMP), which prevents an ... a physical entity that exists inthe physical world and needs privacy protection. For example, inmany scenarios, a physical entity corresponds to ... underlying dataset we allow the adversary to have. For perfec-t privacy, , one would desire D to include all possible distributionsover ... of theunderlying dataset.3.2 Positive Membership PrivacyWhile the notion of membership privacy has been alluded to inseveral papers, e.g., [19, 20], it ... theadversary. We now provide such a formalization.DEFINITION 3.1. [Positive Membership Privacy ((D, ?)-PMP)]: We say that a mechanism A provides ?-positive ... ?)-PMP)]: We say that a mechanism A provides ?-positive mem-bership privacy under a family D of distributions over 2U , i.e.,((D, ... For example,setting ? = 1.2 might seem a reasonable strong privacy protection.However, if Pr[t] = 0.85, then Equation (1) will bound ...
... also key to establish the rela-tionship of PMP with differential privacy. . To start, note that by theBayes? theorem, we havePr[t|S] ... < p.The following theorem is helpful for establishing equivalence ofother privacy notions with PMP.THEOREM 3.6. Given a mechanism A, ?, and ...
... also provides (D1, ?)-PMP. Therefore, to provide the maximumlevel of privacy, , it is desirable to provide (D, ?)-PMP for as ... for thevast majority of cases, satisfying PMP is sufficient for privacy pro-tection, it is possible that in some unusual situations one ... non-membership as well as infer-ence of membership.DEFINITION 3.8. [Negative Membership Privacy ((D, ?)-NMP)]: We say that a mechanism A provides ?-negative ... ?)-NMP)]: We say that a mechanism A provides ?-negative member-ship privacy under a distribution family D ((D, ?)-NMP), if andonly if ... of the proof for Theo-rem 3.6, and is omitted here.3.6 Privacy AxiomsIn [18], it is suggested that all privacy notions should satisfythe two axioms: the Privacy Axiom of Choice and the Axiom ofTransformation Invariance. The following ... results hold for NMP; we omitexplicitly stating them here.4. DIFFERENTIAL PRIVACY AS MEM-BERSHIP PRIVACYInformally, differential privacy requires that the output of a dataanalysis mechanism is not ... affected by any single tuple in theinput dataset.DEFINITION 4.1 (?-DIFFERENTIAL PRIVACY [8, 10]).A mechanism A gives ?-differential privacy if for any pair ofneighboring datasets T and T ?, ... Pr[A(T ?) ? S].There are two major flavors of differential privacy, , depending onthe condition under which two datasets are considered ... In [19], these were referred to as unbounded and boundeddifferential privacy. . In Unbounded Differential Privacy (UDP), Tand T ? are neighbors if T can be ... T ? by adding orremoving an entity. In Bounded Differential Privacy (BDP), T andT ? are neighbors if T can be ...
... these two differential pri-vacy notions are instantiations of the membership privacy frame-work, by choosing particular families of distributions. This rela-tionship enables ... enables a clear understanding of the power and limitationsof differential privacy. .4.1 Unbounded Differential PrivacyWe first establish the relation between UDP ... (POSITIVE AND NEGATIVE UDP). Amechanism A gives ?-positive unbounded differential privacy ifand only if for any dataset T , any entity ... ) ? S]. (8)A mechanism A gives ?-negative unbounded differential privacy ifand only if for any dataset T , any entity ... ? S]. (9)As it turns out, UDP corresponds to membership privacy underthe family of mutually independent distributions.DISTRIBUTION FAMILY 4.3. DI : ... equivalent to (D2I , e?)-PMP.The elegance and power of differential privacy lies in the factthat while positive UDP directly achieves PMP ...
... Bounded Differential PrivacyWe now consider how BDP relates to membership privacy. ... . Wefirst note that unlike the unbounded version of differential privacy, ,BDP has a symmetry to it and cannot be decomposed ... proof forthe existence of such a distribution for the bounded privacy case istechnically more challenging as we need to show the ... to each other accord-ing to the definition of bounded differential privacy, , and thereforePr[S|T2?{t}]Pr[S|T1]? e?. This directly implies that (11) ? ... network. Thefull proof appears in Appendix A.3.5. OTHER INSTANTIATIONSThe membership privacy framework enables one to design andchoose privacy notions suitable for particular situations by choos-ing appropriate families of ... , the family that includes all possible distributions,results in a privacy notion that is likely too strong. We have al-so shown ...
... all bounded mu-tually independent distributions.In this section, we explore membership privacy under three oth-er families of distributions. The first one is ... to differential identifiability [20],and the third one corresponds to differential privacy under sam-pling [22].5.1 PMP Under the Uniform DistributionThe flexibility offered ... PMP enables data publishers to havemore choices in trading off privacy versus utility. Here we showthat it is possible to satisfy ... better utility than it is knownto be possible under differential privacy. . We consider PMP underthe following family.DISTRIBUTION FAMILY 5.1. DN ... data publishing scenarios. We note that mech-anisms that satisfy syntactic privacy notions such as k-anonymitygenerally does not satisfy (DN , ?)-PMP, ... to satisfyDN while providing significantly more utility than satisfying dif-ferential privacy. . As an example, we consider the universe U tobe ...
... 1m > e? Pr[1|{t1}] = 0 for any ?.8975.3 Differential Privacy Under SamplingIn [22], Li et al. proposed a relaxation to ... SamplingIn [22], Li et al. proposed a relaxation to differential privacy thatexploits the adversary?s uncertainty about the dataset. While the o-riginal ... uncertainty about the dataset. While the o-riginal definition of differential privacy assumes that the adversaryhas precise knowledge of all the tuples ... addition statistical information about the rest ofthe dataset D. The privacy notion should prevent such an adversaryfrom substantially distinguishing between D ... A to the sampled dataset.Similar to our analysis of differential privacy, , we focus on posi-tive DPS. It turns out that ... is given in Appendix A.4.6. RELATED WORKA number of syntactic privacy definitions have proposed overthe years; the most prominent ones include ... for a survey. Instead, D-work argues that we should consider privacy protection problemsin a more rigorous and formal way. This is ... formal way. This is the motivation of theresearch of differential privacy. . The notion of differential privacywas developed in a series ... developed [8, 11, 26, 28].In [5], Cormode argued that differential privacy does not preventinferential disclosure. It is shown that, from differentially ... may be inappropriate to use prevention of attribute disclosureas the privacy objective. In this paper, we formalize membershipdisclosure, and show that ... show that it closely matches the social and legaldefinitions of privacy. . Lee and Clifton [20] proposed the notionof ?-differential identifiability, ... adversarial background knowledge. This notionis a special case of membership privacy, , and we analyze it in Sec-tion 5.2. Li et ... adversary?s background knowledgeby sampling the dataset prior to applying the privacy mechanism.We analyze differential privacy under sampling and its relation toour privacy notion in Section 5.3.In [19], Kifer and Machanavajjhala argued that ... Kifer and Machanavajjhala argued that it is not possibleto provide privacy and utility without making assumptions abouthow the data are generated. ... their finding. Kifer et al. also questioned whether the dif-ferential privacy
when data points are correlated. Ouranalysis suggests that differential privacy only guarantees member-ship protection under distributions where data points are ... where data points are mutuallyindependent.Kifer and Lin [18] proposed the privacy axioms of choice andthe axiom of transformation, which we show ... They also introduce a generalization of differentialprivacy, called generic differential privacy, , which follows the syn-tactic structure of differential privacy, , but allows more flexibledefinition of neighboring datasets and the ... tak-en by Machanavajjhala et al. [23], which introduced a frameworkcalled ?-privacy, , which limits the impact the inclusion of one entitycan ... attributevalue. Gehrke et al. [16] introduced zero-knowledge based defini-tion of privacy, , which defines a mechanism to be private if its ... instantiate the privacynotion. Gehrke et al. [15] then introduced crowd-blending privacy, ,which combines safe k-anonymization and differential privacy. . Wealso generalize differential privacy; ; however, our approach differsin that we formalize membership privacy, , which is justified fromanalysis of privacy incidents, and in that our notion is parameter-ized by families ... by families of dataset distributions.In an attempt to make differential privacy more amenable tomore sensitive queries, several relaxations have been developed, ... sensitive queries, several relaxations have been developed, in-cluding (?, ?)-differential privacy [6, 13, 4, 11]. Machanavajjhalaet al. [25] introduced a variant ... 11]. Machanavajjhalaet al. [25] introduced a variant of (?, ?)-differential privacy called(?, ?)-probabilistic differential privacy. . Roughly, all these relax-ations use ? to bound the ... allow such an error probability.7. CONCLUSIONSThrough analysis of the recent privacy incidents, we have con-cluded that what society often views as ... we have con-cluded that what society often views as a privacy breach is the abil-ity of an adversary to either re-identify ... in a supposedly ?anonymized? dataset. Thus we in-troduce the membership privacy framework. We have demonstrat-ed that differential privacy and several other related privacy ... notionsare instantiations of the framework.Identifying the family under which a privacy notion guaranteesmembership privacy provides deeper understanding of the powerand limitation of the privacy notion. In particular, they identify theassumptions that are made by ... In particular, they identify theassumptions that are made by the privacy notion. As all practicalprivacy notion requires some assumptions on the ... anotion is appropriate for a given setting, and choose a privacy thatis neither too strong nor too weak, in order to ... order to maximize utility. Ourframework enables the development of new privacy notions. Webelieve that the membership privacy framework opens doors for fu-ture research at developing new privacy notions, understanding andcomparing privacy notions, and designing mechanisms for satisfy-ing different privacy notions.Acknowledgement. This paper is based upon work support-ed by the ... freemovement of such data. Official Journal L,281(23/11):0031?0050, 1995.[2] Standard for privacy of individually identifiable healthinformation. Federal Register, 67(157):53 181?53 273, Aug2002.http://www.hhs.gov/ocr/privacy/hipaa/administrative/privacyrule/index.html.[3] ... the SuLQ framework. In PODS, pages 128?138,2005.[5] G. Cormode. Personal privacy vs population privacy:learning to attack anonymization. In KDD, pages1253?1261, 2011.[6] ... pages1253?1261, 2011.[6] I. Dinur and K. Nissim. Revealing information whilepreserving privacy. . In PODS.[7] C. Dwork. Differential privacy. ... . In in ICALP, pages 1?12.Springer, 2006.[8] C. Dwork. Differential privacy. . In ICALP, pages 1?12, 2006.[9] C. Dwork. An ad ...
... in statistical databases or the case for differentialprivacy. Journal of Privacy and Confidentiality, 2(1):8, 2010.[13] C. Dwork and K. Nissim. Privacy- -preserving datamining onvertically partitioned databases. In CRYPTO, pages 528?544.Springer, 2004.[14] ... 479?496, 2012.[16] J. Gehrke, E. Lui, and R. Pass. Towards privacy for socialnetworks: a zero-knowledge based definition of privacy. . InTCC, pages 432?449, Berlin, Heidelberg, 2011.Springer-Verlag.[17] N. Homer, S. ... 2008.[18] D. Kifer and B.-R. Lin. Towards an axiomatization ofstatistical privacy and utility. In PODS, PODS ?10, pages147?158, New York, NY, ... Li, W. Qardaji, and D. Su. On sampling, anonymization,and differential privacy or, k-anonymization meetsdifferential privacy. . In ASIACCS, pages 32?33, 2012.[23] A. Machanavajjhala, J. Gehrke, ... 2009.[24] A. Machanavajjhala, J. Gehrke, D. Kifer, andM. Venkitasubramaniam. ?-diversity: Privacy beyondk-anonymity. In ICDE, page 24, 2006.[25] A. Machanavajjhala, D. Kifer, ... Machanavajjhala, D. Kifer, J. M. Abowd, J. Gehrke, andL. Vilhuber. Privacy: : Theory meets practice on the map. InICDE, pages 277?286, ... 277?286, 2008.[26] F. McSherry and K. Talwar. Mechanism design viadifferential privacy. . In FOCS, pages 94?103, 2007.[27] A. Narayanan and V. ...
11
October 2016
WPES '16: Proceedings of the 2016 ACM on Workshop on Privacy in the Electronic Society
Publisher: ACM
Bibliometrics:
Citation Count: 0
Downloads (6 Weeks): 3, Downloads (12 Months): 51, Downloads (Overall): 51
Full text available:
PDF
Service providers are often reluctant to support anonymous access, because this makes it hard to deal with misbehaving users. Anonymous blacklisting and reputation systems can help prevent misbehaving users from causing more damage. However, by the time the user is blocked or has lost reputation, most of the damage has ...
Keywords:
cryptography, privacy, privacy-enhancing technologies
CCS:
Security and privacy
Privacy-preserving protocols
Human and societal aspects of security and privacy
Privacy protections
Keywords:
privacy
privacy-enhancing technologies
Primary CCS:
Security and privacy
Privacy-preserving protocols
References:
S. A. Brands. Rethinking Public Key Infrastructures and Digital Certificates: Building in Privacy. PhD thesis, 2000.
M. Manulis, N. Fleischhacker, F. Günther, F. Kiefer, and B. Poettrering. Group Signatures: Authentication with Privacy. Technical report, Bundesamt für Sicherheit in der Informationstechnik, 2012.
Full Text:
... this system is suitable to combat maliciousWikipedia editing.CCS Concepts?Security and privacy ? Privacy- -preserving proto-cols; Pseudonymity, anonymity and untraceability; Privacyprotections;?This research is supported ... supported by the research program Sen-tinels (www.sentinels.nl) as project ?Revocable Privacy? ?(10532). Sentinels is being financed by Technology Founda-tion STW, the ... Dutch National Program COMMIT. This re-search is conducted within the Privacy and Identity Lab(www.pilab.nl).Permission to make digital or hard copies of ... Publication rights licensed to ACM.ISBN 978-1-4503-4569-9/16/10. . . $15.00DOI: http://dx.doi.org/10.1145/2994620.2994634KeywordsPrivacy; privacy- -enhancing technologies; cryptography1. INTRODUCTIONMany websites either completely disallow anonymous ac-cess ...
... A. Brands. Rethinking Public Key Infrastructuresand Digital Certificates: Building in Privacy. . PhDthesis, 2000.[6] J. Camenisch, M. Drijvers, and A. Lehmann.Anonymous ... Fleischhacker, F. Gu nther, F. Kiefer,and B. Poettrering. Group Signatures: Authenticationwith Privacy. . Technical report, Bundesamt fu rSicherheit in der Informationstechnik, 2012.[24] T. ...
12
February 2016
SIGCSE '16: Proceedings of the 47th ACM Technical Symposium on Computing Science Education
Publisher: ACM
Bibliometrics:
Citation Count: 0
Downloads (6 Weeks): 12, Downloads (12 Months): 91, Downloads (Overall): 129
Full text available:
PDF
A basic understanding of online privacy is essential to being an informed digital citizen, and therefore basic privacy education is becoming ever more necessary. Recently released high school and college computer science curricula acknowledge the significantly increased importance of fundamental knowledge about privacy, but do not yet provide concrete content ...
Keywords:
online privacy, privacy curriculum, privacy education
Title:
The Teaching Privacy Curriculum
CCS:
Security and privacy
Human and societal aspects of security and privacy
Keywords:
online privacy
privacy curriculum
privacy education
Abstract:
<p>A basic understanding of online privacy is essential to being an informed digital citizen, and therefore ... essential to being an informed digital citizen, and therefore basic privacy education is becoming ever more necessary. Recently released high school ... curricula acknowledge the significantly increased importance of fundamental knowledge about privacy, , but do not yet provide concrete content in the ... over the past two years, we have developed the Teaching Privacy Project (TPP) curriculum, http://teachingprivacy.org, which educates the general public about ... (TPP) curriculum, http://teachingprivacy.org, which educates the general public about online privacy issues. We performed a pilot of our curriculum in a ... that it was effective: weeks after last being exposed, students' privacy attitudes had shifted. In this paper, we describe our curriculum, ... of it in the classroom, and our vision for future privacy
Primary CCS:
Security and privacy
Human and societal aspects of security and privacy
References:
Center on Law and Information Policy at Fordham Law School. Fordham CLIP volunteer privacy educators program. http://law.fordham.edu/assets/CLIP/2013_CLIP_VPE_Complete.pdf, 2013.
S. Egelman and E. Peer. Predicting privacy and security attitudes. SIGCAS Comput. Soc., 45(1):22--28, Feb. 2015.
A. Lenhart, M. Madden, S. Cortesi, U. Gasser, and A. Smith. Where teens seek online privacy advice. Technical report, Pew Research Internet Project, 2013. http://www.pewinternet.org/2013/08/15/where-teens-seek-online-privacy-advice/.
N. K. Malhotra, S. S. Kim, and J. Agarwal. Internet users' information privacy concerns (iuipc): The construct, the scale, and a causal model. Information Systems Research, 15(4):336--355, December 2004.
Markkula Center for Applied Ethics, Santa Clara University. Your privacy online. http://www.scu.edu/ethics-center/privacy/.
S. B. Wicker. Wiretaps to big data: Privacy and surveillance in the age of interconnection. https://www.edx.org/course/wiretaps-big-data-privacy-surveillance-cornellx-engri1280x#.VJFpwCdGzDw.edX.
Full Text:
The Teaching Privacy CurriculumSerge Egelman1,2, Julia Bernd1, Gerald Friedland1, and Dan Garcia21International Computer ... USA{egelman,jbernd,fractor}@icsi.berkeley.edu2University of California, Berkeley, CA, USA{egelman,ddgarcia}@cs.berkeley.eduABSTRACTA basic understanding of online privacy is essential to be-ing an informed digital citizen, and therefore ... science curriculaacknowledge the significantly increased importance of fun-damental knowledge about privacy, , but do not yet provideconcrete content in the area. ... Project (TPP) curriculum, http://teachingprivacy.org,which educates the general public about online privacy is-sues. We performed a pilot of our curriculum in a ... found that it was effective:weeks after last being exposed, students? privacy attitudeshad shifted. In this paper, we describe our curriculum, ourevaluation ... the classroom, and our vision for futureprivacy education.CCS Concepts?Security and privacy? ?Human and societal aspectsof security and privacy; ; ?Applied computing ? Ed-ucation;1. INTRODUCTIONDespite heightened attention to privacy issues in the pop-ular media, accounts abound of people oversharing ... information, nor on the steps they can take to managetheir privacy. . We believe that in this day and age, a ... that in this day and age, a basicunderstanding of online privacy is key to both good cyber-security practices and to becoming ... they are eager to provide their students withguidance on online privacy, , but feel unqualified to do so. Infact, a survey ... olds found that 70% had soughtoutside advice on managing online privacy [11]. To fill thisneed, we began developing an online privacy curriculum toaid teachers in being able to offer their students ... their students actionableadvice on how to better protect their personal privacy on-line. The Teaching Privacy Project (TPP) is a privacy edu-cation curriculum centered around ten principles and offersstudents descriptions of ... may be putting themselvesat risk online, current threats to personal privacy, , interactivedemonstrations that illustrate the concepts, and guidance onwhat they ... and guidance onwhat they can do to protect themselves.Our online privacy curriculum is targeted at lay audiences,including high school and undergraduate ... de-signed to not just convey comprehensive information aboutcurrent threats to privacy, , but to also empower students todo something about it. ... they can make their own in-formed choices about their online privacy. . We have beenintegrating and evaluating parts of TPP in ... on teachers, we are devel-oping the Teachers? Resources for Online Privacy
... to teach young people about why and how toprotect their privacy online. This way, teachers can eas-ily integrate our curriculum into ... works, knowledge of effective techniques they canuse to protect their privacy, , and the motivation to use thosetechniques when interacting online.In ... interacting online.In this paper, we provide an overview of other privacy ed-ucation efforts (Section 2) and show that our curriculum isunique ... number of existing providers that offer someclassroom materials on online privacy, , but none offer a com-prehensive curriculum per se. Common ... Sense Media?s Dig-ital Citizenship Curriculum [6] includes among its offeringssome privacy- -related resources for elementary, middle-school,and high-school classrooms, such as videos ... videos and posters andlesson plans on oversharing, identity theft, and privacy poli-cies. Fordham Law School?s Center on Law and InformationPolicy has ... plans onprivacy that include material on the relationship betweensecurity and privacy and the relationship between privacyand reputation management [5]. At the ... reputation management [5]. At the college level, SantaClara University?s Your Privacy Online resources cover pri-vacy threats and privacy management from a law and ethicsstandpoint [13].These materials are all ... not offer the combination of technical depth,comprehensive coverage of online privacy issues, and focuson U.S.-specific issues and high-school computer-science cur-riculum standards ... have come from Canada and theEuropean Union, such as the privacy lesson plans from Me-dia Smarts [14] (funded by the Privacy Commissioner ofCanada) and Teachtoday [19] (funded by Deutsche Telekom).The resources ... Deutsche Telekom).The resources mentioned above come closest to the Teach-ing Privacy offerings in quality, scope, and technical ground-ing, but there are ... [1] online-safety classroom materials, which providesome (relatively shallow) coverage of privacy. . Several gen-eral providers of curriculum content also include some ... also include some one-off lessons about one aspect of online privacy ... or another, butare not comprehensive.In an attempt to locate online privacy curricula aimedat broad audiences, we also examined Massive Open OnlineCourses ... (e.g., iver-sity, Udemy, and OpenLearn). There are several coursesthat cover privacy from legal, ethical, and policy perspec-tives (e.g., [20]), or that ... ethical, and policy perspec-tives (e.g., [20]), or that touch on privacy as it relates tobest practices for technology designers (e.g., [18]) ... structure and cover some security topics, theydo not include significant privacy content. In fact, amongmajor providers, the content most similar to ... facultyconsultant Dr. Dan Garcia and contains some of the Teach-ing Privacy content. But again, privacy is one topic amongmany, not the primary focus of the ... section, we describe the content ofour curriculum.3. THE CURRICULUMThe Teaching Privacy
... for Online Privacythat describe at a high level how online privacy works, tech-nically and socially. These principles form the basis of ... be drawn fromputting that collective information together.Guidance: Periodically check your privacy settings andupdate them to limit unintentional sharing.2. There?s No Anonymity:Description: ... tomor-row due to changes in terms of service, public policy,or privacy settings.Guidance: Monitor your information footprint.7. Online Is Real:Description: Your online ...
Privacy ... Requires Work:Description: Most Internet technology is not designedto protect the privacy of those who use it; in fact,most technology providers make ... in fact,most technology providers make money by leveragingyour private information. ?Privacy policies? are gen-erally written to protect providers from lawsuits, notto ... gen-erally written to protect providers from lawsuits, notto protect users? privacy. . Laws and regulations coveronly certain aspects of privacy and vary from place toplace. So, like it or not, ... vary from place toplace. So, like it or not, your privacy is your own re-sponsibility, and requires your constant attention.Guidance: Encourage ... and requires your constant attention.Guidance: Encourage policymakers to develop com-prehensive privacy regulations, educate yourself andothers, and be proactive about protecting your ... a whole, the principles demonstrate the generaltypes of threats to privacy, , how they occur, why organiza-tions may exploit them, what ... consequences are,and what people can do about it. The Teaching Privacy web-site, http://teachingprivacy.org/, features a separate pagefor each principle (Figure 1). ... to teach young peopleabout why and how to protect their privacy online. Eachof the TROPE teaching modules is centered around one ... modules is centered around one ofthe Ten Principles for Online Privacy and includes flexiblelesson elements that can be used ?out of ... as ac-tivities built around the interactive apps on the Teach-ing Privacy website.? Explanation: Teachers use provided slide decks, writ-ten materials, and ...
... than 175 teachers in the course?s Piazza forum),so the Teaching Privacy Project curriculum is already in-fluencing high schools at a national ... we evaluated our curricu-lum by measuring its impact on students? privacy attitudes.We describe our method and results.4.1 MethodologyDuring the Fall 2014 ... TPP curriculum and examined if they had an effect onstudents? privacy attitudes at the end of the course. Our hy-pothesis was ... To test this hypothesis, we asked stu-dents to complete a privacy attitudinal scale before and af-ter being exposed to our materials. ... online survey. This initial survey provided a baselinemetric of their privacy attitudes. During the final week ofclasses (week 14), we emailed ... course of the semester, afterbeing exposed to our materials. Our privacy materials werepresented to students during weeks 9 and 11, which ... exposed to ourmaterials in the classroom.Our surveys consisted of a privacy attitudinal scale and anunrelated psychometric scale that is known to ... test( pre = 3.52, post = 3.50, p < 0.826).1Examining participants? privacy scale responses, we didobserve a statistically significant increase, as well ...
... may suggest an increased desire to understand thecontents of website privacy policies and/or control if andwhen companies send them marketing communications.5. ... results show that three weeks after be-ing exposed to our privacy education curriculum, students?attitudes about online privacy had remained changed: theystated an increased desire to have transparency ... we pro-vide to other educators through our Teachers? Resource forOnline Privacy Education (TROPE). Eventually, we hopeto develop a full-fledged MOOC so ... interactive lab at our university?s open house.We also conducted a privacy education workshop at the 2015ACM SIGCSE conference, and this past ... digital citizenshipcurriculum. https://www.commonsensemedia.org/educators/curriculum.Accessed 12/1/2015.[7] S. Egelman and E. Peer. Predicting privacy andsecurity attitudes. SIGCAS Comput. Soc.,45(1):22?28, Feb. 2015.[8] N. C. for ...
... S. Cortesi, U. Gasser, andA. Smith. Where teens seek online privacy advice.Technical report, Pew Research Internet Project,2013. http://www.pewinternet.org/2013/08/15/where-teens-seek-online-privacy-advice/.[12] N. K. Malhotra, ... K. Malhotra, S. S. Kim, and J. Agarwal. Internetusers? information privacy concerns (iuipc): Theconstruct, the scale, and a causal model. InformationSystems ... December 2004.[13] Markkula Center for Applied Ethics, Santa ClaraUniversity. Your privacy online.http://www.scu.edu/ethics-center/privacy/.[14] MediaSmarts, 2013. http://mediasmarts.ca/.[15] National Security Agency. NSA?s Cyber Camps ... TeachToday, 2015. http://www.teachtoday.eu/.[20] S. B. Wicker. Wiretaps to big data: Privacy andsurveillance in the age of interconnection.https://www.edx.org/course/wiretaps-big-data-privacy-surveillance-cornellx-engri1280x#.VJFpwCdGzDw. edX.APPENDIXA. PRIVACY PREFERENCES SCALE1. Online Fraud? I?m concerned that if I use ... companies ask mefor personal information.3. Transparency and Control? Consumer online privacy is really a matter of con-sumers? right to exercise control ... their information is collected,used, and shared.? A good consumer online privacy policy should have aclear and conspicuous disclosure.? Companies seeking information ... data are collected, processed, and used.? I believe that online privacy is invaded when control islost or unwillingly reduced as a ...
13
May 2017
SIGMOD '17: Proceedings of the 2017 ACM International Conference on Management of Data
Publisher: ACM
Bibliometrics:
Citation Count: 0
Downloads (6 Weeks): 87, Downloads (12 Months): 193, Downloads (Overall): 193
Full text available:
PDF
Differential privacy has emerged as an important standard for privacy preserving computation over databases containing sensitive information about individuals. Research on differential privacy spanning a number of research areas, including theory, security, database, networks, machine learning, and statistics, over the last decade has resulted in a variety of privacy preserving ...
Keywords:
differential privacy
Title:
Differential Privacy in the Wild: A Tutorial on Current Practices & Open Challenges
CCS:
Security and privacy
Human and societal aspects of security and privacy
Privacy protections
Usability in security and privacy
Keywords:
differential privacy
Abstract:
<p>Differential privacy has emerged as an important standard for privacy preserving computation over databases containing sensitive information about individuals. Research ... over databases containing sensitive information about individuals. Research on differential privacy spanning a number of research areas, including theory, security, database, ... over the last decade has resulted in a variety of privacy preserving algorithms for a number of analysis tasks. Despite maturing ... analysis tasks. Despite maturing research efforts, the adoption of differential privacy by practitioners in industry, academia, or government agencies has so ... complex data types, and identify research challenges in applying differential privacy
Primary CCS:
Security and privacy
Human and societal aspects of security and privacy
Privacy protections
Usability in security and privacy
References:
M. E. Andrés, N. E. Bordenabe, K. Chatzikokolakis, and C. Palamidessi. Geo-indistinguishability: Differential privacy for location-based systems. In CCS, 2013.
A. Blum, C. Dwork, F. McSherry, and K. Nissim. Practical privacy: The sulq framework. In PODS, 2005.
J. Brickell and V. Shmatikov. The cost of privacy: Destruction of data-mining utility in anonymized data publishing. In KDD, 2008.
K. Chaudhuri and A. D. Sarwate. Differential privacy for signal processing and machine learning. In WIFS, 2014.
G. Cormode. Building blocks of privacy: Differentially private mechanisms, 2013. Invited tutorial talk at Privacy Preserving Data Publication and Analysis (PrivDB) workshop.
I. Dinur and K. Nissim. Revealing information while preserving privacy. In PODS, 2003.
C. Dwork. Differential privacy. In ICALP, 2006.
C. Dwork and A. Roth. The algorithmic foundations of differential privacy. Foundations and Trends in Theoretical Computer Science, 2013.
U. Erlingsson, V. Pihur, and A. Korolova. Rappor: Randomized aggregatable privacy-preserving ordinal response. In CCS, 2014.
M. Fredrikson, E. Lantz, S. Jha, S. Lin, D. Page, and T. Ristenpart. Privacy in pharmacogenetics: An end-to-end case study of personalized warfarin dosing. In USENIX Sec, 2014.
S. Haney, M. Kutzbach, M. Graham, J. Abowd, and L. Vilhuber. Formal privacy protection for data products combining individual and employer frames. In UNECE, 2015.
M. Hay, K. Liu, G. Miklau, J. Pei, and E. Terzi. Privacy-aware data management in information networks. In SIGMOD, 2011.
X. He, A. Machanavajjhala, and B. Ding. Blowfish privacy: Tuning privacy-utility trade-offs using policies. In SIGMOD, 2014.
S. P. Kasiviswanathan, K. Nissim, S. Raskhodnikova, and A. Smith. Analyzing graphs with node differential privacy. In TCC, 2013.
D. Kifer and A. Machanavajjhala. No free lunch in data privacy. In SIGMOD, 2011.
D. Kifer and A. Machanavajjhala. Pufferfish: A framework for mathematical privacy definitions. ACM Trans. Database Syst., 2014.
C. Li, M. Hay, G. Miklau, and Y. Wang. A data- and workload-aware algorithm for range queries under differential privacy. VLDB, 2014.
C. Li, M. Hay, V. Rastogi, G. Miklau, and A. McGregor. Optimizing linear counting queries under differential privacy. In PODS, 2010.
N. Li, W. Qardaji, D. Su, and J. Cao. Privbasis: Frequent itemset mining with differential privacy. VLDB, 2012.
K. Liu, G. Miklau, J. Pei, and E. Terzi. Privacy-aware data mining in information networks. In KDD, 2010.
A. Machanavajjhala, X. He, and M. Hay. Differential privacy in the wild: A tutorial on current practices & open challenges. Proc. VLDB Endow., 2016.
M. Abadi, A. Chu, I. J. Goodfellow, H. B. McMahan, I. Mironov, K. Talwar, and L. Zhang. Deep learning with differential privacy. CoRR, abs/1607.00133, 2016.
A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, and L. Vilhuber. Privacy: Theory meets practice on the map. In ICDE, 2008.
A. Machanavajjhala, D. Kifer, J. Gehrke, and M. Venkitasubramaniam. L-diversity: Privacy beyond k-anonymity. KDD, 2007.
F. McSherry and K. Talwar. Mechanism design via differential privacy. In FOCS, 2007.
A. D. Sarwate and K. Chaudhuri. Signal processing and machine learning with differential privacy: Algorithms and challenges for continuous data. IEEE Signal Processing Magazine, 2013.
Y. Yang, Z. Zhang, G. Miklau, M. Winslett, and X. Xiao. Differential privacy in data publication and analysis. In SIGMOD, 2012.
Full Text:
Differential Privacy in the Wild:A Tutorial on Current Practices & Open ChallengesAshwin ... NC, USAashwin@cs.duke.eduXi HeDuke UniversityDurham, NC, USAhexi88@cs.duke.eduMichael HayColgate UniversityHamilton, NY, USAmhay@colgate.eduABSTRACTDifferential privacy has emerged as an important standardfor privacy preserving computation over databases contain-ing sensitive information about individuals. Research ... computation over databases contain-ing sensitive information about individuals. Research ondifferential privacy spanning a number of research areas, in-cluding theory, security, database, ... over the last decade has resulted in a vari-ety of privacy preserving algorithms for a number of analysistasks. Despite maturing research ... of analysistasks. Despite maturing research efforts, the adoption of dif-ferential privacy by practitioners in industry, academia, orgovernment agencies has so far ... applicationson complex data types, and identify research challenges inapplying differential privacy to real world applications.1. TUTORIAL OVERVIEWPrivacy concerns are a major ... increasing data collectionand powerful new analysis techniques. The goal of privacy- -preserving algorithms is to permit data mining and analysisto be ... the data with little disruption to their methods andresults. Differential privacy [9] has emerged as an importantstandard for protection of individuals? ... machine learning, security, programming languages,statistics and economics.An algorithm satisfies ?-differential privacy if its outputon a database of individuals is statistically indistinguish-able ... http://dx.doi.org/10.1145/3035918.3054779These algorithms work by infusing noise into query answers,and more privacy (smaller ? values) require the infusion oflarger amounts of noise. ... and itemset mining.Additionally, recent work has also considered applying dif-ferential privacy to more complex data types like graphs andsequential data.Despite its ... its success in the research community, the adop-tion of differential privacy by practitioners in academia, in-dustry, or government agencies has been ... problem setup, a lackof understanding of the semantics of differential privacy forcomplex data types in terms of research, and a lack ... well as the challenges faced in in-terpreting and enforcing differential privacy in real applica-tions that deal with complex data types. Thus, ... tutorialwill attract non-experts who would like to learn about dif-ferential privacy, , as well as experts who may understand dif-ferential privacy, , but are looking for new research problemsin making differentially ...
... first three modules on ?Defining pri-vacy,? ?Building blocks for differential privacy? ? and ?Answer-ing counting queries on tabular data? will focus ... focus on the theoretical and practical challenges faced inboth defining privacy and designing algorithms in real worldsettings that involve complex data ... concreteexample exercise in the section on ?Building blocks for dif-ferential privacy? ? (Section 2.2). The topics covered in eachof the modules ... outlined in Table 1.2.1 Defining PrivacyIn this module, we motivate privacy in databases usingexamples of known privacy attacks on sensitive individualdata. We formalize the database privacy problem and dis-tinguish it from related technologies like query answeringon ... do not work, and motivate the need for formalguarantees of privacy that ensure (1) security without obscu-rity, (2) privacy under post-processing, and (3) composition.We will define ?-differential privacy [9, 12] and show that itsatisfies these privacy desiderata.2.2 Building Blocks for Differential PrivacyIn this module, we will ... to compute sensitivity of queries andhow to derive bounds on privacy loss and error. Compositiontheorems including sequential composition, parallel compo-sition and ... to build the algorithm usingthe aforementioned building blocks and prove privacy usingcomposition theorems.2.3 Answering Counting Queries on TabularDataThis module will give ... been developed for answering counting queriesover tabular data under differential privacy. . Tabular datais a common type of data format, which ... starts with a description of two success sto-ries of differential privacy, , where these techniques are cur-rently in use in live ... discuss some of the is-sues that arise when deploying differential privacy, , such aschoosing a value for ?, dealing with limits ... 5,14, 23, 39, 41, 42] and itemset mining [29, 44]).2.5 Privacy
tabular dataThe standard definition of differential privacy is best suitedfor tabular data where each row corresponds to ... in streaming data). We will describe workthat critically analyzes the privacy guarantees by differentialprivate algorithms in terms of information disclosed aboutsensitive ... and character-ize necessary and sufficient conditions when differentiallyprivate algorithms ensure privacy under this model. We willdiscuss the No Free Lunch Theorem ... model. We willdiscuss the No Free Lunch Theorem in data privacy [25], andpresent alternate privacy definitions that can be customizedto match the structure of the ... the state-of-the-art, challenges, and openquestions in deriving algorithms with formal privacy guar-antees for complex data types like networks [22], data withmultiple ... withmultiple entities[15], and trajectories [2, 21]. Using tra-jectories and location privacy, , we will also highlight howusers may require the different ... The tutorial will not assume prior knowledgeof cryptography or differential privacy. . The tutorial will as-sume some background in databases and ... of ?-differential privacyBuilding blocks for DPLaplace mechanismRandomized responseExponential mechanismBounds on privacy loss and errorComposition theoremsSmooth sensitivityAnswering counting queriesExample: histograms & range ... conclude with a short exercise.session will focus on extending differential privacy to appli-cations on different data types (networks, trajectories, etc.).5. PRESENTERSAshwin ... Yahoo! Research. His primaryresearch interests lie in algorithms for ensuring privacy instatistical databases and augmented reality applications. Heis a recipient of ... `-diversity [34] has been veryinfluential in the field of data privacy and has been cited over2500 times (according to Google Scholar). ... one of the first real data publication powered by for-mal privacy guarantees in collaboration with the US CensusBureau in 2008 [33]. ... PODS, SIGMOD,VLDB, ICDE, WWW and WSDM, and has given tutorialson privacy at IEEE SSP 2009, ICDE 2010, VLDB 2016, andon entity ... at Computer Science Department,Duke University. Her research interests lie in privacy- -preservingdata analysis and security. She has also received an M.Sfrom ... published in SIGMOD and VLDB, andhas given a tutorial on privacy at VLDB 2016.Michael Hay is an Assistant Professor in the ... his PhD at UMass Amherst in 2010. Hisresearch interests include privacy-
... the Best Student Paper award. He hasgiven a tutorial on privacy and graphs at SIGMOD 2011 andVLDB 2016.6. HISTORY & RELATED ... identified five tutorials [6, 7, 18, 31, 43] on differ-ential privacy in the past five years, which are mainly fromSIGMOD, KDD, ... techniques, as well as focus on the appli-cation of differential privacy to real problems and complexdata types. While the building blocks ... a largerscope of understanding the promise and limitations of dif-ferential privacy in real applications. While [6, 18, 31] onlyfocused on one ... trajectories. Moreover, we will also show how to cus-tomize differential privacy to meet the privacy requirementsof these applications with complex data.Acknowledgements: This work is supported ... McMahan,I. Mironov, K. Talwar, and L. Zhang. Deep learning withdifferential privacy. . CoRR, abs/1607.00133, 2016.[2] M. E. Andre s, N. E. Bordenabe, ... Mach.Learn. Res., 2011.[6] K. Chaudhuri and A. D. Sarwate. Differential privacy forsignal processing and machine learning. In WIFS, 2014.[7] G. Cormode. ... machine learning. In WIFS, 2014.[7] G. Cormode. Building blocks of privacy: : Differentiallyprivate mechanisms, 2013. Invited tutorial talk at PrivacyPreserving Data ... Analysis (PrivDB)workshop.[8] I. Dinur and K. Nissim. Revealing information whilepreserving privacy. . In PODS, 2003.[9] C. Dwork. Differential privacy. . In ICALP, 2006.[10] C. Dwork. A firm foundation for ... 2006.[12] C. Dwork and A. Roth. The algorithmic foundations ofdifferential privacy. . Foundations and Trends in TheoreticalComputer Science, 2013.[13] U. Erlingsson, ... 2013.[13] U. Erlingsson, V. Pihur, and A. Korolova. Rappor:Randomized aggregatable privacy- -preserving ordinalresponse. In CCS, 2014.[14] M. Fredrikson, E. Lantz, S. ... E. Lantz, S. Jha, S. Lin, D. Page, andT. Ristenpart. Privacy in pharmacogenetics: An end-to-endcase study of personalized warfarin dosing. In ... Haney, M. Kutzbach, M. Graham, J. Abowd, andL. Vilhuber. Formal privacy protection for data productscombining individual and employer frames. In UNECE,2015.[16] ... 2015.[21] X. He, A. Machanavajjhala, and B. Ding. Blowfish privacy:Tuning privacy- -utility trade-offs using policies. InSIGMOD, 2014.[22] V. Karwa, S. Raskhodnikova, ... 2011.[26] D. Kifer and A. Machanavajjhala. Pufferfish: A frameworkfor mathematical privacy
... Y. Wang. A data- andworkload-aware algorithm for range queries underdifferential privacy. . VLDB, 2014.[28] C. Li, M. Hay, V. Rastogi, G. ... D. Su, and J. Cao. Privbasis: Frequentitemset mining with differential privacy. . VLDB, 2012.[30] N. Li, W. Yang, and W. Qardaji. ... 2013.[31] K. Liu, G. Miklau, J. Pei, and E. Terzi. Privacy- -aware datamining in information networks. In KDD, 2010.[32] A. Machanavajjhala, ... A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, andL. Vilhuber. Privacy: : Theory meets practice on the map. InICDE, 2008.[34] A. ... 2008.[34] A. Machanavajjhala, D. Kifer, J. Gehrke, andM. Venkitasubramaniam. L-diversity: Privacy beyondk-anonymity. KDD, 2007.[35] F. McSherry and K. Talwar. Mechanism design ... KDD, 2007.[35] F. McSherry and K. Talwar. Mechanism design viadifferential privacy. . In FOCS, 2007.[36] A. Narayanan and V. Shmatikov. Robust ... Sarwate and K. Chaudhuri. Signal processing andmachine learning with differential privacy: : Algorithms andchallenges for continuous data. IEEE Signal ProcessingMagazine, 2013.[40] ... Yang, Z. Zhang, G. Miklau, M. Winslett, and X. Xiao.Differential privacy in data publication and analysis. InSIGMOD, 2012.[44] C. Zeng, J. ...
14
October 2016
CCS '16: Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security
Publisher: ACM
Bibliometrics:
Citation Count: 0
Downloads (6 Weeks): 52, Downloads (12 Months): 644, Downloads (Overall): 644
Full text available:
PDF
Recommender systems typically require users' history data to provide a list of recommendations and such recommendations usually reside on the cloud/server. However, the release of such private data to the cloud has been shown to put users at risk. It is highly desirable to provide users high-quality personalized services while ...
Keywords:
privacy-preserving recommendation, privacy paradox, differential privacy
CCS:
Security and privacy
Human and societal aspects of security and privacy
Keywords:
privacy-preserving recommendation
privacy paradox
differential privacy
Abstract:
... desirable to provide users high-quality personalized services while respecting their privacy. . In this paper, we develop the first Enhanced Privacy- -built-In Client for Personalized Recommendation (EpicRec) system that performs the ... the data perturbation on the client side to protect users' privacy. . Our system needs no assumption of trusted server and ... design of EpicRec system incorporates three main modules: (1) usable privacy control interface that enables two user preferred privacy controls, overall and category-based controls, in the way they understand; ... and category-based controls, in the way they understand; (2) user privacy level quantification that automatically quantifies user privacy concern level from these user understandable inputs; (3) lightweight data ... perturbs user private data with provable guarantees on both differential privacy and data utility.</p> <p>Using large-scale real world datasets, we show ... world datasets, we show that, for both overall and category-based privacy controls, EpicRec performs best with respect to both perturbation quality ... with negligible computational overhead. Therefore, EpicRec enables two contradictory goals, privacy preservation and recommendation accuracy. We also implement a proof-of-concept EpicRec ... We also implement a proof-of-concept EpicRec system to demonstrate a privacy- -preserving personal computer for movie recommendation with web-based privacy controls. We believe EpicRec is an important step towards designing ... user data using high quality personalized services with strong provable privacy
Primary CCS:
Security and privacy
Human and societal aspects of security and privacy
References:
L. Bonomi, L. Xiong, and J. J. Lu. Linkit: privacy preserving record linkage and integration via transformations. In SIGMOD, pages 1029--1032, 2013.
J. Canny. Collaborative filtering with privacy. In IEEE Symposium on S&P, pages 45--57, 2002.
C. Dwork. Differential privacy: A survey of results. In TAMC, pages 1--19, 2008.
B. C. M. Fung, K. Wang, R. Chen, and P. S. Yu. Privacy-preserving data publishing: A survey of recent developments. ACM Comput. Surv., 42(4):14:1--14:53, 2010.
B. Heitmann, J. G. Kim, A. Passant, C. Hayes, and H.-G. Kim. An architecture for privacy-enabled user profile portability on the web of data. In HetRec, pages 16--23, 2010.
G. Kellaris and S. Papadopoulos. Practical differential privacy via grouping and smoothing. VLDB, 6(5):301--312, Mar. 2013.
B. Liu and U. Hengartner. ptwitterrec: A privacy-preserving personalized tweet recommendation framework. In Proceedings of ASIA CCS, pages 365--376, 2014.
A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, and L. Vilhuber. Privacy: Theory meets practice on the map. In ICDE, pages 277--286, 2008.
F. McSherry and I. Mironov. Differentially private recommender systems: Building privacy into the net. In KDD, pages 627--636, 2009.
V. Nikolaenko, S. Ioannidis, U. Weinsberg, M. Joye, N. Taft, and D. Boneh. Privacy-preserving matrix factorization. In CCS, pages 801--812, 2013.
H. Polat and W. Du. Privacy-preserving collaborative filtering using randomized perturbation techniques. In ICDM, pages 625--628, 2003.
Y. Shen and H. Jin. Privacy-preserving personalized recommendation: An instance-based approach via differential privacy. In ICDM, pages 540--549, 2014.
Y. Xin and T. Jaakkola. Controlling privacy in recommender systems. In NIPS, pages 2618--2626. 2014.
B. Zhang, N. Wang, and H. Jin. Privacy concerns in online recommender systems: Influences of control and user data input. In SOUPS, pages 159--173, 2014.
Full Text:
... respecting their privacy.In this paper, we develop the first Enhanced Privacy- -built-In Client for Personalized Recommendation (EpicRec) sys-tem that performs the ... performs the data perturbation on the client sideto protect users? privacy. . Our system needs no assumptionof trusted server and no ... use.The design of EpicRec system incorporates three mainmodules: (1) usable privacy control interface that enablestwo user preferred privacy controls, overall and category-based controls, in the way they understand; ... that perturbsuser private data with provable guarantees on both differen-tial privacy and data utility.Using large-scale real world datasets, we show that, ... real world datasets, we show that, forboth overall and category-based privacy controls, EpicRecperforms best with respect to both perturbation quality andpersonalized ... andpersonalized recommendation, with negligible computationaloverhead. Therefore, EpicRec enables two contradictorygoals, privacy preservation and recommendation accuracy.We also implement a proof-of-concept EpicRec system ... recommendation accuracy.We also implement a proof-of-concept EpicRec system todemonstrate a privacy- -preserving personal computer for movierecommendation with web-based privacy controls. We be-lieve EpicRec is an important step towards designing ... 2016 ACM. ISBN 978-1-4503-4139-4/16/10. . . $15.00DOI: http://dx.doi.org/10.1145/2976749.2978316KeywordsPrivacy-Preserving Recommendation; Differential Privacy; ;Privacy Paradox1. INTRODUCTIONThe last few decades have witnessed wide applicationsof recommender ... services.However, many user concerns about recommender systemshave been raised from privacy perspectives due to the releaseof users? private data. Consumer fears ... due to the releaseof users? private data. Consumer fears over privacy continueto escalate. Based on Pew Research, 68% consumers thinkthat current ... current laws are insufficient to protect their privacyand demand tighter privacy laws; and 86% of Internet usershave taken proactive steps to ... to remove or mask their digitalfootprints. Responding to increasing user privacy concerns,governments in US/EU are increasing and enforcing existingregulations. LG TV, ... users? private data.To resolve the tensions between business intelligence anduser privacy, , it is critically desirable to develop technologiesthat can preserve ... desirable to develop technologiesthat can preserve and control user data privacy and in themeanwhile still allow intelligence and personalization busi-ness. Without ... and companies will not be able to deployservices due to privacy law constraints and user concerns.A majority of existing methods [5, ...
... in recommendation algorithm. All these meth-ods attempted to protect user privacy when server releasinguser data to third party applications and business ... partners.Unfortunately, in such device-cloud based recommendersystems, there are many other privacy attacks (as shown in180Cloud Intrusion& Data LeakageUser Data EavesdropCloudAttackerUntrusted Recommender SystemRecommendationPerturbed User DataInsiderFigure 1: Attacking ModelFigure 1) that cannot ... developed under suchuntrusted server settings, including cryptography techniques[3, 21], differential privacy- -based techniques [24], and ran-domization techniques [23]. Unfortunately, these approachescannot ... the existence of EpicRec onuser?s device not only satisfies users? privacy needs but alsorequires no assumption of trusted server and no ... Our contributions are summarized as follows:? We design the first privacy- -preserving EpicRec frame-work on user client for personalized recommendation.EpicRec collects ... recommendation.EpicRec collects user private data from various de-vices, provides usable privacy control interfaces, quan-tifies user privacy control input and uses it to perturbuser data. EpicRec enables ... light-weight anddata perturbation algorithm to preserve the categoryaggregates with both privacy and utility theoreticalguarantees, which significantly improves the existingapproach [24] from ... utility theoreticalguarantees, which significantly improves the existingapproach [24] from both privacy and utility aspects.? We conduct extensive experiments to evaluate the ... EpicRec on large-scale real-world datasets.The results show that, from both privacy and utilityperspectives, our proposed S-EpicRec and M-EpicRecsystems consistently outperform other ... proof-of-concept EpicRec system forpersonalized movie recommendation with web-basedoverall and category-based privacy concern controls.The rest of paper is organized as follows. Section ...
... releasing to recommender systems. However, theirmethod does not have provable privacy guarantees and waslater identified that using clustering method on their ... In the meanwhile, cryptography-based approaches [21, 3] are proposed with privacy guar-antees. However, these approaches require a trusted third-party (Cryptographic Service ... [24] are based on the state-of-the-art differentialprivacy notion, with both privacy and utility guarantees.Unfortunately, in addition to the above limitations, all ... the above limitations, all ex-isting approaches largely ignore the usable privacy control181Table 1: Comparison between Privacy- -Preserving Recommendation under Untrusted Server SettingsApproachesNo Change No Need User ... Recommendation under Untrusted Server SettingsApproachesNo Change No Need User User-friendly Privacy Utility Privacyof Service of Trusted Privacy Privacy ControlProvider Third Party Control Interface Quantification Guarantee GuaranteePolat et al. ... X X X Differential PrivacyCategory-basedsuch that users cannot provide their privacy ... concerns in away they understand.In addition, there are some other privacy- -preserving rec-ommendation approaches under trusted server settings [19]or some particular ... a comprehensivesolution at all levels, towards a practical and usable privacy- -preserving client for untrusted recommender system, withstrong privacy and utility guarantees.Differential Privacy. . Differential privacy [7, 8] has be-come the de facto standard for privacy preserving data an-alytics. Dwork et al. [8] established the guideline ... Dwork et al. [8] established the guideline to guar-antee differential privacy for individual aggregate queries bycalibrating the Laplace noise to each ... relaxation was proposed by Machanavajjhala et al. [18],called probabilistic differential privacy. . This novel differen-tial privacy notion allows the privacy preservation with highprobability, thereby improve the flexibility of global sensi-tivity ... sensitivity, which allows the injectionof Admissible noise to ensure differential privacy. . Unfortu-nately, all these approaches require the strict satisfaction ofperturbed ... address the above constraints, byperturbing data to guarantee both differential privacy ... andrecommendation quality. Our data perturbation module inEpicRec framework provides better privacy and utility than[24] from both theoretical and empirical perspectives.3. EpicRec ... this section, we present the framework design of ourproposed Enhanced Privacy- -built-In Client for Recommen-dation (EpicRec) system. The goal of EpicRec ... system. The goal of EpicRec system isthree-fold: (1) enable user-friendly privacy concern controlon their private data in a way they understand; ... their private data in a way they understand; (2) quantifyuser?s privacy level input from layman and user-understandablelanguage to quantified private budget ...
... interacted ser-vices. The results show that most participants raised theirmost privacy concerns about their watching content historyfor either personalized program recommendation ... behaviors? We recruited 505 participantsthrough MTurk and studied 15 different privacy control mech-anisms with different levels and types of control. The ... maturity rating, watch-ing time). The results show that the overall privacy controland finer-grained category-based privacy control are the besttwo control interfaces the participants selected among ... is to perform data perturbationon user private data with user-specified privacy concern lev-els, such that the format of perturbed data remains ... format of perturbed data remains the sameand recommendation results remain accurate.182C?4. Data PerturbationC?4. Data PerturbationC?5. Privacy QuantificationC?5. Privacy QuantificationC?3. User Privacy Control InputC?3. User Privacy Control InputEpicRec on DeviceRecommendation using Perturbed User DataRecommendation using Perturbed User DataC?2. User Private Data InputC?2. User Private Data InputC?1. Public Data InputC?1. Public Data InputExisting Recommender SystemC?6. Recommendation OutputC?6. Recommendation OutputPerturbed User DataPerturbed User DataFigure 3: Architecture of EpicRec ... ourproposed EpicRec system provides users their most preferredoverall and category-based privacy controls. The goal of dataperturbation is to protect user concerned ... user and EpicRec. A user inputs hisprivacy levels using user privacy control input (C-3) and pro-vides his private data on device ... is to preserve the quality of perturbed data withoutsacrificing any privacy breach that could be derived usingpublic information.C-2. User Private Data ... history of watched TV programs,movies on smart TV, etc.C-3. User Privacy Control Input: provide user inter-face to obtain user?s privacy concern levels. Motivated byour user study results discussed in Section ... discussed in Section 3.1, C-3 providesthe following two granularities of privacy control interfaces:Overall (Single-level) Privacy Control:Provide users a single input of privacy concern level;Category-based (Multiple-level) Privacy Control:Provide users inputs of privacy
three privacy concern levels: ?NoRelease? as releasing no information, ?Perturbed Release? asreleasing ... in category-basedprivacy control). The designs of EpicRec system with thesetwo privacy controls are later presented in Section 4 and Sec-tion 5 ... such, thisdata perturbation module is associated with two correspond-ing notions: privacy ... notion and utility notion. Examples ofprivacy notions can be differential privacy, , k-anonymity, in-formation gain, etc. while examples of utility notions ... can bemean absolute error, root mean square error, TopK, etc.C-5. Privacy Quantification: quantify user specifiedprivacy concern levels to mathematical privacy parametersto be used in data perturbation (C-4) component. The ex-amples ... on different pri-vacy notions can be as follows: (1) the privacy budget ? indifferential privacy; ; (2) the value of k in k-anonymity pri-vacy; (3) ... obtained from service provider usingperturbed user data.4. DESIGN OF S-EpicRec:SINGLE-LEVEL PRIVACY CONTROLIn this section, we focus on the design of Single-level ... on the design of Single-level Epi-cRec (S-EpicRec) to enable overall privacy control. In therest of this section, we first introduce our ... the detailed design of the main components (dataperturbation (C-4) and privacy quantification (C-5) compo-nents) in S-EpicRec.4.1 Privacy & Utility Notions4.1.1 Privacy NotionWe consider using the state-of-the-art privacy notion, Dif-ferential Privacy [8], which not only provides strong privacyguarantee but also allows ... is insensitive to any particularrecord in the dataset.Definition 1 (?-Differential Privacy) ). Let ? > 0be a small constant. A randomized ... coin tosses of A.The parameter ? > 0 is called privacy budget, which allowsuser to control the level of privacy. . A smaller ? suggests morelimit posed on the influence ... of size ndp user?s perturbed item vector of size nPT privacy concern level? quantified privacy budgetto stronger privacy protection. More importantly, the ap-plication of differential privacy ensures perturbed data in-dependent of any auxiliary knowledge [7] the ...
User?s privacy concern level, denoted as PT, belongsto one of the following ... not belong to user?s perturbed data.C-5: ? is the quantified privacy parameter (privacy bud-get in differential privacy) ) when PT is selected as PerturbedRelease.For simplicity and consistency, ... to meet both pri-vacy preservation and recommendation quality, specificallyvia the privacy and utility notions in Section 4.1. The restof this subsection ... with public item set I, apublic item-category correlation matrix C, privacy budget? > 0. The objective is to generate the user?s ... to generate the user?s perturbed itemvector dp such that (1) (privacy goal) the category aggre-gates (number of items belonging to each ... items belonging to each category) of per-turbed data satisfy ?-differential privacy with the presence orabsence of an individual item to defend ... the presence orabsence of an individual item to defend against privacy leak-age via public category information; (2) (utility goal) thequality of ... CR and CP.Remarks: Our defined S-Perturbation problem targets ona stronger privacy guarantee (?-differential privacy ratherthan (?, ?)-differential privacy in [24]) and relaxes the objec-tive (discard the maximization of ... data perturbation.4.3.2 ChallengesLarge Magnitude of Noises for Achieving ? Dif-ferential Privacy. . One of the most widely used mecha-nisms to achieve ... One of the most widely used mecha-nisms to achieve ?-differential privacy is Laplace mechanism[8] (Theorem 1), which adds random noises to ...
... it imposes the ?-differential pri-vacy guarantee as later shown in privacy analysis (Theorem2); Second, it captures the correlation between categoriesfrom the ... MAE error usingInput : private user data dr, item-category matrix C,privacy budget ?Output: perturbed user?s data dp// Phase 1: Noise calibration1 ... 1 category 2 category 3 category 4 category 5MAEerror when privacy budget ? = ?item 1 ? ? ?item 2 ? ... entry drp(i) to 1 withprobability drp(i).4.3.4 Theoretical AnalysisWe theoretically analyze privacy and utility, as well astime complexity.Privacy Analysis. We show the ... utility, as well astime complexity.Privacy Analysis. We show the differential privacy guar-antee of S-DPDP algorithm:Theorem 2 (S-DPDP Privacy Analysis). S-DPDP al-gorithm enjoys ?-differential privacy. .185Proof. We observe that there is no privacy loss in phase2 of S-DPDP algorithm as it is considered ... data. Since any post-processing of theanswers cannot diminish this rigorous privacy guarantee ac-cording to Hey et al. [12], we only need ... et al. [12], we only need to focus on analyzingthe privacy guarantee in phase 1 of S-DPDP algorithm.Let D1, D2 be ...
... practice. The last rounding phase takes anotherO(n) time.4.4 Design of Privacy Quantification (C-5)In this subsection, we design the privacy quantification(C-5) component to quantify ?Perturbed Release? level to aprivacy budget ... perturbation component (C-4) discussedabove. Specifically, we devise a novel Single Privacy BudgetQuantification (S-PBQ) algorithm to select a privacy budget? to optimize the utility of perturbed data.The idea of ... of perturbed data will not significantly de-crease any more when privacy budget ? is larger than somethreshold. In detail, S-PBQ algorithm ... noise magnitude to each category aggregateand then search the optimal privacy budget after which util-ity loss can be negligible. Next, we ... z? as the reciprocal of each entry in z?I.Phase 2: Privacy budget optimization. This phase deter-mines the privacy budget ? using the above z?I, z?. Based onthe idea ... : item-category matrix C, learning step rate ?Output: quantified optimal privacy budget ?// Phase 1: Noise magnitude determination1 Solve mathematical programming ... z? ? reciprocal of each entry in zI;// Phase 2: Privacy budget calculation3 Formulate (4.7) with sampled S and RandAs;4 foreach ...
... positive constantk [16]. In addition, in order to avoid potential privacy leak-age caused by ? quantification, we do not use user?s ... algorithm. At last,the averaged ? is chosen as the quantified privacy budget.Time Complexity Analysis. S-PBQ algorithm is sim-ilar as S-DPDP algorithm ... ) similar asthat of solving relaxed (4.5). To provide strong privacy guar-antee, we consider ? ? 1 and therefore the repeat ... using S-PBQ algorithm.Privacy Guarantee. The perturbed data using S-EpicRecsatisfies ?-differential privacy where ? is determined by S-PBQ algorithm (Algorithm 2).Utility Guarantee. ... solv-ing relaxed (4.5) as discussed in [2].5. DESIGN OF M-EpicRec:MULTI-LEVEL PRIVACY CONTROLIn this section, we further design a M-EpicRec frameworkto enable ... section, we further design a M-EpicRec frameworkto enable the category-based privacy concern controls. Theidea of M-EpicRec is extended from S-EpicRec proposed ... only different notation from those in S-EpicRec is in user privacy control component (C-3). Specif-ically, we define the vector of privacy concern levels PT ={PT(1), . . . ,PT(c)} to replace ... to replace a single PT. Each entry PT(j)is the user-specified privacy concern level for category j,which still belongs to one of ... ?Perturbed Release?, ?All Release? }. Correspond-ingly, we define the vector privacy budget ? = {?(1), . . . , ?(c)},where ?(j) ... these categories.Moreover, we define three vectors Ln,Lp,La with respectto three privacy concern levels {?No Release?, ?PerturbedRelease?, ?All Release? }, based on ... 0, 1, 1, 0). Note that each category has one privacy tol-erance level and Ln + Lp + Ln = 1.5.2 ... following aspects: (1) M-Perturbation problemtakes two different inputs: the category-based privacy con-cern levels Ln,Lp,La (derived from PT) and a privacy bud-get ? for categories with ?perturbed release? privacy concernlevel; (2) M-Perturbation problem targets on maintainingthe quality of category ... raw data onlyassociated with?all release?categories. (Note that we chooseto prioritize privacy
... an item if one ofits categories is set ?perturbed release? privacy concern level187and none of its categories is set ?no release?; ... ? 0(5.1)where the first constraint imposes the satisfaction of differ-ential privacy on these categories with ?perturbed release?privacy level; the next two constraints ensure no noise cal-ibration into ... address the constraints of categories with?no release? and ?all release? privacy levels, we select ?allrelease? user data (denoted as dar ) ... is 1) and all of its associatedcategories have ?all release? privacy levels.Second, we inject Lap(z(j)) into the jth category for thosecategories ... Lap(z(j)) into the jth category for thosecategories with ?perturbed release? privacy level. Then, wereformulate (4.7) as follows:minimize 12?l + r?22subject to ... = 0(5.3)where NA(j) =?i?I cijdr(i) for categories with ?all re-lease? privacy level; NA(j) = 0 for categories with ?no re-lease? privacy level; and NA(j) =?i?I cijdr(i) + Lap(z(j))for categories with ?perturbed ... and NA(j) =?i?I cijdr(i) + Lap(z(j))for categories with ?perturbed release? privacy level. Thesecond constraint imposes that the raw data only in ... that the raw data only in cate-gories with ?all release? privacy level will be released; thelast two constraints guarantee the equivalence ... guarantee the equivalence for categorieswith ?all release? and ?no release? privacy levels.5.2.4 Theoretical Analysis.Theorem 4 (M-DPDP Privacy Analysis). M-DPDPalgorithm enjoys ?-differential privacy.Theorem 5 (M-DPDP Utility Analysis). The ... of M-DPDP is also similaras S-DPDP and omitted.5.3 Design of Privacy Quantification (C-5)We propose M-PBQ approach in this section, extendingfrom S-PBQ ... from Laplace distribution Lap(1) if this cat-egory has ?perturbed release? privacy level and 0 otherwise;RandAm is the category aggregates on randomly ... in which Randm(j) = 0 for cat-egories with ?no release? privacy level. The time complexityof M-PBQ is similar as S-PBQ and ...
... As this paper is the first at-tempt for designing a privacy preserving system to enableuser-understandable privacy concern control, there is no ex-isting work to fairly compare ... phase 1 for noisecalibration only, which first uses our quantified privacy bud-get ? from S-PBQ/M-PBQ algorithms and then sanitizesdata by phase ... ? (? > 1) which our proposedEpicRec focuses on stronger privacy protection with ? ? 1.Settings. We conduct the classic recommender ... descent algorithmfor collaborative filtering. In M-EpicRec case, we randomlyselect the privacy levels for each category for each user.6.2 Evaluation ResultsPerturbation Quality. ... 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1Recommendation MAE Loss (%)Privacy Budget ?S-EpicRecPseudo-LPAPseudo-GSQuantified Optimal ? 0 5 10 15 20 0.1 ... 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1Recommendation MAE Loss (%)Privacy Budget ?S-EpicRecPseudo-LPAPseudo-GSQuantified Optimal ?(b) Recommendation AccuracyFigure 5: S-EpicRec Results (L:MovieLens; ...
... to 5% and 3% in MovieLensand Yelp datasets with strong privacy guarantees (? = 0.2in MovieLens and ? = 0.3 in ... Release? categories.One may question that what if a user has privacy con-cern on some category rather than particular items in thiscategory. ... 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1Recommendation MAE Loss (%)Privacy Budget ?M-EpicRecPseudo-LPAPseudo-GSQuantified Optimal ? 4 6 8 10 12 14 ... 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1Recommendation MAE Loss (%)Privacy Budget ?M-EpicRecPseudo-LPAPseudo-GSQuantified Optimal ? 8 9 10 11 12 13 ... 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1Recommendation MAE Loss (%)Privacy Budget ?M-EpicRecPseudo-LPAPseudo-GSQuantified Optimal ? 6 8 10 12 14 16 ... 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1Recommendation MAE Loss (%)Privacy Budget ?M-EpicRecPseudo-LPAPseudo-GSQuantified Optimal ?(b) Recommendation Accuracy (Top: ?Perturbed Release?Categories; Bottom: ... 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1Running Time (ms)Privacy Budget ?S-EpicRecM-EpicRec 500 1000 1500 2000 2500 0.1 0.2 0.3 ... 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1Running Time (ms)Privacy Budget ?S-EpicRecM-EpicRecFigure 7: Running Time (L:MovieLens; R:Yelp)larger than that on ... (C-2) component. Then, we design andimplement user interfaces for user privacy
... to 1 if a movie?s titleexists in the history file.User Privacy Control Input (C-3): We designed the userinterface for user privacy control input (see C-3 in Figure 8),in which a user ... selected, user can furtherselect if he wants to select different privacy concern levelsfor different categories of movies. If so, a list ... movies. If so, a list of categoriesis popped up with privacy concern level drop-down boxes.Privacy Quantification (C-5) & Data Perturbation (C-4):If ... ?perturbed release? and does not check thebox to set category-based privacy concern levels, we call C-4and C-5 components in S-EpicRec system. ... and C-5 components inM-EpicRec system to support user specified category-basedmultiple privacy concern levels.Recommendation Output (C-6): We simply use a netflix-style output ... person-alized recommendation via state-of-the-art differential pri-vacy. EpicRec provides users a privacy control interfacesuch that users can control their privacy concerns in a waythey understand and of their preferred granularities, ... granularities, eitheroverall or category-based concerns. EpicRec further quanti-fies these layman privacy concern levels to privacy budget,which is next used as input to conduct data perturbationalgorithm ... next used as input to conduct data perturbationalgorithm via differential privacy. . With these key compo-nents, EpicRec can also work with ... Private Data InputHistory on user?s device from various resourcesC-3. User Privacy Control InputImplementation of Data Perturbation Component C-4Recommendation Service Provider using ... Output to Client C-6EpicRec on Device for Movie RecommendationImplementation of Privacy Quantification Component C-5SSL/TLSSSL/TLSPrivacyBudgetFigure 8: Proof-of-Concept Implementation of EpicRec for Movie/TV ... Proof-of-Concept Implementation of EpicRec for Movie/TV Recommendationtowards designing a practical privacy- -preserving system forpersonalized recommendation.Future work. We will extend EpicRec into ... user private data; and al-low users to iteratively adjust their privacy levels for tradingoff privacy and recommendation.9. REFERENCES[1] L. Bonomi, L. Xiong, and J. J. ... REFERENCES[1] L. Bonomi, L. Xiong, and J. J. Lu. Linkit: privacy preservingrecord linkage and integration via transformations. InSIGMOD, pages 1029?1032, 2013.[2] ... Convex Optimization.Cambridge University Press, 2004.[3] J. Canny. Collaborative filtering with privacy. . In IEEESymposium on S&P, pages 45?57, 2002.[4] T.-H. H. ...
... private machine learning. In NIPS,pages 2652?2660. 2013.[7] C. Dwork. Differential privacy: : A survey of results. In TAMC,pages 1?19, 2008.[8] C. ... Kim, A. Passant, C. Hayes, and H.-G. Kim.An architecture for privacy- -enabled user profile portability onthe web of data. In HetRec, ... and finance. 2001.[17] B. Liu and U. Hengartner. ptwitterrec: A privacy- -preservingpersonalized tweet recommendation framework. In Proceedingsof ASIA CCS, pages 365?376, ... A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, andL. Vilhuber. Privacy: : Theory meets practice on the map. InICDE, pages 277?286, ... 2008.[19] F. McSherry and I. Mironov. Differentially privaterecommender systems: Building privacy into the net. In KDD,pages 627?636, 2009.[20] N. Megiddo. Linear ... S. Ioannidis, U. Weinsberg, M. Joye, N. Taft,and D. Boneh. Privacy- -preserving matrix factorization. In CCS,pages 801?812, 2013.[22] K. Nissim, S. ... analysis. In STOC, pages 75?84,2007.[23] H. Polat and W. Du. Privacy- -preserving collaborative filteringusing randomized perturbation techniques. In ICDM, pages625?628, 2003.[24] ... techniques. In ICDM, pages625?628, 2003.[24] Y. Shen and H. Jin. Privacy- -preserving personalizedrecommendation: An instance-based approach via differentialprivacy. In ICDM, pages ... IUI, pages 299?304, 2016.[27] Y. Xin and T. Jaakkola. Controlling privacy in recommendersystems. In NIPS, pages 2618?2626. 2014.[28] J. Xu, Z. ... pages 32?43, 2012.[29] B. Zhang, N. Wang, and H. Jin. Privacy concerns in onlinerecommender systems: Influences of control and user datainput. ...
15
November 2014
CCS '14: Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security
Publisher: ACM
Bibliometrics:
Citation Count: 0
Downloads (6 Weeks): 3, Downloads (12 Months): 22, Downloads (Overall): 112
Full text available:
PDF
The 13th Workshop on Privacy in the Electronic Society is held on November 3, 2014 in Scottsdale, Arizona, USA in conjunction with the 21st ACM Conference on Computer and Communications Security. The goal of this workshop is to discuss the problems of privacy in global interconnected societies and possible solutions ...
Keywords:
privacy protection
CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Keywords:
privacy protection
Abstract:
<p>The 13th Workshop on Privacy in the Electronic Society is held on November 3, 2014 ... goal of this workshop is to discuss the problems of privacy in global interconnected societies and possible solutions to them. The ... and 9 short papers on a diverse set of exciting privacy topics selected from a set of 67 total submissions. Specific ... Specific areas covered include but are not limited to healthcare privacy, , censorship circumvention, anonymous communication, web tracking, location and social ... censorship circumvention, anonymous communication, web tracking, location and social network privacy.
Primary CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Title:
WPES 2014: 13th Workshop on Privacy in the Electronic Society
Full Text:
Microsoft Word - w02wpes-datta.docWPES 2014: 13th Workshop on Privacy in the Electronic Society Gail-Joon Ahn Arizona State University 1475 ... USA +1 (412) 268-4254 danupam@cmu.edu ABSTRACT The 13th Workshop on Privacy in the Electronic Society is held on November 3, 2014 ... goal of this workshop is to discuss the problems of privacy in global interconnected societies and possible solutions to them. The ... and 9 short papers on a diverse set of exciting privacy topics selected from a set of 67 total submissions. Specific ... Specific areas covered include but are not limited to healthcare privacy, , censorship circumvention, anonymous communication, web tracking, location and social ... censorship circumvention, anonymous communication, web tracking, location and social network privacy. . Categories and Subject Descriptors K.4 [Computers and Society]: Privacy General Terms Measurement, Experimentation, Security, Human Factors, Languages, Theory, Legal ... Experimentation, Security, Human Factors, Languages, Theory, Legal Aspects, Verification. Keywords Privacy protection 1. INTRODUCTION The increased power and interconnectivity of computer ... resulted in an increasing degree of awareness with respect to privacy. . Privacy issues have been the subject of public debates, and the ... been the subject of public debates, and the need for privacy- -aware policies, regulations, and techniques has been widely recognized. The ... goal of this workshop is to discuss the problems of privacy in global interconnected societies and possible solutions to them. 2. ... novel research on all theoretical and practical aspects of electronic privacy, , as well as experimental studies of fielded systems. We ... the following: ? anonymity, pseudonymity, and unlinkability ? crowdsourcing for privacy and security ? data correlation and leakage attacks ? data ... ? data correlation and leakage attacks ? data security and privacy ? electronic communication privacy ? economics of privacy ? information dissemination control ? models, languages, and techniques for ... techniques for big data protection ? personally identifiable information ? privacy- -aware access control ? privacy and anonymity on the Web ? privacy in cloud and grid systems ? privacy and confidentiality management ? privacy and data mining ? privacy in digital business ? privacy in electronic records ? privacy enhancing technologies ? privacy in health care and public administration ? privacy and human rights ? privacy metrics ? privacy in mobile systems ? privacy in online education ? privacy in outsourced scenarios ? privacy policies http://dx.doi.org/10.1145/2660267.2660383 1546 Permission to make digital or hard copies ... CCS?14, November 3?7, 2014, Scottsdale, Arizona, USA. ACM 978-1-4503-2957-6/14/11. ? privacy vs. security ? privacy in social networks ? privacy threats ? privacy and virtual identity ? privacy through accountability ? public records and personal privacy ? user profiling ? wireless privacy 3. PROGRAM The workshop program includes 17 full papers and ... and 9 short papers on a diverse set of exciting privacy topics selected from a set of 67 total submissions. Specific ... Specific areas covered include but are not limited to healthcare privacy, , censorship circumvention, anonymous communication, web tracking, location and social ... censorship circumvention, anonymous communication, web tracking, location and social network privacy. . 4. ORGANIZERS WPES 2014 was organized by the following ... consisted of 38 researchers working in the research area of privacy.
16
November 2016
MoMM '16: Proceedings of the 14th International Conference on Advances in Mobile Computing and Multi Media
Publisher: ACM
Bibliometrics:
Citation Count: 0
Downloads (6 Weeks): 14, Downloads (12 Months): 61, Downloads (Overall): 61
Full text available:
PDF
Location tracking applications which receives frequent updates of a moving object's position, collect numerous moving objects' location data. Public transit agencies can make use of tracking data to optimize traffic control strategies. While improper use of trajectory data could cause individuals' privacy leakage. However, existing privacy-preserving techniques are unable to ...
Keywords:
Privacy-preserving, differential privacy, trajectory privacy
CCS:
Security and privacy
Keywords:
Privacy-preserving
differential privacy
trajectory privacy
Abstract:
... strategies. While improper use of trajectory data could cause individuals' privacy leakage. However, existing privacy- -preserving techniques are unable to provide sufficient privacy protection. In this paper, we propose a data-dependent differentially private ...
References:
R. Chen, N. Mohammed, B. C. M.Fung, and L. Xiong. Publishing setvalued data via differential privacy. VLDB Endowment, 4(11):1087--1098, 2011.
C. Li, G. Miklau, M. Hay, A. Mcgregor, and V. Rastogi. The matrix mechanism: optimizing linear counting queries under differential privacy. International Journla of Very Large Databases, 24(6):757--781, 2015.
B. Palanisamy, L. Liu, K. Lee, A. Singh, and Y. Tang. Location privacy with road network mix-zones. In Proc. MSN 2012, Chengdu, China, December 2012.
Full Text:
... control strategies.While improper use of trajectory data could cause individ-uals? privacy leakage. However, existing privacy- -preservingtechniques are unable to provide sufficient privacy protec-tion. In this paper, we propose a data-dependent differen-tially private ... Management]: General?Security, integrity,and protection; D.4.6 [Software Engineering]: Securityand ProtectionKeywordsPrivacy-preserving, trajectory privacy, , differential privacy1. INTRODUCTIONThe prevalence of various location-aware devices makeslocation ... 978-1-4503-4806-5/16/11. . . $15.00DOI: http://dx.doi.org/10.1145/3007120.3007149of these data may cause individuals? privacy leakage. Moti-vated by the traffic congestion analysis application, in thispaper, ... of moving objects at each times-tamp may also cause individuals? privacy leakage. Examplesare shown in Figure 1, suppose we divide the ... grids of equal size, moving objects scattered in thesegrids. Location privacy leakage still exists even if we onlypublish count values of ... also shown in Figure 1.Ati(a)ti ti+1g1g2(b)Figure 1: An example of privacy leakageSparse location attack. Suppose a moving object lo-cates in an ... Thus, ?smoving trajectory between ti and ti+1 is exposed.Existing trajectory privacy- -preserving techniques are mainlybased on the concept of k-anonymity, which ... adversaries know and what theydo not know [3]. Fortunately, differential privacy is widelyaccepted as one of the strongest known unconditional pri-vacy ...
... conduct con-strained inferences, which helps to boost the data utilityafter privacy- -preserving process.4. We experimentally evaluate the sanitization algorithmfor range count ... of each node may change over time.We employ a uniform privacy budget allocation scheme toadd Laplace noise to the count values, ... inalgorithm 1.Algorithm 1: NoisyRTConstructionInput : A snapshot moving object dataset Dt;privacy budget ?;road network constraint RN ;Output: Noisy R-tree RT ;?= ...
... assumption.2.3 Consistency ProcessingDue to the noise added to ensure differential privacy, , wemay not be able to obtain a meaningful and ...
... size. Average relative error is reducing with the in-creasement of privacy budget ?. This is because the noisywe added is reducing, ... and V. Rastogi.The matrix mechanism: optimizing linear countingqueries under differential privacy. . International Journlaof Very Large Databases, 24(6):757?781, 2015.[3] B. Palanisamy, ... Palanisamy, L. Liu, K. Lee, A. Singh, and Y. Tang.Location privacy with road network mix-zones. In Proc.MSN 2012, Chengdu, China, December ...
17
November 2008
SPRINGL '08: Proceedings of the SIGSPATIAL ACM GIS 2008 International Workshop on Security and Privacy in GIS and LBS
Publisher: ACM
Bibliometrics:
Citation Count: 2
Downloads (6 Weeks): 1, Downloads (12 Months): 13, Downloads (Overall): 226
Full text available:
Pdf
With the growth of wireless and mobile technologies, we are witnessing an increase in location-based services (LBS). Although LBS provide enhanced functionalities, they open up new vulnerabilities that can be exploited to cause security and privacy breaches. Specifically, location data of individuals that are used by such services must be ...
Keywords:
access control, location privacy, privacy
Title:
Towards a scalable model for location privacy
CCS:
Security and privacy
Keywords:
location privacy
privacy
Abstract:
... new vulnerabilities that can be exploited to cause security and privacy breaches. Specifically, location data of individuals that are used by ... used by such services must be protected from security and privacy breaches. Such services will require new models for expressing privacy preferences for location data and mechanisms for enforcing them. We ... for enforcing them. We identify the factors on which location privacy depends and propose a scalable model for expressing privacy that can be used for LBS and other applications where ... can be used for LBS and other applications where the privacy
Primary CCS:
Security and privacy
References:
B. Gedik and L. Liu. Location Privacy in Mobile Systems: A Personalized Anonymization Model. Proceedings of the 25th International Conference on Distributed Computing Systems, pages 620--629, 2005.
M. F. Mokbel, C. Y. Chow, and W. G. Aref. The New Casper: Query Processing for Location Services without Compromising Privacy. Proceedings of the 32nd International Conference on Very Large Data Bases, pages 763--774, September 2006.
L. Palen and P. Dourish. Unpacking" privacy" for a networked world. In Proceedings of the SIGCHI conference on Human factors in computing systems, pages 129--136, New York, NY, USA, 2003. ACM Press.
E. Snekkenes. Concepts for personal location privacy policies. In Proceedings of the 3rd ACM conference on Electronic Commerce, pages 48--57, New York, NY, USA, 2001. ACM Press.
Full Text:
... open upnew vulnerabilities that can be exploited to cause securityand privacy breaches. Specifically, location data of individ-uals that are used by ... are used by such services must be protected fromsecurity and privacy breaches. Such services will require newmodels for expressing privacy preferences for location dataand mechanisms for enforcing them. We identify ... mechanisms for enforcing them. We identify the factorson which location privacy depends and propose a scalablemodel for expressing privacy ... that can be used for LBS andother applications where the privacy of location informationmust be protected.Categories and Subject DescriptorsK.6.5 [Management of ... to or using such services must be protectedagainst security and privacy breaches. Models are neededPermission to make digital or hard copies ... 2008 ACM 1-60558-324-2/08/11 ...$5.00.that will allow individuals to express their privacy prefer-ences and technologies are needed to enforce them.Location-Based Services (LBS) ... into vari-ous categories based on the services they provide. Providinglocation privacy for all these different types of service maynot make adequate ... the identity of the user as well ashis/her exact location. Privacy protection becomes criticalfor such applications.The notion of privacy varies from one individual to an-other. One individual may be ... not want to do so. The key question in loca-tion privacy is that who should have access to what locationinformation and ... model that will allow different users to express their loca-tion privacy preferences and mechanisms for enforcing them.Moreover, for reasons of implementation, ... have identified somefactors that we feel are important for location privacy. . Thesefactors form the basis of our location privacy model. Wepropose three different models that use these factors for ... Wepropose three different models that use these factors for ex-pressing privacy preferences. The models differ with respectto the computation requirements, and ... differ with respectto the computation requirements, and the granularity withwhich privacy preferences can be expressed. Finally, we alsodiscuss implementation issues pertaining ... privacyand proposes techniques for quantifying them. Section 5discusses how location privacy can be enforced. Section 6concludes the paper and mentions some ... paper and mentions some future works.2. RELATED WORKIETF Geographic Location Privacy (GEO-PRIV) workinggroup [3] addresses privacy and security issues pertaining tolocation information. They specify how location ...
... it does notaddress the issue of how to protect the privacy of the enduser.Gruteser and Grunwald [5] propose a spatio-temporal cloak-ing ... granular location informationthat must be disclosed. New Casper uses a privacy- -awarequery processor to return a list of candidate query results ... approach stillincurs high computation cost.Kido et. al. propose the decentralized privacy protectionwith Dummy Location [6]. The main idea is that the ... searching may breach location privacyof other devices.In relating to the privacy policy, several social studies [2,8, 9] were conducted with regards ... disclosure of private information. Palen et al.[8] found that the privacy management is a dynamic re-sponse to circumstances rather than a ... The result ofthese social studies are used in our proposed privacy prefer-ence model.Perhaps the closest work related to this paper is ... closest work related to this paper is the EinarSnekkenes?s location privacy model [10]. Snekkennes iden-tify five components that play a major ... time, and velocity and pro-pose a lattice-based approach for location privacy. . Sincethe complete lattice containing all information pertaining tothe domains ... the prefer-ence on certain circumstances. This motivates us to developa privacy preference model with primitive requirements sothat the full policy can ...
... that computes the spatial information and isresponsible for respecting the privacy of the location of therequested object. Policy owner is the ... therequested object. Policy owner is the entity who decides thelocation privacy of the requested object.In this paper, we focus our attention ... and security of the individuals. The location providershould respect the privacy of the individual owner and pro-vide information to the requester. ... structure helps determine the location granularity.An user when specifying his privacy preference can choosethe level of granularity at which he wishes ... to respond to thequery.The policy owner must provide his location privacy pref-erences. Location privacy depends on several factors (de-scribed in details in Section 4). ... store information that specifies thedetails about location disclosure.Definition 1 [Location Privacy Preference] The policyowner specifies the privacy preference as a set of tuples ofthe form < c, ... satisfy the location granularity and detailsthat are specified in the privacy preference.Definition 2 [Privacy Preserving Location Response]Let the context associated with the given query ... the location information associated with context c in thepolicy owner?s privacy preference. Let loco be the most spe-cific actual location of ... ? locr.4. FACTORS INFLUENCING LOCATION PRI-VACYIn order to enforce location privacy, , one needs to under-stand the factors that influence the ... the usage informa-tion may also play a role for location privacy. . An user maybe willing to disclose his location information ... vacation. Fourth, location itself plays an impor-tant role in location privacy.
... hislocation information when he is in the theater.What makes location privacy a complex problem is thefact that the factors mentioned above ... the factors mentioned above are really not inde-pendent. Instead, location privacy depends on the combina-tion of these factors. For example, a ... the context of each query has to be matched againstthe privacy preference of the user, we need a mechanismto represent each ... we propose using the role of the requester fordetermining location privacy. . We identify certain importantroles for location privacy. . Examples include close relatives,48close friends, neighbors, co-workers, employers, adversaries,strangers, ... TimeThe temporal attribute is also an important factor in lo-cation privacy. . Time can also be represented in the form ofa ... next level, we have working hours and non-workinghours. Since location privacy is relatively less importantduring working hours, a value close to ... a value close to 1 is assigned. Fornon-working hours location privacy may be extremely im-portant and a value close to 0 ...
... for each factor specified in the querycontext. The level of privacy preference is computed as fol-lows. Lp = wi?vi+wu?vu+wt?vt+wl?vl. The granularityat ... which location information can be disclosed is a functionof the privacy preference. Higher values of Lp correspondto specifying location information at ... Lp correspondto specifying location information at finer granularity. Thelevel of privacy, , Lp, is an input to the blurring function.The blurring ... is a little different.Among all the factors that influence location privacy, , role ofthe requester is perhaps the most important. We ... with requester i and location m. The level of loca-tion privacy Lp is given by Lp = wu?puij+wt?ptik+wl?plim.Here again, we use ... func-tionalities. Improper usage of location information maycompromise the security and privacy of an individual. More-over, a user must be allowed to ... circumstances. To-wards this end, we investigate the factors influencing loca-tion privacy, , suggest techniques for quantifying them, andpropose different approaches for ... with respect to the stor-age requirements, and the granularity of privacy preference.A lot of work remains to be done. First, we ...
... by. wg/geopriv. html, 2003.[4] B. Gedik and L. Liu. Location Privacy in MobileSystems: A Personalized Anonymization Model .Proceedings of the 25th ... W.G. Aref. The NewCasper: Query Processing for Location Serviceswithout Compromising Privacy. . Proceedings of the32nd International Conference on Very Large DataBases, ... pages 763?774, September 2006.[8] L. Palen and P. Dourish. Unpacking? privacy? ? for a50networked world. In Proceedings of the SIGCHIconference on ...
18
December 1993
Communications of the ACM: Volume 36 Issue 12, Dec. 1993
Publisher: ACM
Bibliometrics:
Citation Count: 30
Downloads (6 Weeks): 3, Downloads (12 Months): 50, Downloads (Overall): 2,079
Full text available:
PDF
Keywords:
information privacy
CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Keywords:
information privacy
Title:
Privacy policies and practices: inside the organizational maze
Primary CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
References:
Association for Computing Machinery (ACM). ACM urges government action to protect privacy. ACMember- Net (a supplement to Commun. ACM) 34, 7 (July 1991), 1, 9.
Bennett, C.J. Regulating Privacy: Data Protection and Public Policy m Europe and the United States. Cornell University Press, Ithaca, N.Y., 1992.
Equifax, Inc. Harris-Equilhx Consumer Privacy Survey 1991. National opinion survey conducted by Equifax, Inc., L. Harris and Assoc. and A.F. Westin, 1991.
Flaherty, D. Protecting Privacy in Surveillance Societies. University of North Carolina Press, Chapel Hi!l, N.C. 1989.
Fried, C. Privacy {a moral analysis}. In Philosophical Dimension of Privacy F.D. Schoeman, Ed., Cambridge University Press, Cambridge, England, 1984.
Katz, J. and Tassone, A.R. Public opinion trends: Privacy and information technology. Public opinion Q. 54 (Spring 1990), 125-143.
Rule, J.B., McAdam, D., Stearns, I., and Uglow, D. The Politics of Privacy: Planning for Personal Data System as Powerful Technologies. Elsevier, New York, 1980.
Schoeman, F. Privacy: Philosophical dimensions of the literature. In Philosophical Dimenswns of Privacy, F.D. Schoeman, Ed., Cambridge University Press, Cambridge, England, 1984.
Warren. S.D. and Brandeis, L.D. The right to privacy. Harvard Law Rev. 4 (1890), 193-220.
Westin, A.F. Privacy and Freedom. Atheneum, New York, 1967.
Westin, A.F. Consumer privacy protection: len predictions. Mobius (Feb. 1992), 5-11.
Woodman, R.W., Ganster, D.C., Adams, J., McCuddy, M., Tof chinsky, P. and Fromkin, H. A survey of employee perceptions of information privacy in organizations. AcruL Manage. J., 25, 3 (1982), 647-663.
Full Text:
Privacy policies and practices: inside the organizational mazeBus iness C o ... C o m p u t i n g 0 PRIVACY POLICIES AND PRACTICES: INSIDE THE ORGANIZATIONAL MAZE H. Jeff Smith ... information? This article establishes a new foundation for examining information privacy issues process through which information privacy policies and practices are created in corporations and then reviewing ... created in corporations and then reviewing corporate approaches to information privacy in light of implied societal expectations. The findings of this ... executives rarely take a proactive stance in cre- ating information privacy policies in this ambiguous environment. They wait until some external ... the organizational mores surrounding information use. A growing concern. Information privacy may prove to be the most important ethical issue of ... life have resulted in public concern about threats to personal privacy reaching an all-time high. In 1991 surveys of American public ... were "very concerned" or "somewhat concerned" about threats to personal privacy- --as compared to a figure of 64% in 1978 [9]. ... [8]. This sug- gests a disparity between society's concerns about privacy and indus- try's response. This disparity is even more alarm- ... ing in light of the increasing poten- tial for information privacy intru- sions in today's society. This is due to several ...
... basis. Flaherty [10] and Bennett [31] considered the role of privacy protection boards in several countries. A few other authors have ... [35]), little of the recent effort has been directed toward privacy issues in the commer- cial sector. As businesses increasingly use ... businesses increasingly use personal information to gain strategic advantage, new privacy con- cerns naturally emerge. Given the current state of research ... naturally emerge. Given the current state of research on information privacy, , an explora- tory study was conducted. It ad- dressed ... medical information, information about one's purchases)? Definition of Personal Information Privacy ... Various definitions have been pro- posed for the concept of privacy. . One of the earliest, "the right to be let ... their weaknesses. Rule et al. [26] dis- tinguish between aesthetic privacy ("restriction of personal information as an end in itselF') and ... other end"). This study accepts the broader concept of aesthetic privacy and modifies Schoeman's [27] preferred definition ("a [condition] of limited ... of limited access to a person") by noting that, although privacy in general encom- passes both information about a per- son ... physical access to a person, the immediate concern is "information privacy. ." Thus, in this study, infor- mation privacy is defined as follows: A condition of limited access to ... identifi- able information about individuals. If, as Fried [12] proposes, privacy is "moral capital" which is expended in forming relationships, then ... society has an important incentive to pro- vide for information privacy. . In- deed, most cultures do value inti- macy in ... a society to function. Thus, any defini- tion of information privacy must be set in the context of a workable socie- ... effective func- tioning of society and 2) individuals' right to privacy ... in which "reason- able" steps are taken to protect that privacy. . Such an objective would likely be embraced in the ... survey [8] 75% of credit granters claimed thw con- sumers' privacy rights in crecit re- porting are adequately protect.~d by law ... areas. Interviews addressed both the process of crafting infbrma- tion privacy policies as well as the cur- rent policies and practices ...
... personal informa- tion developing within corporations? The cycle. The information privacy policymaking cycle at the sites was a consistent one: 1) ... degree on individuals' own interpreta- tions of the concept of "privacy" "; the specific definition of " information privacy" " was not pro- vided to the interviewees or survey ...
... insurance orga- nizations was similar. HealthIns A had an omnibus privacy policy state- ment from 1980, and HealthIns B had one ...
... HealthIns B, it appeared that its reconsiderat ion of the privacy policies would be more difficult. The objective of both orga- ...
... some marketing prac- tices. However, no cohesive statement regarding information privacy had been formulated or de- bated. External threat. Until, that, ... an- other credit card issuer began to ad- vertise a "privacy protection plan" for its cards, under which, it prom- ised, ... assigned a vice president the responsibility for creat- ing a privacy policy for the organiza- tion. At the time of the ... drafts, in which he detailed a "uto- pian" view of privacy, , were rejected by many of his colleagues. He said: ... threat, so they had not yet entered a period of privacy policymaking. Even so, the "drift" stage had been under way ... an outside vendor, had resulted in a large number of privacy complaints from customers, and new practices had been embraced (but ...
... aren't their own . . . There definitely are some privacy implications here, No attempts to create new, cohesive policies for ... in particular--was receiving little media or legislative attention regard- ing privacy at the time of the study. Thus, the "drift" stage ... Thus, senior management 's attention was devoted to informat ion privacy only in a reactive sense: as a response to the ... organization and a leadership vacuum with respect to informat ion privacy. . Emotional Dissonance Because tile corporations dri fted until they ... banks and CredCard, utilized a stra- tegic definit ion of privacy and ex- pressed a disbelief that anyone could object to ...
... In another situation, a mark~:ter at a bank def ined privacy in an m ~usual way: We do not violate people's ... interested. . . ' So, we don't really violate their privacy. . We dcm't say how we found out about them. ... executives here at the bank it we are violating customers' privacy, , they will probably say "n(:." But some of the ...
... makers. And, even then, few executives wanted to be information privacy leaders in their industry. Followers, not leaders. As noted at ... tives in a recent opinion survey [8] wanted to adopt privacy policies only after there was a clear consensus in their ... to have their company be viewed as a leader in privacy issues. One said: I have come to think that it ... tive than most other health insurance companies in this regard [privacy] ], any- way, so why not get out in front ... we can convince people that we are doing a better privacy job, and if people who are buying insurance care about ... more common, however, was a desire to avoid leadership in privacy issues. One bank executive stated pointedly: Why be a leader ... One bank executive stated pointedly: Why be a leader in privacy? ? You will only lose money in the short run, ... true, in most of the study's sample, for the information privacy policies at a cor- porate level. Most executives are comfortable ... negative im- plications for the effective manage- ment of information privacy
... the wandering process is that it leaves large holes in privacy policies and leads to gaps between those policies and actual ... the "societal e ~pec- tations" for handling personal infor- mation, privacy advocates' wriJngs, the U.S. federal law, profes, ional codes, and ... of "Fair Infor- mation Practices" [31] were asscssed; interviews with privacy and con- sumer advocates were conducted; and focus groups of ... judgment. In drawing conclusions regal:ding the various corporate approackes to privacy in each of these areas, it is helpful to distinguish ... ACM members shall "always consider the principle of the individual's privacy and seek . . . to minimize the data collected ...
... objectives, the banks had no policy statements re- garding the privacy implications of this additional collection of informa- tion. Information used ... the bank'.s and CredCard's actions was not always to protect privacy, , per se. Rather, they seemed equally or more concerned ...
... these items 'were encountered. Internal access. Of great concern to privacy advocates and to consumers in the focus groups was improper ... data" as a responsibility of members in protect- ing individuals' privacy. . However, the interpretation of which individu- als have, and ...
... they then view the pro- cess as offensive. Concerns about privacy are then exacerbated. The use of automated vs. custom- ized ... A and one at HealthIns B) saw these items as privacy concerns, but no specific policies addressing these items had been ... reiterated in the preceding discussion, societal concerns, as expressed by privacy advocates, in federal law, and in con- sumers' viewpoints, are ... carries some exposures that have nothing to do with information privacy, , such as loss of competitive advantage. Likewise, deliberate errors ... corporate atten- tion that is not linked directly to information privacy. . While it is un- reasonable to draw significant infer- ... corporations have directed ex- tensive attention to addressing infor- mation privacy concerns, per se. It should also be noted from Table ...
... of gaps between policy and practice, of leadership vacuums on privacy issues, and of a corporate climate that inhibits dis- cussions ... and of a corporate climate that inhibits dis- cussions of privacy concerns--unt i l an external threat is perceived. Since this ... the HealthIns B executives seemed more willing to take a privacy leadership role in the future than were the executiv~.~s at ... extensive concern to all who value the "moral capital" of privacy. . It is incorrect to conclude, though, that this situation ... respon~ded quickly and with some force to con- front the privacy concerns. They rarely took overt action of their own accord, ... were not, as a group, acting as moral leaders on privacy issues within their corpc,rate "mazes." More generally, many of the ... be avoided if executives took strong, proactive roles in identifying privacy concerns in their organizations, crafting appropriate policies to ad- dress ... sort of external prod--will likely allow their companies' infor- mation privacy
... behaviors. Con- sumers could, for example, mount boycotts to protest privacy violations. But Stone [30] notes four assump- tions that must ... in situations in which individual con- sumers feel their information privacy has been violated. Consumers often do not know how personal ... government should embrace a new role in improving the information privacy environment. A reasonable path. A solution to the information privacy problem can best be effected, then, by increasing the scrutiny ... a sufficient incentive to executives to carefully consider their information privacy policies, at the same time yielding benefits to society as ... executives to begin taking an aggres- sive approach to information privacy concerns-- in which their "volun- tary" efforts yield acceptable solu- ... this plan is to in- crease the visibility of information privacy concerns and to ensure that the threat of negative economic ...
... mandate the responsibility for being a focal point in information privacy debates. Already in place in most European countries and Can- ... boards provide an ideal venue for providing leadership in information privacy issues. The board's composition should include a number of appointed ... well as a small staff. The board should sponsor information privacy studies, should ensure that legislators are made aware of existing ... tions as they craft their own ap- proaches to information privacy is- sues . Enforcement. Of course, the board could not ... survey, few executives wished to be industry leaders in setting privacy policies, but many were willing to comply once boundaries were ... man- ner; consider consumer interests; recog- nize industry initiatives in privacy protec- tion mechanisms; and provide a balanced viewpoint of privacy interests, industry initiatives, and overall economic issues of resource allocation ... GAAP, industry groups should develop sets of "gen- erally accepted privacy principles (GAPP)" for their own industries. The board could offer ... in- dustry-specific knowledge. The GAPP should address the specific information privacy issues of the in- dustry. For example, banking GAPP would ...
... an organization's objectives" [11]. more proactive positions towa:d in- formation privacy, , or whether the threat of negative publicity ~ nd/or ... un( oubt- edly become more aware of their own organization's privacy ap- proaches. As this happens, they should focus on the ... A proactive policy process should be the backbone of the privacy ap- proach at corporations which utilize personal information, and the ... which employees can openly raise their concerns regard- ing information privacy. . Jus'. as ~0 December 1993/Vol.36, No.12 COMMUHICAII'IOH|OIIIrHIIACM many corporations ... and by supporting in- dustrywide initiatives that lead to better privacy protection. It should be noted, though, that this response is ... to embrace real and substan- tive changes to protect individuals' privacy, , or should the "least privacy- - observant firms" in certain industries refuse to adhere to ... power, even if this results in a societal "cost of privacy" " in an economic sense. Should this occur, it would ... signal the failure of volunta- rism as a remedy for privacy prob- lems. Far better, of course, would be the proposed ... execu- tives take some of the lead in ad- dressing privacy issues. Conc lus ion The U.S. saw more than a ... on these applications, and questions about the protection of personal privacy are being asked. a~The auto industry's responses to the problem ... addition to these items, the survey contained some "'base- line'" privacy questions taken from existing public opinion surveys as well as ...
approach to personal privacy as it handles sensitive information about your clients ( .................... ... to ensure that the societal objective--a fair trade-off between information privacy and efficient, effective societal functions-- is achieved. Our path of ... Computing Machin- ery (ACM). ACM urges government action to protect privacy. . ACMember- Net (a supplement to Commun. ACM) 34, 7 ... House of Representatives, Apr. 10, 1991. 3. Bennett, C.J. Regulating Privacy: : Data Protection and Public Policy in Europe and the ... and A.F. Westin, 1990. 9. Equifax, Inc. Harris-Equifax Con- sumer Privacy Survey 1991. National opinion survey conducted by Equi- fax, Inc., ... and Assoc. and A.F. Westin, 1991. 10. Flaherty, D. Protecting Privacy in Sur- veillance Societies. University of North Carolina Press, Chapel ... Stakeholder Approach. Pitman, Bos- ton, Mass., 1984. 12. Fried, C. Privacy [a moral analysis]. In Philosophical Dimensions of Privacy. . F.D. Schoeman, Ed., Cambridge Uni- versity Press, Cambridge, England, ... 1989. 16. Katz, J. and Tassone, A.R. Public opinion trends: Privacy and informa- tion technology. Public Opinion Q. 54 (Spring 1990), ...
... McAdam, D., Stearns, L., and Uglow, D. The Politics of Privacy: : Planning for Personal Data Systems as Powerful Technologies. Elsevier, ... as Powerful Technologies. Elsevier, New York, 1980. 27. Schoeman, F. Privacy: : Philosophical dimensions of the literature. In Philo- sophical Dimensions ... Philosophical dimensions of the literature. In Philo- sophical Dimensions of Privacy, , F.D. Schoeman, Ed., Cambridge Univer- sity Press, Cambridge, England, ... 1973. 32. Warren, S.D. and Brandeis, L.D. The right to privacy. ... . Harvard Law Rev. 4 (1890), 193-220. 33. Westin, A.F. Privacy and Freedom. Ath- eneum, New York, 1967. 34. Westin, A.F. ... Freedom. Ath- eneum, New York, 1967. 34. Westin, A.F. Consumer privacy pro- tection: Ten predictions. Mobius (Feb. 1992), 5-11. 35. Woodman, ... Fromkin, H. A !mrvey of employee perceptions of in~brma- tion privacy in organizations. Acad. Manage. J., 25, 3 (1982), 647-1563. CR ... and Subject Descrip- tors: K.4.1 [Computers and Sc~eiety]: Public Policy Issues--Privacy General Terms: Security Additional Key Words and Pl:rases: Information privacy About the Author: H. JEFF SMITH is an assistant professor ...
19
November 2009
CIKM '09: Proceedings of the 18th ACM conference on Information and knowledge management
Publisher: ACM
Bibliometrics:
Citation Count: 4
Downloads (6 Weeks): 1, Downloads (12 Months): 5, Downloads (Overall): 153
Full text available:
PDF
A communication trace is a detailed record of the communication between two entities. Communication traces are vital for research in computer networks and study of network protocols in various domains, but their release is severely constrained by privacy and security concerns. In this paper, we propose a framework in which ...
Keywords:
privacy, utility
CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Keywords:
privacy
Abstract:
... in various domains, but their release is severely constrained by privacy and security concerns. In this paper, we propose a framework ...
References:
A. Machanavajjhala, D. Kifer, J. Gehrke, and M. Venkitasubramaniam. L-diversity: Privacy beyond k-anonymity. ACM Trans. Knowl. Discov. Data, 1(1):3, 2007.
J. Mirkovic. Privacy-safe network trace sharing via secure queries. In NDA'08, pages 3--10. ACM, 2008.
A. Slagell and W. Yurcik. Sharing computer network logs for security and privacy: A motivation for new methodologies of anonymization. In SECOVAL, pages 80--89, 2005.
L. Sweeney. k-anonymity: a model for protecting privacy. Int. J. Uncertain. Fuzziness and KB Syst., 10(5):557--570, 2002.
M. Terrovitis, N. Mamoulis, and P. Kalnis. Privacy-preserving anonymization of set-valued data. VLDB, 1(1):115--125, 2008.
Full Text:
... protocols in various domains, but their release isseverely constrained by privacy and security concerns. Inthis paper, we propose a framework in ... November 2?6, 2009, Hong Kong, China.Copyright 2009 ACM 978-1-60558-512-3/09/11 ...$10.00.by privacy and security concerns and the lack of availabletraces is a ...
... third component of our approach is the formal eval-uation of privacy and utility. Because both the transfor-mations and the utility requirements ... a precise, formal understanding of theirimpact on trace utility and privacy. . In the remainder ofthe paper, we describe components of ...
... Ter-rovitis et al., have extended the definition of k-anonymity forthe privacy- -preserving publication of set-valued data con-taining multiple entries for the ... for the same entity [15]. Verykioset al [16] considered the privacy of transactional data in thecontext of data-mining where they wanted ... in [8] tries to achieve such a balancebetween utility and privacy but it is restricted to securequeries with aggregations. In [9], ...
... 2004.[7] A. Machanavajjhala, D. Kifer, J. Gehrke, andM. Venkitasubramaniam. L-diversity: Privacy ... beyondk-anonymity. ACM Trans. Knowl. Discov. Data, 1(1):3, 2007.[8] J. Mirkovic. Privacy- -safe network trace sharing via securequeries. In NDA ?08, pages ... Slagell and W. Yurcik. Sharing computer network logs forsecurity and privacy: : A motivation for new methodologies ofanonymization. In SECOVAL, pages ... pages 80?89, 2005.[14] L. Sweeney. k-anonymity: a model for protecting privacy. . Int.J. Uncertain. Fuzziness and KB Syst., 10(5):557?570, 2002.[15] M. ... Syst., 10(5):557?570, 2002.[15] M. Terrovitis, N. Mamoulis, and P. Kalnis. Privacy- -preservinganonymization of set-valued data. VLDB, 1(1):115?125, 2008.[16] V. Verykios, A. ...
20
November 2012
IWGS '12: Proceedings of the 3rd ACM SIGSPATIAL International Workshop on GeoStreaming
Publisher: ACM
Bibliometrics:
Citation Count: 2
Downloads (6 Weeks): 5, Downloads (12 Months): 40, Downloads (Overall): 251
Full text available:
PDF
Location privacy and security of spatio-temporal data has come under high scrutiny in the past years. This has rekindled enormous research interest. So far, most of the research studies that attempt to address location privacy are based on the k -Anonymity privacy paradigm. In this paper, we propose a novel ...
Keywords:
differential privacy, location Privacy, moving object privacy, stream privacy
CCS:
Privacy policies
Security and privacy
Human and societal aspects of security and privacy
Keywords:
differential privacy
location Privacy
moving object privacy
stream privacy
Abstract:
<p>Location privacy and security of spatio-temporal data has come under high scrutiny ... most of the research studies that attempt to address location privacy are based on the <i>k</i>-Anonymity privacy paradigm. In this paper, we propose a novel technique to ... this paper, we propose a novel technique to ensure location privacy in stream and non-stream mobility data using differential privacy. . We portray incoming stream or non-stream mobility data emanating ... non-stream mobility data emanating from GPS-enabled devices as a differential privacy problem and rigorously define a spatio-temporal sensitivity function for a ... define a spatio-temporal sensitivity function for a trajectory metric space. Privacy is achieved through path perturbation in both the space and ... achieve strong anonymity; we show that our approach provides stronger privacy even for a single moving mobile object, outliers or mobile ...
References:
C. A. Ardagna, M. Cremonini, E. Damiani, S. D. C. di Vimercati, and P. Samarati. Location privacy protection through obfuscation-based techniques. In DBSec'07, pages 47--60, 2007.
A. Blum, C. Dwork, and K. Nissim. Practical privacy: The sulq framework. In PODS'05, 2005.
A. Blum, K. Ligett, and A. Roth. A learning theory approach to non-interactive database privacy. STOC '08, pages 609--618, 2008.
M. Duckham and L. Kulik. A formal model of obfuscation and negotiation for location privacy. In Pervasive'05, pages 152--170, 2005.
C. Dwork. Differential privacy. In ICALP, 2006.
A. Friedman and A. Schuster. Data mining with differential privacy. KDD '10, pages 493--502, 2010.
S. R. Ganta, S. P. Kasiviswanathan, and A. Smith. Composition attacks and auxiliary information in data privacy. KDD '08, pages 265--273, 2008.
B. Gedik and L. Liu. Location privacy in mobile systems: A personalized anonymization model. ICDCS '05, pages 620--629, 2005.
B. Hoh and M. Gruteser. Protecting location privacy through path confusion. In SECURECOMM '05, 2005.
L. Liu. Privacy and location anonymization in location-based services. SIGSPATIAL Special, 2009.
A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, and L. Vilhuber. Privacy: Theory meets practice on the map. In ICDE'08, pages 277--286, 2008.
A. Machanavajjhala, D. Kifer, J. Gehrke, and M. Venkitasubramaniam. L-diversity: Privacy beyond k-anonymity. ACM Trans. Knowl. Discov. Data, 2007.
F. McSherry. Privacy integrated queries. In SIGMOD '09, 2009.
F. Mcsherry and K. Talwar. Mechanism design via differential privacy. FOCS '07, 2007.
M. F. Mokbel. Query processing for location services without compromising privacy. In VLDB'06, 2006.
L. Sweeney. k-anonymity: A model for protecting privacy. International Journal of Uncertainty, Fuzziness and Knowledge-based Systems, 2002.
M. Terrovitis and N. Mamoulis. Privacy preservation in the publication of trajectories. In MDM'08, 2008.
L. Wasserman and S. Zhou. A statistical framework for differential privacy. JASA, pages 375--389, 2009.
M. L. Yiu, C. S. Jensen, X. Huang, and H. Lu. Spacetwist: Managing the trade-offs among location privacy, query performance, and query accuracy in mobile services. In ICDE, 2008.
Full Text:
... ObjectsRoland AssamRWTH Aachen UniversityGermanyassam@cs.rwth-aachen.deMarwan HassaniRWTH Aachen UniversityGermanyhassani@cs.rwth-aachen.deThomas SeidlRWTH Aachen UniversityGermanyseidl@cs.rwth-aachen.deABSTRACTLocation privacy and security of spatio-temporal data hascome under high scrutiny in ... far, most of the researchstudies that attempt to address location privacy are based onthe k -Anonymity privacy paradigm. In this paper, we pro-pose a novel technique to ... this paper, we pro-pose a novel technique to ensure location privacy in streamand non-stream mobility data using differential privacy. . Weportray incoming stream or non-stream mobility data em-anating from ... rigorously define a spatio-temporal sensitivityfunction for a trajectory metric space. Privacy is achievedthrough path perturbation in both the space and time ... achieve strong anonymity; we show that our ap-proach provides stronger privacy even for a single movingmobile object, outliers or mobile objects ... ; K.4.1 [Computers and Soci-ety]: Public Policy Issues?PrivacyGeneral TermsAlgorithm, SecurityKeywordsDifferential Privacy, , Location Privacy, , Stream Privacy, , Mov-ing Object Privacy1. INTRODUCTIONAlthough the mainstream adoption of GPS ... the k -Anonymity [30] and l-Diversity [21] privacydefinitions. The aforementioned privacy definitions requireat least two mobile objects to achieve anonymity. In ... exposed to background knowl-edge attacks, compositional [13] and other attacks. Privacy, ,especially in terms of location or mobility can not affordto ... context aware computing, which is seenas the worst threat to privacy. . As a result of this, in thispaper, we employ ... As a result of this, in thispaper, we employ a privacy paradigm called differential pri-vacy [10] and introduce a new notion ... add more semantic location context tothe differential private results. Providing privacy with a verystrong privacy paradigm like differential privacy in contextaware applications will be beneficial to a broad spectrum ... To the best of our knowledge, this isthe first location privacy work that utilizes differential pri-vate results to provides a context ... aware location privacy.Motivation Example: This paper focuses only on loca-tion privacy of moving objects using GPS technology. Let?sassume Ann is at ... Although k -Anonymity has provided lotsof solutions to tackle location privacy, , it will fail to protectAnn. How can one protect ... or any individual in a sparsely populatedarea using a strong privacy definition? This is the first mo-tivation of this paper. However, ...
... k -Anonymity and which is resistant tobackground knowledge) called differential privacy to protectmultiple users. In both cases, differential privacy is achievedin a context and non-context aware manner.As a summary, ... has two main goals. These in-clude the use of differential privacy to ensure non-contextaware privacy and secondly, the utilization of differentialprivate outputs to ensure context ... the utilization of differentialprivate outputs to ensure context aware location privacy foroutliers or multiple moving objects. We should note that thesecond ... should note that thesecond goal is not part of differential privacy. . The main ra-tional of the second goal is to ... to achieve very good data utility,thus ensuring a trade-off between privacy and data utility.1.1 Our ContributionsMany research studies dealing with LBS ... [33], [19] or path ob-fuscation [9], [2], [15] to achieve privacy. . Unlike previousworks, we use differential privacy instead of k -Anonymityto guarantee the privacy of moving objects. Although thetheoretical strength of differential privacy has been highlypraised, it is quite difficult and challenging to ... this challenge, and provide a differ-ential private solution for location privacy using the Laplacenoise. In addition, we present a technique that ... In addition, we present a technique that accomplishescontext aware location privacy by using differential privateoutputs for moving objects. Specifically, we propose ... moving objects. Specifically, we propose a noveltechnique to enforce differential privacy in trajectory datastream by perturbing traces of a given trajectory ... before sending them to a MOD orLBS. To achieve differential privacy, , we first determine amore accurate measurement of an object?s ... sensitivityis then added to each Running Window to achieve differ-ential privacy. . Furthermore, we introduce a new notion ofNearest Neighbor Anchor ... Neighbor AnchorResource (NNAR) which is needed to achieve contextaware location privacy. .? Using real and synthetic datasets, we show that ourtechnique ... relevant related works. InSection 2, some basic concepts of differential privacy are ex-plained. Section 3 discusses how Laplace noise is used ... NNAR is introduced and uti-lized to accomplish context aware location privacy. . Section5 discuses the experimental results. In section 6, a ... setup point of view; the trusted server of the formerguarantees privacy through anonymization while our trustedserver provides privacy through trace perturbation.While another technique called SpaceTwist [35] uses an-chor ...
... thenotion of moving average and secondly, we utilize the differ-ential privacy ... paradigm while the approach in [1] is basedon (k -?)-Anonymity.Differential Privacy: : Fundamental theories of differen-tial privacy are provided in [10], [23], [6] and [5]. We alsoemploy ... a differential pri-vate classification technique to generalize data. [20] utilizeddifferential privacy to track commuters? pattern. Differen-tial private frequent item and times-series ... trajectorypath.Figure 1: Differential private data interface.2.2 Basics of Differential PrivacyDifferential privacy is a privacy paradigm proposed byDwork [10] that ensures privacy through data perturbation.Differential privacy is based on the core principle that forany two datasets ... query resultsby a randomized mechanism; if the randomized mechanizedobeys differential privacy, , the ratio of the probability of theoutput query results ... providing pri-vacy. This is formally given as follows.Definition 1. (?-differential privacy [11]): A ran-domization mechanism A (x) provides ?-differential privacyif for ... dataset, there would benegligible change at the output, thus guaranteeing privacy. .? is the privacy parameter called privacy budget or privacylevel. When ? is less than one, then ... than one, then exp(?) ? 1 + ?Sensitivity: In differential privacy, , sensitivity is verycritical during the process of noise derivation. ...
... which is specifically tailored for a trajectory metricspace.Noise Addition: Differential privacy is achieved by addingnoise to data. Three types of noise ... Sequential compositionis exhibited when a sequence of computations provides dif-ferential privacy in isolation. The final privacy guaranteeis said to be the sum of each ?-differential privacy. . On theother hand, parallel composition occurs when the input ... sets, independent of the originaldata. In this case, the final privacy from such a sequence ofcomputation depends on the worst computation ... [10], [11] of differential pri-vacy explained how to achieve differential privacy for predi-cate outputs (i.e. 0 or 1). However, many real ... applica-tions have more complex outputs. Hence, in order to achievedifferential privacy for trajectories, trajectory data needs tobe intensively analyzed, re-defined and ...
... + Noise Randomization Mechanism A(x) MOD LBS Figure 2: Differential Privacy in each Running Windows.Definition 5. (Problem Definition): Assume that anoutlier ... to produce a perturbed trace HTrj, such that the ?-differential privacy condition is fulfilled. The perturbed traceHTrj should then be sent ... location. Section 4 provides adetail description of NNAR.713.3 Linking Differential Privacy to TrajectoryOverviewThe difficulties of practically implementing differential pri-vacy in other ... function that captures changes in the trajectory met-ric space. Our privacy settings is illustrated in Figure 2.Naively adding Laplace noise within ...
... algorithm. The algorithm takes in thefollowing parameters as inputs. The privacy level ?, the di-mensions d to be perturbed (space and ... averages HTrjare equivalent to differential private high precision traces.Analysis of Privacy Guarantee: All noisy high preci-sion traces emanating from the trusted ... 1 states that the addition of Laplace noise guaran-tees differential privacy. . Also, Line 4 is performed only once72for a given ... then according to Theorem1, Line 4 guarantees 1 ?-differential privacy. . However,because a spatio-temporal data contains three dimensions,namely the X-position, ... budget needs to be carefully managed to control thecost of privacy. . Using the Sequential Composition [22] de-scribed in Section 2, ... Composition [22] de-scribed in Section 2, the total cost of privacy in a RunningWindow to perturb the different dimensions is ?.|D|. ... other, it implies, following the principle ofParallel Composition [22], the privacy budget does not needto be shared across Running Windows. Hence ...
... addition by the Laplacemechanism is ?.|D|-differential private.Therefore, if an overall privacy budget ? is provided bythe data miner, for ? = ... 1 is ?-differentialprivate.Algorithm 1: Differential Private Trace PerturbationInput: Dataset T1, privacy budget ?, number ofdimensions d, number of raw GPS points ... It couldoptionally be used to immensely enhance data utility whilepreserving privacy. .4.1 Context Aware Location PrivacyContext Aware computing motivations were described ... and Section 4 as a whole is not part ofdifferential privacy. . The main motif of this section is to im-prove ... intends to analyze this data fortrend analysis without hurting Ann?s privacy, , the miner willconclude that Ann went to (or might ... to a restaurant withoutrevealing which restaurant.However, on one hand, Ann?s privacy has been preservedand few knowledge could be gained from the ... which could be gained by the data miner withoutcompromising Ann?s privacy. . Moreover, the result mighteven become completely meaningless if a ... the context of their lo-cations is an intrusion to their privacy. . In contrast, there areother users who would like to ... of) the context oftheir locations which do not hurt their privacies. . As a resultof this, NNAR is user driven in ... order to put a user at thedriver?s sit of her privacy. . The user is given the privilegeto specify multiple categories ...
... previous works as NWA and PPC, respectively. Inthe k -Anonymity privacy paradigm, k denotes the numberof indistinguishable objects. We should note ... through-out this section, our technique which is based on differen-tial privacy does not compare k (from k -Anonymity ) to ?(from ... not compare k (from k -Anonymity ) to ?(from differential privacy) ... ). Instead, in order to orchestratethat our technique preserves the privacy of outliers, we com-pare and highlight from time to time ... location traces was used.5.2 Quantifying User?s PrivacyWe utilized two location privacy metrics to analyze theprivacy obtained by a user during perturbation. ...
... through our randomized mechanism withlow ? values, a very strong privacy is guaranteed.5.3 Quality and Utility of Perturbed TraceRange Query Distortion: ... [1] at Section 4C with k = 140. Weused different privacy levels ? and a radius ranging from 300to 4000. Figure ... as ?increases, the range query distortion decreases. That is, asthe privacy increases (low ?) more uncertainty is injected toprevent an adversary ...
... notionof Nearest Neighbor Anchor Resource, which ensures con-text aware location privacy by capturing and storing thelocation context of an object in ... an object in an MOD or LBS, yet guar-anteeing strong privacy. . We orchestrate empirically that ourtechnique protects outliers. Differential private ... Ligett, and A. Roth. A learning theoryapproach to non-interactive database privacy. . STOC?08, pages 609?618, 2008.[7] J. Bond. An investigation on ... L. Kulik. A formal model ofobfuscation and negotiation for location privacy. . InPervasive?05, pages 152?170, 2005.[10] C. Dwork. Differential privacy. . In ICALP, 2006.[11] C. Dwork, F. Mcsherry, K. Nissim, ... 265?284, 2006.[12] A. Friedman and A. Schuster. Data mining withdifferential privacy. . KDD ?10, pages 493?502, 2010.[13] S. R. Ganta, S. ... ?08, pages 265?273, 2008.[14] B. Gedik and L. Liu. Location privacy in mobilesystems: A personalized anonymization model. ICDCS?05, pages 620?629, 2005.[15] ... kinematic positioning. InJournal of Global Positioning Systems, 2008.[19] L. Liu. Privacy and location anonymization inlocation-based services. SIGSPATIAL Special, 2009.[20] A. Machanavajjhala, ... A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke,and L. Vilhuber. Privacy: : Theory meets practice onthe map. In ICDE?08, pages 277?286, ... 2008.[21] A. Machanavajjhala, D. Kifer, J. Gehrke, andM. Venkitasubramaniam. L-diversity: Privacy beyondk-anonymity. ACM Trans. Knowl. Discov. Data, 2007.[22] F. McSherry. Privacy integrated queries. In SIGMOD?09, 2009.[23] F. Mcsherry and K. Talwar. ... SIGMOD?09, 2009.[23] F. Mcsherry and K. Talwar. Mechanism design viadifferential privacy. . FOCS ?07, 2007.[24] N. Mohammed, R. Chen, B. C. ... 2011.[25] M. F. Mokbel. Query processing for location serviceswithout compromising privacy. . In VLDB?06, 2006.[26] M. E. Nergiz, M. Atzori, and ...
... Uncertainty,Fuzziness and Knowledge-based Systems, 2002.[31] M. Terrovitis and N. Mamoulis. Privacy preservation76in the publication of trajectories. In MDM?08, 2008.[32] Y.-H. Tsai, ... 2008.[34] L. Wasserman and S. Zhou. A statistical frameworkfor differential privacy. . JASA, pages 375?389, 2009.[35] M. L. Yiu, C. S. ...
Result page:
1
2
3
4
5
6
7
8
9
10
>>