Encoding Privacy: Sociotechnical Dynamics of Data Protection Compliance Work

How do developers shape data protection regulations when they are passed from the policy arena to technical teams for compliance? This study explores data protection compliance work (DPCW) as a sociotechnical process mediated by developers’ attitudes and experiences. We draw on 14 semi-structured interviews with individuals responsible for GDPR and/or CCPA compliance to examine how developers approach DPCW and the resulting implications for user privacy. We highlight three key ways in which developers can shape compliance: by creatively interpreting ambiguous regulatory requirements; by exploiting expectations of technical expertise and low accountability; and by reducing DPCW to a one-time project. We conclude by discussing the implications for both researchers and practitioners and by recommending how to conceptualize and conduct DPCW otherwise. This article adds specificity to understanding why and how developers’ attitudes and experiences affect data protection regulations in the field.


INTRODUCTION
In September 2019, a team of developers for a user-facing mobile app with approximately 500,000 daily active users 1 worked with a product manager to implement new data protection and optout features to comply with the California Consumer Privacy Act (CCPA).Initially, they were excited to shape privacy safeguards for their users.However, there were many unresolved questions: What constitutes "selling" data?Should they roll these privacy features out to all users or only in California?How discoverable should they be?Should they audit opt-outs to confrm compliance or simply expect it to work?As they pored over the legal guidance from the privacy lawyers, they grew increasingly frustrated by shifting feedback, poor communication, and insufcient resources to instrument the features that they thought were necessary.
Soon, it became clear that the paramount objective for the company was to implement something-anything, really-within just a few weeks, before the law would take efect in early 2020, to show that the company made a good faith efort at compliance.Under pressure and frustrated by poor communication and corporate bureaucracy, the developers revised their scope, sacrifcing features and usability to meet their deadline.What they launched looked very diferent from what they had planned just a few weeks earlier.
On one hand, this outcome is unsurprising since prior research has found that quality software development practices may be disregarded in the face of resource constraints or market pressures [13].However, the stakes for data protection laws are unique because they are implemented via technical features launched by liable companies-and the stakes are growing given the acceleration of regulatory action in recent years.In particular, both the CCPA and the European Union's General Data Protection Regulation (GDPR), passed in 2016 and 2018, respectively, have played outsized roles in setting global standards for data privacy laws [5,34].These standards have developed because both states and liable companies have adopted GDPR's and CCPA's principles as de facto regulatory approaches to achieve harmonization [3,32].It is therefore important to understand how companies-and their workers-have responded to GDPR and CCPA through compliance practices that are likely to become increasingly standardized and globally widespread.
Thus, this study asks: How have developers shaped GDPR and CCPA when they were passed from the policy arena to technical teams for compliance?It pursues this question by building on previous research that approaches developer practices in general, and privacy work in particular, as a sociotechnical process informed by developers' attitudes and experiences.We focus specifcally on this process as our empirical object and call it data protection compliance work (DPCW): the labor of translating data protection regulations from policy into code.As we demonstrate, encoding privacy laws is not a straightforward process of conversion, but rather a sociotechnically contingent system that brings together a broad network of actors whose relations are often mediated through developers.
Following a review of related literature about privacy work in software development and a description of methods, we summarize themes and representative quotes from 14 interviews with software developers and adjacent workers that we collectively label data technicians (see Section 3).These fndings are organized around four guiding questions: How do developers approach compliance work?How is compliance work situated within organizations?How do developers make decisions about compliance work?What do developers think about the lasting efects of GDPR and CCPA?
This article expands upon prior research by describing developers' unique capacity to shape data protection compliance in three ways: by creatively interpreting ambiguous regulatory requirements; by exploiting expectations of technical expertise and low accountability; and by reducing DPCW to a one-time project.At the same time, the developers interviewed conceived of their impact and responsibility as contingent, ambivalent, and dependent upon networks of external actors.This discrepancy demonstrates the signifcance of these fndings, which highlight the central role that developers play as mediators whose attitudes and experiences carry implications for data protection regulations in the feld.More broadly, these fndings demonstrate the importance of studying data protection compliance work (DPCW) as a sociotechnical phenomenon, which includes unpacking expectations of developer expertise and boundary work across felds, and imagining alternative promises for data protection and digital privacy.

STUDYING PRIVACY IN SOFTWARE DEVELOPMENT 2.1 Developers as Key Actors
This study builds on previous research that has investigated how developers approach privacy.This literature begins with the premise that privacy values can be embedded in technical design and development, a framework called Privacy by Design [4].Many researchers have been inspired by this framework to explore the decisions developers make when designing and developing software, especially when implementing privacy features.
One challenge with Privacy by Design is "translating the general abstract notion and the meaning of informational privacy (or, in its European term, data protection) into concrete guidelines" [22].This has opened up a line of research about how developers approach privacy work in software development because they are delegated responsibility to interpret and design privacy frameworks [22,43].In particular, Greene and Shilton [16] argued that Privacy by Design "positions developers and mobile companies as ethical agents" and then, accordingly, "platforms emerge as de facto regulators" (p.1643).This efectively promotes platforms as well, such as iOS and Android, as not only sites of privacy mediation but also as privacy co-regulators themselves, especially because iOS and Android developers maintain diferent conceptualizations of privacy [16,35,47].
Given developers' role as ethical agents, several studies have explored how developers conceptualize and approach their relationship to privacy work.For example, Tahaei et al. [41] found that both computer science students and professional developers share similar attitudes toward privacy and security, including "hacker and attack mindsets", inexperience with security instrumentation, and a tendency to trust other developers' solutions while applying minimal scrutiny.In response, many developers turn to online forums for advice on approaching privacy work, which means that online forums serve as important spaces where developers deliberate ethics and values [35].However, these spaces and the discourse and actions they cultivate are not neutral.For example, Tahaei et al. [47] analyzed privacy-related questions on Stack Overfow using both topic modeling and qualitative analysis and found that developers seeking answers to questions about privacy derived their answers from platforms (i.e., iOS and Android) or individual opinions from other individuals regardless of their expertise.In another study, Tahaei et al. [42] found that responses to questions on online forums can be biased toward particular privacy paradigms.Thus, developers are key actors in digital privacy not only through their software development practices but also by developing and sharing best practices beyond their teams and organizations.

Social and Institutional Factors
Collectively, these studies indicate that developers are often generally uncertain about how to tackle challenging privacy questions, and that they often rely on social networks and discourse to defne privacy priorities and standards.But sometimes there are exceptional developers who are committed to privacy and ofer to lead team decisions and culture.Tahaei et al. [40] characterized such individuals as privacy champions-defned as those who strongly care about advocating for privacy on their teams-and found that they play important roles in cultivating a team culture that prioritizes privacy in software development.They further found that privacy champions attempt to overcome prioritization conficts, organizational ambiguity, and limited support through team-building practices such as informal discussions, promoting stakeholder communication, and developing documentation.
On the other hand, many developers' privacy decisions are also subject to infuence from institutional sources.For example, developers tend to retain the default privacy settings on third-party services such as advertising networks [27]; meanwhile, platforms claim that developers are responsible for their own regulatory compliance-even though their confguration interfaces contain dark patterns that nudge developers toward "privacy-unfriendly defaults" [44].However, developers are often not aware of the data collected by third-party tools due to a lack of resources, expertise, and time [2].These fndings are consequential for how and when developers think about privacy.For example, Li et al. [24] found that although developers often claim to care about privacy, they may carry a limited understanding of their own products' data collection practices, partly because of poor documentation.Instead, developers may be more likely to discuss privacy concerns in response to changing policies from platforms, app stores, or regulators rather than to develop new privacy features in their own products [24].This may be because developers see a clear separation between their own code and third-party services, who they consider responsible for their own privacy practices [27], or because developers feel that additional privacy work is likely to incur more costs than benefts [25].

Implications for Data Protection Regulation Compliance
Understanding how developers approach privacy work is important because their decisions can carry material consequences for user privacy, especially when implementing GDPR and CCPA.Feng et al. [12] found that data protection features that meet the baseline standard for compliance often do not help users make informed decisions, and thus argued that compliance work should be enhanced by providing users with "meaningful privacy choices" that meet a higher standard of usability.Indeed, several studies have found that the types of privacy choices mandated by these regulations are placed in diferent locations on websites, some of which are difcult to fnd or use [19,20], and that both visual and UX design choices-such as link text, colors, iconography, and choices between banners and overlays-can afect user comprehension of and decisions about privacy choices [8,18,28,29,33].Such studies have even led to regulatory changes and standards [7,21].These criticisms also apply to data subject access requests, which allow users to access, modify, and delete their personal data collected by liable organizations.For example, Di Martino et al. [10] conducted audits that exposed that many large websites leaked personal information to unauthorized third parties after only minimal social engineering.Although developers are not always responsible for fulflling requests, they are often implicated in setting up workfows to verify and execute them.Overall, while previous studies have explored how developers make privacy decisions in general, and how their technical and design decisions shape user outcomes in particular, they have not defned the specifc responsibilities assigned to developers to comply with data protection regulations or how they are situated within their organizational contexts.These dynamics are important because they shape users' experiences of privacy online-in essence, to what extent the regulations fulfll their intended goals-which is increasingly important as new data protection regulations have been passed or introduced around the world.This study addresses that gap by evaluating how communicative and team-based processes identifed in the literature interact with GDPR and CCPA-and the implications of developers' decisions on what privacy regulations actually look like in practice.

METHODS
This study focuses primarily on developers because the literature has already demonstrated that they play a key role mediating between many stakeholders: they operationalize regulators' statutes, remain accountable to privacy lawyers, learn from and negotiate with platforms, and produce features for end-users [16].However, this study also explores developers' experiences with data protection regulation compliance work (DPCW) as collaborative processes with diverse organizational actors, including lawyers, executives, and other technical workers.Thus, it departs from previous research, which has either focused on developers alone (and excluded adjacent functional areas) or the outputs of their work (i.e., code, features, and frameworks), by including workers from data, product, design, and operations teams.
We call this broader category of workers data technicians, or workers who operationalize the key processes of data collection, administration, management, design, and analysis.Software developers are key data technicians, but they also collaborate or work alongside data analysts, designers, product managers, operations specialists, and others.Thus, data technicians are collectively responsible for managing the mediating data infrastructure between organizations (e.g., "data controllers" and "processors" as defned by GDPR) and individuals (e.g., "data subjects" or "consumers" as defned by GDPR and CCPA, respectively).This expansive approach to interviewing data technicians beyond developers alone was chosen in order to account for the diverse contributions of diferent workers besides software developers.For example, in the course of complying with data privacy laws, data scientists recommend collecting or storing personal data with particular structures and attributes, and designers create particular iconography or design patterns to present to users.We included these roles in order to better understand how they contribute to DPCW alongside and in relation to software developers, who are often the only focus.However, in our study, participants largely described their compliance work in relation to decisions and processes led by software developers, so our fndings below focus primarily on developers' attitudes and practices-but are enhanced by perspectives from the broader category of data technicians.

Recruitment
This study was conducted through semi-structured interviews with 14 data technicians who have been responsible for GDPR and/or CCPA implementation in a variety of organizational contexts.The key inclusion criterion was that participants should self-identify as having experience bearing some responsibility for an organization's compliance with GDPR or CCPA.
Participants were recruited using purposive sampling by drawing from personal and professional networks.Purposive sampling is a non-probability sampling technique that is most useful when researchers aim to account for specifc perspectives without needing to generalize to an entire population [11].It is appropriate in cases where the research question requires calls for specifc perspectives or experiences that may not be adequately captured in probability samples.We adopted a purposive strategy in this study to account for recruitment challenges documented by previous research [45], which has led many qualitative studies about developers' privacy attitudes and practices to be based on small samples of 10-20 interviews [1,22,[38][39][40][41]46].
Individuals were recruited based on diverse organizational experiences, including tech giants, startups, and non-proft organizations, and in diferent geographic regions.This allowed us to follow a maximum variation purposive sampling strategy to discern common themes and diferences across experiences [11].We posted calls for participation on LinkedIn and Facebook, which yielded three interviews, and reached out directly to 26 individuals, which yielded 11 additional interviews.We continued reaching out to individuals approximately every two weeks until saturation was reached.Saturation, or information redundancy, is a common measure to demonstrate rigor in qualitative research, but it is often undefned [17].In this study, we operationalized a version of saturation called code saturation, or the point when no new codes (which we call themes to avoid confusion with computer code) emerge from additional data collection [23].
We faced the additional challenge of recruiting individuals with direct experience with DPCW, which is both uncommon and difcult to identify from the outside.In many organizations, compliance instrumentation may be a discrete project-or even a series of tasks within a sprint-rather than a full job description.Therefore, asking a potential participant if they have worked on privacy regulation compliance was open to interpretation and subject to recall from their project history from two (for CCPA) or four years (for GDPR) prior at the time of data collection.Even if a participant was eligible for the study, we found that they were sometimes reluctant to discuss sensitive privacy regulation compliance issues on the record.Participants voiced this concern for two reasons: privacy regulations are considered sensitive topics given individuals' inability to speak for an entire company while potential complaints carry high fnes, and because of the context of high-profle, expensive leaks from Facebook and ongoing controversies related to companies such as Twitter and Amazon.Therefore, other potential participants may have harbored fears of retribution or may have been bound by nondisclosure agreements in an industry marked by considerable asymmetry between workers and owners.

Ethics
This study's recruitment and interview procedure was approved by the University of Southern California's Institutional Review Board as study UP-21-01100.As part of the approved process, participants were provided with a PDF copy of the approved informed consent form before the interview began.Verbal consent was requested before proceeding with the interview and, if the participant agreed, with recording the audio and/or video.
Participants were ofered the option for their responses to remain de-identifed to avoid scrutiny of their work or negative implications for their current or prior employers and clients.Eleven out of 14 interviewees requested for their responses to remain deidentifed.We therefore decided to de-identify all examples and quotes to treat all participants equally.Individual companies and countries are not identifed, precise job titles are not disclosed, and the gender-neutral "they" is used for all participants throughout this article.Additionally, since several participants asked to remain anonymous, the quotes we use to illustrate our points primarily come from a subset of participants.However, the experiences illustrated by the quotes were applicable to many participants, which we describe throughout the article.
Participants were notifed in advance that they would be ofered a digital gift card worth USD 25 (or the equivalent in their local currency) regardless of their willingness or ability to answer every question.Several participants declined the gift card, most often because of restrictions imposed by their employer; in one case, a participant asked for their payment to be donated to a privacyoriented advocacy organization.

Interviews
The 14 interviews were scheduled for 60 minutes but lasted between 30 and 90 minutes.They were conducted between October 2021 and August 2022 on Zoom, except for one interview that was conducted  1.In terms of geographic distribution, 9 participants primarily work in North America; 2 in Oceania; 2 in South Asia; and 1 in Europe.We aggregated by region to respect participants' confdentiality preferences and to minimize the risk that any individual participant can be identifed.The over-representation of developers and of North America-based participants refects two factors: our professional network and experiences and participants' experiences being responsible for GDPR/CCPA compliance.Eleven of the 14 interviews were recorded and transcribed, yielding approximately 75,000 words of transcripts.The other three interviews were not recorded or transcribed as requested by the participants.Instead, we took non-identifable notes during the interviews.The transcripts and notes were analyzed using iterative open coding to identify common themes, exceptional experiences, and representative quotes.

Interview Guide
Interviews were semi-structured in order to explore participants' experiences and perspectives in depth.This was important because this study is not simply interested in answering questions about what developers do, but also about why compliance work is experienced in a particular way and how they handle ambiguity and uncertainty.Thus, we drew from an interview guide and then probed further based on each participant's experience, recall, and interest.
We developed the interview guide (see Appendix A) based on exploratory research questions.We began by asking participants about their familiarity with GDPR and/or CCPA and to recall a specifc organizational and temporal context in which they were responsible for data protection compliance.This helped to validate that participants met our inclusion criteria and helped participants locate specifc experiences.Next, we asked questions about how DPCW was distributed and implemented.To do this, we asked how responsibility was determined and negotiated across functional teams, about risk and uncertainty, and about tools, systems, and preparation.We repeated these questions for participants who were responsible for data protection compliance in multiple organizations (N = 2) or who recalled specifc experiences with both GDPR and CCPA in the same organization (N = 3).
After that, we asked participants to refect on their experience(s) with DPCW.We asked about how their background informed their approach, about expectations about their expertise, and about the extent to which their company's compliance work would be surprising to diferent actors.Finally, we asked participants to refect on their own relationship with data protection and digital privacy.This yielded several interesting insights and, in some cases, led to new or more precise refections about DPCW.

Analysis
The interviews were analyzed using a constructivist grounded theory approach in which data and theory are co-constructed by the researcher and participants.Grounded theory, in its original formulation by Glaser and Strauss [15], followed a positivist orientation to systematic and objective analysis, requiring the researcher to let meaning emanate from empirical data through iterative procedural rigor and with minimal interpretation or prior literature.Instead, a constructivist approach, as advocated by Charmaz [6], embraces empirical rigor while also acknowledging how analysis is inherently interpretive; thus, she calls for bringing in the subjectivity of the researcher and situating analysis in prior literature.We embraced a constructivist approach because we understand our analytic object-DPCW-as a sociotechnical process that cannot be understood without acknowledging and interpreting social context.Thus, we followed Charmaz in conducting and transcribing interviews, iteratively coding themes using prior literature to correlate excerpts with concepts such as privacy champions, expertise, and uncertainty, and updating an analytic memo throughout the data collection process, which extended over the course of 11 months.The themes can be found in Appendix B.

FINDINGS
This section summarizes the fndings from the 14 interviews oriented around four questions: First, how do developers approach GDPR/CCPA compliance work?Second, how were developers' responsibilities situated within their organizational contexts?Third, what decisions did they make while doing compliance work?Fourth, what are the lasting efects and attitudes about data protection regulations today?

How Did Developers Approach Compliance
Work?
These questions were oriented toward discerning participants' attitudes toward data protection regulations and understanding of the scope of work.
4.1.1Feeling Unprepared to Address Vague Regulations.Participants often expressed hesitation when describing the scope of GDPR and CCPA in detail.This was even true of several participants who were individually responsible for overseeing compliance work for a particular product or application.For example, P09, working at a large technology company, described GDPR compliance work in 2018 as such: We were in a mad dash to ultimately be able to say we're in compliance.That was, and is in all my experiences, the principal concern-being able to say we are compliant.I don't know that anybody I've encountered-and certainly not I-has a concrete grasp of exactly what that means.(P09) P06, who was solely responsible for GDPR compliance at a small company, provided a blunter response to a question about how prepared they felt to tackle GDPR: "Zero.I felt unprepared" (P06).
These two examples are representative of the experiences described by participants across diferent organizational contexts, which are also captured by the metaphors participants used to describe their experiences.For example, P06 compared data protection compliance work to fulflling accessibility requirements but (DPCW) with "less clear guidelines" in which "a lot of people and a lot of companies don't want to implement them to the point they are recommended, or at all, until it becomes a legal issue".Similarly, P08 described DPCW as being "in the same bucket" as the US Health Insurance Portability and Accountability Act (HIPAA), but "less serious".The perception that data protection regulations are "less serious" than HIPAA was apparent in how participants reconciled their approaches to achieving compliance.Several participants used expressions such as "the spirit of GDPR/CCPA" to describe their aspirational level of compliance rather than trying to satisfy every component of the regulations.

Compliance Work as Risk Assessment
Labor.The two exceptions among the participants-in that they felt well prepared for regulatory compliance work-were both working in operations roles supporting compliance eforts at very large technology companies.P07 "fell in" to privacy work after being pulled into a project to respond to a data breach, which provided the necessary experience to become familiar with the technical and legal nuances of compliance work: "The best preparation happens on the job.I didn't plan to work in privacy.The breach happened, and then I got into it".In addition, P10 described how they drew upon their background working in content moderation to make decisions about verifying and fulflling data subject access requests: In the end, it's all about risk assessment.Is giving this person their data, or removing an account, based on the information they've given us more likely to result in the most correct outcome?Or is it more likely to result in some random person getting someone's data?It's the same as trying to trust someone to accept the rules of whatever platform they're on and not harass someone in the future.You're just basically just assessing what you have in front of you.(P10) In both cases, the participant identifed prior experience with diferent types of risk assessment and management as a source of confdence in approaching DPCW.This contrasts with developers' feelings of unpreparedness and reluctance, which suggests that expertise in DPCW should perhaps be less defned by specifc technical capabilities or proximity to data systems than by experience with managing risk.

How is Compliance Work Situated Within
Organizations?
4.2.1 High Autonomy.Participants on developer teams described high levels of autonomy with minimal oversight over their work.In several instances, developers were the frst to bring GDPR or CCPA compliance requirements to the attention of their organization's leadership.P12 recalled how their autonomy came from a combination of pressure from executive leadership and lack of product strategy: We had a lot of autonomy in general, and in particular with GDPR and CCPA.So this was a particular thing that engineering took upon itself to try and handle.We were responding to middle manager engineering pressure.The underlying mechanics were fairly technical in nature. . .So it was a hodgepodge between direction that we were getting from senior engineering leadership in the organization, and our own judgment and instincts about what the intention of the laws were.(P12) This description is similar to the experiences articulated by other developers, including those in smaller organizations who received pressure from executive leadership instead of middle manager developers.In general, they described being saddled with-or perhaps entrusted with-conducting DPCW because of their technical expertise, even if they did not feel equipped to do so.For example, P06 described their experience instrumenting GDPR compliance as a contractor working for a large multinational company.They described how they were saddled with the responsibility of ensuring that the organization was prepared for the beginning of the enforcement period: We were not really given clarity on what compliance was.We had a [marketing director]-she mostly just passed it of to [us] and had us kind of just fgure it out. . .It involved Googling and seeing how to be compliant and using online resources and not necessarily getting legal interpretations of how to do it.(P06) Since P06 was one of two contractors who each worked on separate projects for the same client, their work was not code reviewed by another developer, nor was their work fully tested for quality assurance.Instead, they were independently responsible for both executing and checking their own work-all while acting as a contractor rather than a full-time employee.P06 ultimately ended their contract before the GDPR compliance work was complete, passing it back to the marketing director who hopefully found another contractor to pick up the remaining work.
Similarly, P04 described a state of confusion among developers who were left to determine their own standards for quality.They were consulting for about a dozen small to medium companies in diferent countries in 2017-18 as GDPR entered its enforcement period.They described how developers in diferent client organizations needed to solicit privacy lawyers for advice, and how they made solicitation decisions in part based on lawyers' diferent risk tolerances with little or no input from other organizational leaders since GDPR compliance was deemed a technical issue.

Lack of Oversight.
A related theme was a lack of oversight over developers' work.This was related to high autonomy, as described in Section 4.2.1, but applies specifcally to the extent to which developers' work was held accountable to actors in another functional area, such as a legal department.For example, P09 described how "it was ultimately up to us, the implementing team, to look at what was in the application, document all the [third-party services], and identify which ones were potentially at risk [for collecting personal data]".In addition, they recounted how the process of determining which categories of data collection were permissible without user consent according to the organization's "legitimate interests" (under the GDPR) were not verifed by executives, compliance ofcers, or lawyers: One of the main things that I remember is a statement that basically 'information that you collect that applies directly to the operation and function of your application isn't really subject to the same restrictions...We creatively applied that in a few places where it was like, okay, I think this is operational data.You could probably argue it in a diferent direction, but we don't have an easy workaround here, and we don't think it's a fagrant violation.So we're going to go with it.We're just going to say this is operationally relevant data that we're collecting, it's not being used for tracking or profling.We're just going to ignore it.(P09) In sum, "we could have easily faunted it entirely and nobody would have known; nobody was in a position to question us" (P09).This lack of oversight was corroborated by P01, P05, P07, and P13, working in diferent functional areas such as design and product.Each of them described providing suggestions or minimum requirements to developers, but then allowing developers to lead in developing a compliance strategy, even if it was "patched together and disorganized" on the back end (P01).P07 even described one instance where they were accosted by a developer: "There was a person who screamed at me.They were like, why are you wasting my time?I said I'm not wasting your time, I'm just telling you what you need to do" (P07).Ultimately, P07-a non-developer working in an operational role-did not have the authority to block the developer from launching their non-compliant feature.This example, while irregular, illustrates how developers' high autonomy can be upheld through institutional immunity from oversight by other functional teams.

Tenuous Relationships with
Lawyers.Developers' high levels of autonomy and minimal oversight extended to their relationships with lawyers.In some cases, this was attributed to developers' reluctance.For example, P09, working at a large technology conglomerate, described how their team was hesitant to consult lawyers to reconcile questions about whether a particular third-party service was compliant: Nobody up or down the chain knew exactly what the answers were.But there was generally a hesitation to take things to lawyers just because it usually ended up being more work.We would often get the outcome that we might have suspected but would prefer to avoid it in terms of the efort required or the implication.(P09) In other cases, this tenuous relationship was attributed to lawyers' reluctance to prescribe specifc instructions.For example, P08, working in a large company, had access to a lawyer and a full-time compliance ofcer to support GDPR and CCPA compliance, but they appeared to be more interested in upholding the "spirit of the law" in general simply to avoid a large penalty: It's like, "just make a good faith efort and that's good enough." That's kind of our legal take on it. . .There's a lot of gray area on what is the right thing to do.But I think the lawyer probably wrote one paragraph about it and then stopped caring about it.Because there's very little chance we're going to get sued or anything, and that's what the lawyer cares about.(P08) In several other cases, participants described how legal expertise was unavailable to them, either by design or because of lack of capacity.For example, P06 described their experience working with a small organization that was reluctant to consult any lawyers throughout the process: The struggle was that nobody wanted to talk to a lawyer to give us any understanding of the things we should do. . .It was something that, uh, felt like it was actively being avoided. . .So I don't know how close to compliance we were because I'm not a privacy expert and that was never what I was hired for.(P06) Collectively, these examples demonstrate how developers' autonomy was upheld by constructing boundaries between technical and legal decisions and expertise, although these boundaries were constructed and negotiated diferently across organizational contexts, especially by developing socially and institutionally contingent distinctions.

How Did Developers Make Decisions About
Compliance Work?
4.3.1 "Just a One-Time Thing".The specifc regulatory features that participants focused on included implementing cookie consent notices, adding disclosures and confguration options to forms, updating privacy policies, purging data, investigating and reconfguring third-party integrations, and updating data infrastructure.However, one of the most common themes across all interviews was that participants described a lack of auditing or follow-up to ensure high quality compliance.This was an issue for several participants because they described facing major bugs while instrumenting initial compliance.For example, P14 described how EU visitor trafc doubled immediately after deploying a page redirect for GDPR because new cookies were being created for existing users.However, the issue was not caught immediately because a separate bug affecting a third-party analytics service was introduced at the same time, efectively masking the bug that infated their trafc metrics.P08 also described a bug with cookie opt-outs during CCPA implementation that was caused by testing for one case (users opting out of personal data collection) but not for another (users opting in): Our original code for [our third-party tracking service] wrapper was successfully turning of the cookies for people who didn't want them, but also had a bug where it was successfully turning them of for everyone else, too.We went like a week without collecting data before anyone noticed this.What happened was somebody wrote the code, somebody else ran it and opted out, and then looked at [the third-party tracking service] and confrmed that the correct data was there.That was our testing process.If that stopped working now, it would probably be months before we noticed it.We don't have any sort of regular audit.(P08) P08 was describing an experience at a medium-sized tech company with more than 250 staf, one-ffth of whom are developers or in similar technical roles.Further, the company also handles sensitive health data protected by the Health Insurance Portability and Accountability Act in the United States.
When asked about the current status of GDPR and CCPA compliance, participants universally admitted that they had not reviewed or audited their implementation work, including confrming that third-party cookies were caught behind a wrapper, confrming that the appropriate disclosures and confgurations were available on forms, and so forth.Two participants (P06, P08) suggested that their privacy lawyers likely would not agree to a proactive compliance audit because it was not mandated; thus, in their minds, they could remain ignorant to potential bugs or violations while staying true to "the spirit of the law".
The mindset that regulatory compliance work was "just a onetime thing" also applied to handling data subject access requests at small and medium-sized organizations.While some participants used self-service solutions such as privacy dashboards to enable users to request or delete their personal data, other participants set up manual processes that they planned to automate in the future in order to handle higher volumes of requests.However, none of those participants had streamlined those workfows when the interviews took place 2-4 years later.Both P02 and P09 attributed this to the low volume of requests: We wrote a "hacky" script that an admin runs with the intention that, if we get a lot of these requests, we'll make this a self-service tool so that we don't have to spend a lot of time doing this and so that it's easier for customers.But I think we've only had three people ever ask for their data.And I want to say maybe once a month somebody asks for their data to be deleted.(P09) 4.3.2Limited Sense of Responsibility.In addition to expressing reservations about the status of their compliance instrumentation, several participants expressed how they inevitably relied on several third-party services to maintain their compliance mechanisms.For example, P09 described their GDPR and CCPA compliance work in a mobile app as "fipping a series of switches" without any visibility into what actually happens to user data stored on other companies' servers: I'm responsible for making sure the switch is there, and that it calls the service somewhere when the user taps it.Beyond that, I have no idea what happens. . .Everybody on my team and my peers and colleagues, we are doing what we believe needs to be done, by law, by intention, by principle, but without actually concrete assurance.

(P09)
This refection captures a sense of ambivalence expressed by several developers.In contrast with the expectations of expertise and responsibility, described in Section 4.2.1, which led to developers enjoying relatively high autonomy in conducting DPCW, many developers saw themselves as reliant on networks of actors that may be invisible to their colleagues and broader organizations.

What Do Developers Think About the
Lasting Efects of GDPR and CCPA?
4.4.1 Strategic Value of Regulatory Ambiguity.Several participants described how they have been able to advance personal priorities under the ambit of data protection compliance work (DPCW).For example, P03, a senior developer with a background in the free software movement, noted that they found strategic value in drawing on GDPR to advance an anti-corporate ethos.They described how GDPR provided cover for decisions to, for example, reduce dependence on "Big Tech" companies such as Meta, Google, and Amazon.
I would never be okay with uploading emails to Facebook to create targeted lists, because I imagine that Facebook is holding the data. . .We didn't really use Google Analytics for the same purpose.You put their tracker on your website and you're inviting the giants to your website and to all your users. . .These things infuenced my decisions [about privacy].For instance, I would prioritize not giving data to huge corporations.But I wouldn't-and I didn't-prioritize creating an automatic way for people to access their data [under GDPR].I would say, they can email us and ask to deliver their data, or change it, or delete it, and that's fne.(P03) In this example, the developer made decisions about their organization's technical systems, such as avoiding integrations from the largest tech companies, under the ambit of advancing privacy-even though those services were not actually disallowed by the GDPR.Instead, they described strategically interpreting the "spirit of the law" to advance their own privacy framework, especially their own interpretation of data minimization.Later in the interview, P03 acknowledged that their values were not shared by their company, and that once they left for another job, the organization returned to using services by Meta and Google again.
While this may seem like an extreme example, P09 described a similar appreciation for the regulations providing cover for advancing privacy as a human right: [GDPR and CCPA] are not the best thing for our business, but it is important to our values to try and deliver it to users to whom it matters and for whom it's important, because privacy as a value and a principle does matter to us as a group.It's not great for the advertising business so we would prefer that users did not elect to opt out.But we do think it's the right thing to do to (A) be compliant because it's the law, and (B) because privacy is a fundamental right.So we are, in some ways, grateful that the mechanisms exist for us to be able to do that.(P09) To P09, advancing privacy is an important goal that is fundamentally incompatible with their business's corporate strategy, which is premised on an advertising model.Therefore, the mandates of complying with GDPR and CCPA ofer permission for the developer team to exceed regulatory requirements and advance data privacy goals that may have been unfavorable for their employer's proftability.

"
Regulations Do the Bare Minimum".Participants responded to questions about the value of GDPR and CCPA with great skepticism.In general, they were not very hopeful about advancing a broader conceptualization of privacy through data protection regulations that were neither standardized nor enforced.P01 captured this sentiment, which was shared by several participants, by describing how they realized from serving as a technical executive that "companies will do the absolute bare minimum in terms of privacy".However, several participants noted the value of compliance work instigating important discussions and processes within their teams.For example, P08 highlighted how the regulations have promoted discussion and allowed developers on their team to signal their values to each other: I have kind of mixed feelings about [how important the regulations are].I think it's important that they drive conversations at companies.I appreciate hearing someone say, "let's do more than we're legally required to do for our users".I would also appreciate hearing somebody say the opposite of that because then I would think-this isn't a company I want to work at.So I think that's probably the main beneft I see of it.I don't think the regulations do much to improve user data privacy.But I think they start conversations that wouldn't happen if the regulations weren't in place.(P08) Similarly, P09 appreciated that data protection regulations allow them to challenge their employer's business model:

DISCUSSION
This study has examined the empirical object of data protection compliance work (DPCW), scrutinizing neither the text of data protection regulations themselves nor their outcomes but rather the sociotechnical process of achieving compliance.This approach is valuable for shifting focus from technology policy analysis or social analysis alone to instead understanding how the social and technical are mutually constructed.Thus, a number of important analytic themes have emerged that highlight key contributions to the subfeld of developer studies and also topics for future research.

A Unique Responsibility to Defne the "Spirit of the Law"
Developers can creatively interpret specifc clauses of data protection regulations.As previously discussed, developers made substantive decisions about interpreting various parameters in strategic or instrumental ways, such as adopting a fexible understanding of data collection that is exempt from data protection regulations; shifting the boundaries of their responsibilities when assessing third-party software; and advancing additional goals through compliance work.A handful of phrases recurred across the interviews to justify or frame these decisions, including a broad commitment to "the spirit of the law".But what, exactly, is "the spirit of the law"?This, too, was left open to individual interpretation.This uncertainty suggests that privacy operates as a boundary object-a concept whose ambiguous defnition enables cross-functional actors to collaborate without actually agreeing upon a single defnition [37].This ambiguity enables developers to defne compliance work not only in terms of implementation decisions about specifc technical features but also a more general sense of the scope of work.For example, the experiences of developers strategically interpreting the scope of GDPR and CCPA to advance goals outside the explicit scope of the regulations adds to Tahaei et al. 's [40] fndings a new way that privacy champions infuence their teams, and further suggests that privacy champions can bear outsize infuence beyond developer teams by shaping organizational strategy, especially in smaller organizations, by, for example, defning "the spirit of the law".
This opportunity is unique to developers.We discovered this in this study by, at frst, recruiting participants from a variety of functional areas-developers, data managers, product managers, and designers-who, on paper, seemed to have key roles in DPCW.However, our interviews confrmed that developers are the most important category at the center of DPCW since they hold key responsibilities and expertise-even compared to legal teams.In our study, this was manifested in lofty yet often unsubstantiated expectations of developer expertise.Participants described how colleagues from legal departments and other functional areas delegated the "technical" work of compliance to developers.Our focus on this "technical work"-which we call DPCW-has demonstrated that it is not straightforward or clear.Instead, developers navigate a range of sociotechnical and institutional factors with material-and thus political-implications.One exemplary implication is that developers often do not meet the expectations imposed upon them about their ability to simply translate policy into code, to see into technical systems, and to sustain a state of compliance over time.
These fndings extend and raise the stakes for prior research on how developers approach privacy work.Such studies have often focused on understanding developers' attitudes toward [22,41] and discourse about [16,25,35,42,47] privacy.However, this article has demonstrated that it is also crucial to explore the sociology of expectations.For example, future studies could explore how expectations of technical expertise over data systems and privacy develop and proliferate, how widely they are held and by which actors, and how developers respond to and negotiate such expectations.This would shift the focus of research that "audits" data protection regulations [8,10,19,20] from applying a binary rubric of compliance to instead unpacking sociotechnical processes as an organizational genre [49] enacted by multiple functional teams that collectively uphold the fction that compliance is a straightforward and feasible endeavor.Thus, this article calls for new questions about how DPCW has evolved and become standardized through contestations over professional boundaries related to skills and expertise [14,37].There is substantial literature on objectivity, quantifcation and technocratic expertise in governments and institutional contexts [31], but data governance is often characterized as a values-centric project of advancing an ethic of privacy.How has DPCW come to constitute a kind of boundary work that privileges developers' presumed skills and experiences?And how does it manifest in diferent institutional and cultural contexts?

Developers as Street-Level Bureaucrats
At the same time, developers face remarkably little accountability in DPCW while wielding outsized infuence on policy.While previous research has measured the infuence of the "human factor" in software development in general, and privacy work in particular, this study adds a comparative dimension by interviewing data technicians in non-developer roles, including design, data, and compliance.Participants' experiences suggest that adjacent functional workers such as product managers and executives are capable of infuencing these decisions, but that developers are nonetheless often uniquely autonomous.
This autonomy is evident in two types of decisions.First, developers enjoy autonomy by creatively defning the scope of their work, as previously discussed.Second, developers sustain limited accountability by deciding when their decisions should receive external input-and from whom.This was evident in the anecdotes about deciding when it was appropriate to put additional efort into reaching out to third-party companies and when to consult lawyers whose responses might increase their workload.
This fnding brings prior literature on developers' privacy attitudes and behaviors in conversation with policy implementation.Specifcally, it highlights how developers can and should be understood not only as technical workers within companies but also as "street-level bureaucrats" of public policy [26].This complements internet governance research that has demonstrated how technology companies act as "central points of control" [9], especially through institutional forces [48], by highlighting the specifc and unique role that developers play in shaping policy.While this study was focused on data protection regulations, developers may also play a substantial role in implementing, and thus shaping, other internet governance regulations, especially related to artifcial intelligence (AI) governance and regulation.Thus, this study elevates the implications of studying software development practices in the context of privacy and policy implementation, as reviewed in Section 2.
This bears relevance for both practitioners and policymakers.We have demonstrated that DPCW is not a straightforward, objective task, but rather a complex, institutionally situated sociotechnical process that requires negotiation and resolution.The questions posed in the vignette at the beginning of this article-What constitutes "selling" data?Should they roll this out to all users or only in California?What's the discovery strategy?Should they audit opt-outs to confrm compliance or just expect it to work?-merit attention from a collaborative group of heterogeneous actors and should not just be delegated to developers to "make a good faith efort" .Responsibility and accountability for compliance should be collectively held and deliberated by drawing on multiple forms of expertise.
It is tempting to conclude that data protection regulation should impose more specifc software and design requirements.In some cases, this is an appropriate and even necessary approach.For example, studies have demonstrated that certain design choices can and should be standardized in order to improve clarity and use for the public [21].However, in our fndings, the line between "specifc, clear requirements" and "open to interpretation" was not so easy to fnd-and it is not clear that such a line can exist.Instead, data protection regulations should serve as an infrastructure or baseline set of expectations for public accountability over companies that collect personal data.Just like other infrastructure, these regulations are deeply integrated into other systems in complex ways and thus require regular repair and maintenance work.However, infrastructure famously becomes most visible on breakdown [36], which, according to our fndings, can be difcult to discern in the case of data protection regulations because data and data systems are not easily subject to public scrutiny-or even visible to developers themselves (see Section 4.3.1).How can we uphold responsibility for compliance for complex sociotechnical systems that don't reveal themselves when they fail?This issue suggests alternative mechanisms or systems of accountability need to be imagined in order to fulfll the promises of data protection and digital privacy.

Opportunities for Accountability?
Developing a more detailed and nuanced understanding of DPCW is even more important because developers often conceive of data protection compliance work as a one-time project.All the participants acknowledged that they have not returned to initial compliance instrumentation to ensure high quality, update temporary settings, or audit their work-even though several participants described implementing provisional confgurations that they intended to return to in the future.Instead, they generally wait for bugs to emerge-which are unlikely to emerge as long as users are unlikely to exercise their privacy rights or call attention to errors.
This suggests that end users can play a key role in increasing accountability for privacy work.For example, in the case of data subject access requests, several participants stated that they intended to enhance or automate their processes, but that they did not see the need to do so because of the small number of requests submitted by users.Moreover, participants agreed that few users are likely to take advantage of features such as opting out of optional cookies.Increasing pressure from end users, and making developers aware of user interest, may increase developers' motivations to reconceptualize DPCW as a continuous priority and to return to and improve data protection features.This urgency and pressure would also contribute to persuading developers that DPCW is valuable not only to achieve regulatory compliance but also to meet users' needs and expectations.Collectively, then, these fndings suggest an alternative way to understand how developers approach privacy work: by exploring how and why developers anticipate user feedback about data privacy-and how they respond.

Limitations and Future Work
This study used purposive sampling and constructive grounded theory to explore data protection compliance work (DPCW) as an object of empirical analysis.These methods have some important limitations.Our participants were skewed toward North America and we focused on common themes across a variety of organizational contexts, but future studies could pursue comparative analysis or focused study on specifc geographic regions (either specifc regulations or liable companies in a specifc region).In addition, our interviews were conducted in 2021-22, up to 5 years after compliance work began for GDPR.Thus, our participants had diferent levels of recall of specifc details.For all these reasons, this study is not necessarily generalizable to all companies subject to data privacy laws in all contexts.Future research could examine DPCW soon after implementation or, ideally, in action through ethnographic study with a team.
In addition, future studies can focus on specifc roles within the category of data technicians that were not recruited in this study.For example, in some companies, privacy program managers develop company-wide operational practices and mediate between functional teams such as developers and lawyers.In fact, this work is codifed in the curriculum for the Certifed Information Privacy Manager (CIPM) certifcation that is ofered by the International Association of Privacy Professionals (IAPP) [30].This curriculum outlines standardized expectations and processes that can be analyzed to unpack how privacy program managers navigate technical, legal, and operational expertise and boundary work, as described in Section 5.1.

CONCLUSION
This article expanded upon prior research by adding texture to an understanding of developers' unique capacity to shape data protection compliance in three ways: by creatively interpreting specifc clauses such as "strictly necessary" and "personal data"; by making such decisions in a uniquely unaccountable way; and by treating data protection work to be a one-time project with dubious, indeterminate value to users.At the same time, the developers interviewed conceived of their impact and responsibility as highly limited.This discrepancy highlights the signifcance of these fndings for adding specifcity to why and how developer attitudes and experiences afect data protection regulations in the feld.
Overall, these fndings validate the importance of understanding not only the behavior of developers and other technical workers when implementing privacy features, but also their attitudes toward that work-especially in the case of data protection regulations.Further research connecting attitudes to technical outcomes that materially afect users' privacy experiences online will clarify how developers play a highly impactful, yet under-studied, co-regulatory role in shaping privacy through subjective interpretation and implementation of data protection statutes.
think privacy is really important. . .In my work, I have tried to advance what I think is the spirit of the regulations in an environment even though it's sort of against the interests of the business that I'm working for.There's a confict there. . . .But I've tried to do what I think the intention [of the regulations] was for our users, knowing that it's not enough.It's far from it.The regulations are ambiguous on the surface, and they're ambiguously implemented, although they're well intentioned.I don't know what the net beneft is, if any.