Should they? Mobile Biometrics and Technopolicy Meet Queer Community Considerations

Smartphones are integral to our daily lives and activities, providing us with basic functions like texting and phone calls to more complex motion-based functionalities like navigation, mobile gaming, and fitness-tracking. To facilitate these functionalities, smartphones rely on integrated sensors like accelerometers and gyroscopes. These sensors provide personalized measurements that, in turn, contribute to tasks such as analyzing biometric data for mobile health purposes. In addition to benefiting smartphone users, biometric data holds significant value for researchers engaged in biometric identification research. Nonetheless, utilizing this user data for biometric identification tasks, such as gait and gender recognition, raises serious privacy, normative, and ethical concerns, particularly within the queer community. Concerns of algorithmic bias and algorithmically-driven dysphoria surface from a historical backdrop of marginalization, surveillance, harassment, discrimination, and violence against the queer community. In this position paper, we contribute to the timely discourse on safeguarding human rights within AI-driven systems by providing a sense of challenges, tensions, and opportunities for new data protections and biometric collection practices in a way that grapples with the sociotechnical realities of the queer community.


INTRODUCTION
"...Your scientists were so preoccupied with whether or not they could; They didn't stop to think if they should." -Dr.Ian Malcolm, Jurassic Park (1993) Smartphones offer a wealth of functionalities that can aid everyday life.These compact yet powerful devices are versatile tools, offering digital connectivity, entertainment, and even health monitoring.Embedded in the majority of smartphones, sensors such as accelerometers and gyroscopes offer valuable signals for monitoring and providing feedback for various daily activities, including physical activity and sleep [63].Paired with cloud technology, collecting biometric data like heart rate and daily moving patterns can be done in a systematic and comprehensive fashion.Consequently, smartphones can leverage this big data to deliver personalized user experiences at scale.This, in turn, highlights their value in enhancing daily life and subsequently, the desire for smartphone users to carry them everywhere.
Biometric data is also incredibly valuable to research involving user authentication [35] and human activity recognition more broadly [12].Accelerometers can measure movements and vibrations of an object, contributing to an ability to gather knowledge on a person's steps and gait.However, biometric data has also been employed in the task of predicting gender, with some works inferring this based on a smartphone's placement on a person's body [54,72,90,103].Giving credence to a task definition intrinsically based on hegemonic definitions of gender has harmful impacts to communities that exist outside of these presuppositions.The resulting modeling, therefore further marginalizes communities that do not reflect the normative assumptions codified within the machine learning (ML) and broader artificial intelligence (AI) pipeline.In this work, our paper centers the queer1 community in examining mobile biometric design and problem formulation that, while wellintended, risks propagating social harms if sociotechnical research assumptions remain unscrutinized.
Dr. Malcolm's famous quote anchors this position paper to the broad ethical question of whether we should continue to use collected data from our smartphones without making a concerted effort to pause and think about the societal harms and unintended consequences it might have on historically marginalized communities such as the queer community.How will having access to this biometric data possibly harm, perpetuate, and reinforce stigmas about queer bodies throughout our society?The technologies and data-driven processes we currently employ consume data inherently woven with societal assumptions and inductive biases [18,56].
Therefore, we reason that it behooves researchers to exercise caution by first taking a step back and evaluating possible sociotechnical repercussions to AI design choices.
Choice is a mechanism by which individuals propagate their own views into research, design, algorithms, and policy dictating the use of these technologies.Choices in AI design and, similarly, technopolicy design do not exist in isolation to stakeholders' viewpoints and internal social constructs [17,18,82].Who is included in an evaluation is a function of what a researcher or policy stakeholder understands should be included.As such, leaving unexamined the inextricable link between perceived social constructs and design risks harming communities which fall outside of these dictated boundaries of consideration.To demonstrate this, we show how limiting the definition of gender to that of a binary translates to a similarly narrow conceptual design in mobile biometrics and automatic gender prediction systems, having unique adverse impacts on gender minorities.
Our paper critically examines the power choice holds within the ecosystem AI resides in, consisting of both ML researchers and technopolicy makers that govern AI usage and design.Advancements of AI technologies have outpaced policy which adequately protect users from harm.We find this through the critical detailing of U.S. technopolicy gaps which result in limited protections for historically marginalized groups, such as the queer community, leaving them vulnerable to technological harms.Because of these gaps, we advocate for more comprehensive technopolicy that include state and federal statutes to ensure the safety, security, and privacy of all users with mobile devices that collect biometric information such as movement data and gait.
Overall, this position paper aims to outline queer community considerations in mobile biometrics and AI for practicing researchers and technopolicy makers alike.We contribute the following: (1) Detail the dangers of gender-normative assumptions in AI with respect to the queer community ( §3).(2) A discussion grounded on the dangers of smartphone biometric analysis for gender prediction ( §2, §4).
(3) A survey on existing and ongoing United States (U.S.) data privacy and gender discrimination law that is contextualized in the queer community and their corresponding needs ( §5).(4) Explore ways to codify inclusive law and technological praxis ( §6) by adopting a reflexive approach to AI design and research processes [17,18,73,82].
Positionality Statement All authors are people of color who center an intersectional perspective in their social and professional lives.With the exception of one author, all were formally trained as computer scientists and identified as queer, respectively.One author is formally trained in data privacy law and how technology can lead to discrimination.All authors have additional training in gender theory, critical social theories, and criminology.All authors have training in queer studies through activism and advocacy.As such, our backgrounds influence this work's posture.All authors are located in the U.S. but have diasporic links to other social contexts.Our positions arise with respect to U.S. law, though this work also has implications for AI in a global context.We write this to empower individuals across the existing tech industry and upcoming technopolicy to critically consider how we can collectively create mechanisms that better include and/or codify the inclusion of queer bodies.Therefore, this paper is positioned for AI industry practitioners, AI policymakers, and the queer community.

BACKGROUND 2.1 Mobile Biometrics
The term biometrics is described as the unique physical or behavioral characteristics that can be used for automatic recognition of individuals [16,103].Biometrics can be physical (e.g., fingerprints, faceprints, ear shape, iris, and retina) or motion (how someone is walking or otherwise moving).Biometrics researchers have reasoned that biometrics is an unobtrusive way to protect a user's privacy on a cellular device [71].Researchers have further illuminated the benefits of using biometrics on smartphones to secure and protect personal information (i.e., passwords and private information) stored on devices, especially if a device is stolen or lost [7].Therefore, mobile devices are commonly employed for identification and verification.However, researchers need accurate, abundant, and high quality mobile data for biometrics.
Gait, a biometric based on human locomotion, is difficult for users to conceal or fake, thereby making it preferable to utilize in tasks like authentication [25,39].Wan et al. [102] name six properties which make gait a favorable biometric to collect.According to them, gait can be: (1) captured from far away (2) low resolution and therefore not costly to collect (3) requiring low instrumentation (4) done without the user's cooperation (5) hard to impersonate and (6) more "accessible". 2These locomotive patterns serve as valuable insights to several domains.In a clinical setting, gait analysis assists in diagnosing, preventing, and monitoring neurological, cardiac, and age-related disorders [75].In assessing gait, clinicians may use a combination of image processing, floor sensors, and sensors located on the body [75].Gait follows a standard pattern within the clinical literature, though importantly, variables like shoe type and posture impact the ablity to measure a person's gait in a consistent manner.In computer science, gait has been studied for several applications, including authentication [102].Gafurov [43] describes gait recognition (or Biometric gait recognition) as a way for a user to authenticate simply by walking.Historically, gait data solely consisted of recorded videos.However, the advent of smartphones brought with it a paradigm shift; built-in accelerometers and gyroscopes offered a faster and less resource-intensive avenue for gait data collection.

Automatic Gender Recognition Systems
Automatic gender recognition systems (AGR) are designed to determine an individual's gender based on various biometric features or behavioral patterns.These systems are commonly based on features including a person's gait, face, and voice.Gait-based gender prediction is described as a soft biometric which significantly increases the performance of hard biometric systems that are strictly based on mobile features [53].Since works by Mantyjarvi [71], several studies have attempted to use gait-based gender recognition for tasks including but not limited to authentication and security.To do this, works build off Li et al. [68], who detailed that human gait could be separated into distinct components based on human silhouettes.Yu et al. [108] approach gender recognition by looking at human static and dynamic silhouettes.Other works [54,72,90,103] have focused on using built-in sensors such as accelerometers for gait and AGR prediction based on the device's location on a device user's body.
However, as we detail below, several forms of biometric recognition software have demonstrated poor accuracy in identifying anyone outisde of a binary gender.Scheuerman et al. [91] found that commercial facial analytics services consistently performed worse at predicting gender for trans and nonbinary individuals than cisgender individuals.Other studies using AGR systems like Buolamwini and Gebru [23] found reductionist views on gender that only distinguish between "man" and "woman".These examples are only brief examples on the ways issues in accuracy can arise due to choices to propagate cisnormative viewpoints in design.We continue to further ground what AI-driven AGR systems mean for the queer community in the next section.

DATA-DRIVEN SYSTEMS AND AI: HISTORIES OF HARM TO THE QUEER COMMUNITY
The mainstreaming of nonnormative genders and sexualities gives the impression that queer and trans lives and experiences are incorporated into the normative world [13,92].Yet, scholars [13,59,92,96] have cautioned this visualization does not necessarily construe the achievement of equity throughout this cis-heteronormative social-technical world.The queer and trans population experiences several stressors due to the propagation of cissexist systems [86].Gender is complex and its definitions are conservative [29], therefore gender can not be confined to binary constructs such as "woman" and "man"; this gender construct occultates the growing minority of those who fall inside and outside the binary norms.
According to a Pew Research Center survey in 2022, 1.6% of U.S. adults are transgender or non-binary.Those under 30 are more likely than older adults to be trans or nonbinary, at 5% -and this is more likely than not to be reflective of future trends in gender expression [22].
Similarly, algorithmic systems trained on such cis and heteronormative dimensions of social reality go on to perpetuate hegemonies through ever-present normativities.For instance, several studies [83,88,96] describe how cis and heteronormativity are not only propagated into AI systems but serve as foundations for mistrust -causing consequences pertaining to surveillance and communal harm to the queer community.Tomasev et al. [96] describe how censorship in the form of erasing queer voices and amplifying heteronormative voices is possible, by censoring content resulting directly from being adjacent to the queer community.Simultaneously, there is a disproportionate online harassment [11] and filtering out queer content for anti-queer propaganda [8,36] Such automated filtering reflects an erasure that reifies cissexist systems.
Sexual and gender identity are private components of one's identity.Tomasev et al. [96] describe how possibilities to use AI risk infiltrating otherwise safe spaces in the name of facial recognition and overall surveillance for safety.Indeed, ample scholarship speaks to the ways such tools can further marginalize the queer community [34,87].For instance, physiognomic and phrenologic applications such as computer vision to (falsely) infer gender and sexuality which is antithetical to queerness and queer bodies [95].Surveillance in these areas that collide with poor privacy measures leads to (1) forced outing [83] (2) applications upon "determining someone's gender" (e.g., such as recommendation systems providing binary gender-based clothing) may lead to algorithmically-induced gender dysphoria, as well as (3) queer erasure through the privileging of cisbodies, detection of bodily parts, and propagation of physiognomic thinking [34].
Hamidi et al. [49] found that transgender individuals have overwhelmingly negative attitudes to automated gender recognition systems due to their impact on misgendering individuals.Most AGR systems had performance issues when tested against transgender individuals, and did not classify nonbinary and queer people outside of cisnormative labels [91].These AGR systems cause dysphoria by failing to recognize the chosen gender of transgender individuals accurately and do not acknowledge nonbinary individuals, creating labeling issues that queer users might not desire, with discrimination followed by making automated decisions based on incorrect gender predictions.Such labeling has been theorized to bind concepts and labels like "makeup" to gender identity, and lead to bias by third parties who use gendered appearances.Other researchers like Mahalingam and Ricanek [70] have used video recognition algorithms to try and determine whether a person has gone through hormone replacement therapy.These efforts have been criticized for compiling their data from transgender users on YouTube without notice or consent and for the potential ethical issues of creating an algorithm with the power to distinguish between cis and trans people, enabling harassment and governmental persecution [33,99].
In the following section, we take a closer look at ways in which mobile biometric technologies have propagated these aforementioned harms through choices in their task design.

BIOMETRICS, CHOICES, AND THEIR IMPACT ON THE QUEER COMMUNITY
Gender recognition systems, at their core, hinge on the notion of extracting gender information from inherently unobservable and often superficial characteristics [96].This problematic foundation perpetuates an understanding of gender as a static, physiognomic concept, thus failing to consider gender as a social construct.As a result, the diversity of gender identities and expressions do not exist beyond the confines of a binary.These systems, built upon the notion of distinct features reflecting a "man' and a "woman", tend to disregard the complex and nuanced reality of gender.Keyes [59] urges researchers to consider how they assess and conceptualize the term gender critically.Too often, they argue, gender is treated explicitly or implicitly as a "binary, immutable and physiologically discernible concept" in research.Yet, restricting gender to a fixed, static context erases transgender and non-binary people, cascading throughout the design and research process.Albert and Delano [5], scholars in bio-impedance technology 3 , an intersecting field with biometrics, similarly discuss how it excludes non-binary and trans people.Their discussion is relevant to mobile devices.The researchers point out the field's assumptions in how sex and gender are straightforward.As we delve further into the subsequent sections, our critical examination reveals how these normative assumptions not only persist but also propagate throughout the various tasks and applications of these systems, ultimately perpetuating harmful biases and inaccuracies that have far-reaching consequences.These sections all converge towards questioning whether researchers should be making such design choices followed by critically discussing their links to technopolicy.

Reconstructing Personal Body Data from Mobile Sensors
Biometrics studies have shown that accelerometer data can be used to reconstruct "sensitive personal data" from its users [63,110].
While accelerometers are predominantly used in mobile devices for various applications like cameras and microphones, inferences about an individual's identity, demographic, personality, and activities can intrude on their privacy.Accelerometers can be combined with other biometric data-gathering devices and data sources to form multimodal signals which more accurately predict human activity than one single device [4].Kroger et al. [63] describe accelerometers as "cheap, low in power consumption, and often invisibly embedded into consumer devices.Thus, they represent a perfect surveillance tool as long as their data streams are not properly monitored and protected from potentially untrusted parties including service providers and app developers." [63].At the same time, safeguarding privacy is essential to uphold individual autonomy, particularly for those whose gender identity could expose them to persecution if revealed.This practice brings to light poorly regulated access to information about users' physical biometrics and identity that can be used to harm them.

Determining Gender from Clothing
Some AGR research reinforces the idea that an individual's clothing choices can be used to predict binary gender identity.They may contend that the types of clothing individuals wear, if they contain pockets [76,77] or certain footwear [32], could offer significant insights into their gender.As such, works like Friere-Obregon et al. [42] and Guan et al. [47] aim to derive user information on based on clothing.However, it is essential to approach this research with caution, as it risks oversimplifying a deeply personal and multifaceted aspect of an individual's identity.Predicting gender solely from clothing data can reinforce stereotypes and undermine the lived experiences of those within the queer community and others whose gender expression may not conform to societal norms.Remarkably, multiple studies highlight the inherent variability in clothing-based gender prediction.Yet, rather than avoiding this challenge altogether, some researchers opt to address it by incorporating an additional task to enhance the effectiveness of clothingbased AGR systems.Guan et al. [47] observed that a person's gait could change with the clothing type, reducing the effectiveness of gait recognition.The researchers designed a method to address this concern through the creation of a dataset encompassing a fixed set of clothing.Ghebleh and Moghaddam [46] also found difficulties in capturing human gait patterns using accelerometers due to the variability in user clothing.Because of this, they employed various outlier detection techniques for "mitigation".Choosing to build technologies on others that presuppose binary gender reflects the gross magnification of gender hegemonies, reinforcing the binary concept of man or woman.As we continue to detail works and their complexities with respect to gender, it becomes increasingly clear that we need a robust technopolicy framework to safeguard the well-being of the queer community.

Determining Gender from Smartphone Location on the Body
A research task that infers gender based on smartphone placement on user bodies is infused with harmful assumptions on gender and bodily autonomy.While such endeavors might not appear harmful in theory, the task insidiously perpetuates the notion that gender can be "detected", determined by passively collecting data on someone's body, and that it should be detected.Biometric AGR works like Abuhmad et al. [2] describe gait-based user authentication methods as being "feasible in specific applications, which requires capturing the user gait traits while moving".Works also explore the usability of the "user's physical state", without detailing how this is defined nor its implications on user privacy.As with the above example on clothing, users carry their smartphones in pockets, but not all users have pockets in every garment they wear.If a user wears a bra or a purse, they may use the garment to carry their phones [52].However, it is crucial to recognize the inherent privacy risks and ethical concerns associated with this reductive approach.First, predicting gender from such data center cisnormativity and bioessentialism, thereby erasing any representation which outside a binary.Second, it does this in an incredibly invasive manner.Rather than pursuing research that could inadvertently marginalize the queer community, it is imperative that we prioritize community concerns in their development, paired with robust technopolicy that ascertains the privacy and dignity of all individuals across gender identity and expression.

EXISTING DATA PRIVACY LAW & JURISPRUDENCE 5.1 Background
The sexual orientation and gender identity (SOGI) of LGBTQIA+ people and communities have been historically criminalized and persecuted [106].Today, being publicly outed by a release of SOGI data can lead to being disowned, discriminated against in public accommodations, discriminated against in employment and social life, and even lead to physical threats [105].While more LGBTQIA+ protections have been advanced in the U.S., many states still have or are in the process of creating laws to prohibit LGBTQIA+ expression or rights or have insufficient protections against discrimination [3].In this environment, protecting one's privacy and identity is important for preventing data privacy harms associated with their outing and identity self-determination [27].This section briefly examines key data privacy protections in the European Union (E.U.) and the U.S. and examines the existing gaps for protecting queer people.
Guarantees for data privacy exist in several forms.The most prominent are data protection laws like the E.U.'s General Data Protection Regulation (GDPR) which offer a comprehensive set of regulations on companies that collect data, such as how consent must be acquired, what data can be shared, how data can be processed, and what rights consumers have to their data [104].The GDPR identifies several categories of sensitive personal data, including racial or ethnic origin, political opinions, health data, sex life, or sexual orientation without consent, legitimate purpose, or other key exemption 4 .The key feature of the GDPR is the standard for requiring consent from data subjects before data processing may occur, with heightened requirements for explicit consent before sensitive personal data like biometrics can be processed [10].While gender identity or transgender status is not explicitly included within Article 9, details about transgender status like gender reassignment or sex life will fall within the protections of Article 9 of the GDPR [97].Importantly for transgender people, the GDPR also includes a "right to be forgotten," a form of user control over personal information on the internet to remove or correct information stored online.This can be of particular value to transgender people who seek to have their gender identity updated to reflect their current identity [31] 5 .
There are other data protection laws for other countries.Many are modeled on the GDPR, like Brazil's General Data Protection Law (LGPD) [? ] or the California Consumer Privacy Act (CCPA) [85].The U.S. does not have a comprehensive data protection law on the federal level.Instead, the U.S. relies on data protection laws for specific forms of information or content (e.g., Health Insurance Portability and Accountability Act and the Illinois Biometric Information Privacy Act), classes of people (e.g., laws protecting minors like the Children's Online Privacy Protection Act), and data protection laws in individual states (e.g., California Consumer Privacy Act).
Aside from data protection laws, other privacy laws exist as torts against the harmful disclosure of private facts and constitutional protections.Early U.S. privacy law existed as a scattered tort case law following Warren and Brandeis's harms-based approach to privacy [19,27].Privacy torts were developed in a time far before the modern technologies of the internet and mass data collection and focus on compensating for the emotional, financial, and physical harms that intrusions of privacy could cause [26].Cases where a recording device was secretly hidden in the bedroom to spy on intimate activities, leaking someone's social security number, or being assaulted because location data was leaked to a stalker, would give a plaintiff grounds to sue under privacy torts.Privacy torts do not cover or redress privacy intrusions conducted by database operators, harms associated with mass data collection, or issues of government surveillance.Even in instances where a plaintiff can undertake the considerable effort for suing under a privacy tort, monetary remedies, and injunctive relief are insufficient to remedy harms when data has already leaked to a broader public [51].Additionally, courts have not always been receptive to the importance of SOGI data.LGBTQIA+ plaintiffs who file lawsuits for privacy tort offenses that involve their SOGI data often fail, even with strong cases [6].
The U.S. Federal Government is limited by privacy rights granted to U.S. citizens, while agency regulators have been taking a bigger hand in promoting data privacy.The Fourth Amendment protects individuals against warrantless government surveillance that violates a reasonable expectation of privacy, which has been extended to privacy interests held by third parties [57] 6 .While there is no explicit right to privacy enumerated within the Constitution, the Supreme Court has found there to be a penumbra emanating from the First, Third, Fourth, Fifth, and Ninth Amendments to create a zone of privacy free from government intrusion, which has been used to justify the constitutional protection of contraceptives, samesex intimacy, and abortion prior to the Dobbs decision7 [20].Neither privacy tort law nor Constitutional law offers protections against harms created by the mass collection of data or processing of it.This space has been covered by limited data protections that exist in the form of agency regulations that promise to improve protections against unfair and discriminatory practices that could infringe on the rights of queer people.The Federal Trade Commission (FTC) has recently stepped in to use its power against unfair business practices to enforce better cybersecurity and data privacy practices in the U.S. [94].When combined together, these U.S. laws and regulations form a patchwork of data protections that are not comprehensive in geography or subject and inconsistent in the level of protection and application.While both U.S. and E.U. laws promise certain rights for individuals, Waldman finds that the structure of the GDPR and other data protection schemes have been criticized for being performative, where consent is used as a shield for data extraction companies to shield themselves from accountability [101].The presence of laws and regulation is meaningless without enforcement, and privacy harms against queer people have continued unabated in the online space.
Many of these laws and regulations will categorize many forms of biometric information as sensitive information that requires greater protection, but there is no comprehensive federal data protection scheme in the U.S. that will protect biometrics as used by big tech companies [24].So far, the only consumer-centric data focusing on biometric privacy is Illinois's Biometric Information Privacy Act (BIPA).Other states, like Maryland, have introduced bills modeled on the BIPA or include biometric information in broader privacy regulation [78].This leaves a gap in collecting and processing sensitive biometric information.Some laws are also limited in their regulation of gait as biometric information, such as Texas's Capture or Use of Biometric Identifier Act, whose definition of "Biometric Identifier" does not include gait or accelerometer data.This can pose a challenge as technology advances, and the power for gait and other motion-based information becomes stronger for identification and user analytics [24].

Considerations for the Queer Community
Insufficient data protection in an age with exhaustive data extraction combines to form substantial infringements on the safety and identities of LGBTQIA+ people.The gathering of biometric data like gait data and its subsequent power for user identification and gender recognition can expose the information of users.Information that users might not want third parties, tech companies, advertisers, data brokers, or governments to be aware of.By allowing these parties access to gender or other assumed SOGI data, other sensitive details can be presumed.Concealed gender history might be exposed to employers, coworkers, landlords, social groups, and other malicious groups who might discriminate or harass a user over their queer identity, either intentionally or subconsciously.The reconstruction of intimate personal data not only exposes sensitive information that exposes a user to harm but can also affect the determination of one's identity as it exists online.
Social identity and the autonomy of identity are at risk when there is a constant invasion of personal privacy [20].As Brescia argues, the integrity of identity is threatened by digital technology and platforms that seek to exploit it [20].This autonomy can be particularly important for queer users who wish to choose how to be identified or who to disclose to [14,31].Computer vision analysis performed on a physiognomic or phrenologic basis can upset that self-autonomy and out queer people.[58,95] Data brokers and algorithms that only distinguish between a binary biological origin cannot accurately represent the identity of queer users.Not in how queer users interact on the platform with how they might sort them or by choosing the content to provide them.Social media platforms may undesirably provide a female-tomale transgender user with content from women-exclusive social media groups or advertisements for dresses that no longer align with their gender identity.
The data that is collected and used to recreate identifiers of gender and queerness play into algorithmic harms and biases.These harms have been observed in areas like algorithmic content moderation and demonetization [89], the promotion and social amplification of hateful content [109], the normalization of hateful content, and undesirable or inaccurate outing [9,10].
Guzman identified how the public disclosure of SOGI data of LGBTQIA+ individuals, better known as being "outed" in the conventional sense, can lead to discrimination, harassment, physical violence, and has historically led to the denial of rights and opportunities in the past [48].The denial of rights has long historical roots for LGBTQIA+ people.Same-sex marriages could be banned by states, with the same laws banning gay marriage limiting marriage with transgender people by recognizing the transgender individual as their birth gender.There was no fundamental right to marriage for same-sex couples until 2015 with Obergefell v. Hodges 8 [65].Discrimination against trans, nonbinary, and other LGBTQIA+ people is also still very present.Transgender people commonly report being harassed, physically assaulted, and economic discrimination 8 Obergefell v. Hodges, 576 U.S. 644 (2015) in the workplace [37,50].There has been a wave of bills targeting transgender rights in the access to gender-affirming healthcare, the right to cross-dress or perform in drag, use bathrooms and other gender-segregated spaces, education about gender identity issues, updating ID cards, and other civil rights such as by removing antidiscrimination provisions [60,79].All areas could be alleviated with better data protection and privacy laws that help guarantee identity self-determination, thereby reducing the risks of privacy harms in the absence of more thorough equal protection laws.

Gender Equal Protection, Missing Coverage
The current federal U.S. laws protecting LGBTQIA+ people from homophobia and transphobia are limited.They are narrow to certain government-controllable sectors and often end when they might infringe on another's rights.Title VII of the Civil Rights Act of 1964 prohibited discrimination based on protected categories, including sex, but this protection did not originally extend beyond biological sex to sexual orientation or gender identity.Courts narrowly interpreted the term "sex" under Title VII to only mean a person's sex assigned at birth.After Price Waterhouse and Oncale, Courts would interpret Title VII to protect transgender people under legal theories that employers discriminating against gender-nonconforming behavior would discriminate against sex "because the discrimination would not occur but for the victim's sex." 9 Over 20 states prohibit discrimination based on transgender status and sexual orientation in some form through statute.Common categories of areas where discrimination is prohibited include employment, housing, credit, public accommodations, and education [40,74].
In 2020, the Supreme Court held that employment discrimination based on SOGI violated Title VII of the 1964 Civil Rights Act in Bostock v. Clayton County."An employer who intentionally treats a person worse because of sex-such as by firing the person for actions or attributes it would tolerate in an individual of another sex-discriminates against that person in violation of Title VII." 10However, the Supreme Court case did not decide whether First Amendment protections or religious freedom protections under the Religious Freedom Restoration Act (RFRA) which may provide exemptions to Title VII for discrimination.The Supreme Court also ruled that state public accommodation antidiscrimination law could not compel speech by businesses that disagree with LGBTQIA+ clients, 11 leaving further gaps and uncertainty in transgender protections.Nor is the protection comprehensive, and transgender claimants will still face the difficulty of proving discrimination because of their sexual orientation and gender identity.
Because of these gaps in coverage, data privacy laws will still have significant importance in protecting the privacy of LGBTQIA+ people at the barest level, helping to prevent insidious discrimination by preventing the intrusion of private life.

Fears and Harms. Privacy, Outing, and Digitized Dysphoria
Researchers warned that accelerometer data infringes on user privacy because it can collect information like location and identity [63].Privacy is a tremendous concern on its own, yet using accelerometer data to predict gender identity associated with one's gait may be particularly distressing to some communities of marginalized users.Although gait is considered biometric data revealing personal information about the owner, the federal government does not have comprehensive laws to protect the biometric data of US citizens [69].This has led to fears of the aforementioned privacy harms from the disclosure of SOGI data, like through public outings, but more harms derived from the use of that data: Fears of algorithmic bias, and a loss of identity self-determination.
As of the time of this paper, there is a lack of comprehensive data privacy protections and algorithmic regulations in the U.S. to help prevent the gender recognition harms discussed in this paper.Most directly, the exposure of data is linked to privacy harms, but the inferences that can be made from raw data like gait can make inferences that become more sensitive, like SOGI data, and further inferences built on that which starts making decisions about people.What social groups do they belong to, whether they will fit in employment, the status of healthcare access, or whether their business will be refused for their identity?These are all fears that existing data privacy and equal protection laws seek to prevent for LGBTQIA+ people [67,105].While there is a patchwork of state laws, federal regulations, and equal protection promises, there are significant concerns about growing algorithmic bias as more sophisticated and less explainable AI becomes commonplace.Harms for housing discrimination, employment discrimination, healthcare, and economic access might all happen without the direct involvement and conscious human decision against queer people, but rather the result of algorithms not designed for gender nonconformity.The insidious discriminatory effects of algorithmic bias have already been seen when comparing results between men and women, and of race [61,64,100].
The lack of standards for fairness, black box design, contextual specificity, and sheer efficiency of algorithms could create the potential for ubiquitous, insidious biases for people at the margins of the U.S. population [66,84].Ethnic and gender minorities are the most likely to be subject to biases, as they fail to be accurately recognized by the algorithms whose design is sampled from a less diverse set of training data [62,107].
Algorithmic biases and AI discrimination are areas that the US government has signaled intent on regulating, potentially through interpretations of anti-discrimination law as well as recent informal guidance.The FTC has released informal guidance that they intend to use their powers to target AI discrimination as illegal, unfair, deceptive acts and practices [55].A combined board with representatives of the FTC, Consumer Finance Protection Bureau, Equal Employment Opportunity Division, and the Justice Department's Civil Rights Division have made statements of their intent to enforce civil rights and non-discrimination, and other legal protections [30].
The effects of algorithmic bias on queer people are understudied.However, the direct, personal effects that gender recognition and other algorithms can have are more present.The labels and classification results of automated gender recognition have the potential to directly misgender trans and queer users, and induce third parties to introduce dysphoric materials like undesirable advertising [49,59].AI and automated decisionmaking systems that adhere to a cisnormative design pose the risk and harms of digitizing dysphoria as a constant in queer users' lives.
While a few [63,93] have brought attention to privacy concerns, there are issues with applying biometrics to non-binary personages that create the loss of nuanced identity data [5,28].Therefore, smartphone sensor data for gender prediction risks overgeneralizing gender norms, exacerbating privacy concerns, and inducing dysphoric implications on queer bodies.
Throughout the decades, most of the research around biometric gait recognition has primarily focused on accuracy and efficiency [43], and this largely remains true today.While tremendous advances and strides have been made to improve gait recognition with the progress of machine learning and artificial intelligence, there is more emphasis on the new advancements in technology and absence or glossing over whether we should do this type of work given the technological harms and reinforcing gendered stereotypes.
Revisiting Mantyjarvi [71] from earlier, they found promising results in unlocking the phone by assessing movement, such as gait, by accessing phone accelerometers.However, the researchers found difficulties and potential drawbacks due to how one's gait could be changed based on footwear and ground surfaces, as Mantyjarvi et al. have noted in their study that gait changes based on footwear, but further nuance that footwear can be fluid [71].Gafurov has also pointed out that the security of biometric gait recognition has not been studied [43].Bouchrika later argued that using gait would be "a more suited modality for people recognition in surveillance and forensic scenarios...due to non-intrusively and covertly from a distance even with poor resolution imageries."[16].These all reveal potential failures and inaccuracies in the creation and interpretation of gait data, leading to inaccuracies in gender recognition algorithms.The privacy harms, algorithmic biases, and digitized dysphoria can be caused by unfettered and inaccurate predictive algorithms based on gait recognition.

FOOD FOR THOUGHT & QUEER FUTURES 6.1 Codifying Inclusive Law
Some proposals promise to regulate algorithms and AI, focusing on algorithmic bias and discrimination, but most are only in their early stages and have not taken effect.The E.U.AI Act has been the furthest reaching.It is structured by differentiating between purposes.AI systems with low-risk purposes, like spam filters, require limited disclosure.Those of high risk, such as AI for autonomous vehicles and medical devices, demand greater testing, transparency, and accountability.The E.U.AI Act lastly finds that some purposes, like using AI for a social credit scoring system like in China, are deemed unacceptable under the AI Act [38,80].Yet, these beginning proposals are still missing the inclusiveness of queer communities in the international law landscape.The U.S. has seen several proposals, with the Biden Administration releasing a Blueprint for an AI Bill of Rights that covers algorithmic discrimination and data privacy protections while incorporating other concepts for data protection [81].Legislative action taken to curb AI harms has been proposed in the federal government, with the Algorithmic Accountability Act of 2019 being introduced by Senator Ron Wyden and updated in 2022.While it features protections against algorithmic discrimination and greater requirements for impact assessment, neither bill has proceeded far in the legislative process [1,41].
Regulatory agencies do have a prominent role at least, with the FTC taking a strong position to regulate AI creation and algorithmic use under its focus on unfair business practices, as well as credit reporting and equal protection powers.A recent FTC settlement has required the deletion of a facial recognition algorithm developed from user data without consent [45,98].This punitive measure has the potential to be a strong deterrent to bad data practices, but it has yet to be seen if the FTC will begin regulating biased algorithms in the same way.
The creation of these new bills and regulations would hopefully redress the queer community's growing concerns of algorithmic bias and unrestrained AI development and use that is not designed for LGBTQIA+ people.Algorithmic and privacy harms currently theorized could potentially escalate and rise in frequency, creating an online ecosystem where queer people face automated discrimination and digitized dysphoria when interacting with apps and websites whose algorithms are only designed for a gender binary.Regulation on how data can be captured and algorithms can be created to be more equitable and acknowledge gender minorities.While new regulations and laws in the U.S. might be inevitable, technology developers and industry do not need to rely on or wait for their creation before becoming more inclusive.

Building Inclusive Technologies
Our paper highlights significant gaps in gender inclusivity within biometrics and broader AI-driven technology.For instance, choosing to design a gait recognition task based on mobile device placement on one's body is one of many examples which not only infringe upon user privacy but also perpetuate gender normative design.Choosing to control for clothing in gait recognition is another myopic example by which binary gender assumptions propagate into the erasure of non-hegenomic identities.In addressing these gaps, an opportunity presents itself: ML researchers and technopolicy makers alike can move towards addressing privacy and representational harms through critical discourse surrounding the social norms driving their respective sociotechnical design choices.In our provided examples, this looks like taking a moment to question biometric approaches infused with cis and heteronormativity and contextualizing their impact on gender minorities.
Several scholars provide avenues by which wearable technology can intentionally support the existence of queer bodies [5, 15, inter alia].These works, among many others, serve as examples for reimagining inclusive biometric technologies.Furthermore, being that ML researchers and technopolicy stakeholders co-exist within the broader AI ecosystem, we highly encourage open and honest conversation on ways in which AI harms can directly result from a lack of diversity in inclusion criterias which shape AI-driven systems.Asking these questions may be facilitated through the use of reflexivity, a vehicle to investigate how one's own biases, values, and social locations are imparted onto a given process.Here, taking into account one's positionality can help illuminate gaps in coverage during the design of task definition in data-driven processes and legislature alike [17,18,82].
Dr. Ian Malcolm's quote highlights a tension we have explored throughout this paper which grapples with access to large-scale mobile biometrics, the power of choice, and the consequent possibilities to do harm.While mobile biometrics can add value to daily life, there are many contexts in which operationalizing biometric data erases non-hegemonic identities, evidenced earlier by the ways gender prediction both reinforces stigmas about queer bodies while erasing them.Dr. Malcolm's quote reminds us to deliberately scrutinize assumptions behind decisions influencing AI ecosystems as whole.Even if we can leverage mobile biometrics to infer a "gender label" via clothes segmentation techniques and smartphone placement on the bodyshould we?