“That’s Kind of Sus(picious)”: The Comprehensiveness of Mental Health Application Users’ Privacy and Security Concerns

With the increasing usage of mental health applications (MHAs), there is growing concern regarding their data privacy practices. Analyzing 437 user reviews from 83 apps, we outline users’ predominant privacy and security concerns with currently available apps. We then compare those concerns to criteria from two prominent app evaluation websites – Privacy Not Included and One Mind PsyberGuide. Our findings show that MHA users have myriad data privacy and security concerns including a user’s control over their own data, but these concerns do not often overlap with those of experts from evaluation websites who focus more on issues such as required password strength. We highlight this disconnect and propose solutions in how the mental health care ecosystem can provide better guidance to MHA users and experts from the fields of privacy and security and mental health technology in choosing and evaluating, respectively, potentially useful mental health apps.


INTRODUCTION
Since 2020, at least 4 in 10 adults in the United States have encountered significant levels of psychological distress [53].Yet, around 40% of people who have experienced mental health symptoms did not receive mental health treatment due to high cost and lack of access to the service [47].In addition, the burden of these conditions disproportionately affect populations with limited access to mental health treatment [17].In response to the rising mental health needs, mental health supports were embedded into mobile applications (apps) that provide supported care, illness management, symptom tracking, as well as teletherapy as a more convenient, accessible, and low-cost mental health care mechanism for people in need [6].
As the usage of mental health apps (MHA) increases [12], there is growing attention on the protection of user privacy in MHAs due to the sensitive nature of mental health data compared to other forms of health data [8] as well as the lack of regulatory oversight [21,40].In the United States, the Health Insurance Portability and Accountability Act (HIPAA) [3] protects a consumer's health information collected by covered entities (e.g., health care providers, health plans, health care clearinghouses) from being disclosed to any parties without the consumer's consent.However, online therapy apps that are subjected to following HIPAA requirements, such as BetterHelp and Cerebral, sometimes fail to comply with these requirements by sharing users' personal information to third parties without users' consent [22,72].Outside of online therapy apps, MHA development companies, such as those that create self-management apps where users input their own data within the app, are not considered covered entities, so they are not required to comply with HIPAA regulations [2].Some states have attempted to remedy this by requiring MHAs to provide more comprehensive privacy policies.For example, the California Consumer Privacy Act (CCPA) [1] requires privacy practices to be disclosed to users for apps that collect personal data.Yet, the privacy practice disclosures for MHAs are often unclear [31,48,60], inaccessible [34] and, even more egregiously, have been shown to be misleading [45].
Privacy risks associated with stigmas surrounding mental illness could have negative impacts on individuals' well-being [11,46].As such, for people who are vulnerable to harms due to their mental health conditions and need to seek mental health support, it is crucial to prioritize and protect their privacy when they engage in online mental health care.Moreover, vulnerable populations are more susceptible to violations of their privacy [63].Prior work in the field of HCI has advocated for a more inclusive privacy and security design for under-studied populations [42,77], including people with disabilities [28], older adults [43,58], and people with visual impairments [67].Yet, there is still a need for a more comprehensive understanding on the data privacy and security concerns from actual MHA users to inform future privacy and security designs for people with mental health concerns.
In this paper, we aim to understand users' concerns regarding the data privacy and security of MHAs and compare them with concerns from privacy and security experts as well as mental health experts.We first qualitatively analyzed 437 user reviews pertaining to privacy and security of 83 apps from the U.S. Apple App and Google Play Stores to determine what data privacy and security aspects users are currently concerned with.We then compared these concerns with the concerns listed on two prominent app evaluation websites created and maintained by experts: Privacy Not Included and One Mind PsyberGuide.Our findings elucidate the gaps users have in understanding the potential risks of MHAs.However, we also identify a number of specific concerns that are not addressed by current assessment sites.We make recommendations for how MHAs can be better designed with privacy and security in mind, as well as how mental health practitioners can support their client's decision-making when choosing a MHA.

RELATED WORK 2.1 Privacy and Security Risks and Concerns for People Suffering from Mental Illnesses
Data privacy is becoming an important ethical consideration for mental health technologies as mental health data is regarded as more sensitive than other forms of health data [8].The negative impacts on people's well-being due to stigmas surrounding mental illnesses can lead to social isolation, employment risks, negative therapeutic outcomes, aversion to treatment seeking, and more [18,19,27,38,44,62].Several studies have delved into the risks associated with the loss of privacy for people with mental health concerns when they engage in online activities [43,46].For instance, a study from Naslund and Aschbrenner highlighted that privacy risks for users with serious mental illnesses engaged in social media could lead to concerns about stigma and judgment from others and cyberbullying [46].Likewise, other studies that explored online safety and privacy for autistic youths found that there are increasing privacy risks online that resulted in poorer well-being for those with autism compared to other non-autistic youth [41,61].However, to the best of our knowledge, the precise consequences of privacy violations on individuals suffering from mental illnesses remains unknown.While research has demonstrated that MHAs can be effective interventions for alleviating symptoms of mental illness [23,39], concerns about MHA user data privacy and safety have arisen among clinician and mental health practitioners [7,21,52,69].In a report published by the the American Psychological Association (APA) [21], psychologists urged other practitioners to use caution when recommending or implementing MHA in their practices.The report warns clinicians about the risk of data breaches and limits of data encryption, stating that, unlike information obtained and stored by the psychologist, data collected by MHAs are susceptible to outside forces such as hacking and interception, which limits a practitioner's ability to take the "reasonable precautions" of privacy and confidentiality as required by the APA ethics code [10].There are also concerns about what a breach in data privacy could mean for the wellbeing of a patient [37].The risk to a patient as well as the risk to ethical principles has led practitioners to be wary about the implementation of MHA in their practice [21,40,76].
Our work attempts to understand MHA users' concerns with mental health apps that collect, exchange, and store users' personal and health data.Past work in personal informatics and health (e.g.self-tracking [35,74], mood tracking [64], intensive data monitoring [13,14], health information exchange [66], mobile apps [20,78]) have explored people's use and perspectives on data-driven support in mental health management.However, only a few studies specifically focused on the privacy perspectives of MHAs.In survey studies on people's perspectives on mental health data collection, researchers found that people are more willing to share health data than personal data as long as it is beneficial to their mental health [13,20,78].On the other hand, Blair et al. and Shen et al. 's interview studies found that users' willingness to share health data are context-dependent and are based on the level of trust they have with the people they are sharing their data with [14,66].
In this paper, we are interested in the experiences of users who engage with MHAs that collect users' data.By analyzing users' feedback on their experiences with MHAs, we are able to provide guidance to MHA users and experts from the fields of privacy and security and mental health technology to make informed decisions about the use of MHAs.

Data Privacy in MHAs
While it is important to ensure the safety and protection of data for users of MHAs, several studies found that the current MHA industry puts little effort into safeguarding users' data privacy and security while actively urging them to share information [32,50,51].Furthermore, research has revealed that data sharing with third parties is a common practice in MHAs [31,50], posing a potential threat to users' privacy.A recent study conducted by Iwaya et al. discovered that the data collected by MHA companies could be linked to users' identity, which could potentially result in the sharing of linkable data with the third parties [32].
However, users often have little or no way to be aware of such sharing practices when they engage with MHAs.While privacy policies are discoverable on MHA companies' websites, most policies require college-level reading comprehension abilities [32,54,60], and are not up-to-date [31] or transparent [48,60] about their privacy practices.HCI researchers have explored methods to help users easily understand privacy policies using nutrition labels [36], comic strips [70], and interactive dashboards [59].However, the privacy labels on popular app stores, such as the Apple App Store, are still difficult for users to decipher [79].Moreover, Mozilla found that the privacy labels of apps on Google Play are often inconsistent with their privacy policies [45].Without knowing how an app shares data, users could falsely assume that their data are kept private and secure within the app.This could result in companies profiting from vulnerable communities (i.e., people with mental health conditions) by exploiting this sensitive data.
To better understand MHA users' perspective on the data they share with the apps, we analyzed user reviews from 83 MHAs available on the Apple App and Google Play Stores.Previous work in HCI has conducted user review analysis on MHAs [15,29] to investigate users' experience and perspective and briefly mentioned privacy and transparency as part of their findings.To extend our understanding on MHA users' experience as well as their concerns regarding data sharing with MHAs, we conducted a two phase study.In Phase 1, we provide an in-depth analysis on user reviews related to data privacy.In Phase 2, we compared our findings to reports by experts given on Privacy Not Included and One Mind PsyberGuide.

Expert Evaluation Criteria
There are two expert evaluation sites that reviewed MHAs to provide guidance for people interested in using MHA -Privacy Not Included and One Mind PsyberGuide.On the former site, the apps were reviewed by privacy and security research experts [26], while on the latter, they were reviewed by mental health technology experts [57].In this paper, we refer to these two sites as "experts." 2.3.1 Privacy Not Included.Mozilla's Privacy Not Included [24] buyer's guide has the aim to "help you navigate this landscape by understanding what questions you should ask and what answers you should expect before buying a connected tech product."For their analysis, they explain that they "look at things like privacy policies, company websites, news reports, research whitepapers, app store listings, consumer reviews, and anything else we can find and trust to inform our research" They do not purchase or download the app to evaluate it.They then assign a *Privacy Not Included Warning Label to those products if it receives two or more warnings from their criteria; this also means that each criteria is weighted equally.The criteria for assigning a warning listed on their website include: Permissions (including tracking), Privacy (including use and control of one's data), and Minimum Security Standards (including encryption and privacy policies).In addition, the website lists untrustworthy artificial intelligence as a concern consumers should be cognizant of; however, at the time of our analysis, the experts do not use that criteria in their warning label assessment due to the limited availability of information available from the product companies.
2.3.2One Mind PsyberGuide.One Mind PsyberGuide [56] was developed by a non-profit organization and brings together experts in mental health practice, technology, and digital mental health.Expert reviewers score mental health apps on three categories credibility (i.e., is the app likely to have a positive effect on mental health), user experience (e.g., how fun and easy is using the MHA), and transparency (i.e., is information on data collection, storage, and exchange readily available and understandable).We specifically focus on the criteria specified in the last category here.App privacy policies are rated as acceptable, questionable, or unacceptable based on whether sufficient information is provided.

PHASE 1: MHA USER REVIEW THEMATIC ANALYSIS
In the first phase of this study we sought to understand the perspectives of MHA users' data privacy and security concerns.In the following subsections, we outline the methodological approach used to collect and analyze our data followed by the thematic findings.

Phase 1 Methods
We used an interpretivist approach to understand MHA users' experiences and perspectives on the data privacy and security of mental health apps.We conducted a qualitative analysis of user reviews from publicly available mental health apps.HCI researchers have employed user review analysis as a promising method to understand real-world users' experiences on mobile apps [15,29,30,33].
Guided by past work that studied user review analysis of mental health apps [9,15,65,68], we leveraged publicly available data from U.S. app stores to gather qualitative feedback from MHA users as well as from different types of mental health apps (i.e., self-care, teletherapy, peer support).While we did not identify users' demographics, nor did they leave identifying information in their reviews, we were able to collect user reviews from a wide range of individuals using apps designed for people with mental illness.The study is approved by our university's Institutional Review Board.
3.1.1Selection of Sample Apps.We followed the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) four-phase process to ensure transparency in our data inclusion (see Figure 1) [49].We identified apps from keyword searches and top free and paid "Health and Fitness" charts on Apple App store and Google Play from June to July 2023.We utilized "AND" and "OR" commands of "mental health", "therapy", and "care" when we performed the keyword searches on both app stores.In addition to browsing apps on both app stores, we used Google advanced search to find apps that did not show up on app stores due to the ranking algorithms.We applied the following inclusion and exclusion criteria: • Inclusion criteria: (1) teletherapy, self-care, and peer support apps that specifically mentioned the type of care (e.g.guided meditation) or therapy (e.g.CBT), (2) apps that target mental health condition (e.g.anxiety, depression, addiction), and (3) apps available in English.• Exclusion criteria: apps that (1) have less than 50 reviews, (2) were last updated prior to 2020, (3) do not collect data, (4) not available in English, (5) do not explicitly state targeting mental health conditions, and (6) do not explicitly state the type of care or therapy provided.
The first author carefully screened the descriptions of every app that appeared in the app store keyword searches, top "Health and Fitness" charts, and Google advanced search, and added those that met the inclusion criteria to the corpus (Apple App Store: n = 270, Google Play: n = 201).For each identified app, the first author retrieved the number of user reviews, the date of the latest version, and collected data from the privacy labels on the app stores.We aimed to focus on apps with high and active user engagement, and thus we decided to exclude apps that have less than 50 reviews and were last updated prior to 2020.Since privacy labels on popular app stores might be incorrect, as found by Mozilla [45], for apps that do not indicate any data collection on the app stores, the first author went to the app companies' websites to investigate whether there is a privacy policy.Additionally, the first author reviewed the app description and user reviews showing up on the app store's website to determine if there were any features that collected user data.Only after this thorough evaluation was a determination made as to whether an app did not collect data and thus was not part of our evaluation.The exclusionary criteria were then applied, resulting in 140 discrete apps (Apple App Store: n = 87, Google Play: n = 94, with an overlap of 41 apps) that were included in the analysis (see Figure 1).We reported the final list of MHAs from both app stores in Appendix A.1 Table 1-3  We then excluded reviews that (1) were not in English, and (2) were fewer than 100 characters (according to previous study [75], the average length of the app user review is 117 characters).We applied our exclusionary criteria to ensure that the reviews we analyzed would generate meaningful insights.For apps that appeared on both the Google Play and Apple App Stores, we included user reviews from both platforms with an intention to include both android (Google Play) and iOS (Apple App Store) users in our analysis.This resulted in 23,968 reviews from Google Play and 31,120 reviews from Apple App Store.
Similar to the previous work [9,15,29,68], we conducted a thematic analysis where we followed Clarke and Braun's [16] six stages of thematic analysis.However, given our study's specific focus on privacy and security concerns, we included only reviews relevant to the study's scope.This approach was well-suited to identify a wide range of privacy and security concerns raised by users.
The first two authors familiarized themselves with the data by screening randomly selected reviews.Then, we defined reviews as "relevant" and "irrelevant" based on users' experience with data and perspectives on privacy and security.Specifically, reviews that are relevant to the scope of this study are those that reflecting users' experiences with sharing data with the apps.This includes personal information (e.g., name, email address, credit card number, biometric data), mobile sensing data (e.g., audio recordings, location data), and health data (e.g., tracked physical or mental data, mood journal entries, PHI).Examples of "irrelevant" are those pertaining to app usability issues, cost concerns, or reviews that only mention a feature without explaining the user experience.
The first two authors and an undergraduate research assistant screened the reviews together from randomly apps based on the definition of "relevant" reviews to discuss any uncertainty about what constitutes "relevant" and "irrelevant" until a consensus was reached.Then, all three researchers independently screened all reviews collected from each app store and kept each other informed of any uncertain reviews in order to reach a consensus on them.The process of narrowing down the reviews as shown in Figure 2.
The first two authors then assigned initial codes to the identified user reviews using an inductive approach.After gathering the codes into candidate themes, all authors reviewed the themes together and conceptualized new themes and subthemes.At the end of the analytic process, we had identified 437 reviews from 83 discrete apps that were relevant to the scope of the study, with a total word count of 36,759 for all reviews and an average of 84 words per review.We then organized them into four themes: types of data being collected, third party involvement, data safety, and agency.Final set of inductive codes are available in Appendix A.1 Table 4.We report themes as subsections in Findings.Since the user reviews contain sensitive information, we paraphrased the quotes presented in the following section, as it is the best practice to reduce the risk of user profiles being identified [73].Our first reported theme demonstrates that users are concerned when asked to provide data linked to their identity.This includes personal information including first and last name, date of birth, phone number, email address, as well as biometric identifiers including fingerprints and facial recognition.Users often referred to the collection of identifying information as "intrusive" and "unnecessary".A number of reviews also expressed concerns when the app requested protected health information (PHI) governed by HIPAA regulations.For instance, in comment R306, the user stated that they were "not comfortable with non-professionals to collect my HIPPA information."This comment and ones similar to it highlight that users are hesitant when asked for identifying information by an app which they may not fully trust because they understand the potential risks associated with providing that information.
Users expressed similar concerns about granting permission to track data on their devices, indicating that they were uncomfortable when the app tracks their browsing history and purchase history: "The amount of personal data that it collects is crazy!I get that they might collect certain data to make things run smoother.Why is it necessary to monitor my personal browsing history or purchases?Why do they need to gather so much of my personal information?You can't even use all the features without agree to their data collection practice." (R41) On the other hand, some users held negative views toward what they referred to as "anti-privacy" trackers collecting personal and device information, as indicated by this comment from R419: "Decent app, but has anti-privacy trackers.I'm willing to invest for such an app, but not if it uses trackers from Google and Facebook."Even though users may initially find the app helpful, their opinions shift negatively once they become aware of the presence of trackers that potentially invade their privacy.
3.2.2Third Party Involvement.The theme of third party involved arose from reviews indicating users' concerns regarding data sharing or selling to other platforms for advertising purposes.User comment R263 expressed the user's concerns in an ironic manner, stating, "Expecting to receive spam emails from people who has my medical information once it's sold to the highest bidder.Shady." This user seemed particularly worried about the potential, undesirable consequences of data selling -receiving spam emails.Another user had also received emails from third party services and wrote, "They may claim they don't sell your information, but I've begun to receive emails from other mental health services.Coincidence?Definitely not." (R101) This user noticed that their data might have been shared to third party advertisement companies based on the unusual emails they received.Unlike previous quote, however, this quote suggested that the user lost their trust of the app after they began receiving spam emails from marketers.
Apart from data sharing for advertising purposes, some users identified other third parties that might collect their data for nonadvertising purposes, with whom they would prefer not to share their data.For instance: "Your policies provide ample room to change your mind.My personal, sensitive data about my well-being might be sold if your company is sold out or declares bankruptcy.What if my health insurance provider or my employer gets it?"(R31) After reading the MHA company's policy, this user was particularly concerned about how their data could be sold to health insurance companies or accessed by employers.Such scenarios could potentially impact their insurance rates as well as employment status.Findings in this theme highlight user uncertainty and distrust regarding how the MHA companies utilize their sensitive data.

3.2.
3 Data Safety: Mishandling data.Some reviews mentioned undesirable outcomes that made users suspicious about how the app mishandled their confidential data, specifically regarding the usage of data without users' consent.For example in an online therapy app, one user mentioned that their therapist abused the information that was shared: "The therapist assigned to me abused my trust.They asked me to recommend solutions to another patient where they revealed the patient's name and personal information, violating patient's confidentiality.I felt disregarded, manipulated, and as if I was paying to be exploited.This violates my trust and harms my mental health." (R27) It was evident that the therapist on this platform violated HIPAA regulations by disclosing another patient's personal information.This comment is particularly alarming when considering that therapists are trained to adhere to HIPAA regulations not only for legal reasons, but also to uphold the ethical principles and expectations of therapists [10].In addition to triggering negative emotional states in users, mishandling of data can also lead to security risks in other, not mental health related apps.For example, this user became suspicious about the data that they provided to the mental health app they used when encountered a potential security breach on their personal Google account: "They request for an email and password when installing and opening the app makes me kind of suspicious.What's even more concerning is receiving a notification about an attempted login to my Google account.It could have been a coincidence, but the occurrence of both events together suggests that the creators might be accessing, hacking, or otherwise obtaining people's emails due to having access to their passwords." While the collection of identifying data itself raised concerns, the attempted, unauthorized access to the user's personal Google account raises even more concern.Unfortunately, this review also indicated that, although the user may have been aware that sharing personal information carries risks, this awareness did not deter them from providing said personal information.It was only when the risks were realized that they started to become genuinely concerned about their online security.Fortunately, from the user review, it appeared as though the user intervened before their Google account had been accessed.
Other reviews also demonstrated user awareness and alertness about their data safety, consuming online information (i.e., Federal Trade Commission reports, Privacy Not Included) to assess the security of the specific app they were using.For instance, one user of the Cerebral app, an app that has been subject to intense public scrutiny, left a review stating: "They recently had a breach that leaked every patient's data, even their social security number.So that's GREAT." (R96) In this review, the user indicated that they not longer trusted this company due to their concerns about how the app misused others' personal information.
According to news reports, Cerebral shared 3.1 million users' personal information through tracking technologies from third parties without users' consent [72].Among that information were PHI, information that, under HIPAA, are expected to be kept confidential.Reviews of other apps also indicate the sharing of PHI and other personal information, and news stories continue to report of data breaches within MHAs.As a result, users lose trust in MHAs as concerns for their own safety increase.

Data Safety:
Lacking security measures.Among users who expressed concerns about the lack of security measures, many emphasized the importance of security settings, such as password, passcode, or face ID to prevent others with access to their phones from accessing their diary entries in the app.In R184, this user suggested a security lock to ensure the privacy of their data: "I suggest to add an optional security lock feature.I wouldn't feel comfortable if anyone else could open this and see everything since it's private."Users value their privacy and want assurance that the people around them will not be able to access their mental health data entries.Therefore, a security setting that prevents others from accessing the app appears to be an essential feature when users engage with MHA that stores users' diary entries as well as other forms of health information.
Other reviews mentioned the lack of encryption of the data.Without a clear assurance of data encryption, users are unwilling to provide their personal information due to concerns about unauthorized access to their data.For example in R153, the user wondered: "Where's the end-to-end encryption?It would be very helpful and reassuring if app's contents are encrypted because people are storing the history of their life on this app.Without this type of encryption, the entries are visible to anyone else with access to Apple's servers." This user seemed very much aware of the risks of storing unencrypted information on the MHA's server.The lack of transparency regarding how their data were being protected within the app led to the user questioning the safety of their data.
Finally, concerns regarding the lack of protection for vulnerabilities, specifically minors and individuals with mental illnesses, were raised on multiple occasions.Concerns related to minors were mostly brought up in apps where users interacted with other usergenerated content (e.g., peer support) and AI-generated response (e.g., chatbot).In R57, the user mentioned: "They failed to implement any measure to protect minors from accessing the app, and they collected their data.It's as if their legal team is a bunch of drunks."Here, the user expressed worry that the app failed to consider the protection of minors in the data collection process.This user also indicated that such an occurrence may be an illegal act.A number of other reviews mentioned that the app not only failed to protect vulnerable users, but also exploited the vulnerabilities of individuals with mental illness to collect personal information for potentially nefarious reasons.For instance, this review clearly indicated users' concerns about the private information that was gathered during the process of addressing cognitive distortion: "People use this app to sort through their most private and personal thoughts and feelings.The app developers are responsible for protecting the private information they collect, especially considering its therapeutic nature.These developers target vulnerable people, encouraging them to share their private thoughts in the app to seek help.They exploit data obtained from individuals who are uninsured and under-insured to line their own pockets.Apps like these need more federal oversight." (R142) While individuals with mental health conditions are in need of the support to alleviate their condition, this user was particularly concerned that the app exploited this "need" to collect private information.Moreover, this user explained that the app developer had a responsibility to protect such information but, instead, used the information to profit off vulnerable users.

3.2.5
Agency: Ownership of the data.Users demonstrated concern about the lack of control over their data.Specifically, they were worried about not being able to manage their data after it had been collected.This included the inability to edit and delete personal information and other data entries from the MHA, an instance sometimes refer to as "no respect for [their] privacy".When requesting to delete personal information, one user became suspicious when their request was not fulfilled: "They won't remove your information.It'll remain in their system for years, ready to be sold.They charged my credit card and refused to remove my personal data." (R101) Not being able to remove personal information upon request elevates users' concerns that the company will share their information with third parties without their consent.Similarly, when considering the risks associated with collection of personal information, users are worried about not being able to control data sharing with third parties.
"The app may claim to be free, it actually sell your data without a way to opt out of it with 'do not sell my data' feature.This is concerning because paying a small subscription fee cost far less than having your mental health information sold to third parties.Basically, their privacy policy is very weak." (R214) As demonstrated in the above review, the user wondered why it was not possible to avoid sharing their data with third parties and suggested that they would prefer to pay the MHA developer than risk their privacy being violated by the developer selling their data for a profit.A willingness to pay for data protection and ill-reactions to not being able to manage their data shows how concerned users are that, once they provide their information, they will lose the ownership of their personal data.In other words, the company remains in full control over a users' data.

Agency: Manipulative data practices.
When asked to provide personal information, many users reported concerns about deceptive data collection practices.Specifically, apps "forces" users to input personal information to proceed with using the app without giving the option to skip through the data collection process.Several reviews mentioned that an app compelled users to give permission to receive emails and did not offer an option to opt out, as seen in this review: "They force you to enter your email to see your results and do not provide an option to unsubscribe from future emails.I'll probably end up on every freaking spam list out there." (R131) This user indicated that the app employed a deceptive strategy to entice users to provide their email addresses to access app content while denying them the choice to opt out of the email list.Under such manipulative practices, users are more likely to provide information against their will in exchange for access to personalized app content.
In addition, users are concerned about the lack of transparency regarding data practices, even when they read the privacy policy.Without complete disclosure from the app, users often remain uninformed about the usage of their personal data.In one review, a user wrote R178 state: "They did not disclose whether your information is used for marketing purposes.For example, if you are trying to improve your fitness, you will start receiving with ads for gyms and personal trainers.Even after reading their privacy policy, it's still not clear." This example illustrates that the user perceived no way to find detailed information about the data sharing practices from publicly available content.Moreover, some reviews also mentioned the inability to contact the company regarding the privacy policy, as seen in this review: "I contacted the email address listed in their privacy policy because I have questions about it, but it came up as undeliverable."(R186) Even when the user proactively sought out more clarity on the privacy policy, they were met with roadblocks that prevented them from getting answers, perpetuating the loss of agency over their own data once provided to the app.
Lastly, users consistently brought up the unethical data usage from the app.Many users complained about unauthorized charges to their credit cards, the inability to cancel app subscriptions, and other, unexpected charges after providing payment information.Oftentimes, apps allow users to access the app using a "free trial" while charging immediately after they signed up.As seen in this review: "When I signed up and tried the free trial, they immeidately charged my card.That's just sketchy to me." (R80) In this case, the app deceived users by claiming there was a "free trial" in order to access the user's payment information.Even more, many users indicated that an app will "randomly charge" them without their consent.
"They'll store your card details and randomly charge you after you've canceled your subscription.My wife canceled before the free trial ended but still got charged.We canceled a year ago but still got charged today." (R176) From this review, it becomes evident that once the user provides their payment information, an app will charge the user at its own discretion, leaving the user without any control over their finances.Several reviews also revealed how apps will employ a paywall to block users' access to their data until a payment is received, as this review stated: "I'm sure it's against the law to restrict access to users' private medical data behind a paywall.So disappointed."(R389) In this and similar instances, the app, essentially, held a user's data hostage, demanding money before allowing a user access to their own, private data.The lack of regulations surrounding the actions of MHAs may be a contributing factor to this happening as many app companies appear to prioritize profit over user data privacy and security.

PHASE 2: COMPARISON OF MHA USERS' PRIVACY AND SECURITY CONCERNS TO THOSE IDENTIFIED BY EXPERTS
In the second phase of this study we collected the privacy and security concerns from two prominent app review websites and then compared those concerns to the concerns identified in Phase 1 of the study.

Comparison of User to Expert Concerns
There are two prominent app evaluation websites that provide information to concerned users who are evaluating the privacy and security of a MHA: Privacy Not Included and One Mind PsyberGuide.
We collected evaluation criteria listed on both websites [25,55], referred to as "expert concerns", and compared them with the concerns from the user review.The expert concerns are listed on the right side in Figure 4. Figure 3 presents the process of how we analyze the user review data and compare it with experts' concerns.From Phase 1, we had 60 low-level codes (see Appendix A.1 Table 4) to compare to expert concerns collected from the websites.Many of these codes overlapped in concept and were only different in specific application (e.g.data selling to marketers, data sharing with third parties).We combined these codes, referred them as "MHA user concerns", and correlated them to the expert concerns that have the similar meaning, as shown in Figure 4.After this mapping exercise, we were left with user app review codes that were not addressed by any of the expert websites.Finally, we counted how often each user app review code was used and transferred that as a percentage of the total number of concerns coded.In the following section we present the comparison between the codes and the percentage of occurrence as a heat map and then discuss those differences.

Phase 2 Findings
Figure 4 lists the combined concerns from these two websites and compares them to the concerns we have outlined in our Findings.
We have presented the findings as a Heat Map showing the relative frequency of each user concern that arose in our data analysis.
In the following subsections we discuss how the concerns outlined on these two expert app evaluation websites compare to our users' concerns.When comparing themes and subthemes to the categories of evaluation criteria established by Privacy Not Included and One Mind PsyberGuide, we found that a portion of users' concerns aligned with most of the expert criteria.However, there was still a mismatch between users' and experts' concerns regarding user data privacy and security.

4.2.1
Where user concerns aligned with the expert criteria.Many users mentioned privacy concerns related to the collection of their identifying information and expressed worries that the app might share or sell their data to third parties without their consent.Furthermore, users raised concerns about their inability to edit and delete their personal information, indicating a lack of control over their data.Both of these issues align with the privacy expert criteria outlined by Privacy Not Included.In addition to the privacy evaluation criteria, data tracking also emerged as a significant concern among users, particularly with regard to permission requests that were perceived as intrusive and unnecessary.When it came to security measures, a few users touched on the lacking encryption and lacking protection for vulnerable populations (e.g., minors, people with mental illness), as well as an app's prior record of data breach.This also matched the expert criteria.

4.2.2
Where users did not mention expert criteria.Even though our user reviews covered a significant portion of the categories in the evaluation criteria, there were six expert criteria that were not mentioned or were only mentioned once or twice in the user reviews.Three of these fall into the "minimum security standard" category, specifically concerning the lack of automatic security updates, lack of a requirement for strong passwords, lack of a publicly available privacy policy, and lack contact.Among all the user reviews included in this study, no reviews mentioned lack of automatic security updates or requirements of strong passwords, and only two mentioned an inability to contact the MHA development company.The lack of user reviews about these evaluation criteria may suggest that users are not as concerned as experts with those aspects of MHAs.

4.2.3
Where users had concerns that were not being evaluated by experts.When it comes to the data collection process, users specifically mentioned concerns about the confidentiality of their PHI and potential HIPAA violations by the app.However, it is unclear how well users understand what information is protected under HIPAA solely by mentioning its violation in their reviews.Nevertheless, users mentioned HIPAA violations on multiple occasions, but experts failed to include the following of HIPAA guidelines as an evaluation criteria on their websites.The expert criteria also falls short in offering a comprehensive review of the control users should have over their data.Some users noted that they were unable to disable data tracking, data sharing, and opt out of email lists after providing their personal information.However, the expert criteria only covered a user's ability to request and delete their data.Additionally, users highly value their privacy, especially regarding control over data access by others.From user reviews, we observed concerns about the lack of security settings (e.g., data password or passcode protection) for controlling access.Both experts did not include this aspect as a criterion.
Lastly, neither of the experts addressed user experience aspects that contribute to users' concerns about data privacy and security.Users mentioned manipulative data practices by MHAs that unethically and deceptively collect users' data, causing emotional and, sometimes, monetary distress.Additionally, some users expressed concerns about companies not adhering to their privacy policies, which also was not included in the criteria on both expert websites.

DISCUSSION
This study analyzed 437 reviews from 83 MHAs from Apple App and Google Play Stores within the scope of privacy and security concerns.We distinguished our study from prior research that explores users' perspectives and experiences with MHAs [15,29] as well as people's attitudes towards data-driven mental health management on smartphone apps [20,78] by investigating actual users' experience with data sharing in MHAs.Furthermore, we juxtaposed those user experiences with expert criteria to compare and contrast what the experts believe users should be aware and concerned of in terms of data privacy and security and what users are actually aware and concerned of.Despite previous studies that found that users are willing to share their data to receive valuable health feedback in different contexts [13,14], our findings indicate that there is still a portion of users who have a myriad of concerns when asked to provide their personal information.
Our findings also suggest that even though people who are concerned about their privacy and security are acutely aware of the collection of identifying information by MHAs and have a similar understanding of the implications of their data being collected by MHAs as mental healthcare providers [9], they do not seem to fully understand the potential dangers they face.In the context other than of MHAs, a study by Tabassum et al. on end-user perception of data practices in smart homes also shows that users are not fully aware of the potential misuse of their data [71].On the other hand, a study by Rocheleau and Chiasson on the perspectives of autistic teenagers regarding privacy and safety on social networking sites found that these teenagers are aware of their privacy and safety, prompting them to adopt different approaches to protect themselves online [61].This suggests that individuals who are vulnerable, such as autistic teenagers who are more susceptible to privacy and safety risks, might be more conscientious about protecting themselves online.However, data from user reviews suggests that although users are aware of data collection, they are not deterred from providing information to the app.It is only when undesirable outcomes actually occur that users become cognizant of the risks associated with data collection.Unfortunately, understanding these risks after data has been shared may be too late as the information is typically stored on company servers where users may have limited control over it.Furthermore, even when individuals proactively seek clarification on data privacy practices, MHA companies often failed to provide full disclosures in their privacy policy, which is consistent with prior work's finding on the lack of transparency in privacy policies [48,60].
Our study contributes to not only understanding users' privacy and security concerns, but also how users' concerns differentiate from experts' concerns that were incorporated into app experts.Our findings have important implications for MHA users to make informed decisions when choosing the app and mental healthcare practitioners that may be recommending such apps to their clients, as well as for those app experts who are trying to provide guidance to MHA users.

Implications for Better MHA User Guidance and Education
As highlighted in Phase 2 of this study, there are publicly available expert resources such as Privacy Not Included and One Mind PsyberGuide to help guide users when assessing the privacy and security of a MHA.However, the former only includes a small subset of the available mental health apps and the latter does not dive as deeply into privacy and security.This means that there is still a large gap in the resources available to support both MHA users and mental healthcare providers in their decision making around MHAs.In addition to better, widespread MHA users and mental healthcare professional education, a more robust online resource is needed.
The experts behind Mozilla's Privacy Not Included and One Mind PsyberGuide have clearly put in significant effort to review, at a minimum, dense, convoluted privacy policies.In fact, Privacy Not Included goes beyond that, seeking out additional publicly available online resources, such as records of data breaches, to include in their reports on the website.However, our analysis on the comparison of user concerns derived from user reviews and expert concerns presented in two app evaluation websites suggests that user concerns present opportunities for experts to consider in their app evaluation: adding explicit HIPAA evaluation criteria for mental health apps, weighting criteria, and incorporating the user voice in evaluations.
Despite the comments invoking HIPAA, it remains unclear to us whether MHA users have an accurate understanding of what this policy entails.In this area, experts can help clarify HIPAA regulations by adding further criteria related to HIPAA in their assessment of MHAs.Specifically, this could involve identifying whether or not the apps are considered covered entities, determining whether PHI is collected, and evaluating whether the apps adhere to HIPAA regulations if they are indeed covered entities.
However, both Privacy Not Included and One Mind PsyberGuide specifically state that they do not test the apps.Given that our analysis suggests that there are user concerns that are not currently being evaluated on these websites, including user reviews or another sources that brings in the voice of the user to better understand actual in-app experiences could be critical to fully capture the range of relevant privacy and security concerns.Privacy Not Included attempts to mitigate this disconnect by including a 'Creep-O-Meter' where consumers can read about an app (or other product) and rate its level of creepiness.However, the person rating does not necessarily need to be a user, and this is a rigid measure that does not allow individuals to explain what exactly they find creepy.
Monitoring user reviews on app stores for privacy and security concerns may currently be the best option for large-scale understanding of user privacy and security concerns.User experiences with MHA may be unique to the individual, but those experiences can still provide valuable input and insight that other, potential users may benefit from.It might also be helpful for users if they assigned a weight to the various criteria being assess on these websites while also customizing their Creep-O-Meter report.

Limitations
Our findings are based on the analysis of user reviews from MHAs in the Apple App Store and Google Play.One limitation of gathering data from these platforms is that there is no way to determine whether the reviews are fake.As we analyzed the user reviews, we attempted to identify fake reviews and excluded them from the analysis by observing the timing patterns, recurring phrases across different reviews, and the distribution of star ratings.Furthermore, the scope of the study is centered on users' experiences and perspectives related to data privacy and security, which led to the exclusion of reviews that were too generic and did not provide meaningful insights into user experiences and concerns related to data privacy and security.However, even though we excluded a number of reviews that were irrelevant to the scope of this study, we were able to conduct an in-depth qualitative analysis of all the relevant reviews related to privacy and security concerns from a significant number of MHAs (n = 83) available on the market.
Another limitation of analyzing user reviews is that we might overlook input from users who do not express their privacy and security concerns through app reviews.Furthermore, unlike other qualitative methods such as interviews, this method does not allow further probing with users to uncover a deeper understanding of their concerns.However, we have identified 60 lower-level codes as users' concerns relevant to privacy and security from a large sample of MHAs and users.This provides a good starting point to understand users' concerns from a large-scale perspectives for future work that could employ other methods, such as interviews or surveys, to further explore users' concerns related to privacy and security in MHAs.

Future Work
There is a discrepancy between experts' evaluations and users' concerns for several apps.While the experts may not be identifying any privacy and security issues based on what is stated in the privacy policy, users have raised concerns in their reviews.It is unclear whether it is the app that does not follow its privacy policy or if users have misunderstood the situation.More research is needed to disentangle policy, end-user license agreement (EULA), practice, and perceptions as well as justify if users' concerns are valid for the specific app.
Furthermore, understanding what MHA users might do when provided with guidance that includes a comprehensive list of users' privacy and security concerns would contribute to our knowledge of how we can better support their decision-making process.Such information could lend itself to being the start of a series of educational materials aimed to help users understand data privacy and security, how to evaluate an app's safety and privacy policies, and how to better protect themselves from data breaches due to deceptive in-app practices.
Finally, the comparative findings suggest that the guidance provided to users for making informed decisions on choosing an MHA may not be sufficient.Therefore, future research could explore other forms of guidance, such as input from mental health providers.It would be worthwhile to investigate what mental health providers are aware of when they recommend apps to their clients.

CONCLUSION
As the MHA market continues to grow [12] and more apps proliferate app stores, the risk of data privacy and security breaches may increase without intervention from regulatory bodies that protect patients and MHA users such as the Federal Drug Administration (FDA), Federal Trade Commission (FTC), and Office for Civil Rights (OCR).Researchers have demonstrated the ill-effects as a result of stigma on people when their mental health data are exposed, underlying the importance of maintaining user data privacy and security in regards to data stored on MHAs [11,18,19,27,38,43,44,46,62].Our research further underscores this importance by providing insights from real users about the challenges they are currently experiencing with MHAs resulting from data mishandling, lack of security measures, violations of data agency, third party involvement, and unclear data collection policies.
While there exists resources created by experts to assess the privacy and security of MHAs, our results also indicate that there is a disconnect between what those security experts warn against and what users are experiencing in real-life.Users and practitioners alike should remain aware of both the consequences of data privacy violations and how to select secure and safe MHAs for use.
For the foreseeable future, it appears that the popularity and development of MHAs will not dissipate.In order to ensure user safety, the security of data collection and handling processes within MHAs must be held to a higher standard.MHAs have the potential to make mental health care more accessible to all peoples [7], but the move towards accessible mental health care should not come at the risk of user privacy and security.

A APPENDIX A.1 Supplementary Table
.

Figure 1 :
Figure 1: Flowchart of the Sampling Process for User Review Analysis

Figure 2 :
Figure 2: Process of Narrowing Down the User Reviews Relevant to the Study

Figure 3 :
Figure 3: Process of Comparing 60 Open Codes to Expert's Concerns .

Table 1 :
Heat Map of User and Practitioner Concerns Compared to Expert Evaluator Concerns (* indicates those by both Privacy Not Included and One Mind PsyberGuide)

Table 2 :
MHAs Included in the Analysis

Table 3 :
MHAs Included in the Analysis Continue 1

Table 4 :
MHAs Included in the Analysis Continue 2