52 Weeks Later: Attitudes Towards COVID-19 Apps for Different Purposes Over Time

The COVID-19 pandemic has prompted countries around the world to introduce smartphone apps to support disease control efforts. Their purposes range from digital contact tracing to quarantine enforcement to vaccination passports, and their effectiveness often depends on widespread adoption. While previous work has identified factors that promote or hinder adoption, it has typically examined data collected at a single point in time or focused exclusively on digital contact tracing apps. In this work, we conduct the first representative study that examines changes in people's attitudes towards COVID-19-related smartphone apps for five different purposes over the first 1.5 years of the pandemic. In three survey rounds conducted between Summer 2020 and Summer 2021 in the United States and Germany, with approximately 1,000 participants per round and country, we investigate people's willingness to use such apps, their perceived utility, and people's attitudes towards them in different stages of the pandemic. Our results indicate that privacy is a consistent concern for participants, even in a public health crisis, and the collection of identity-related data significantly decreases acceptance of COVID-19 apps. Trust in authorities is essential to increase confidence in government-backed apps and foster citizens' willingness to contribute to crisis management. There is a need for continuous communication with app users to emphasize the benefits of health crisis apps both for individuals and society, thus counteracting decreasing willingness to use them and perceived usefulness as the pandemic evolves.


INTRODUCTION
Over the course of the pandemic of Coronavirus Disease 2019 (COVID- 19), caused by the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) in early 2020, countries around the world have taken severe measures to contain the spread of the virus.These measures include governmentmandated quarantines, local and national lockdowns, and contact and travel restrictions, ultimately leading to a breakdown of international travel and the global economy.Digital tools were soon developed to support these efforts, with public discussion focusing on smartphone applications.
In many countries, including Singapore [31], Israel [36], and Germany [59], apps were seen as a key factor to relieve the burden on health authorities by assisting their traditional disease management methods with automated digital contact tracing.Apps for other COVID-19-related purposes were deployed by public and private entities worldwide, including apps for quarantine enforcement [66], symptom checking [3], and easy access to COVID-19-related information [7].As the number of people vaccinated against COVID-19 increased and countries sought to lift restrictions and reinstate international travel, the discussion had shifted towards digital health certificates that help individuals manage the results of recent tests for SARS-CoV-2 or confirm their vaccination against COVID-19.Examples include the European Union's Digital COVID certificate [23], which was implemented, for example, into the German CovPass app [58], the IATA Travel Pass Initiative [34], and New York State's Excelsior Pass [65].While these apps serve widely different purposes, they all pursue the same common goal: to assist traditional measures in the fight against the COVID-19 pandemic.Our work revolves around such COVID-19 apps, which is the umbrella term we use to denote smartphone applications specifically designed to help combat the COVID-19 pandemic.
Depending on their concrete functionality, the effectiveness of such apps can rely on widespread voluntary adoption.This was particularly evident with digital contact tracing [25] but also applies to other COVID-19 apps such as digital health certificates, which can only significantly speed up mandatory checks in everyday life if many people use them.The deployment and promotion of such apps by governments has led to public debates about the privacy, security, and social implications of COVID-19 apps [17,54,70], which can influence people's willingness to use them.
While previous work has identified factors that promote or hinder adoption of COVID-19 apps, these studies were typically based on data collected at a single point in time [2,8,43,67] or focused solely on contact tracing apps [62].Learning which factors promote or hinder adoption of apps for different purposes in the COVID-19 pandemic and how they evolve over time can provide insights for the development of smartphone apps designed to help combat future public (health) crises.On a more general scale, these results can inform the development of apps, particularly those issued by non-commercial entities such as governments or NGOs, that collect users' personal information not for their immediate personal benefit but that of society as a whole and, thus, would benefit from widespread adoption.Examples for which such data collection could be beneficial include crisis prediction and management, improvement of local infrastructure, or scientific research.
To better understand how different factors such as app purpose, collected data, technical implications, stance towards authorities, privacy attitudes, prior experience with COVID-19, and demographics influence people's willingness to use such apps, we conduct a sequence of surveys over the course of the first 1.5 years of the COVID-19 pandemic.In three different survey rounds, each administered in both the United States and Germany between June 2020 and May 2021, we use a vignette design of hypothetical yet realistic COVID-19 apps for five different purposes (contact tracing, health certificate, information, quarantine enforcement, and symptom check) to examine the factors that influence people's willingness to use such apps and how these factors change over time, and attempt to relate the results to different pandemic events.More specifically, we investigate the following research questions: We find that for most app purposes there is a slight negative trend in participants' willingness to use COVID-19 apps over time.Across all survey rounds, the perceived utility of apps for each individual purpose is slightly higher than participants' willingness to actually use that app.RQ 2: What factors significantly influence people's willingness to use COVID-19 apps and (i) are of continuous importance for participants or (ii) whose influence changed during the first 1.5 years of the pandemic?Factors related to the data processing practices of COVID-19 apps tend to have consistently significant influence on adoption.The same applies to apps for contact tracing that were widely discussed and used to make an individual contribution to fight the pandemic.The impact of other factors on participants' willingness to use apps changed over time, particularly due to individual experience with COVID-19 and external influences, such as the app purpose "health certificate" in the light of public discussion of vaccine passports in Germany.
RQ 3: How does people's subjective perception of COVID-19 apps change over time, based on 1.5 years of experience with the pandemic and actual use of these apps?Participants' perception of COVID-19 apps has become less extreme over time, with privacy concerns remaining important but being overshadowed by considerations of utility and misconceptions about the capabilities and use cases of apps, including post-vaccination use.
Our findings can help better understand how people interact with state-issued technology over the course of a public health crisis with changing external influences, including steeply rising incidence rates and measures for containment that limit individual freedoms.Insights from this can inform the design of future government-issued apps that collect personal information for the benefit of society as a whole in other contexts-such as disaster prevention and management and scientific research-as opposed to providing useful functionality only for the individual.In particular, we identify the importance of establishing trust in the entities behind an app and communication about its capabilities and continuous utility to be vital in achieving widespread voluntary adoption.

RELATED WORK
Our work relates to previous studies of smartphone applications that process personal health information, particularly in the COVID-19 pandemic, and of apps used in crisis situations to inform and communicate with the general public.

User Perception of COVID-19 Apps
Security and privacy research has started to investigate user perceptions of COVID-19 apps as early as spring 2020, initially focusing on apps for digital contact tracing to accompany intense public discussions about different proposed architectures and their implications for security, privacy, and society [17].In the early phase of the pandemic in April 2020, two studies with international participants found that people were generally highly willing to use such apps, even though security and privacy concerns may be hindering factors [2,62].In a longitudinal follow-up from April to December 2020, Simko et al. [63] identified diverse public opinions of contact tracing apps.While the general willingness to adopt such apps continuously increased, it also became apparent that some people will never use them voluntarily, a finding also confirmed in later work [32], including our own [68].In November 2020, Li et al. [42] identified prosocialness, COVID-19 risk perception, and technology-readiness to be drivers for adoption and confirmed that privacy concerns reduce the use of contact tracing apps.By contrast, Seberger et al. [61] found that people are willing to set back their own privacy expectations when it serves public health as a greater good.
Regarding other drivers that hinder the adoption of digital contact tracing, Toch and Ayalon [67] found that people are less likely to voluntarily use such apps in environments with mass surveillance measures in place when they have positive attitudes towards such measures.Beliefs of being immune also reduced the willingness to use contact tracing apps, even among people whose suspected COVID-19 infections were unconfirmed [8].Fast et al. [24] explored different types of incentives to increase the adoption rate of contact tracing apps and found monetary incentives to significantly increase app installations.Kahnbach et al. [38] systematically reviewed 21 European apps for digital contact tracing and recommended to increase in-app user engagement and to combine multiple features (e. g., contact tracing and venue check-in) to foster future adoption of COVID-19 apps.
Beyond contact tracing apps, in our prior work [68] we studied people's willingness to use different types of COVID-19-related apps in Germany, the United States, and China and found attitudes towards governmental authorities to significantly influence the willingness to use such apps.Marhold and Fell [46] showed an increased importance of unified digital solutions to prove immunization status via digital vaccination certificates.In this context, another study by some of the authors of this work [41] indicated that users' disposition to privacy plays an important role in the willingness to adopt app-based vaccination certificates in Germany.
In health contexts not related to COVID-19, studies have found nuanced privacy attitudes in the sharing of health data depending on the type of data being shared [20] and the receivers of shared health-related data [30,49].

Use of Crisis Management Apps
In a more general context, previous research investigated mobile apps designed to provide crisisrelated information and enable communication between citizens and authorities prior to the COVID-19 pandemic.
Appleby-Arnold et al. [5] studied the role of trust in authorities in disaster-related crises in Italy and Germany.They found that when using disaster apps, trust between citizens and authorities is generated through perceptions of shared responsibility and tasks.While they consider trust to be situational, the authors explain it as a cultural factor that is subject to constant change in societies.A study by Dressel [21] complements prior research about the build of trust in cultures and differs between risk cultures of a given society in case of a crisis.These risk cultures have different implications for crisis management and different conclusions need to be drawn when authorities are tasked with a developing crisis: In an individual-oriented risk culture, i. e., where trust in authorities is medium to high, crisis management needs to implement the idea that the state is there and well prepared to tackle disasters.If high or even very high trust is placed in authorities (state-oriented risk cultures), the state needs to implement the idea that despite responsible crisis management institutions, individual behavior is equally important for handling a crisis in a meaningful way.
In an online study Reuter et al. [56] evaluated the use of crisis apps in Europe based on three popular apps focusing on warning functionality.Participants were found to have a positive attitude towards crisis apps, but one of the biggest contentions for users was the need to install more than one app.Users also seem to be interested in being integrated in crisis management and participating as volunteers to report incidents.However, only 16 % of the participants used at least one crisis app (as of 2015).Other work [40] related to mobile warning apps in disaster communication found that the messages presented in mobile warning apps for the purpose of crowd control need to be carefully designed and presented to users to consider strong emotions such as fear that often emerge in crisis situations.The authors argue that existing apps in Germany lack quality and timing, which highlights the importance of well-chosen messages to prevent people from ignoring them due to notification fatigue.Further findings show that malfunctions in these apps can lead to a high number of user complaints.Kaufhold et al. [39] conducted a representative user study in Germany in May 2019 to investigate, among others, the functionality and perceived usefulness of mobile crisis apps.Their results indicate that besides emergency and health-related warnings participants value bidirectional communication in mobile crisis apps.Participants also prefer a single comprehensive app over multiple different ones and are resistant to installing more than one crisis app on their smartphones.

METHOD
Our work extends this prior research on COVID-19 apps and crisis management apps by exploring the impact of different app purposes and various factors related or unrelated to the data processing practices of an app on people's willingness to use apps in the context of a global (health) crisis and how their perception and use of such apps developed over the course of the first 1.5 years of the pandemic.In particular, it continues the research from our first paper [68] by adding in the temporal aspect.Broadly considering people's willingness to use state-supported smartphone apps for crisis management and communication over time, i. e., over different phases of a crisis, can inform the development of similar future apps issued by public actors that collect personal information for the benefit of society as a whole and investigate how external effects beyond app design influence people's acceptance and use of such digital tools.
To this end, we conducted three rounds of online surveys with participants from the United States and Germany, in summer 2020, late fall 2020, and spring 2021, selected to reflect different stages of the COVID-19 pandemic.The data from the first round is the data collected for our first paper [68] in these two countries, i. e., without the data from China, as we only conducted the first survey round there.
Correspondingly, our survey instrument is the questionnaire we already used in this first survey round and publication and a more detailed description of its creation can be found in that paper [68].In its main part it presented participants with vignettes [26] describing (hypothetical) COVID-19 apps and asked them to rate these apps according to a set of criteria.Participants were also asked about their smartphone use, experience with the coronavirus, use of COVID-19 apps, privacy concerns, and attitudes towards governmental actions.In subsequent survey rounds we included additional questions, for example, regarding the official German app for digital contact tracing, the Corona-Warn-App [59].However, as the focus of this paper are changes in attitude towards COVID-19 apps over the course of the first 1.5 years of the pandemic, we did not analyze all survey questions.In the following, we describe the theoretical framework behind our vignettes and the study protocol: questionnaire, vignette design, recruitment process, methods used for data analysis, and the limitations of our study design.

Theoretical Framework: Privacy as Contextual Integrity
Nissenbaum's theory of privacy as contextual integrity (CI) [50] has proven to be useful in identifying factors that influence individuals' privacy perceptions [10,22].The CI theory states that privacy can be understood as a matter of appropriateness of information flows that is governed by informational and social norms consistent with a number of parameters: • Actors (Who will send and receive the data?)Public discussions about anonymity, data recipients, and possible architectures of COVID-19 apps implicitly referred to these parameters.Like other previous research that studied user willingness to adopt new technology [6,22,47], we used a vignette design to describe scenarios that vary these parameters and potentially impact participants' willingness to use different variants of COVID-19 apps.

Study Protocol
At the beginning of the survey we informed participants about the purpose of the study, the data collection, and asked for their consent before they proceeded to the actual questionnaire, which consisted of three parts: (i) a warm-up that asked participants about their smartphone use, experience with COVID-19, and their knowledge and general perception of COVID-19 apps, (ii) the vignette part, where participants saw and rated ten scenarios describing fictitious COVID-19 apps inspired by real ones from all over the globe, and (iii) COVID-19 apps and concerns, where we asked participants about their experience with real-world COVID-19 apps, attitudes towards governmental actions, and general privacy attitudes.

Warm-up Questions.
To help participants ease into the questionnaire, its first part asked about their background regarding phone use, COVID-19, and general questions about apps designed to help fight the pandemic.
Smartphone Use.First, we asked participants about their use of smartphones: if they owned one (Q1), its operating system (Q2), and how happy they were with certain aspects of their device (e. g., its battery life; Q3).COVID-19 Experience.Personal experience with COVID-19 was shown to be a significant factor in the adoption of (hypothetical) contact tracing apps [70], so we asked about participants' experience with previous infections (Q4 and Q5), quarantine (Q6), and their concern of loved ones getting infected (Q7).We also asked participants about people at higher risk in their household (Q8), and, in the third survey round, vaccination status (Q9).
Knowledge and Perception of COVID-19 Apps.Next, we asked participants whether they knew any app available in their country for contact tracing, symptom checks, quarantine enforcement, COVID-19 information, or health certificates (Q10), which are the five app purposes of interest in our study.In open-ended questions, we let participants name general positive (Q11) and negative (Q12) aspects of COVID-19 apps.Facing evidence for dissatisfaction with existing apps, in Round 3 we added a new question (Q13) that asked participants what functionality they would wish for in the "ideal" COVID-19 app.

App Scenarios.
The main part of the questionnaire asked participants to rate hypothetical yet realistic COVID-19 apps according to four different criteria.Each participant received a unique set of ten vignettes, short texts which each described one (hypothetical) COVID-19 app.
Vignette Design.The vignettes shown in our survey contained short textual descriptions of hypothetical COVID-19 apps based on a common text template, whose blanks were filled in to create a concrete app description.We denote such a specific app description an (app) scenario.
Our previous work [68] describes in more detail how we created the app scenarios and also provides an illustrative example.Each scenario is composed of a fixed text template with gaps for eight data processing factors, each of which is filled with one factor level.The factors are derived from the theory of contextual integrity (CI) (see Section 3.1) and the factor levels originate in an analysis of real-world COVID-19 apps conducted in April 2020, which is also described in more detail in our first paper.
The factors and factor levels, whose exact number varies by factor, are as follows: (1) Purpose (5 levels): contact tracing, symptom checking, quarantine enforcement, information, health certificate.
(2) Data Collected (16 levels): encounter data, location data, health or activity data (excluding COVID-19 infection status), COVID-19 infection status, all combinations of two or three of them, unspecified data, no data.(3) User Anonymity (3 levels): whether the app collects personal data that allows for unique identification of the individual, collects only demographic data, or collects only data that cannot be used to uniquely identify the user.(4) Data Receiver (6 levels): health authorities, law enforcement, research institutions, private companies, the public, none.(5) Data Transmission (3 levels): automatically, manually (app-type-specific wording, e. g., for health certificate: "when you request your health report"), none (in case no data is collected).( 6) Retention (3 levels): one month, until end of current coronavirus regulations, unspecified.(7) Technical Implications (3 levels): impact on battery life, app malfunctioning (app-typespecific wording, e. g., false positive for breaking quarantine), none.( 8) Soci(et)al Implications (4 levels): possible additional benefits in the future, more timely adjustment of local coronavirus regulations, extended personal freedom of movement or travel, none.
Scenario Sets.Combining all possible factor levels resulted in 155,520 different app scenarios.As described in more detail in our previous work [68], we applied constraints to ensure that the combinations of factor levels made sense, such as symptom check apps always requiring health or activity data or data made available to the public always being stored for an indefinite amount of time.
For each participant, we then created a set of ten scenarios that they would be presented with in the survey in random order.To create a set, scenarios were drawn randomly under two constraints: (i) we included two scenarios for each of the five app purposes, and (ii) if possible, each pair of scenarios with the same app purpose differed in all other factor levels.
Scenario Questions.For each app scenario, we asked participants the same four questions.We asked how likely they were to use the described app (Q14) and to estimate the expected share of app users in their country (Q15).We also let participants assess the normative pressure to use the app (see Ajzen [1]) that they expect in their social circles (Q16) and how useful they perceived the described app to be in the fight against the pandemic (Q17).

COVID-19
Apps & Privacy Concerns.In its third and final part, the survey investigated participants' concrete experience with COVID-19 apps, their attitudes towards public actors and anti-pandemic measures, and towards data privacy on the Internet in general.
Experience with COVID-19 Apps.After the app scenarios, we wanted to know whether the participants themselves (had) used such an app (Q18/Q19) and, if yes, which app (Q20/Q22), and if not, why they did not (or no longer) use one (Q21/Q23).We also asked if they were satisfied with the used apps (Q24).German participants who used the national app for contact tracing, the Corona-Warn-App [59], were additionally asked how satisfied they were with the app (Q25), if they had ever been warned through it (Q26), and to assess a set of true/false statements about the app (Q27).
Attitudes Towards Governmental Actions.Since the fight against the pandemic was accompanied by restrictions of personal freedoms in both surveyed countries, we were interested in participants' opinions of and attitudes towards public and governmental institutions.We let them assess the measures applied in their region to counter the pandemic (Q28) and asked for their opinions of health authorities, law enforcement, research institutions, private companies, and federal and regional governments (Q29).
Individual Privacy Concerns.Finally, to learn participants' general privacy concerns, we used the Internet Users' Information Privacy Concerns (IUIPC) [45] constructs for Control, Awareness (of privacy practices), and Collection (Q30).The IUIPC scores would be used in two contexts: as a demographic factor to describe the sample and as an input factor for regression models.

3.2.4
Changes to the Questionnaire.Between rounds, we aimed to minimize changes to the questionnaire to preserve comparability.To account for the development of the pandemic, we added some questions (such as Q7 about participants' vaccination status) or answer options (such as Luca [18] for the list of widespread apps possibly used by German participants in Q19).

Recruitment
We commissioned an online panel provider, Lightspeed Research (Kantar), to obtain representative samples for the general population of Germany and the US in terms of gender, age, region, and education.For each survey round, we recruited a new sample of approximately 1,000 participants.We decided against a true longitudinal design with a fixed panel of participants because the duration and further evolution of the pandemic were unpredictable, which would have made it very difficult to plan the overall duration, number, and timing of different survey rounds, as well as the required number of participants under high expected dropout rates between rounds.
We were still able to compare results across rounds because each sample was representative for the general population with regard to the above criteria in the respective country.All direct interaction with participants was handled by the panel provider, including setting representative quotas, reimbursement, and data cleaning.The latter involved discarding respondents who had completed the survey faster than 40 % of the median response time or who had responded in suspicious or repetitive patterns.The cost was 2,500 e per survey round and country plus a onetime 2,500 e for setup and implementation.

Survey Timeline and Pandemic Context
To provide context for our study, we describe the pandemic situation and availability of COVID-19 apps in the surveyed countries at the time of each survey round.
3.4.1 Germany.We started each of the three survey rounds in Germany, as this initially presented us with the opportunity to learn about people's opinions before the release of a federal COVID-19 app.
Round 1 (June 9-11, 2020).The first round was conducted during the first wave of the COVID-19 pandemic (see Figure 1) when strict contact restrictions were in place across the country.This was just days before the launch of the Corona-Warn-App [59], Germany's official smartphone app for digital contact tracing, published on June 16, 2020.Public discussion in the preceding months had been dominated by the privacy risks of a centralized architecture for digital contact tracing [44], which ultimately led to the adoption of a decentralized system based on Google's and Apple's Exposure Notification API [4].At the time of survey Round 1, one COVID-19 app officially recommended by German authorities was available, Corona-Datenspende ("Corona Data Donation") [57], which allowed owners of fitness trackers to voluntarily provide the RKI 1 with health and activity data to aid COVID-19 research.Pandemic-related information had also been added to Germany's official disaster alert app, NINA [13].Round 2 (November 5-18, 2020).The second round was conducted at the beginning of the second pandemic wave in Germany when incidence rates were steeply rising.At this point in time the Corona-Warn-App had been available for five months, accompanied by a national advertising campaign.Subsequent updates had fixed bugs and increased usability.
Round 3 (May 19-31 2021).The third survey round took place during the third wave of the pandemic in spring 2021.The national vaccination campaign was ongoing, with 39.0 % of the German population being partially and another 12.6 % being fully vaccinated [27].New functionality had been added to the Corona-Warn-App, including a contact diary and management of test results.In addition to the publicly funded apps, Luca [18], an app for digital contact tracing developed by a private company, had been integrated into the anti-pandemic strategies of multiple German states.This app provided event hosts and gastronomers with a QR-code-based implementation of their legal obligation to record attendance for contact tracing purposes and was directly connected to the computer systems of health authorities.It has been criticized for its centralized architecture, privacy, and security issues [64].In addition, public discussion at the time of Round 3 revolved around app-based digital vaccine certificates to let users easily prove their COVID-19 vaccination status [16].This functionality was ultimately integrated into the CWA on June 9 and issued as a standalone app, CovPass [58], on June 10.

United States.
After Round 1 in the US had been conducted approximately two weeks after Round 1 in Germany, we followed a similar schedule for subsequent rounds.
Round 1 (July 6-14, 2020).In the US, Round 1 took place during the first wave of the COVID-19 pandemic.High incidence rates had prompted multiple states including California and Indiana to postpone or reverse plans to reopen their economies [27].A coronavirus-related app available in the US at that time was Apple's now discontinued COVID-19 Screening Tool [3] that provided information about the disease and allowed users to assess their symptoms.
Round 2 (December 7-17, 2020).The second round was conducted during the second pandemic wave with an exponential increase of infection rates.Starting in November 2020, this development had led several states, such as Michigan, Washington, and California, to impose strict measures including the closure of high schools and restaurants [9].Unlike in Germany, there had not been any plans in the US to roll out a contact tracing app on the federal level.Instead, from August 2020 US states, territories, and Washington, D.C. started to issue their own apps, some with a common code base and the ability to interoperate [51,60].Examples included Virginia's COVIDWISE app, published on August 5, and California's CA Notify [14], available since December 10. December 2020 also marked the start of mass vaccinations against COVID-19 in the US.
Round 3 (June 7-23, 2021).The final survey round was conducted at a time when the vaccination campaign in the US had reached about half of the total population, with approximately 50.9 % having received at least one dose of the vaccine and 41.

Data Analysis
Our data analysis process comprised quantitative and qualitative methods.We focus on changes over time and only briefly describe differences between countries, which we already analyzed in our previous work [68].

Statistical Analysis.
In our evaluation we first investigated participants' overall willingness to use COVID-19 apps.To examine differences in the willingness to use apps (Q14) across three rounds and two countries, we used analyses of variance (ANOVA).In order to understand the effects of different app scenario factors on participants' willingness to use an app, we performed a regression analysis with the cumulative link mixed models (CLMM) module of the R package ordinal [15].We were specifically interested in how effect sizes change over time, so we included interactions between several factors and the survey round, with Round 1 as the baseline.Since a cross-combination of all factors would have resulted in a model that is not computable, we first conducted a manual visual review of the interactions between each factor and the answers to Q14 to determine which factors yielded interesting results.The resulting models are quite large and included 43 individual factors and 27 (Germany) / 29 (US) interactions, respectively.We followed best practice [71] to improve the model and iteratively removed non-significant factors, as indicated by the removal increasing the Akaike information criterion (AIC) of the model compared to a version that included that factor or interaction.
3.5.2Qualitative Analysis.Our survey contained seven open-ended questions (Q11-Q13 and Q20-Q23, see Appendix B).For analysis, we applied Mayring's mixed-methods approach with qualitative and quantitative elements [48]: We first used an iterative open coding procedure to create a codebook for each question and subsequently applied it to the data.In the second step, we determined and compared code frequencies across countries and survey rounds.
The coding procedure was performed by two of the authors.First, each coder independently examined the first 3002 answers to each question to inductively identify and categorize recurring themes and create an initial set of codes.These were compared and discussed to create a first codebook for each question.Each answer could be assigned one or multiple codes.
To validate the codebooks, the next 200 answers to each question were independently coded by both researchers.This accounts for roughly 20 % of the data, which falls within the recommended amount of 10-25 % to determine inter-coder reliability (ICR) [52].As a metric for this we used ReCal 2 [28] to compute Krippendorff's alpha for each code and, for each codebook, report their mean, weighted by how often the respective code occurred in the ICR data.
The remaining 500-600 answers were then split into two and each coded by a single coder.
In subsequent survey rounds, we used the first 200 answers to each question to determine if the previous codebooks still applied or needed additions.In Round 3, the ongoing COVID-19 vaccination campaign led us to add "vaccination" as a reason named by participants to not (or no longer) use an app.Otherwise, we applied the same coding procedure as in Rounds 1 and 2. For open-ended responses to Q20/Q22 that asked which apps participants used, we used a simplified process: Using Internet searches and app stores to identify any app previously unknown to us, we mapped each answer to one of the following categories: COVID-19 app, other health app, disaster app, other non-health app, unnamed app (e. g., when participants could not remember the name of an app), a non-app tool (such as a website), none (when participants contradicted themselves and wrote into the text field that they did not use any app or tool), or unusable (incomprehensible).

Research Ethics
Our department does not have an institutional review board.Still, we followed best practices of human subject research and data protection guidelines.The study was approved by our institution's data protection officer.We complied with the rules of the EU General Data Protection Regulation (GDPR) and obtained participants' informed consent at the beginning of the study.Our panel provider, Kantar, follows a self-commitment to the ICC/ESOMAR International Code on Market and Social Research [35].

Limitations
The various factors and factor levels that could occur in a COVID-19 app were created based on a set of apps from the beginning of the pandemic.Due to the study investigating developments over time, we could not make major changes to the questionnaire or app scenarios to reflect more recent events.In particular, the factors for the CLMM model had to stay the same to allow for an analysis of changes over time.Thus, we tried to account for possible future developments from the beginning.Nevertheless, the app scenarios presented in the vignettes are theoretical and simplified and thus unlikely to accurately represent existing apps.
Still, in our evaluations we tried to account for changing circumstances over the course of the pandemic, such as social restrictions, newly released apps, and ongoing public discussions about COVID-19 apps.For this, we included questions about measures taken to fight the pandemic and participants' opinion about state institutions.However, we cannot assign cause and effect beyond doubt for external factors not included in our study, such as political changes like the 2020 presidential election in the US.We did not collect data on political opinions in our questionnaire, because comparing political views across countries is not trivial and this was also not the focus of this work.As we could only speculate about the effect of these external factors, we refrain from discussing them.

RESULTS
Our results are based on the data collected in a total of six different survey rounds, conducted in Germany and the US at three distinct points in time during the first 1.5 years of the COVID-19 pandemic.

Participants
Over three survey rounds in two countries, we recruited a total of 6,124 participants, 3,049 from Germany (DE) and 3,075 from the US, in six independent samples, each with roughly 1,000 participants as shown in Table 1.Target representativity quotas were met with average discrepancies of 2.5 % (DE, Round 1), 1.0 % (DE, R2), 2.4 % (DE, R3), 3.4 % (US, R1), 4.5 % (US, R2), and 4.2 % (US, R3).Across all rounds and countries, IUIPC scores were above the median of the 7-point Likert scale (median = 4).The scores for the three IUIPC dimensions were similar for all rounds in the US, with scores lowest for Collection.In Germany, IUIPC scores decreased from Round 1 to Round 3 for all dimensions, and scores were lowest for Awareness.

COVID-19 Apps
Used by Participants.Table 1 shows that the number of German participants who reported use of COVID-19 apps (3.99 % in Round 1) almost increased tenfold with the availability of the Corona-Warn-App in Round 2 (38.9 %) and continued to grow in Round 3 (42.7 %).In the US, usage first increased between Round 1 (6.2 %) and Round 2 (11.3 %), but then dropped between Rounds 2 and 3 (8.6 %).One possible reason could be the quickly progressing COVID-19 vaccination campaign in the US, as our qualitative analysis in Section 4.4.1 and 4.4.2hinted at some people thinking an app was no longer necessary if one had received the vaccine.
In Germany in Round 1 (R1), 40 participants reported use of a COVID-19 app, but when asked which, only 11 named a dedicated COVID-19 app, most frequently (7 times) Corona-Datenspende.In R2, where 396 participants (38.90 %) had indicated to use any kind of COVID-19 app in the multiple-choice Q19, 325 reported use of the Corona-Warn-App (CWA), 44   The first part of our evaluation focuses on participants' general willingness to use different types of COVID-19 apps over time.
General Willingness to Use Apps Over Time.First we take a look at how willing participants were across survey rounds to use any kind of COVID-19 app, irrespective of purpose.Table 2 shows the average response values on the 7-point Likert scales in Q14 (willingness to use the presented hypothetical COVID-19 apps) along with standard deviations across all app scenarios by survey rounds and countries.
Overall, participants' willingness to use an app is rather low in both countries across all three survey rounds-average responses are always below the neutral response, which is 4 on a 7point Likert scale.Values change over the course of the pandemic in Germany and the US.For Germany, we found pairwise statistically significant differences ( < 0.05) between R1 and R2, as well as between R2 and R3.This shows that in Germany the willingness to use COVID apps decreased significantly from R1 to R2 and increased significantly again from R2 to R3.There is no significant difference in the willingness to use any app between R1 and R3.In the US, the willingness to use any COVID-19 app dropped significantly in R3 compared to both other rounds.These differences between R1 and R3 as well as between R2 and R3 were significant ( < 0.05).We did not observe statistically significant differences between R1 and R2.Pairwise comparisons of the average willingness between both countries for each round (i.e., R1-DE vs. R1-US), always yield significant differences between German and US participants.
COVID-19 Apps by Purpose.Next, we more closely examined the different types of COVID-19 apps and evaluated participants' willingness to use these apps (Q14) and their perceived utility (Q17) for each distinct purpose: contact tracing, symptom check, quarantine enforcement, health certificate, and information.In this analysis we focused on the percentage of positive responses to the two questions, i. e., all responses higher than the neutral response (> 4) on the 7-point Likert scales.Table 3 shows these percentages by country, survey round, and app purpose.Corresponding mean response values to the two questions can be found in Tables 9 and 10 in Appendix C. Across both countries and all three survey rounds, participants' willingness to use (Q14) a contact tracing app is higher than for any other app purpose.We speculate that this is an effect of apps for digital contact tracing being prominently discussed and subsequently launched in many countries across the globe.In Germany, the willingness to use an app of any type decreases between R1 and R2.The subsequent increase in R3 is less pronounced, resulting in an overall negative trend across all survey rounds.In the US, there is a continuous negative trend over all rounds.We suspect that the temporary decrease in German participants' willingness to use apps across all purposes in R2 is related to lockdowns that were in place at that time due to high incidence rates (see Figure 1).In this situation, participants could have considered COVID-19 apps as being unable to make a distinct contribution in containing the pandemic and not offering societal or individual benefits [11].Similarly, the subsequent increase in R3 could reflect the turn in public policy at that time-released lockdowns and lower incidence rates prompting people to socialize more frequently again.While this development can be observed for all types of apps, it is especially prominent for health certificate apps in Germany, where the willingness to use this type of app in R3 (31 %) was even higher than in R1 (26 %).A possible reason for this could be that German participants associated this app purpose, initially inspired by the Health Code systems deployed in multiple Chinese municipalities [68], with vaccination and test certificates.At the time of survey R3, there were public discussions about the introduction of (digital) "vaccine passports" and whether they should be made mandatory for various aspects of public life in Germany [37,53,69].
In Figure 2 we investigate how participants' COVID-19 infection status affected their willingness to use different types of COVID-19 apps over time.We observe noticeable differences between participants who had reportedly tested positive and those who stated that they had tested negative: In both countries participants who had already been confirmed to have the coronavirus exhibited a higher willingness to use COVID-19-related apps (top figures).Among participants who had not yet   contracted COVID-19 (bottom figures), the willingness to use apps is noticeably lower.What stands out is that for negatively tested participants in Germany, the willingness to use contact tracing apps is higher than for the other app purposes, while for participants with a negative test result in the United States the willingness to use both contact tracing and symptom check apps is higher (both bottom figures) compared to those participants who reported having tested positive (top figures).We hypothesize that for individuals who had tested positive, the motivation to further use contact tracing apps in particular was lower than for those who had so far only tested negative and were still trying to avoid an infection.This supports our hypothesis that the perceived individual benefits of COVID-19-related apps influence the decision-making process whether to use an app or not.Overall, our results indicate that participants who had already had a negative experience with the coronavirus (i.e., had been infected) or who had higher concerns of becoming infected (see Figure 8 in Appendix E) tended to be more willing to use COVID-19 apps across all survey rounds.For the perceived utility of COVID-19 apps (Q17), we make similar observations (see the right part of Table 3).Interestingly, the percentages of participants with positive responses for utility are slightly higher than for the willingness to use the respective type of app.This indicates that there is a certain share of participants who consider these apps useful in general but are not actually willing to use the respective app.
Summary RQ 1: Participants' willingness to use COVID-19 apps is generally rather low, with the average being below the neutral response.For contact tracing, it is higher than for any other purpose in both Germany and the US.Over time, there is a slight negative trend in the willingness to use apps for all purposes, with the exception of health certificate apps in Germany.People with higher concern of contracting the coronavirus are more likely to use apps for digital contact tracing.Participants' perceived utility of COVID-19 apps is slightly higher than their willingness to use them.

Factors Influencing the Willingness to Use COVID-19 Apps Over Time (RQ 2)
As described in Section 3, we used ordinal regression analysis to measure the impact of each app scenario factor on participants' willingness to use COVID-19 apps and how this influence evolved over time.In this analysis we distinguish between scenario-specific factors that originate in the ten app scenarios shown to participants and non-scenario factors, which are the responses to the survey questions outside the app scenarios plus demographic information provided by Kantar.In presenting the influence of factors we further differentiate how significant this influence is over time: • factors that significantly influence the willingness to use apps across all survey rounds, i. e., that are of continuous importance to participants (highlighted red in our models), and • factors that have significant influence in one or multiple survey round(s) but not in others, i. e., that change in interaction with the survey round (highlighted blue in our models).
Table 4 presents selected significant factors from the two final CLMM models, one for each country, and Table 5 in Appendix A shows the comprehensive models with all factors that were included in the respective model.

Willingness to Use Apps Over Time.
As shown in the first block of rows in Table 4, the influence of time, i. e., the factor "Survey Round," on participants' willingness to use any of the presented COVID-19-related smartphone apps changed over the course of the pandemic in both Germany and the US, though in a different way: Among German participants, time had a significant positive effect on willingness to use any COVID-19 app (R2: estimates 0.51,  < 0.05, R3: estimates 0.95,  < 0.01, compared to R1 as baseline).In the US, time had a significant negative effect on the willingness to use an app in both R2 (estimates -1.41,  < 0.15) and R3 (estimates -0.86,  < 0.05) compared to R1.
Since these results may seem counter-intuitive when compared to the overall willingness to use apps (see Section 4.2), we emphasize that both measure different effects.The estimates in Table 4 present the actual effect of time on willingness to use an app (which is positive in Germany), whereas the overall results in Table 2 show cumulative effects of all factors, resulting in a significantly smaller willingness to use apps in R2 in Germany when compared to R1.

Influence of Factors on the Willingness to Use COVID-19
Apps Over Time in the US.As already mentioned above, when investigating the influence of different scenario-based and other factors on US participants' willingness to use COVID-19 apps over time, we differentiate between factors with continuous significant influence on app adoption and those that were not always significant.
Consistently Influential Factors.Considering only app scenario factors that had consistently significant influence in every survey round (red), we observe in Table 4 that contact tracing was the only app purpose with a positive influence on participants' willingness to use an app (Q14) in the US (estimate 0.21,  < 0.05).Also consistently significant but with a negative estimate were apps reducing phone battery life (estimate -0.12,  < 0.05) and both identification data levels: demographic data (estimate -0.18,  < 0.05) and data that allowed for unique identification of an individual (estimate -0.25,  < 0.01).Participants being worried about sharing their data with private companies and law enforcement were also less likely to use any COVID-19 related app (estimates -0.18,  < 0.05 and -0.25,  < 0.01, respectively).As shown by the black entries in the full US model in Table 5 in Appendix A, most app scenario factors (e. g., purpose, data transmission, or retention) did not have any significant influence on US participants' willingness to use an app in interaction with the survey round (i.e., over time).For the factors highlighted red (e.g., whether data can be used to identify an individual, data receiver) we  found significant influence on the willingness to use an app in every survey round, which means that these are factors that are of continuous importance for US participants' decision whether to use an app or not.
Factors with changing influence over time.Further, our analysis identified factors which we found to not always have significant influence on US participants' willingness to use COVID-19 apps in every survey round (blue).Regarding app scenario factors, we found such varying influence only for the payload data type "location", which had a significant effect in interaction with both survey rounds R2 and R3 (estimate 0.14 and 0.15,  < 0.05) on the willingness to use an app.This significant influence in R2 and R3 could not be observed in R1 and, thus, constitutes a change in significance over time.
Non-scenario factors (i.e., those taken from parts of the survey outside the app scenarios) more frequently had significant influence on participants' willingness to use an app in interaction with the survey round.One example is the reported education level: In R2 and R3 participants with a high school or higher education were more likely to use any app compared to those with secondary school or less in R1.This effect was higher for participants with a higher formal education level and also higher in R3 than in R2, as indicated by the higher estimates.
Another influential factor in interaction with time was the participant's own experience with COVID-19 (Q4), which has an effect in interaction with R3.Here participants who had tested positive were less likely to use any app (estimate: -0.77,  < 0.001), and those who had not yet been tested but suspected they had already been infected were more likely to use any app (estimate: 0.34,  < 0.05).Figure 3 shows on the right-hand side the (simplified) dependency between US participants' reported infection status and their willingness to use an app over time.
The analysis over time also shows that in R3 participants of higher age exhibited significantly less interest in using any app, although the effect size is rather small (estimate -0.01;  < 0.001).In R3 those with an unfavorable opinion of healthauthorities (estimate -0.77,  < 0.001) or the federal government (estimate: -0.40,  < 0.001) were less likely to use any app, as did those with a favorable view of their state government (estimate: -0.39,  < 0.001).
The influence of US participants' privacy concerns as indicated by their IUIPC score also changed significantly over time.Both IUIPC scores of the dimensions Collection and Awareness had a significant influence on participants' willingness to use a COVID-19 app: an increasing willingness to use an app for Collection and a significant decreasing tendency to use a COVID-19 app for Awareness, in both survey rounds R2 and R3 compared to R1 (see Table 4).In the US the willingness to use any variant of COVID-19 app related to the IUIPC dimension Collection significantly decreased over all survey rounds.

Influence of Factors on the Willingness to Use COVID-19 Apps Over Time in Germany.
As in the US analysis, for Germany we also differentiate between factors with consistently significant influence and those that were significant in some survey round(s) but not in others.
Consistently influential factors.Our CLMM model for Germany shows that almost all app scenario factors that had continuous significant influence on participants' willingness to use an app in the US also had such consistent influence over all rounds in Germany, though the effect size (estimates) and significance level (p-values) varied: Both factor levels "contact tracing" and "reduced battery life" again revealed a continuous positive and negative influence, respectively (estimates 0.27,  < 0.01 and -0.09,  < 0.01).Compared to the US, Figure 4 shows that even though contact tracing only had a continuous significant influence across all three survey rounds, we observe noticeable differences between both countries.During survey rounds R1 and R3, the mean willingness to use an app is higher in Germany, which we attribute to the general availability of COVID-19-related apps mainly provided by the federal government in Germany, and thus aligns with the actual usage of COVID-19 apps (see Table 1).Conversely, in survey round 2, both countries exhibit comparable mean values for the app purpose "contact tracing".As discussed in Section 4.2, we hypothesize that this observation, i. e., lower willingness to use apps in R2 in Germany, correlates with the governmental enforcement of a stringent lockdown caused by a rapid increase in COVID-19 infection rates (see Figure 1), resulting in an overall reduced perceived utility of contact tracing apps due to limited social contacts.
Any app used to (uniquely) identify an individual also negatively impacted participants' willingness to use it over all survey rounds.As in the US, the same reduced willingness to use any COVID-19 app emerged for certain app data receivers, namely private companies (estimate -0.19,  < 0.01) and law enforcement (estimate -0.26,  < 0.01).Unlike in the US, the public serving as the data receiver consistently had a negative influence over all three survey rounds (estimate -0.20,  < 0.01).Moreover, apps with the societal implication "faster update of local COVID-19 regulations" had a significant impact in all rounds in Germany (estimate 0.12,  < 0.01).German participants' IUIPC scores across both the dimensions Collection (estimate 0.11,  < 0.01) and Awareness (estimate 0.05,  < 0.05) also showed a positive influence on the willingness to use any app in all three rounds.Factors with changing influence over time.Similar to the US data, the influence of most factors from app scenarios did not change over time.One notable exception is the app purpose "health certificate" in R3, in which German participants' willingness to use a health certificate app significantly increased ( < 0.01; estimate: 0.29), as shown in Figure 4.This development coincides with the increased availability of vaccines at the time of R3 in Germany and ongoing discussions in the media about privileges for vaccinated people and the planned introduction of digital vaccine passports.
Beyond app scenarios, the model for Germany in Table 5 also shows significant interactions between the survey round and participants' experience with COVID-19.In R2 and R3 participants who had already tested negative were significantly more likely to install an app than in R1 ( < 0.001 for both interactions, estimates 0.64 in R2 and 0.61 in R3).At the same time those who had not been tested but suspected to already have been infected were less likely to use an app in R2 ( < 0.05, estimate -0.27); this interaction was not significant in R3.A likely explanation for the increase in the willingness to use an app for already tested participants is that the Corona-Warn-App was not yet available at the time of the first survey round.By contrast, at the time of R3, it had been upgraded with the ability to register one's COVID-19 tests and quickly receive the results digitally.This could hint at people who were willing (or required) to get tested considering the app a good digital support tool.For participants who assumed they had already been infected and had not yet been tested this functionality did not provide any use cases.
The model also shows a significant interaction for education level over time.Medium education level (high school or associate degree) in interaction with the second survey round negatively impacted participants' willingness to use an app (compared to R1).This effect was not present in R3 and might be due to increased app skepticism in this group during the winter 2020 wave.We also observed a significant and positive interaction between people's satisfaction with the battery life of their phone and their willingness to use an app in R3.This survey round took place in May 2021, with fewer restrictions in place compared to the time of R2, so people might have spent more time outside their homes, rendering the battery life of their phone more important.
Further, participants who found pandemic measures too lenient were more likely to install an app in R1, but less likely to do so in later survey rounds.People who found the measures against the spread of the coronavirus about right were more willing to use an app in R2 than in R1.We did not observe this effect in R3.This reflects that more effective methods against the spread of COVID-19, namely vaccines, had become available by the time of survey round 3.
Summary RQ 2: We identified factors that impact people's willingness to use COVID-19-related smartphone apps and significantly change over the course of the first 1.5 years of the pandemic, including increased acceptance of health certificate apps in Germany.Other factors consistently remain significant across all three survey rounds in both countries, including the app purpose contact tracing, unique identification of individuals or transmission of their demographic data, and private companies or law enforcement as data recipients.Factors related to the data processing practices of apps less frequently have significant influence on people's willingness to use them compared to individual experience with the pandemic and stance towards certain public and private organizations.

Perception of COVID-19 Apps Over Time (RQ 3)
Our qualitative analysis of the open-ended responses in our questionnaire provides insights how participants perceived COVID-19 apps over the course of the first 1.5 years of the pandemic.

Reasons
Not to Use COVID-19 Apps.We found participants' reasons to delete previously downloaded apps (Q23) to be a subset of those not to use COVID-19 apps in the first place (Q21), so we applied the same codebook.Further, the answers to these questions were similar to perceived   negative aspects of such apps (Q12), leading to codebooks with a large overlap, shown as the y-axis labels in Figure 5  .For all open-ended questions, each answer was assigned one or multiple codes.Figure 5 shows how often each code was assigned.For the codebook for reasons not to use an app or delete a previously installed app, the weighted Krippendorff's alpha value to determine inter-rater agreement (see Section 3.5.2) was 0.90.Values for individual codes ranged from 0.52 for the "unclear" code to 1.
For the reasons not to use a COVID-19 app in the first place (Q21; Figure 5 (a)) we received a varying number of open-ended responses over time as the availability of COVID-19 apps in the respective country and consequently the number of app users changed over time: in Germany 685, 310, and 412 answers in Rounds 1-3, and in the US 702, 634, and 786, respectively.Correspondingly, we observe a drop in the prevalence of there being "no app" or "no suitable app": Numbers dropped to almost zero in Germany in Rounds 2 and 3 after the Corona-Warn-App had become available, while in the US we observe a drop by approximately 50 % in R2, when participants had also started to name state-specific COVID-19 apps in Q20 (see Section 4.1.2).At the same time, with availability of the Corona-Warn-App (see Section 3.4.1) the number of German participants who referred to technical issues as the reason not to adopt a COVID-19 app increased in R2, as hands-on experience with COVID-19 apps had become more widespread.

Perceived Negative Aspects of COVID-19 Apps.
In Q12 all participants were asked which aspects they generally found negative about smartphone apps designed to help fight the pandemic.The resulting codebook, shown in Figure 5 (b) and Appendix C, contained 26 codes.Its weighted Krippendorff's alpha value was 0.94, with values for individual codes ranging from 1 to 0.49.
As with the reasons not to use an app, over the course of the first 1.5 years of the pandemic we observe a steady decrease in the number of people who worried about government surveillance and their personal autonomy.For privacy, this holds true only between R1 and R3, whereas both countries exhibited their lowest number of participants with privacy worries in R2, when infection rates were about to rise to new all-time highs in both countries and lockdowns were in place throughout Germany.Note that this effect was not observed in the reasons to not use an app (see Section 4.4.1).Here, privacy still was still one of the most frequently named reasons but was overshadowed by the notion that the app was unnecessary, i. e., notions of individual utility.
In Germany, where national government-backed COVID-19 apps had been available since Round 2, we find both an increase in the number of participants who mentioned malfunctions ("false alarms" [R3-DE-815], "outdated information" [R2-DE-3]) as well as aspects revolving around an app's user base ("incorrect information supplied by users" [R3-DE-408], "many do not enter their positive test result" [R2-DE-408]).By contrast, in the US, these numbers slowly decreased across survey rounds.An explanation could be that widespread or prolonged use of an app made people more aware of its drawbacks in everyday use.
Across rounds, we also find an increase in the number of participants who stated that apps designed to help fight the COVID-19 pandemic had no negative aspects.This was also the case for perceived positive aspects (see Figure 6 (a)), so it appears that over the course of the pandemic people became more indifferent in their stance towards COVID-19 apps.

Perceived
Positive Aspects of COVID-19 Apps.For positive aspects of COVID-19 apps, we assigned 24 different codes as shown in Figure 6 (a) and Appendix C. The weighted Krippendorff's alpha for this codebook was 0.91, with individual positive values ranging from 1 to 0.63.The code "healthcertificate" produced a negative alpha value due to it being relatively rare and used by only one coder in the data used to compute inter-rater agreement.
As just mentioned in Section 4.4.3,we found an increasing number of people who perceived COVID-19 apps to have no positive aspects, which adds to the evidence of growing indifference towards such apps.Between R1 and R2 we also observe a noticeable drop in the number of people who provided generic statements that these apps were a useful tool against the pandemic ("it will keep the pandemic down" [R3-US-230], "it can slow down the spread" [R1-DE-263], "the pandemic is over sooner" [R2-DE-985]).This could be an effect of the prolonged duration of the pandemic dampening people's expectations of what an app can actually contribute.
Looking at concrete aspects, the use of apps for contact tracing was regarded as positive by a slowly increasing number of participants in Germany, while US numbers steadily declined between rounds.This could be an effect of Germany having issued a national contact tracing app and multiple of its states having used the privately developed app Luca as part of their anti-pandemic strategy (see Section 3.4.1),while such apps have only been rolled out on the state level in the US, resulting in less hands-on experience with them.

4.4.5
The Ideal COVID-19 App.Since many participants in R2 expressed dissatisfaction with existing COVID-19 apps, we added Q13 in R3 that asked which functionalities they desired in smartphone apps designed to help fight the pandemic.We received 1026 answers from participants in Germany and 1052 from the US.Many answers described functionality already found and perceived to be positive in existing apps, so we used the codebook for perceived positive aspects as a starting point, adding new (sub)codes that occurred frequently 3 and removing those that did not relate to an app's functionality.The final codebook, shown as the y-axis labels in Figure 6 (b) and in Appendix C, comprised 24 codes.The weighted Krippendorff's alpha for this codebook was 0.88, with individual positive values ranging from 0.45 for the "unclear" code to 1.
The high number of "Don't know" answers (DE: 257, US: 222) indicates that participants found this question difficult to answer.Another large share of participants (DE: 199, US: 268) provided negative sentiments towards COVID-19 apps ("none" [R3-US-5], "one to uninstall" [R3-DE-14], "I don't want an app" [R3-DE-274], "rubbish" [R3-US-67]), which we coded as the ideal COVID-19 app being "None."As with perceived positive aspects, we found patterns of participants desiring functionality they were already familiar with from existing apps: contact tracing in both countries, and vaccine passports and functionality to manage test results in Germany, where another 34 participants directly referred to the functionality of existing apps.Some participants also wished for functionality that, with the current state of technology, is impossible to provide by a smartphone app.Most notable (DE: 31, US: 34) was the wish for real-time detection of the virus or infected people in proximity to the app user: "alert of any confirmed infected people that may be within a certain radius" (R3-US-611), "alert me if a sick person is moving towards me" (R3-DE-128), "location of positive cases" (R3-US-1038).A smaller number of participants, particularly in the US, wished for direct treatment ("treat and diagnosing the virus like beyond the COVID-19 pandemic.In particular, as the factor levels used in the app scenarios were originally derived from real-world COVID-19 apps issued or backed by national governments [68], our findings can inform the design of future apps issued by public actors that collect and process users' personal information for the benefit of society as a whole and would benefit from widespread voluntary adoption.Possible use cases could be apps to tackle future health crises on a more general scale, for the information and management of natural disasters like tornadoes and floods, or that simply collect citizen-provided data to inform projects to improve local or national infrastructure.

The Continuous Importance of Data Privacy
Our analyses revealed multiple factors of continuous importance in people's willingness to adopt COVID-19 apps and how they perceive such apps, most of which relate to data privacy.
In our CLMM models we found less privacy-preserving implementations to consistently reduce people's willingness to use an app.For example, we found apps that transmit data that allows for unique identification of the individual or even only collect demographic information to continuously have negative impact on both US and German participants' willingness to use a COVID-19 app.Note that the role of user anonymity is an aspect that depends on cultural norms, as our earlier work [68] has found the exact opposite effect for China, where identifiability of the individual had a significant positive influence on people's willingness to use a COVID-19 app.
In the same vein, sharing data with third parties who are not directly concerned with pandemic control (i.e., private companies or law enforcement) also significantly reduced people's willingness to use COVID-19 apps.Participants were also more willing to use apps for less invasive purposes, e. g., apps for contact tracing and information as opposed to apps enforcing quarantine or for symptom checks that prompted for sensitive health information.
Combining these observations we can deduce that people expect the data flows of an app to be appropriate to the context in which they occur, thus confirming Nissenbaum's theory of privacy as contextual integrity [50].
These findings are supported by the open-ended survey responses, with privacy being the most frequently mentioned negative aspect of COVID-19 apps in both countries and even a frequent reason not to use COVID-19 apps at all.The still comparably high adoption of actual apps in Germany can also be explained by the theory of CI: While in a general context, privacy has high value for a participant, in a specific situation, such as a global pandemic in which some actors tie fundamental rights to app use, its importance can be re-evaluated.
This also explains our finding that in both countries privacy concerns were more frequently voiced as negative aspects of COVID-19 apps in the first and third survey rounds than in the second.The pandemic state around R2 coincided with the highest incidence rate across all three survey rounds and severe contact and travel restrictions in an attempt to slow down rising infection rates, particularly in Germany.This again implies that people may still consider privacy important but are more willing to trade it in for a greater good under extraordinary circumstances, temporarily adjusting what they consider "appropriate" data flows.
Taking all of this into account, we recommend that future state-issued apps for widespread adoption follow a "privacy by design" approach and only collect the amount of data that is necessary to implement the desired functionality.As the discussion of different architectures for digital contact tracing has shown, such requirements can lead to the development of innovative new technology that unites public actors' wish to collect information and citizens' legitimate privacy concerns.

Changing Influences: External Factors and Experience with COVID-19 and Apps
We also identified factors whose influence on people's willingness to adopt a COVID-19 app changed over time.Looking at commonalities between these factors, we find that most relate to external events that are subject to change over a pandemic and have nothing to do with the inherent properties of an app: a person's individual experience with the disease, such as test results or suspected past infections; how they rate measures taken in their community for pandemic containment and control, and their opinion of actors in the pandemic, including health authorities and federal and local governments.The latter directly relates to the importance of trust in times of public crisis (see Section 5.3).
For app designers, such externalities are hard to impossible to predict and consider in the app design process, and updates that try to accommodate such changes may come too late.
It appears more attainable to consider various events that may occur in the type of crisis situation that fits the use case of an app and try to account for changes in user stance and behavior that these events may objectively cause.One such example would be the effect of lockdowns or stay-at-home orders: If people stay home more, they have fewer use cases for apps for digital contact tracing that record their recent encounters with other people or health certificate apps that provide access to certain locations or services.Apps could display information that point out the importance of their continued use (see also Section 5.4) to encourage people to keep using them despite temporarily changed circumstances.
Beyond such externalities, we found that the actual availability of apps and experience with them, be it first-hand or reported, naturally changed over time and had major influence on people's stance towards and the reported and actual willingness to adopt COVID-19 apps.
The actual use of COVID-19-related smartphone apps in Germany continuously increased over the first 1.5 years of the pandemic (up to 43 % in R3), while in the US the adoption rate remained comparatively low (highest at 11 % in R2).Accompanying these developments in actual app use, our analysis in Section 4.3 found for both countries similar changes in the influence of the three survey rounds on people's willingness to use (hypothetical) apps (see Table 5).While in Germany the willingness to use apps was significantly higher in R2 and R3 compared to R1, we observe the exact opposite effect in the US-later survey rounds had significantly negative effects on people's willingness to use COVID-19 apps.We assume that these diverging developments can be partially attributed to the availability of a federal COVID-19 app in the respective country.While in Germany the government-backed Corona-Warn-App was widely used, there was no comparable contact tracing solution available on the federal level in the US, which may be a reason for comparably low adoption in practice.A lack of hands-on experience with a given technology and its benefits may negatively impact people's stance towards it and, subsequently, their willingness to use it.
Again, timely open public discussion of apps to be issued by public entities for wide adoption in crisis situations can help familiarize prospective users with their functionality and benefits to foster understanding and the willingness to use them.
Our qualitative analysis found misconceptions among people's negative stance towards COVID-19 apps and suggestions for future functionality, likely caused by incorrect assumptions about the purpose of an app or what is possible with the respective technology.Missing feedback from the app, such as contact tracing apps only actively changing something in their user interface if a risk encounter was registered, might cause misconceptions or the perception that the app does not work properly.Kotthaus et al. [40] found similar effects for the perception of crisis apps in 2016.
We recommend for the development of future government-issued apps whose efficiency depends on widespread voluntary adoption to more accurately describe the intended purpose and functionality of the app and highlight personal and societal benefits to motivate individuals to use it, but also to clearly communicate its boundaries.Explaining what an app can and cannot do could avoid raising false expectations that do not come true in the end.As we found people's perception of COVID-19 apps to change over time, particularly due to the influence of external effects such as trust in institutions and public policy, it is important that this communication considers such changes and the aforementioned information is not just provided once to motivate app installation but continuously upheld and adjusted throughout the lifetime of an app to clearly communicate the benefits and importance of its continued use.

The Role of Trust in Actors and Institutions in Crisis Management
In both the US and Germany, we identified a favorable attitude towards state authorities to have a significant positive impact on participants' willingness to use COVID-19 apps across all three survey rounds.As already discussed in more detail in Section 2.2, Dressel [21] has shown in her work conducted before the COVID-19 pandemic that people in high-trust states (state-oriented risk cultures) are willing to contribute individually in a crisis, are interested in being integrated in crisis management, and that trust in authorities is an important factor for the use of mobile crisis apps.For the particular case of the Corona-Warn-App for digital contact tracing released by the German government and health authorities, the research findings of Reuter et al. [56] were confirmed in a real crisis: Citizens have a positive attitude toward mobile crisis apps, implying that government-issued apps are trusted in times of crisis.On the contrary, data being sent to private companies or law enforcement had a significant negative impact on reported app adoption.Thus, public actors issuing apps designed for widespread adoption in public crises should carefully consider who they partner with for the development of such apps, what third-party components are being used, and who has access to the collected data.The purpose and data processing practices of the app must be clearly outlined beforehand to foster people's trust that the app only does what it claims to do.Ideally, the source code is made publicly available to allow for review by security and privacy experts.
Once established, institutions need to continuously work on keeping up users' trust in an app, taking care to not succumb to "function creep" [12] and exceed the capabilities and data processing practices they originally promised an app to have.Though we found the influence of trust in institutions to have varying influence on people's willingness to use COVID-19 apps, real-world events such as surrendering contact tracing data from the app Luca to the police [55] have shown that trust in an app can quickly be damaged.Similarly, ongoing discussions about the future of existing COVID-19 apps and what will happen to the collected data are facing the argument that these apps were initially introduced with the intention of being a tool for a very specific purpose and a limited period of time in a global crisis, which should not be extended or overstepped retroactively, thus again highlighting the importance of trust in an app and the entities behind it.

Societal vs. Individual Benefits
Real-world adoption of COVID-19 apps and related research have shown that people generally are willing to install and use smartphone apps for societal benefit-though not without limits.
First, people's opinions appear to be shaped by what they are already familiar with through media discussions or actual use in their country.In our CLMM models only the app purpose contact tracing had a significant positive influence on participants' willingness to use apps over all three survey rounds in both countries, confirming the results of Dressel et al. [21] that people are motivated to provide an individual contribution to help overcome a public crisis.The same applies for the significant change in participants' willingness to use health certificate apps in Germany in Round 3, a finding confirmed for actual use of such apps by other work from our research group [33] who found in a representative study in Germany in December 2021 that 70 % of participants actually used an app of this type (digital COVID certificate).
It has to be noted that the high reported user bases for these apps, particularly health certificates, could partly be due to the individual benefits they granted to users: for digital contact tracing apps, the ability to estimate one's own risk of recent exposure to COVID-19, and for health certificates, the ability to obtain access to locations or services only available to people with a certain health status, such as vaccination or a recent negative test result.We found evidence in our data that some people were not or no longer willing to use COVID-19 apps if they did not consider them to provide sufficient individual benefit: Across all three survey rounds, the most prominent reason participants named for not using a COVID-19 app was such apps being unnecessary in a generic way, and multiple US participants in R3 reported to have deleted a COVID-19 app because they no longer considered it necessary after having received the vaccine.For the development of future apps designed for widespread adoption in emergency or crisis situations this means that designers should keep communicating to users how they could as individuals benefit from (continued) use of an app that was mainly designed to benefit society as a whole, rather than provide individual short-term advantage.
Despite the above observation, there were real-life COVID-19 apps that were downloaded and used even though they did not provide an immediate personal benefit to users.This is supported by our CLMM models, in both of which the promise of individual benefits in the "societal implications" factor did not have significant influence on participants' decision to use a COVID-19 app in any of the survey rounds.One example for a COVID-19 app with no immediate personal benefit was the German Corona-Datenspende ("Corona Data Donation") [57], which allowed people to contribute their health and activity data for scientific research.As it could only used by people who owned a fitness tracker, its user base was nowhere near as large as those of national contact tracing apps, so there are limited insights into users' decisions for (prolonged) use of the app.Future work could investigate under what circumstances people would be willing to adopt apps for purely altruistic purposes.
In the EU and in Germany in particular, public debates sparked about the privacy, security, and societal implications of government-backed health applications during their development phase and in advance of their release.This ultimately led to the more privacy-friendly approach of a decentralized and Bluetooth proximity-based app, e. g., the Corona-Warn-App for digital contact tracing in Germany, which has a comparatively high acceptance.

CONCLUSION
The COVID-19 pandemic has demonstrated how smartphone apps can make a meaningful contribution in a public (health) crisis and help slow down the spread of certain types of infectious diseases, but at times only when they are widely adopted.We conducted online surveys in Germany and the United States at three distinct points in time during the first 1.5 years of the COVID-19 pandemic to learn how people's attitudes towards smartphone apps designed to aid the fight against COVID-19 developed over the course of the ongoing pandemic.Fears of surveillance and perceived risks for privacy, security, and personal autonomy became less pronounced over time while people's adoption of-but also indifference to-such apps increased.We found external factors such as pandemic state and containment measures to influence the willingness to use COVID-19 apps.Still, we identified future opportunities to foster use of these and similar apps: respecting users' expectations of privacy and the appropriateness of data flows, along with focusing on perceived utility and user understanding of the functionality of an app and the importance of its continued use through transparency and explanations.

B QUESTIONNAIRE
Please note: Questions marked with * indicate those for which we collected responses but did not report results in this paper for reasons of scope.Still, we report these questions here for the sake of completeness.

Welcome Text
The current situation with the novel coronavirus (SARS-CoV-2) and the disease it causes (COVID-19) has sparked an intense debate about the use of smartphone apps to better understand and contain the spread of the virus.This study investigates how people perceive apps that promise to help fight the COVID-19 pandemic and what they expect from them.

Phone Use
First we would like to ask you a few questions about the smartphone you mainly use.
Q1: Do you own a smartphone?[single choice: Yes / No / Prefer not to answer] Q2: If "Yes" in Q1: What is your phone's operating system?
[single choice: Android/Google, iOS/Apple, Other (please specify:), Don't know, Prefer not to answer] Q3: If "Yes" in Q1: How satisfied are you with the following aspects of your smartphone?
[array of single-choice questions for: Battery life, Location accuracy (GPS), Camera quality, Speed (of apps)] [answer options for each: Very satisfied, Satisfied, Neither satisfied nor dissatisfied, Dissatisfied, Prefer not to answer]

Coronavirus Experience
Now we would like to ask you some questions about your experience with the novel coronavirus.This study uses the following terminology: • "coronavirus": the novel coronavirus (SARS-CoV-2) that has caused a global pandemic in early 2020; • "COVID-19": coronavirus disease 19, the respiratory disease caused by this virus; • "corona apps": smartphone apps specifically developed to help limit the spread of the COVID-19 pandemic.
Q4: Are you or have you been infected with the novel coronavirus?[single choice; answer options: I was tested for coronavirus and at least one of the results was positive./ I was tested for coronavirus and all results were negative./ I was not tested for coronavirus and I do not think I have been infected./ I was not tested for coronavirus, but I suspect that I might have been infected./ Prefer not to answer] Q5: Is there a person in your social circle who is or has been infected with the novel coronavirus?
[single choice: Yes / No / Prefer not to answer] Q6: Have you been quarantined or did you quarantine yourself because of coronavirus?
[single choice: Yes / No / Prefer not to answer] Q7: How concerned are you that you or someone you are close to will become infected with the coronavirus?
[single choice: Not at all concerned / Slightly concerned / Somewhat concerned / Moderately concerned / Extremely concerned / Prefer not to answer] Q8: To the best of your knowledge, is there a person at higher risk in your household, i. e., an older adult or a person of any age who has a serious underlying medical condition?[single choice: Yes / No / Prefer not to answer] Q9: Round 3 only: Are you vaccinated against COVID-19?
[single choice; answer options: Yes, I am fully vaccinated / Yes, I am partially vaccinated / No, but I plan to get vaccinated in the future / No, and I do not plan to get vaccinated in the future / Prefer not to answer]

Perception of Corona Apps
Note: In Round 1, this section was displayed before the app scenarios.We moved these questions for Rounds 2 and 3 to reduce bias due to participants' prior exposure to the app scenarios.For each app, you will be asked a few questions.Please consider the app's purpose and functionalities while answering the questions.Always assume that you are free to choose whether or not you install and use the app.

App Scenarios
Note: This question group was displayed 10 times, only with different scenarios.
Sample Scenario.Imagine an app that provides information about the novel coronavirus and the disease it causes, COVID-19.The app uses your current or past location(s) and your COVID-19 infection status.In addition, the app collects general statistical data about you.This data is sent to research institutions when you request information and it will be stored for an unspecified period of time.It is possible that the app cannot download and display current information due to technical problems.In the future there may be other possible benefits of using this app.
Q14: How likely are you to use this app on your smartphone?[single choice: 7-point scale with end points "Very unlikely" and "Very likely", plus "Prefer not to answer"] Q15: How many people in (Germany | the United States) do you expect to use this app on their smartphones?
[single choice: 7-point scale with end points "No one" and "Everyone", plus "Prefer not to answer"] Q16: Please complete the following statement: Most people who are important to me think that I ...
[single choice: 7-point scale with end points "should not use this app" and "should use this app", plus "Prefer not to answer"] Q17: How useful do you rate this app in helping contain the spread of the COVID-19 pandemic?

RQ 1 :
How does people's willingness to use COVID-19 apps for different purposes change over time and what is their perceived utility?

Fig. 1 .
Fig.1.Survey timeline for Germany and the US, including COVID-19 incidence rates (new infections per 100,000 people in the last 7 days)[29].

4. 2
Willingness to Use and Perceived Utility of COVID-19 Apps Over Time (RQ 1)

Fig. 2 .
Fig. 2. Willingness to use COVID-19 apps, differentiated by participants who had tested positive (top) or negative (bottom) in Germany (left) and the United States (right).

Fig. 3 .
Fig. 3. Willingness to use any app in Germany (left) and the United States (right) in relation to the reported COVID-19 status over time.

Fig. 4 .
Fig. 4. Willingness to use health certificate (left) and contact tracing apps (right) over time.

R o u n d 1
R o u n d 2 R o u n d 3
(a) and (b).The codebook for reasons to not (or no longer) use an app comprised 31 codes, that for negative aspects 26.Unique to the former were reasons rooted in individual behavior availability, and statements that merely expressed unwillingness to use them without providing any reason.Only the codebook for negative aspects contained codes for generic negative statements and "none." Vaccination was added as a new reason to both codebooks only in Round 3. The full codebooks with examples for each code are available in Appendix C

•
Information types (What types of data are concerned?)• Transmission principles (What are the means of the data transmission?) 5 % being fully vaccinated [27].Several US states had issued bans against basing access to certain goods and services on COVID-19 vaccination status [19].Only the state of New York had issued a vaccine passport app as of US Round 3, Excelsior Pass (Plus) [65].Additionally, private entities had started to issue health certificate apps, such as the IATA Travel Pass published on April 15, 2021 (and decommissioned by the time of writing), which could be used to provide proof of vaccination for air travel and was recognized by 290 different airlines [34].

Table 1 .
Participant demographics, with population statistics as reported by our panel provider.Table1further shows the demographics of our participants (gender, age, and education as reported by our panel provider), the percentage of participants who used at least one COVID-19 related app, and the mean values of their IUIPC scores.
of Corona-Datenspende, and 111 of other apps, including contact tracing apps of other countries when on vacation.64 participants indicated to have deleted a previously used app (50 CWA, 13 Datenspende, and 7 another app).In R3, 439 participants (42.70 %) used a COVID-19 app, broken down into concrete apps as follows: 369 CWA, 70 Datenspende, 172 Luca, 82 other, including vaccine passports and apps showing current COVID-19 regulations.107 participants had deleted an app (72 CWA, 22 Datenspende, 24 Luca, and 17 another app).In the US, 62 participants (6.18 %) indicated to use a COVID-19 app (Q18) in R1, but only 7 out of 47 open-ended answers to Q20 named a COVID-specific app, most frequently (3 times) the app How We Feel.R2 yielded 113 (11.30%) self-reported app users, and out of 96 open-ended answers 30 used designated COVID-19 apps, most frequently state-specific contact tracing apps like Covid Alert NY, CA Notify, or Slow Covid NC.For R3 we counted 92 (8.58 %) affirmative answers to Q18 and as many open-ended responses, with 19 answers referring to COVID-specific apps, again most being state-specific apps.The majority were contact tracing apps, but one participant named New York State's vaccine passport app.

Table 2 .
Mean response values (± sd) related to the general willingness to use COVID-19 apps (Q14).Data marked with an asterisk is significantly lower ( < 0.05) than in the two other rounds in the same country.

Table 4 .
Estimates from the cumulative link mixed models (CLMMs) for participants' willingness to use COVID-19 apps for survey round, app scenario factors (e. g., app purpose), and non-app scenario factors (IUIPC).The statistics for factors that are consistently significant in all survey rounds are red, i. e., factors that keep having a significant (positive or negative) impact on the willingness to use apps over time.The blue suffixes R2 and R3 indicate a significant change compared to round R1, i. e., a change in interaction with the first survey round.Significance levels are indicated with stars (* < .05,** < .01).

Table 5 -
continued from previous page

Table 5
Q10: Do you know of any app recommended by the public authorities in (the United States | Germany) that can be used to ... [array of single-choice questions; answer options for each: Yes, No, Unsure, Prefer not to answer] • ... get information about the novel coronavirus and its spread?• ... check if you have coronavirus-related symptoms?• ... enforce quarantine?• ... identify people you have been in close contact with and alert them in case you tested positive for coronavirus?• ... provide information about your health and needs to be shown if you want to visit a certain place?Q11: In general, what do you consider positive aspects of smartphone apps to help limit the spread of the COVID-19 pandemic?[free text] Q12: In general, what do you consider negative aspects of smartphone apps to help limit the spread of the COVID-19 pandemic?[free text] Q13: Round 3 only: Imagine you could wish for a smartphone app to help contain the COVID-19 pandemic.What functionality would this app have?[free text] Introduction to App Scenarios In the following, you will be shown different apps to find out what kind of corona apps you would prefer to use.The presented apps are fictional and have different purposes and implement different functionalities.

Table 7 .
Codebook for positive aspects of COVID-19 apps (Q11) Use of Corona Apps Q18: If "Yes" in Q1, for US in all rounds, for Germany only in Round 1: Do you use any (kind of) corona app(s) on your smartphone?[single choice; answer options: Yes / No / Rounds 2 and 3: I had previously installed an app but I have since deleted it / Don't know / Prefer not to answer]