Privacy Concerns of Student Data Shared with Instructors in an Online Learning Management System

Learning management systems are used for facilitating communication between instructors and students, dissemination of lecture materials, and grading of assignments. They collect large amounts of student data, necessary or otherwise, with or without explicit consent from students. Furthermore, they make the data visible to instructors, which could have significant implications for students’ grades and experience in the classroom. In this study, we interviewed 31 students enrolled in a large public university about their privacy concerns towards different data sharing practices related to the learning management system used at their university – Canvas. Data from the study was analyzed by two researchers using inductive thematic analysis methods. The results show concerns about misrepresentation, the justification for information being visible, and discrimination. We present the implications of this study on instruction, design of learning management systems, and policy.


INTRODUCTION
Learning Management Systems (LMSs), such as Canvas, have become essential to higher education and have significantly changed the interactions between students, instructors, and course content [12,45,60].LMSs are used for both in-person and online education to facilitate communication between instructors and students, disseminate course materials, collect and grade assignments, and administer online assessments.Alongside these critical functions, LMSs collect a large volume of student data to enhance learning analytics (LA) [15,52], which offers the possibility of using big data to optimize learning [5,43,44].Since LMSs contain confidential student data, they have become targets for an increasing number of cyberthreats [42,62].Despite their prevalence and value to external actors and prominence in the day-to-day lives of students [18], there is limited past work that has analyzed privacy issues in LMS [32].LMS companies like Instructure (Canvas' parent company) and Moodle also acknowledge that such data are sensitive and valuable. 1Nevertheless, the focus has been on addressing privacy threats to student data from external sources; privacy concerns that exist within LMS platforms (between students, instructors, advisors, and administrators) has not been sufficiently identified and understood.
When LMSs were first introduced, researchers compared them to the panopticon [23,31], a prison where a centrally located watchtower could surveille many separated prisoners without knowledge of when the jailer could be watching [16].Students are required to use the LMS, but never know when and how the instructor, who is in a position of power over them, is monitoring them, and using information about them, or their activity data to grade their performance.This could create a surveillance relationship where students fear misrepresentation and may not see the data collection as justified.The risks of misrepresentation and fairness in surveillance by online proctoring technologies captured the media's attention during the COVID-19 pandemic [6,24,71], but the risks of misrepresentation emerging from more overlooked LMS data has yet to be reckoned with.
Much of the focus on privacy and surveillance in educational technology has been on studying privacy issues related to LA because of the noticeable privacy risks that students are exposed to often without their awareness [34,38,47,64,65].However, data sharing practices in LMS that haven't been analyzed in prior work involve the sharing of large amounts of "raw" student data directly to instructors without any meaningful student consent [32].For instance, information such as when an individual student last logged in, when a student turned an assignment in, or when a student clicked on a page in the LMS are made available to instructors.LMS data sharing seems to lack obvious purpose, which makes it different from LA data that is deliberately used for understanding and optimizing education.While misuse of student data aggregated for developing LA could have broad and systemic impacts on students, direct sharing of student data with instructors could result in more immediate and marginalizing effects.So, what are students' perspectives and concerns regarding the amount of personal and detailed information visible to instructors and others in positions of power, such as teaching assistants, in an LMS like Canvas?This work aims to answer this central research question.The student data we are concerned with in this work is personal, not aggregate, very detailed, and largely overlooked.Our contributions are three-fold: • We identify specific concerns that students expressed when informed about the several kinds of information that are typically shared with instructors of their courses on the LMS.We found that students were concerned about being misrepresented by the data, feeling the effects of surveillance.They expressed the need for justification for data collection and worries of power imbalances.• We highlight overlooked student concerns about privacy on LMSs (e.g., Canvas) used for engaging with coursework in higher educational institutions (HEIs) in contrast with data aggregated for developing LA.We describe how detailed raw data in LMS is concerning and worthy of research attention as much as concerns about big data privacy in LA. • Finally, based on our findings, we support designing LMSs to foster transparency and mitigate power dynamics, design for minimal data collection, and enable privacy autonomy.

BACKGROUND AND RELATED WORK 2.1 History of Privacy in LMS Adoption
Privacy in LMSs has been an issue from the very beginning of their adoption.Papers from as early as 1991 detail concerns about instructors reading and monitoring student work without making it clear that they were doing so [23] and an instructor surveilling and disciplining a student based on his log-on and log-off times to the electronic classroom technology [31].The combination of the instructor being able to see activity on the LMS and students being unaware of what instructors could see about them began to draw comparisons to the panopticon [16,23,31].In this paper we argue that the panoptic nature of the LMS continues to put students under surveillance.Regardless of privacy concerns, the adoption of LMSs continued through the 2000s.Instructure, the parent company to Canvas, entered the market in 2009 and grew to become the market leader in LMS technology, serving as the primary LMS at 36% of US and Canadian HEIs as of 2022 [25].When the COVID-19 pandemic necessitated a move to remote learning, LMSs were rapidly adopted without proper investigation of privacy impacts [11,53].Our investigation of the Canvas LMS has potential impacts for many users and decision-makers who may have adopted this platform without proper consideration of privacy.

Student perspectives and concerns about learning analytics
Overall, the body of work on students' privacy perceptions about LA suggests that students may be unaware or unconcerned about LA's implications to their privacy [22,38,70].In a manner consistent with people's privacy perceptions in other contexts, students tend to report lower apprehension regarding their privacy when they believe that the collection and sharing of their data are justified, noninvasive, and conducive to their learning [27,34,39].Interestingly, the level of concern appears notably lower in the context of data collection and sharing within educational platforms when compared to other domains such as e-commerce or social media [34,38,70].The trust students have in their educational institutions and the belief that educational institutions always act in the best interests of students, could influence them to trust in the potential benefits they could get from data collected for LA.However, HEIs sharing data with stakeholders outside the classroom and with third parties were concerning to students [70].Therefore, the context in which the students' data is collected and used can determine students' concern, or the lack thereof.Frameworks have been proposed by LA researchers to ensure the ethical implementation of LA [14,73].There aren't similar frameworks for the distinct issues of surveillance in LMSs.This study aims to analyze students' concerns in the context of the data collected and shared by the LMS with instructors that can be used to track individual students' behaviors and engagement with content on the LMS.
Although most research suggest a general lack of concern about privacy in educational contexts, students nevertheless expect to be informed about data collection practices and justifications for the data collected [47,65,70].They have consistently reported being unable to recall giving consent for their being used for LA or shared directly with instructors [34,55,74].When queried about their values and boundaries for the collection of identifiable data, in one study, students reported valuing the collection of data helpful to the instructor but were concerned that the data would not be objective because of "gaming of the system" [47].Students also reported discomfort in being treated as data points instead of as people [47].Beyond privacy notices, students have expressed desires to become privacy aware, to have control over their data, and limit who has access to their data [34,65,70].Since LMSs serve as a platform for data collection, one approach would be to provide a dashboard that enables students to review and control the data collected, aggregated, and shared.The research on designing privacy dashboards in newly emerging [33].

Privacy issues around power imbalance relationships
There is inherently a power differential between students and instructors because the instructor is the source of knowledge, and the student is "unknowing" [41].Additionally, instructors who are professors have the power to affect students' near-and long-term futures by awarding grades, writing letters of recommendation, and holding influence in the classroom [61,72].Digital communication, such as email, social media, and LMSs, have the advantage that students may feel more comfortable engaging with their instructor, but boundaries are necessary for online contact to ensure the relationship is beneficial to both the student and instructor [26].There are hierarchical divides in expectations of instantaneous online communication.Students may be assumed to be available to go online at any time, whereas instructors may have varying expectations of themselves and their students [61].Instructors must make decisions about their online availability, response time, and profile information available to students on the LMS.The level of selfdisclosure determines another power differential that is lessened with more self-disclosure and increased through stricter boundaries [61].LMSs add a virtual dimension to the power differential between students and instructors, which our study seeks to understand.
The nature of the systems collecting data are such that both students and instructors are subject to the decisions of HEIs and the software developers determining what data collection is possible in LMSs.HEIs and third parties both have access to various student data but fail to effectively uphold student privacy [59].Students are assumed to have agency over their data, but HEIs make choices about when to require consent for data release favoring their needs over students' [9].Despite not being in control or sometimes fully aware of the data collection, instructors should be mindful of their position as the ones responsible for informing students of surveillance capabilities [7].Instructors have powers of surveillance regardless of their desire and intention to use education technology for these purposes.Our paper is one of the first to investigate student sentiments about instructors' access to technological surveillance, but studies of other workplaces can teach us about how surveillance functions in power differential relationships.Domestic workers experiencing household surveillance technology report that surveillance indicates lack of trust [3,8].Power differentials are further complicated by the intersections of marginalization in society, which has been shown to affect security and privacy concerns [17].The nature of privacy and hierarchical relationships means differentials in power are integral to how we conceptualize privacy in this paper.

Theoretical Frameworks
The theoretical frameworks of intellectual privacy theory, contextual integrity, and panopticism are integral to understanding privacy in LMSs.
Intellectual privacy theory centers on freedom of thought, which is a right firmly upheld in Western legal tradition.The main tenants of intellectual privacy and "spatial privacy, the right of intellectual exploration, and the confidentiality of communications" [55].Students require spaces, including online, that are free from surveillance to practice freedom of thought.Not fully formed ideas should be communicated with trusted parties without concern for becoming public.As soon as technology was introduced into education, new concerns about intellectual privacy appeared.Starting in the 1970s when audio/visual recordings and computers entered the classroom, the classroom's status as a safe space for learning was threatened by the possibility of recordings ending up outside the classroom and being used for unintended purposes, especially without students' knowledge [19].Today cyberattacks stealing data continue to validate this as a concern [42].LMSs are online spaces that have the imperative to uphold intellectual privacy to create a comfortable learning environment.
What information students and instructors choose to share about themselves is context dependent.Nissenbaum's framework of contextual integrity describes privacy according to the norms of appropriateness and norms of flow or distribution [49].In our work, norms of appropriateness refer to what is appropriate to share in an academic setting.Norms of flow or distribution refer to the relationship with the instructor and the interaction with the interface of the LMS.Nissenbaum also makes distinctions between the type of information being shared as parameters of the norms [50].In this paper, students are asked about various types of information to understand these distinctions.We hypothesize that Canvas does not uphold the norms of contextual integrity because it takes a onesize-fits-all approach to privacy.To combat this approach, some individual instructors may implement improvised LMSs for their classes that are more responsive to their needs, but still have privacy implications [28].Regardless of the LMS, individualized control over privacy could align them with contextual integrity theory.The implications of our study offer design and policy suggestions to accomplish more contextually appropriate data sharing.
Finally, Foucault's theory of panopticism describes surveillance as a prison of individual cells arranged around a watchtower, so the warden can see in every cell.Importantly a bright light makes it impossible to tell whether there is a warden in the watch tower causing every prisoner to feel surveilled regardless of if they are actually being watched [16].Panopticism has been used to describe education technologies such as online proctoring [54], learning analytics [76], and LMSs [23,31].For example, students individually browse the LMS, but the instructor has the power to see all the students' activities.Students submit assignments, yet it is unknown to the student when the instructor will view their work.Login times, submission times, and other data that tracks when a student is active on the LMS contribute to the panoptic dynamic of one instructor surveilling many students.

METHODS
We conducted 32 semi-structured interviews in the spring of 2023 at a large public university in the USA about privacy in Canvas LMS.We conducted a pilot study of 4 interviews and a focus group to develop the interview protocol, which can be found in the Supplementary Material.Transcripts of the interviews were coded using inductive thematic analysis [58].The analysis of the themes in this paper leads us to recommendations about privacy for instructors, HEIs, and LMS designers.This study was classified as exempt by the University's Institutional Review Board (IRB).

Recruitment
Academic advisors in various departments at the university were given the opportunity to email their undergraduate students with a recruitment flier.The flier advertised that students would participate in a study about their privacy and browsing activity on Canvas during a 60-minute interview with compensation of 15 USD.The inclusion criteria were being a student at the university that uses Canvas, never having used Canvas as an instructor or teaching assistant, and being 18 years old or older.

Demographics
After completing the interview, participants answered a demographic questionnaire.The genders of the participants were: 22 women, 7 men, 1 nonbinary woman, and 1 gender-fluid participant.Tables 1 through 3 in the Appendix offer a detailed breakdown of the demographic information of the participants involved in the study which include race, academic major, and number of years in school.

Interview Process
The interview process began when potential participants contacted the research team using the online form.Preliminary screenings, based on self-reported data, were conducted to confirm meeting the inclusion criteria.The interviews were conducted over a span of 2 months, from May 2023 to June 2023.One researcher conducted the interviews via Zoom, except for two that took place in-person.The interviews (without setup and follow-up questionnaire) lasted between 26:38 and 58 minutes (median = 36:07).We started the interviews by obtaining informed consent via an online consent form and an introduction with information about the study.We built a rapport with the participant using a few icebreaker questions before starting the interview.In the Context section of the interview, students were prompted to think about privacy in general and on Canvas, as well as if they had been informed about their privacy.In the Exploration section of the interview, students were asked questions about specific types of information that instructors can view on Canvas and the types and contexts where students' privacy concerns would vary.In the Reflection and Solutions section of the interview, students were asked to self-reflect on their opinions and interactions depending on identity, control, information, and proposed student Canvas privacy protections.
This paper studies the responses to five questions from the Exploration section.The following questions were asked with a preface that they should explain their concerns or lack of concerns about the information in Canvas: We stopped the recording at the conclusion of the interview and sent the participant a link to an online demographic questionnaire to be filled out with the interviewer's guidance.Participants chose pseudonyms and pronouns.They were asked what minoritized population(s) they belong to in their field, if any, and if they would be interested in a follow-up interview about this topic.Open ended questions asked for the participants' gender, sexual orientation, race, major, and year in school.Appendix A.1 contains this demographic information.Future studies will investigate the data in relation to student's minoritized status and other demographic traits.

Data Analysis Methods
We analyzed the transcripts of 31 semi-structured interviews (one omitted due to not meeting inclusion criteria) via inductive thematic analysis [58].The transcripts were created using Rev's automatic transcription service and cleaned by the research team.We anonymized transcripts and notes to remove identifiable details.Subsequently, we permanently deleted the original audio recordings for confidentiality.Two researchers conducted open coding on a random sample of 5 interviews with a focus on finding themes in the participants' explanations of their privacy concerns.The researchers met to compare thematic codes to reach agreement on a set of codes and their definitions, a codebook.This process was repeated iteratively.
After 5 rounds of individual coding in NVivo of 2-5 interviews per round and consensus building on those interviews, we narrowed the scope and came to a consensus on a final codebook.The interrater reliability was measured using Cohen's kappa calculated using NVivo, then averaged for all codes and weighted by the word count of each code in a sample of 5 interviews.We achieved a Cohen's kappa value of 0.77 between the two researchers conducting the coding using the final codebook.The remaining 16 interviews were split between the two researchers for coding.We also coded the responses based on which question or piece of information was being discussed.
Each coded theme in the transcripts was analyzed by finding subthemes and their relationships to the questions.The results are presented by category of concern and theme within each category.The themes and subthemes informed the Discussion and Implications.

RESULTS
We sorted the participant's statements as Concerned or Not Concerned then sorted according to the seven themes defined by our codebook.The quotes are attributed to the participant and question in the parentheses.The questions in the attributions have been abbreviated as Photo (P), Last Login (LL), Submission Time (ST), Post-submission Review (PR), and Engagement Activities (EA).The number of references to the themes for each question can be found in Figure 1.

Concerned Versus Not Concerned
Concerned and Not Concerned are the initial categorizations of each statement made by the participant.Most participants started their responses by saying outright whether they would be concerned or not about the given information being shared.Sometimes participants were neutral or conflicted, so we categorized the statement by the sentiment of the explanation given by the participant.Their mixed opinions are reflected in the data with 47% of the coded statements in the Concerned category and 53% in the Not Concerned category.
Students reported being more concerned about their last login and their engagement activities being visible.They expressed not being as concerned about their photo and submission time.Comparisons of the concern based on question can be seen in Figure 2. Participants were rarely consistent about their level of concern across questions or even within each response.One explanation for this would be that many participants had never thought about their privacy on Canvas.One student described feeling conflicted as follows: "So it goes both ways.So one feeling would be, oh they, they're aware that I'm active, I'm going online and doing something, doing my assignments and on the other way, oh someone is watching me.So it may go into different ways.One is very positive and the other one is questions in your mind" (P10, LL).

Concerned
The Concerned category (ref=129 references) contains the Consent, Misrepresentation, Not Necessary, and Vulnerability themes.

Consent.
We defined Consent (ref=9) as a student expressing to know more about how and what information is being shared as well as describing not knowing this information as a concern.Students said they "should, " "wish, " and "would like to" know more about what information is being shared (P13, P24, P17).They also used the words "informed, " "consent, " and "permission" (P24, P26, P8, P17).
" [It] makes me uncomfortable that [instructors] can see something that I didn't explicitly give permission to." (P8, P) Although discussions of consent were rarer than the other six codes, they appeared distinctly.It is surprising that more participants didn't express that they want to be more informed and to choose whether to be tracked because consent is quite prominent in the education technology privacy literature [70].Those who did mention consent were very clear that they valued informed consent in their perspectives on privacy.One participant (P24, EA) described consent as "a fundamental of basic human society" and that "collecting information in the dark [is] [. ..] kind of unethical." Informed consent is an important value, but less espoused by our participants than expected.The academic context might be such that students put trust in the HEI when consenting to be a user of institution-approved education technology, so they report concerns about consent less frequently.

Misrepresentation.
When participants feel that the information available to instructors may create a bad or inaccurate impression, we consider this a concern about Misrepresentation (ref=47).Misrepresentation is mostly discussed in relation to Last Login (ref=20), Submission time (ref=13), and Engagement Activities (ref=9) (Figure 1).
Participants imagined that instructors could use data to gauge their level of effort and understanding, but a lack of context would result in an inaccurate assessment (ref=12).For example, the data would not reflect instances such as submission of group work where one person submits the assignment (P23), individuals scheduling when to work on a certain course (P26), or a student being a fast reader that views assignments for shorter periods of time (P21).One participant described the concern about being misrepresented as follows: "If my professor could see how long I was on each page, I guess there's just a level of self-consciousness, I guess, that would come with that and be like, 'oh, am I spending too much time on this?Are they going to think that I don't understand the material or something?"'(P2, EA) Students concerned about disapproval for procrastinating (ref=12) mentioned this entirely in relation to the Last Login and Submission times.Students articulated this as a fear of being considered a procrastinator: "I tend to do my assignments very late, most of the time procrastinating.I guess.I did not want to see my professors seeing when I logged in or not for returning in an assignment 30 minutes before it's due at 5:00 AM, not really, I don't want them to be looking at me like that." (P9, LL) "If someone […] usually turns things in later, it would be nicer that they don't know.Cause then they don't kind of associate a pattern with you." (P30, LL) Unprompted by the interviewer, many students used the word "judged" (ref=10).Some felt they may be judged for their effort, understanding, procrastination, or habits.Some students, like the following, spoke more generally about feeling unfairly judged: "I think they could just make guesses and claims about you without actually hearing what's going on.So there's just kind of basing information off of stuff that you don't really know about and you can't really control too much." (P3, LL) One student brought up instructors not necessarily being aware that they were making judgments about students (P1) and another mentioned discrimination (P10) in this context.Unconscious bias and discrimination can be considered as types of misrepresentation, but the larger sample of students discussing discrimination appeared in the Vulnerability theme.Misrepresentation via education technology that leads to discrimination has been documented in proctoring software [71].Our sample was mostly made up of White and Asian students, but it would be interesting to see if other students of color were more concerned about being misrepresented by the technology.
Participants thought that grading (ref=6) would be harsher if they were perceived to be procrastinating or less active on Canvas, but grading perceptions weren't always consistent.When we asked about Last Login information being visible, one participant said students who log in more may receive extra credit.However, she contradicted herself with the concern that if an instructor sees a student made a post-submission review, they may be tougher on grading because "they might think you're overthinking" (P21, PR).Concerns about the exposure of habits (ref=6) centered around students feeling disapproved of for working late at night or early in the morning.Like for procrastination, participants were concerned about the impression they were making on the professor: "I think the opinion[s] of my professors really matter to me.I guess I don't really know why, but I don't want them to think badly of me just cause I'm doing my work at 3:00 AM as opposed to normal person hours." (P7, LL) Students worried about being perceived as cheating (ref=4) because of suspicious behavior regardless of if they were cheating.The Misrepresentation theme is rooted in students perceiving the information available to instructors as telling an inaccurate or unflattering story about their academic performance.Misrepresentation captures two types of concern: (1) data creating an inaccurate portrayal of the students and (2) an instructor trusting the data and forming impressions about the student based on the observed data.
The data lacking sufficient context and possibly not being accurate contributed to the first concern about an inaccurate portrayal.
Students were concerned that instructors would lack context to appropriately gauge their effort and understanding.Students worried that their habits would be revealed, and the habits revealed could misrepresent them as procrastinating or cheating The second concern about misrepresentation is based on instructors' trust in the LMS and data collected by LMS.Students expressed apprehensions that their instructors could make judgements, implicitly or explicitly, of them for private habits, procrastination tendencies, or other information that online surveillance would reveal about them.These concerns about misrepresentation stem from the realization that such inaccurate behavioral assessments could affect their grades and academic standing.
The lack of purpose and narrative based on the given data in the LMS, in contrast with LA, allows for open interpretation.The Misrepresentation theme also implies a power differential in the relationship between students and instructors.Instructors are in a position of authority that includes awarding grades which could have a significant impact on the future of the student.This explanation is consistent with concerns that students described about how the instructor's personal perception of them may impact them.Procrastination and working late at night or early in the morning are behaviors that students see as stigmatized by their instructors leading the students to feel judged.The combination of large amounts of information about sensitive topics and the creation of a persona based on this information indicates the students are feeling surveilled in their discussions of misrepresentation.

Not Necessary.
The Not Necessary code (ref=37) is defined as the student considering the information being shared as unnecessary, too much information, or having no purpose to the student.This code appeared almost evenly between Engagement Activities (ref=13), Post-Submission Review (ref=12), and Last Login (ref=11).
The words "(un)necessary" (ref=14) and "why" (ref=14) were most prominently brought up unprompted in this code.Those words were used in the following contexts when discussing that an instructor sees when a student looked at a submitted assignment: "Yeah, that would be a privacy concern.I don't know why they would need to see that.I don't see a reason to have that.So I feel like that's just another piece of data that's unnecessarily shared." (P8, PR) "I think that goes back to why is that information being collected?That just concerns me because I don't understand why that's necessary and why that information is being collected." (P13, PR) Not Necessary appears in both the Concerned and Not Concerned categories (ref=5).In this context, the nothing to hide sentiment refers to an apathy toward privacy.These participants generalized that nobody was doing anything bad on Canvas, so surveillance is unnecessary and intrusive.For example: "I don't think they should be able to see [my engagement activities] because why do we just, maybe we want to see what to do in that assignment or just read instructions.We're just trying to get our work done.We're not doing anything bad.So yeah, I don't think they should care about those things" (P12, EA) Students also described the information as unrelated (ref=7).One student expressed the following about the instructor's relationship to the last login time: "It just has nothing to do with them and it's unrelated to their job." (P4, LL) The findings in Not Necessary include a combination of confusion, lack of justification, and questioning the necessity and relevance of data sharing.Students repeatedly wonder why data is being shared.Their inability to discern the need for data collection, having not considered it before, made students more concerned.In contrast, students who could think of a justification expressed a lack of concern.Students were concerned when they didn't know why the data collection was necessary, considered the data itself as irrelevant, or considered the information to be unrelated to learning.This is also a form of information asymmetry [1] because of the difference in information available between students and the platforms that collect and present their data and instructors using their data to track progress.The imbalance in knowledge could result in students becoming self-conscious and conservative in their participation on LMS.Conversely, students may also engage more consistently, thus unknowingly exposing personal or sensitive information.This underscores the importance of responsible data collection and sharing practices within educational technology platforms.

4.2.4
Vulnerability.We used the Vulnerability theme (ref=34) to capture specific feelings of intrusion and details that are uncomfortable to share.It is especially apparent in Photo (ref=12) and Engagement Activities (ref=9) (Figure 1).
Surveillance (ref=10) emerged as a subtheme because students described their Last Login and Engagement Activity visibility as revealing work and sleep habits (P19, P23, P30), being watched (P23), feeling "too close" (P25), "too much information" (P3), and being "a more intimate activity" (P29).What makes surveillance unique is that these statements are less concerned about judgment and more concerned about exposure.One student expressed concern that the instructor would take this position lightly: "I think it would be kind of rude, especially if it became like, oh, a joke.A professor thinks they're a comedian and is like, oh, da da da da da.You're all submitting [assignments] super late.I don't know.It would be, it would get my hackles up.Cause I'm like, well, at least I turned it in." (P27, ST) Some students felt vulnerable about their photo being unflattering (ref=5).In one instance this was connected to appearance discrimination, which an international student described being a major issue in Korea.The other types of discrimination, each mentioned once, were based on being an international student, pronouns, disability, and wearing a hijab.Multiple students brought up that their photo could be printed and used to identify students, which wasn't considered a problem, unless that paper was lost.The lack of security made one participant feel vulnerable.
Students considered lower performance (ref=4) in the class as a vulnerability factor making them more concerned about their privacy: "I think especially if I'm not as on top of things for a certain class, if I knew that that was something they could see […] I would feel a little bit exposed."(P17, LL) An interesting nuance in the conversation about vulnerability is that students may feel disempowered, thus unable to fully express their concerns: "Oh, maybe just when someone is doing something a little bit creepy, but then you don't really want to create more trouble out of it.It's like, okay, that was creepy, but you know what, I'm not going to think about it.Yeah." (P19, PR) Along these lines, a few students who were graduating shortly expressed not being as concerned due to having to use Canvas anymore.Otherwise, students felt especially vulnerable because of a combination of "feeling watched" and aspects they would feel vulnerable about offline such as appearance, minority status, or low performance.The Vulnerability theme describes the emotional reaction to being surveilled by the LMS.Surveillance and privacy have many similarities in the context of education [57], so we consider the feelings of vulnerability as pointing toward a surveillance dynamic between students and instructors.In the metaphor of the panopticon, students feel vulnerable because the technology they use leads to an uncomfortable surveillance relationship with their instructors.Students feel uncomfortable being watched, especially when there are aspects of themselves (LGBTQ, race, or other nonmembership in the dominant group at the HEI) that they feel are vulnerabilities in the HEI context.

Not Concerned
The Not Concerned category (ref=145) contains the Genuine Need, Normalized, and Unaffected themes.

Genuine
Need.We define the Genuine Need theme (ref=53) to describe students' perceptions of a useful, legitimate, and usually academic reason for the information to be visible to the instructor.Submission Time (ref=18) and Engagement Activities (ref=14) were the questions where this theme emerged most frequently (Figure 1).
We noticed a shared sentiment in discussions of accountability, effort, and leniency (ref=14).These could be described as fairness or meeting student expectations of monitoring.Students saw information being shared on Canvas as a method of being accountable to their professor.This is linked to effort because the instructor could attribute patterns of Canvas activity to effort in the class.The following statements describe what students find legitimate: "If your advisor, teacher or professor wanted to see if you actually watched the video, I feel like they could see if you actually clicked the tab of that.I feel like that's pretty valid." (P16, EA)."I guess then they can see the efforts that a student has made to their class.I think that's good for them, just knowing how often the student checks their course." (P19, EA) The above quotes take the perspective of the instructor, so these participants are linking the benefit to the instructor to a lack of personal concern about their information being seen.However, some participants linked accountability and effort to the personal benefit of leniency.The information available to instructors could build a reputation of showing effort, and there could be leniency when issues arose: "The professor can use your previous track record and be like, okay, you're a really good student.You're on top of your stuff.I'll let you, I'll excuse this one for you, or whatever it may be.So I can see how that would be helpful for both sides." (P24, ST).
The closeness of the monitoring could also lead to leniency in the example that a student submitted something just barely later than the deadline, but was not penalized for it (P24, P8).
Deadlines (ref=10) were considered to be legitimate reasons for information to be shared with instructors.The responses in this subtheme linked deadlines entirely to the Submission Time question.As the following quotation states, some students felt that the early or late scenarios were either not concerning or legitimate: "So if someone turned it in a week early, then I don't see that as being bad.And if someone turned it in late, you would probably want to know that." (P18, ST).
The information visible to the instructor could be used as a teaching tool (ref=10) in a few ways: getting to know the students (P10, EA), improving the Canvas page (P17, EA), knowing when to help a student (P18, LL) (P2, EA), analytics on the class as a whole (P21, EA) (P28, ST), and the printed photos used as an identifying tool as mentioned in the Vulnerability section (P25, P) (P8, P).
Policing for academic dishonesty (ref=7) was seen as a genuine purpose for the information to be visible.The types of academic dishonesty mentioned were plagiarism, collaboration on assignments, and looking up the questions during an online quiz.Although students had opinions about either being concerned their data would make them look like they were cheating (like in the Misrepresentation section) or were not concerned about sharing information to uphold academic integrity.On the contrary, one student made the following statement about her values: "I know that academic integrity is a really important part of any education system, and a lot of faculty and students are working hard to make sure that it's enforced, which I think is totally valid and it makes sense to me.But their privacy and safety is a bigger priority in my opinion than enforcing academic integrity." (P24, EA) In contrast with Not Necessary, students who could think of a genuine need for data sharing expressed more comfort.Students imagined data being useful for holding them accountable for their work, gauging their effort, and giving opportunities for instructor leniency.They saw this data as useful for enforcing deadlines and detecting academic dishonesty.Finally, students thought of ways the instructor could use the data as a tool.Genuine Need was the most eloquent description for a lack of privacy concern.Students trusted in the institution and their instructor to use the technology in ways that were required for the students to receive quality education.Here the surveillance of the LMS was seen as contributing positively to the students' experiences of learning and education technology.

Normalized.
We capture the sentiment that the information is already commonly shared, or the instructor will know the information anyway in the Normalized theme (ref=28).This theme is almost entirely composed of responses about the Photo (ref=23) and the remaining about Submission Time (ref=5) (Figure 1).
Students didn't mind the photo being visible on Canvas because the instructor would see them in person in class anyway (ref=8).The justification given by many participants was that their in-person appearance matches their photo online.Students also said that the photo is official and expected to be shown (ref=8) because it is the same one on their identification card.
The comments related to the visibility of Submission Time expressed that they were comfortable because it was a norm throughout college or already implemented (ref=5).These were both existing concepts that students expected would be visible.This result is a sharp contrast to instructors viewing submission time being compared to the panopticon in the 1990s [31].The ability of instructors to know individual submission times as a standard on every assignment is a trait that mostly appeared with the proliferation of LMS.For example, a pre-LMS method of collecting homework would be to bring it to class where everyone submitted it at the same time, and the professor didn't know if the student had finished their work just before they arrived in class or days in advance.
The participants were comfortable sharing certain kinds of information such as photos and assignment submission times because it was an accepted practice.Students who described themselves as comfortable with time stamped submission had accepted this as a standard part of the experience of submitting homework.The ubiquity of identification cards and online profiles were also reasons for acceptance.However, accepting the photo as not concerning implies a lack of a concern about discrimination (as seen in the Vulnerability theme).Most students were comfortable because they felt the photo would not affect how they would be seen in person, but this isn't true for everybody.For example, transgender students may appear differently in a photo from earlier in their transition process.This is an example of how minoritized groups may have different and greater privacy concerns [17].This concept was rarely mentioned by the participants.

Unaffected.
We define the Unaffected theme (ref=57) as when students think the information is useless.Students may feel unaffected by the sharing of information.This theme appeared relatively equally between the questions with the most frequent being Post-Submission Review (ref=16) (Figure 1).
The nothing to hide sentiment (ref=15) also emerged in this subtheme.Students describe always turning assignments in on time, logging in every day, being active on Canvas, and not doing the behaviors linked with the information.One student discussed permissiveness with their information because it wasn't anything they felt was worth hiding: "I guess there's generally not anything on Canvas specifically that I would particularly mind being shared because Canvas is relatively… niche isn't the right word, but it… you're not putting random information on there that you're like, it is just your assignments." (P31, EA) The types of academic data being discussed are not considered something that would affect privacy.In contrast with some of the students worried about their habits being revealed (Vulnerability), some participants considered habits to not be concerning data: "I'm not that concerned because I don't think they'll know where you live.They'll just know kind of your routine and what time you're up maybe." (P21, ST) Some participants considered the impact (ref=10) of the information to be low.They didn't know how it would be used, so they felt permissive about it being viewed: "I don't see any use of that information.So I mean, if they want to check it, sure, go ahead.Or if they want to know the time that I look at it, sure.But I don't see any point." (P19, PR) Sometimes the data being viewed was meaningless because the data itself was considered unreliable or useless (ref=8).The participants pointed out that having multiple tabs open or even multiple computers could make the data not reflect what students were doing.Some participants said that they logged in and out frequently or never logged out, so it would be hard to draw meaning from the data.
Regardless of if the data was meaningful, some students thought that the data would have no impact (ref=13).Instructors were not "going to do anything with it" (P21, P) or "be paying attention to that anyways" (P27, PR).It was considered "not a big deal" (P22, P23).A somewhat large, but more nebulous subtheme is Indifference (ref=12), which is where participants' initial reactions of lack of concern are captured.Students don't offer much explanation.They say they just don't care (P10, P16) or generally say they have no worries or concerns.These types of responses were usually followed up on by the interviewer and those elaborated responses fell under different themes, so salient quotations don't emerge as they do for other subthemes.It is informative that many initial reactions were of feeling unaffected.
Students cited feeling unaffected by the data sharing because they thought the information was useless, unimpactful, or unreliable.They also described themselves as having nothing to hide.This theme also captured people having initially dismissive reactions to being asked about concerns.The nature of the data being generated and used in an academic environment is a common factor in this theme.Some students were explicit about this, and others we infer might be responding this way based on the context of being in an educational space.The fact that this is the largest theme implies that there is a sentiment of trust and lack of privacy concern about data in education technology.

Emergent Perspectives Surveillance
Synthesizing the results and the individual themes that were inferred from them, our study reveals the emergence of two overarching perspectives cutting across these different themes: Surveillance and Justification.

Surveillance
. Participants spoke about feeling surveilled in the Vulnerability theme, but the theory of panopticism [16] leads us to additionally draw surveillance parallels to the Consent (Section 4.2.1),Misrepresentation (Section 4.2.2), and Not Necessary (Section 4.2.3)themes.Students reported feeling watched, which is the main effect of panoptic surveillance [16].In Misrepresentation (Section 4.2.2) and Not Necessary (Section 4.2.3)students discuss not knowing how and why, respectively, their data is being viewed.Students reported discomfort because of the lack of transparency about the level of surveillance.Students who discussed consent wanted more transparency and information.The lack of knowledge of whether the instructor is checking information, how the instructor is interpreting information, and ways in which the instructor might be biased or discriminatory could position the instructor's online presence as a disciplinarian.This is an outlook on educators that students likely learned from previous schooling experience.The LMS facilitates a personal relationship with an imbalance in power based on what students know is visible about them and how the information will be (mis)interpreted.

Justification
. The justification for sharing information was another theme that consistently emerged in students' concerns.Students show concern about the necessity, reason for having the information, and relevance of it in the Not Necessary theme (Section 4.2.3).On the other hand, students that are not concerned see Genuine Need (Section 4.3.1)for the information.The two sides of justification demonstrate why "raw" LMS information is distinct from LA data.LA serves specific objectives, while the data and context we explored prompted students to reflect on justifications from their own perspectives and experiences.Submission Time was very justifiable because it was accepted as a way of enforcing deadlines.However, Last Login and Engagement Activities were almost evenly contested between Not Necessary and Genuine Need.We interpret this to mean that students were uncertain on whether they could think of a justification for these types of information being shared.If there is a legitimate way for instructors to learn how to provide a better education to their students, our participants see more validity and less concern about being surveilled.

Awareness, Misrepresentation, and Justification
Our work demonstrates that students expressed the feeling of being surveilled and misrepresented within LMS platforms, which is consistent with related literature that describes power dynamics and misrepresentation in education technology.Data collected and shared within LMS platforms creates a power asymmetry because students are less aware that instructors can see certain data yet overestimate how much instructors use the data [32].The focus of our study is student concerns rather than awareness, but in the Not Necessary theme we see the concepts are linked because students lack a clear awareness of why data is being collected leading to feeling concerned.The students who were not concerned saw a justification for the instructor having the power to surveille them as seen in the Genuine Need theme.Notably, many of the comments under Genuine Need were also related to representation, in this case a more positive representation (Quotes by P19, P24 in Section 4.3.1).Student awareness and justification for data use are addressed in the LA literature through frameworks such as DELICATE and SHEILA [14,73].In line with our findings, the DELICATE framework emphasizes openness, consent, and legitimacy to address concerns corresponding to our themes of awareness, misrepresentation, and justification [14].There is work needed on creating frameworks specific to the distinct concerns related to LMS data.

Surveillance and Panopticism
Concerns about LMS data are encapsulated by the concept of surveillance because of the sharing of private information without meaningful consent and awareness from students.Per our results, this causes varying reactions in both the concerned and not concerned categories, which is consistent with apathy reactions to surveillance in other literature [2,22].The tendency for LMS to result in instructors to see students as a "corpus of texts" [13] combines with how modern surveillance often takes the form of dataveillance [10].The danger of dataveillance is that the large amounts of data about students' online activity and assignments submissions could create a digital persona of the student, which can be inaccurate and misrepresent the student.Misrepresentation and vulnerability as a risk of surveillance is communicated by the students in our study.Findings from our work could lay down the foundations for developing a framework for studying student concerns about data collected and shared within LMS platforms.
Although not within the scope of our interviews, the use of proctoring software for policing academic dishonesty during online exams has parallels in the feelings of surveillance between students and instructors.Like the students in our study, when asked about their privacy on online proctoring tools, students expressed valuing deterrence of cheating but described types of surveillance where they felt "watched" [6].Additionally, faculty are also displeased with their observational role in the online proctoring system [37].Some of the similarities between perceptions of online proctoring and LMS surveillance can be attributed to proctoring tools being one of the many potential third-party integrations in the LMS [59].The normalization of education technology (especially due to the COVID-19 pandemic) leads students to be undiscerning in their privacy risks between various platforms [20].Some students who were concerned about privacy feared being accused of academic dishonesty and their grades being based on perceived effort based on LMS data.The students' concerns mirror the self-regulation of prisoners in the panopticon [16,23,31].This positions the LMS as a disciplinary system, but the acceptability of using LMS data surveillance for detecting academic dishonesty is debated.Some universities and statements from Instructure prohibit or dissuade the use of Canvas to detect academic dishonesty [29,68,77].The New York Times claims to have found inaccuracies in Canvas' logs of student data during their reporting on the Dartmouth cheating scandal of 2021, which points to why LMS data is not to be relied on for serious investigations [63].Regardless, researchers at other universities are still using Canvas data for academic misconduct detection [48,51].Additionally, it is nontrivial to predict grades from LMS activity [4,69], but the field of LA is finding models that are successful in doing so [e.g., 4,29].Misrepresentation, beyond academic dishonesty and grading, in the literature about LA focus on data and broad implications such as accurate visualization and ethics [75], consent and data ethics [64], trustworthiness of LA [66], and equity [64], but not the interpersonal relationships between student and instructor.Despite academic dishonesty and grading being the most well-documented concerns about misrepresentation in education technology, they have a small number of references in our study compared to the concerns about instructors gauging effort and understanding, procrastination, judgement, and exposure of habits.This data has a personal nature of direct surveillance by an instructor, who is in a panoptic position to have this covert power imbalanced relationship with all their students.The negative educational effects of self-regulation are documented in one study of AI classroom monitoring that described the behavioral and emotional impacts of surveillance; notably, students fear showing a poor performance because of a lack of transparency about the system [21].What makes LMS surveillance a concern is the damage to the direct relationship between student and instructor.Instructors have the power to use the LMS to discipline and observe, much like all-seeing jailer in the panopticon [16].

Nothing to Hide
The nothing to hide sentiment is the reasoning that privacy is not a concern unless there is something illegal or wrong to uncover.At first glance, describing themselves as having nothing to hide invokes Soloves' writings on the tradeoff between government surveillance, security, and the societal value of privacy regardless [67].One participant discussed not being concerned about privacy due to their uncontroversial identities but being concerned about privacy for their transgender sibling (P25), which is consistent with Soloves' argument that privacy is needed on a societal level for more vulnerable people regardless of individual sentiments [67].However, this argument was not common among other students who described having nothing to hide.Other authors have described students, who may not have much of a choice to use a technology, may cite having "nothing to hide" due to feeling a lack of choice and thus resignation to privacy violations [2,46].Teenagers on social media describe feeling apathetic about their privacy because they have "nothing to hide" [2].Adorjan and Ricciardelli go on to propose that this may be out of the need to "adapt to the demands of authority figures, including parents and educators" [2].We argue the perceived power differential between educators and students carries over to college due to how students have seen educators in their primary and secondary education.Importantly, there is evidence of this in how students see education technology and their privacy once they enter college [20].We observe this in our participants who feel unconcerned due to the normalization of certain data being shared on LMSs.The combination of normalization and students adapting well to their authority figures (by turning in their homework on time, etc.) leads them to say they have nothing to hide and are thus not concerned about their privacy.However, this doesn't explain the students who were concerned, but described themselves as having nothing to hide under the Not Necessary theme.This phenomenon can be compared to James' "privacy as forsaken" [30].Academic spaces are expected to uphold intellectual privacy [56], but students may assume that, regardless of their actions on the LMS, instructors have the power to observe them.
Adopting principles from intellectual privacy theory could foster a learning environment where students feel a sense of empowerment regarding their privacy that allow them to freely explore, express and develop their ideas without feeling the need to hide [56].When students feel their intellectual privacy is violated, it can lead to self-censorship, inhibiting them from freely sharing their ideas or opinions, or from seeking help when needed.

DESIGN AND POLICY IMPLICATIONS 6.1 Fostering Transparency and Mitigating Power Dynamics
Educating instructors, instructors implementing course policies, and HEIs supporting these efforts are key practices to foster transparency and mitigate power dynamics in LMSs.Instructors may not be aware that they may be perceived to be playing the role of a panoptic disciplinarian on the LMS.Instructors usually have the best intentions and may lack awareness of the information that is visible to them online, but that is not obvious to all students.Educating instructors using LMS to become aware of the conclusions they draw from data gathered online with the awareness that being misrepresented is a concern for students is critical.Instructors can be personally proactive in seeking this information, but we encourage HEIs to offer education on LMS data sharing.Students who participated in our research expressed concerns about data (last login, post submission reviews) collected and shared with instructors that may be irrelevant to their learning.They highlighted the potential for this information to influence instructors' judgments on their habits and behaviors without sufficient context.Once instructors are educated about these concerns, one approach to alleviating them would be through instructors establishing policies and expectations that outline how effort, procrastination, habits, and potential cheating will be judged by the instructor could help alleviate such concerns.
Being explicit about how student data will be perceived has the potential to lessen the power differential that results from students not knowing and fearing how they will be perceived.Not knowing if an instructor is looking at the data leads to the instructor creating the environment of a technological panopticon.One way to combat this dynamic is by communicating surveillance policies using a syllabus statement [35].This would lessen privacy concerns, in the short-term and long-term, because communication is held as a value, also evidenced by the Consent theme.Although instructor education and course policies are important for adapting privacy to context, institutional-level policies offer uniformity and accountability.HEIs should ensure that instructors are supported in efforts to learn about and mitigate their role in surveillance through education and supporting instructor policies.As further discussed in the limitations section, students may have varying levels of trust about an instructor's self-report of the way data is observed, so institutional powers can offer credibility to the policies established by individual instructors.

Designing for Minimalization
The participants in our study often expressed the need to know more about what information is being collected and shared with instructors.The Consent and Not Necessary themes centered around questions around the necessity and relevance of some of the data shared with instructors through LMS.The designers of LMSs should justify the data they choose to collect and present to instructors.Per the Genuine Need theme, accountability, deadlines, and academic dishonesty could be legitimate uses.However, there should be a critical eye on what is strictly necessary for these purposes.One approach is applying privacy engineering and "privacy by design" principles to critically analyze what data is collected and shared between roles within institutions and courses, how that data is used by different roles and importantly, the utility of the data towards student learning and academic achievement.Students cited the use of data as a teaching tool as a genuine need.Designers of LMSs wishing to expand teaching tool functionality should do so while minimizing the visibility of individual data.Beyond the scope of this paper, the gathering of this data in the first place should be investigated.Designers of LMSs have underlying motivations for the LMS to collect more data to increase the value of the educational technology company [78].Our work shows that despite the major issues with data being gathered, the micro-scale of what information is visible to the instructor needs to be considered for a more comfortable and equal relationship between the student and instructor.One approach to doing this would be involving students, and not just instructors and academic administrators, while designing LMS functionalities that involve sharing of student data directly with instructors [70].

Enabling Privacy Autonomy
The contextual integrity of communicating information with the instructor and the LMS platform would benefit from customizable controls for various types of information and class contexts [50].
A suggested solution for this is the creation and implementation of privacy dashboards in LMSs [33].Through privacy dashboards, both students and instructors could be enabled to choose what types of student information is collected and shared with instructors and others during the period of a certain course.For example, a student may choose to share submission time but disapproves sharing of "views" or "attempts" because of concerns of misrepresentation or irrelevance to the course whereas the same student may approve the sharing of "views" if the student believes they would benefit from feedback on their engagement.Similarly, access to privacy dashboards could enable instructors to customize the information they receive and other students in the class receive based on the requirements of a particular course or to respect the concerns of some students.Students and instructors choosing what is best for their learning goals, class dynamics, relationships, and personal preferences would remedy some of the feelings of surveillance and uncertainty reported by students.This approach is in line with Nissenbaum's contextual integrity because it would adjust the norms and information flow based on the context of the class or instructor [49].Importantly, it would emphasize the importance of privacy within educational institutions by shifting the locus of control from a centralized system to the students, fostering a sense of agency and autonomy.Students and instructors could collaboratively make decisions on the information shared, creating a more inclusive and respectful classroom environment.It could also lead to trusted adoption and use of LMSs by students and instructors, alike.

LIMITATIONS AND FUTURE WORK 7.1 Limitations
The initial binary categorization of the participants' statements as either concerned or not concerned does not capture the magnitude of the participants' concern.Many participants weren't firm in their beliefs because the interview was the first time that they had heard about what data about them is visible to instructors.Perhaps more time to reflect on their concerns would yield different results.
The qualitative methodologies of semi-structured interviewing and qualitative coding have strengths and limitations.No two interviews are identical in the semi-structured model, so some participants were asked more follow-up questions (beyond the ones listed in the interview protocol) than others.Follow-up questions were mostly used to elicit elaboration on simple answers, but interviewer bias could lead to not all participants' responses being as deeply interrogated.Our bias and positionality as coders also limit our perspectives in analyzing the results.
Although a large majority of the participants interviewed are women, we didn't target the recruitment for this study toward minoritized students.This paper doesn't discuss how the students belonging to minoritized groups may have unique concerns.For example, a study of African American and Non-White Hispanic students called for HEIs to offer more privacy and security support [40].Some of the smaller categories like Consent and Discrimination might have had larger impacts if we had focused on the experiences of people who are generally more affected by these topics.
The study takes place at a single university with one LMS, Canvas.Different HEIs and different LMSs may differ in what data are visible to an instructor.Like any technology, LMSs change their features, so by the time of the publication of this work there may be more or fewer types of information available.

Future work
We only asked about five types of information visible to the instructor on the Canvas interface, but this is just a handful of the types of information available on Canvas.Per our Related Work section, there are other researchers investigating students' comfort with some LA features, but there are many surveillance features about which student perceptions haven't been documented.We noticed students were more concerned about their privacy during online quizzes, but none of our questions were directed at information visible to instructors during or about quizzes.Along the lines of expanding the list of student information to examine, the distinction between quizzes and other activities would be a fruitful study.
If all the implications of this study were implemented, it is unclear if students would trust the communication and control that they would be offered.In parts of the interview outside of the scope of this paper, students mentioned that expressing their desires for privacy could negatively affect their relationship with their instructor.How instructors perceive students less willing to share data would be a hurdle to allowing students to choose the data they share.In addition, there is work being done about making online privacy policies more accessible to everyday readers [36].Something similar is necessary in relation to educational technology.For instance, making surveillance policies more accessible to students so they feel more trust and confidence in the LMS.
More theoretical studies would be useful to understanding the responses given by our participants.A formal study is needed of the phenomenon of students saying they have nothing to hide, especially in relation to how well the students adapt to the academic environment.To what extent do students see their instructors and HEIs as disciplinary bodies and to what extent does their academic integrity or class behavior lead to their sentiments?More information is needed as to why students cite having nothing to hide in their expressions of concern and lack of concern.

CONCLUSION
The results of this study show that students consider themselves as concerned or not concerned based on various types of student information available to the instructor in the Canvas LMS.Students worry about consent, misrepresentation, the data being unnecessary, and express their vulnerabilities.Those not concerned cite genuine needs for the data, normalization, and being unaffected.Power dynamics, panoptic surveillance, and the nothing to hide sentiment explain these privacy opinions.We recommend establishing and communicating surveillance policies, educating instructors, "privacy by design" for minimal data sharing, and the implementation of privacy dashboards in the LMS.Future work on LMS information sharing, not just LA, is needed on the topics of more types of information, students' reaction to our recommendations, and what leads students to say they have nothing to hide.

Figure 1 :
Figure 1: References in each question by theme

Figure 2 :
Figure 2: Concerned or Not Concerned response by number of references per question

"
Maybe when it comes to tests or quizzes, I, I'd be a little bit concerned just cause if it's an open note exam, of course I will be clicking out and checking[…] So I guess I'd be a little bit wary for those." (P15, EA)

Table 4 :
Number of references by question for Concerned and Not Concerned categories

Table 5 :
Number of references by question and theme