Communication and Algorithmic Decision Making in a Virtual Healthcare Context
Extended Abstract

Lillian Campbell, Marquette University, United States, lillian.campbell@marquette.edu
Amrita George, Marquette University, United States, amrita.george@marquette.edu
Shion Guha, University of Toronto, Canada, shion.guha@utoronto.ca

This qualitative study draws on interviews and observations with nurses working in a virtual intensive care unit and using algorithms to track patient progress. It overviews how health practitioners navigate algorithmic systems to build relationships with other providers and patients, with attention to strategies for accountability and advocacy in virtual healthcare contexts.

CCS Concepts:Applied computing → Health care information systems;

KEYWORDS: automated patient care, healthcare, algorithmic systems

ACM Reference Format:
Lillian Campbell, Amrita George and Shion Guha. 2021. Communication and Algorithmic Decision Making in a Virtual Healthcare Context: Extended Abstract. In The 39th ACM International Conference on Design of Communication (SIGDOC '21), October 12-14, 2021, Virtual Event, USA. ACM, New York, NY, USA, 4 Pages. https://doi.org/10.1145/3472714.3473669

1 INTRODUCTION

While there is a growing body of research on how practitioners negotiate algorithmic systems [1,2,3], attention to accountability and advocacy in their communication practices has been limited. This project draws on interviews and observations with five nurses working in a Virtual Intensive Care Unit (VICU) to report how health practitioners in virtual contexts navigate algorithmic systems to build relationships with other providers and patients.

2 AFFORDANCES AND LIMITATIONS OF ALGORITHMIC PATIENT CARE

In virtual contexts, practitioners lack the intuitive cues that come from physical and verbal patient interaction [4] and thus, often make decisions about patient care based on algorithms that track patient status. These decisions are then communicated to other providers and documented in an electronic health record. As an algorithmic system mediates interpersonal communication, the subjective element of person-to-person communication can be removed. In turn, this allows decision-making to rely less on individual personalities or relationships, but also risks alarm fatigue [5, 6]. Indeed, practitioners now face an ever-growing number of alarms and learn to ignore them to continue focusing on patient care [7].

In addition to the interpersonal impacts of implementing virtual patient care, healthcare algorithms themselves have both affordances and limitations. By using appropriate IT systems, providers can improve the health of individuals through the temporal displacement of care [8]. While the use of such algorithms aids organizations in achieving reputational, operational, and economic value [3], embedded biases in the algorithms can affect the benefits derived from their use [9, 10]. In addition, algorithm risks arising due to issues in input data, algorithm design flaws, usage flaws, technical flaws, and security flaws [11] can hinder adoption of algorithm-enabled IT systems within any organizational setting. Thus, understanding the experiences of virtual healthcare practitioners engage with algorithmic systems is vital for determining a pathway for successful virtual care.

3 METHODS

Research began in January 2021 and is part of a one-year NSF pilot grant to study algorithmic decision-making across several clinical contexts. Research for this presentation included observations and interviews with five nurses working in a VICU during May and June 2021. Nursing participants were recruited through attending VICU staff meetings, sharing details of the project, and asking staff to complete a consent form to opt in or out of interviews or observations; no patient data was collected for this stage of the project. The project received Human Subjects approval from both the researchers’ institution and the institution where it took place.

4 AN EXTENDED EXAMPLE

This section discusses how VICU nurses navigate the changing landscape of algorithmic systems in their workplace, negotiate care alongside other providers and practitioners, and communicate with their patients.

4.1 Changing Algorithmic Systems

Like in many healthcare workplaces, the VICU staff face frequent changes in the virtual platforms they are expected to master. During the course of a single staff meeting, the staff were trained to use a new virtual scheduling system to sign up for shifts; reminded to prompt their patients to complete a new educational module program targeted to high-risk patients; and congratulated for their rapid progress ordering Covid-19 tests for patients on a new e-platform. A large change for the VICU staff occurred this past winter, when administrators decided not to renew their contract with the patient warning system, the Rothman Index.

The Rothman Index [1, 2] is a proprietary closed source algorithm that uses vital statistics, lab results, and other health related information to predict health deterioration. In the VICU, nurses would receive alerts from the Rothman Index and make decisions about whether to send those alerts to providers on the hospital floor. Administrators are considering a transition to Epic's early warning system – the Epic Deterioration Index – that is already available and free through the hospital's Epic platform. Administrators cite the limited efficacy of the Rothman in improving patient health outcomes as a key reason for the transition.

4.2 Practitioner Communication and Accountability

As the VICU team transitions away from the Rothman Index, practitioner communication and accountability are also undergoing changes. The director of the VICU has noted that without the Rothman alerts, VICU staff do not have as regular opportunities for outreach to floor personnel, reducing both practitioner accountability and opportunities for patient advocacy. The director also noted that relationships with practitioners on the floor are “always a little dicey,” echoing prior research that has found that workers tend to be resistant to the experience of being monitored by tele-ICU staff [12, 13]. The team has been working strategically to develop a consultative and mentoring relationship with floor nurses, however, especially new hires. Positioning themselves as a resource for new nurses can facilitate a more collaborative relationship while also helping VICU staff to provide support and accountability for those who are most in need. 

VICU nurses also reported that while they often receive requests to monitor a particular patient from floor nurses, they then enter their database to find that the patient's vitals were entered many hours ago. They noted that vitals are taken and recorded on the physician's schedule, typically every 8 hours, which does not reflect the needs of the VICU team. Thus, they can do little to support a nurse that has requested monitoring but has not input vital information recently. These findings echo recent research on early warning systems that has found one of the largest detriments to success was delayed information input by nurses [14] and demonstrate how tele-ICU's can disrupt communication workflows [15].

4.3 Patient Communication and Advocacy

In the aftermath of the Covid-19 pandemic, practitioners in the VICU have found their work has shifted substantially towards providing direct outreach to ambulatory patients, especially Covid-19 patients who are recovering at home. Primary duties include information intake and entry into digital tools and regular monitoring of patients through daily protocols and checklists to determine need for hospital readmission. Nurses report that their one-on-one patient contact has increased significantly as a result.

This increased patient contact has also provided opportunities for patient advocacy. One VICU nurse frequently directed floor nurses to reduce ambient noise from telemetry alerts when it was clearly agitating a mental health patient. During a staff meeting, another VICU nurse argued for revised protocols for patients at risk of self-harm, noting that following many of these patients into the bathroom seemed unnecessary. The team discussed building more flexibility into protocols to empower floor nurses to make contextual decisions about a patient's needs. While neither of these examples emerge from algorithmic decision making per se, they do demonstrate how virtual monitoring, alongside algorithmic monitoring, can provide more contextual patient information that enables situated patient advocacy and care and counteracts algorithmic biases [9, 10].

5 CONCLUSION

As the general population ages and life expectancy increases in the United States, demand for health care providers is also on the rise. The next several decades will see a rise in automated patient care and the widespread use of data-driven warning systems. This research points to the important role that these systems can play in relationship building between virtual and on-the-floor providers, as a motivator to initiate and maintain contact as well as to support collaboration on protocol revisions and calls for change. While earlier research has pointed to the risks of alarm fatigue [5,6] and algorithmic bias [9,10] with algorithmic patient care, qualitative field research like this project can help administrators to better understand and prepare for the experiences of providers as they adapt to new modes of establishing accountability and patient advocacy.

ACKNOWLEDGMENTS

This research is funded by a National Science Foundation “Future of Work” Grant, grant no 2026607.

REFERENCES

  • Brian C. Wengerter, Kevin Y. Pei, David Asuzu, and Kimberly A. Davis. 2018. Rothman Index variability predicts clinical deterioration and rapid response activation. The American Journal of Surgery 215, 1 (Jan 2018), 37-41. DOI: https://doi.org/10.1016/j.amjsurg.2017.07.031
  • Meredith C. Winter, Sherri Kubis, and Christopher P. Bonafide. 2019. Beyond reporting early warning score sensitivity: The temporal relationship and clinical relevance of "true positive" alerts that precede critical deterioration. Journal of Hospital Medicine 14, 3 (March 2019), 138-143. DOI: https://doi.org/10.12788/jhm.3066.
  • Stavros Polykarpou, Michael Barrett, Eivor Oborn, Torsten Oliver Salge, David Antons, and Rajiv Kohli. 2018. Justifying health IT investments: A process model of framing practices and reputational value. Information and Organization 28, 4 (Dec 2018), 153-169. DOI: https://doi.org/10.1016/j.infoandorg.2018.10.003
  • Lillian Campbell and Elizabeth L. Angeli. 2019. Embodied healthcare intuition: A taxonomy of sensory cues used by healthcare providers. Rhetoric of Health & Medicine 2,4 (Fall 2019), 353-383. https://www.muse.jhu.edu/article/744859
  • Simon Cooper, Tracy McConnell-Henry, Robyn Cant, Jo Porter, Karen Missen, Leigh Kinsman, Ruth Endacott, and Julie Scholes. 2011. Managing deteriorating patients: Registered nurses’ performance in a simulated setting. The Open Nursing Journal 5 (Nov 2011), 120. DOI: https://doi.org/10.2174/18744346011050100120
  • Thomas C. Bailey, Yixin Chen, Yi Mao, Chenyang Lu, Gregory Hackmann, Scott T. Micek, Kevin M. Heard, Kelly M. Faulkner, and Marin H. Kollef. 2013. A trial of a real‐time alert for clinical deterioration in patients hospitalized on general medical wards. Journal of Hospital Medicine 8, 5 (May 2013), 236-242. DOI: https://doi.org/10.1002/jhm.2009
  • John Asger Petersen, Lars S. Rasmussen, and Susan Rydahl-Hansen. Barriers and facilitating factors related to use of early warning score among acute care nurses: A qualitative study. 2017. BMC Emergency Medicine 17, 1 (Dec 2017), 1-9. DOI: https://doi.org/10.1186/s12873-017-0147-0
  • Steven Thompson, Rajiv Kohli, Craig Jones, Nick Lovejoy, Katharine McGraves-Lloyd, and Karl Finison. 2015. Evaluating health care delivery reform initiatives in the face of “cost disease”. Population Health Management 18, 1 (2015), 6-14. DOI: https://doi.org/10.1089/pop.2014.0019
  • Janna Anderson, Lee Rainie, and Alex Luchsinger. 2018. Artificial intelligence and the future of humans. Pew Research Center 10 (Dec 2018): 12.
  • Henritte Cramer, Jean Garcia-Gathright, Aaron Springer, and Sravana Reddy, 2018. Assessing and addressing algorithmic bias in practice. Interactions 25, 6 (2018): 58-63. DOI: https://doi.org/10.1145/3278156
  • Dilip Krishna, Nancy Albinson, and Yang Chu. 2017. Managing Algorithmic Risks: Safeguarding the use of Complex Algorithms and Machine Learning. Deloitte Risk and Financial Advisory.
  • Trudi B. Stafford, Mary A. Myers, Anne Young, Janet G. Foster, and Jeffrey T. Huber. 2008. Working in an eICU unit: life in the box. Critical Care Nursing Clinics of North America 20, 4 (Dec 2008), 441-450. DOI: https://doi.org/10.1016/j.ccell.2008.08.013
  • Jane Moeckli, Peter Cram, Cassie Cunningham, and Heather Schacht Reisinger. 2013. Staff acceptance of a telemedicine intensive care unit program: A qualitative study. Journal of Critical Care 28, 6 (Dec 2013): 890-901. DOI: https://doi.org/10.1016/j.jcrc.2013.05.008
  • Anne Watson, Chantel Skipper, Rachel Steury, Heather Walsh, and Amanda Levin. 2014. Inpatient nursing care and early warning scores: a workflow mismatch. Journal of Nursing Care Quality 29, 3 (July/Sept 2014): 215-222. DOI: https://doi.org/10.1097/NCQ.0000000000000058
  • Clemens Scott Kruse, Priyanka Karem, Kelli Shifflett, Lokesh Vegi, Karuna Ravi, and Matthew Brooks. 2018. Evaluating barriers to adopting telemedicine worldwide: A systematic review. Journal of Telemedicine and Telecare 24, 1 (2018), 4-12. DOI: https://doi.org/10.1177/1357633X16674087

CC-BY license image
This work is licensed under a Creative Commons Attribution International 4.0 License.

SIGDOC '21, October 12–14, 2021, Virtual Event, USA

© 2021 Copyright held by the owner/author(s).
ACM ISBN 978-1-4503-8628-9/21/10.
DOI: https://doi.org/10.1145/3472714.3473669