Delicate Decisions at the Intersection of Intensive Care and Machine Learning - How Human Information Needs inform the Development of Decision Support

A hospital intensive care unit (ICU) is a complex, dynamic, high-stakes environment in which an array of health professionals monitor, make decisions and work together to keep critically ill patients alive. An ICU is also an information-rich environment where large amounts of digital information intermingle with information in the form of human observations, knowledge and communications. Clinical decision support systems (CDSS) that harness machine learning (ML), artificial intelligence (AI) and other information technologies are seen as potentially powerful allies to support clinicians. While there is a significant and growing body of research on ICU applications of ML and AI, relatively little is understood about the decision-making needs and values of the highly trained professionals responsible for decisions in that challenging information environment. We interviewed 31 ICU clinicians from 9 (mostly Queensland) institutions on their decision needs and values. This paper presents a comprehensible slice of our findings about information needs in the ICU; discusses implications for AI/ML CDSS development; and concludes with the view that clinicians and technologists face delicate decisions about the extent and nature of AI/ML CDSS in the ICU.


INTRODUCTION
Intensive care units (ICU) depend on information to enable humans to deliver safe, timely, and effective care to critically ill patients with complex and dynamic health needs.There is certainly great interest in using machine learning (ML) and artificial intelligence (AI) in clinical decision support software (CDSS) and their potential to support clinical decision-making but the actual impact of AI/ML CDSS on clinical decisions is unclear [11,26,27].It is tempting to assume that AI/ML CDSS enhances decision-making and to implicitly trust the perceived authority of data, numbers and predictive AI/ML, but there are cases in which technology has undermined vital (and uniquely human) attributes: empathy, critical thinking, and intuition [5,6].Data transformed and presented via AI/ML CDSS can carry a certain influence, impinge on clinician's autonomy and introduce its own non-human bias [8].At the same time, intensive care faces increasing challenges on many fronts: from aging populations, pandemics, and health policies that exacerbate already demanding work, to the volume, variety and velocity of information to consider when treating patients in time-critical, high-stakes situations [14][15][16][17].Thus, investigating AI/ML CDSS becomes a pressing obligation.From a technical standpoint, the time is now ripe to develop CDSSs that offer meaningful support for clinicians [20].From a clinical standpoint, that technical development seems remote from the clinical context [8,12,20,25].
When we speak about 'decision support', we must not forget that a decision is typically made within an environment, embedded in a decision-making process, following pre-existing decision pathways to achieve specific outcomes.ICU clinicians are highly trained for that environment, and their exceptional ability to assimilate complex information from different sources while communicating effectively with colleagues and families must be acknowledged, as must the mental and emotional load they can experience.While AI/ML has demonstrated remarkable success at some tasks in some environments, the decision-making environment of the ICU (and the needs and values of the humans who work there) merit careful consideration before launching into AI/ML "solution" development.
To create AI/ML that truly supports ICU decision-makers requires a deep understanding of two interdependent issues: (1) what do decision-makers need? and (2) how do we define the problem we want AI/ML to address?
This understanding is fundamental to clearly defining tasks that AL/ML CDSS could be designed to tackle, both for algorithm development and for achieving a high degree of effectiveness in the ICU context.We have sought that understanding by asking 'If AI/ML CDSS could provide another piece or system of information (let's call it ), what would clinicians need and value?What is just the right  for a certain decision in the ICU?' This paper presents work-in-progress to investigate the nature of ICU decision-making, including clinicians' needs and values, and how these influence , the information or information systems that would genuinely support decision-making in the ICU.
We describe how we have involved, elicited and then meaningfully organised insights from clinicians about their needs so as to inform AI/ML CDSS development.We conducted semi-structured interviews with clinicians (ICU doctors and ICU nurses) from 9 different Australian institutions.Each interview was rich and revealing, and collectively they covered an extensive variety of relevant topics.This paper allows us to present a comprehensible slice of our findings and discussions to date.We found that asking clinicians about their values and needs allows us to define the potential strengths of AI/ML CDSS with more clarity.This approach opens the way to a better understanding of the problems that datadriven approaches can tackle.That deeper understanding helps us approach the delicate decision of defining a problem area with enough precision to examine it in isolation with data-driven approaches, while also taking into account the requirements of the human actors in the target environment: the ICU.

BACKGROUND: MAKING DECISIONS IN THE ICU
Artificial intelligence/machine learning have the potential to greatly impact decision-making processes in the intensive care environment.ICU clinicians deal with large amounts of information and need to make many judgements about what is and is not important in caring for critically ill patients.Sudden, unexpected change makes the ICU a dynamic environment that complicates decisionmaking [9].ICU patients are demographically diverse; their conditions are etiologically heterogeneous; and their responses to treatment for similar conditions can vary significantly [16].Moreover, the absence of specific guidelines for the host of possible scenarios makes decision-making highly dependent on skill and experience, resulting in significant variation in decision-making and care practices [16,28].In principle, AI/ML CDSS could help clinicians deal with their mental load by organizing and analyzing data more efficiently, potentially leading to better treatment outcomes and reduced medical errors [20,25].AI/ML CDSS might also be of great benefit in settings that lack certain clinical skills or resources [18].However, there are several challenges to the use of AI/ML CDSS in the ICU setting [1].While the need for high-quality data and sound ethical and legal practices are acknowledged, it is unclear how clinicians would like to use AI/ML CDSS.How clinicians make decisions mirrors the dynamic, complex and variable character of the ICU: it is non-linear, iterative and might omit steps of an idealised decision-making process [24].Decision support systems that increase demands on clinicians will not be accepted [20].Previous research has investigated factors that clinicians deem important for decision-making, but has not explored the interactions between these factors, or their high dependency on context [9].
To date, the majority of AI/ML CDSS research has focused on early identification of clinical events, as well as outcome prediction and prognosis [10,13,16].Although the prediction of outcomes (e.g., mortality) is a technically well-posed problem, single outcomes are not the only factors that influence clinical decisions [17].Modelling diagnostic and prognostic factors remains complex.Van Der Vegt et al. [26] showed that information retrieval systems effectiveness does not always translate to better decision-making outcomes, but that document interpretation is the most critical factor impacting decision correctness; they suspected that a better integration of already existing knowledge may yield better clinical decisions than further research on models.Therefore, although AI/ML CDSS could improve decision-making, the way it is implemented, its adaptability to workflows and the diversity of professionals, and its quality are critical determinants for its success [15,22,27].Moreover, previous studies have shown mixed effects of AI/ML CDSS on clinicians' decision-making [11,26,27].
Van Der Vegt et al. [27] investigated AI implementation barriers, enablers and key decisions with a focus on sepsis and reported that 66% of post-implementation studies did not report real-world performance of the tool implemented; those studies that did not always show an improvement in clinical decision-making, and focused instead on how that tool performed in the environment, not how decisions were affected in the overall environment.Van Der Vegt et al. [27] demonstrate how difficult it is to draw a causal link between AI models and mortality outcomes because of study limitations, especially non-randomised study designs with many confounders, and infrequently reported, non-standard algorithm performance metrics post-implementation.Thus, they concluded that 'it remains unclear whether [machine learning algorithms] were responsible or needed for improved mortality' and that the choice of outcome definition is critical 'as it can directly influence algorithm performance measures, particularly specificity' [27].
At this time, and despite tremendous technical advances, relatively little seems to be known about what ICU clinicians may need and value from AI/ML, and the extent to which methods from our increasingly technically advanced AI/ML toolbox might address those needs and values.

METHOD: APPROACHING DEEPER UNDERSTANDING 3.1 Positionality: A Humanistic Perspective
Despair and grief are very close to hope and joy in the ICU.Technical research and development aimed at ICU clinicians need to pay attention to their circumstances, which requires an empathybased approach based on genuine interest [2].We have adopted a humanistic perspective by engaging with participants based on their values, recognizing that clinicians are experts when it comes to their own needs.[23].While data-driven approaches like AI/ML seem to be objective and reproducible, their development and implementation often lack a comprehensive understanding of human needs in care-oriented environments [3].Connection with clinicians' needs and their perception of what is beneficial is essential to AI/ML CDSS development that prioritizes intensive care over intensive data [21, p. 92].

Approach: Navigating Complexity
The environment in which clinicians operate is dynamic, uncertain, relational, situational, and high risk, and as such highly complex.These ICU-specific characteristics create many practical constraints which is perhaps why there is not more qualitative inquiry before AI/ML CDSS research endeavors.Clinicians have vital, demanding work to perform in caring for critically ill patients; we presumed that, by the end of a long shift, they would have limited energy for a thorough conversation about their decision-support needs.Interviews were entirely voluntary and conducted with ethics approval from QUT Human Research Ethics (5737) and GCHHS Ethics Committee (HREC/2022/QGC/88658) for Metro North Health sites.
The primary challenge we encountered whilst planning semistructured interviews, was not being able to ask participants directly about our main interest: their needs of AI/ML CDSS because (1) we could not expect participants to have a background in AI/ML or its possibilities (2) participants may exhibit a positive or negative reaction toward AI/ML which could obscure their actual needs, and (3) such direct questioning could convey the impression that we were more interested in pushing technology than understanding their needs or concerns.With this in mind, we developed a simple interview guide (Table 1) to give priority to questions that (1) build a trust and rapport (2) demonstrate our genuine curiosity, and (3) stimulate participants' imagination about what they need.We aimed to keep questions simple, considering interviews would take place during participants' shift breaks when they might be tired.The interview guide, piloted with three ICU professionals, underwent revisions based on feedback.Questions were adapted to each participant's background during interviews.
The interviews were conducted by the first author from April-November 2023, giving participants flexibility in choosing the time, location, and duration.Recruitment targeted diverse medical and nursing roles, seeking participants through local hospitals, the Australian College of Critical Care Nurses (ACCCN) journal announcements, and LinkedIn profiles.Participation was entirely voluntary.We interviewed 31 participants from 9 institutions, predominantly in Queensland.Participants represented a range of institutions, from small to large, and had backgrounds in nursing ( = 17) and medicine ( = 14).Experience in the ICU varied from less than 1 year to over 20 years, with most having worked in different roles and hospitals.Initial transcriptions were made with otter.aifollowed by manual proofreading and correction.The thematic analysis involved iterative discussions after each interview [4,7,19], then creating mind maps to organize informative details into coherent and relatively distinct themes.Clinical experts were consulted to avoid misinterpretations and confirm that our thematic organisation was comprehensive and reasonable.

INITIAL FINDINGS: DECISIONS, VALUES AND PREDICTIONS
Here just patient information, but also their colleagues' decision-making capabilities, demonstrating that decision needs are not limited to EHR data, but rather encompass the entire ICU environment.

Technology Development should account for Five Values
(1) Autonomy: Clinicians' appreciation for autonomy speaks to their responsibility and power in making decisions and taking action.They view that 'humans care for humans' was seen as a vital part of care.(2) Experience: Autonomy and experience are closely related.Experience determines whether a clinician is qualified to make a decision or whether it needs escalation.Critical thinking, data interpretation, reflection, and handling high mental and emotional loads were seen as aspects of experience.(3) Adaptability: Participants emphasized the importance of  being adaptable to their individual needs and those of their patients, rather than requiring them to change their work routine or develop workarounds: 'Customization based on clinician preference and individual patient is the holy grail.' (4) Caution ('healthy mistrust'): Clinicians need to know how much they can rely on data and information, especially if using 'someone else's source of information.' (5) Time: Clinicians want  seamlessly integrated into their workflow without increasing mental workload.They highlighted processes that took too much time (e.g., logging into multiple systems to source information).

Prediction Needs have Many Faces
Participants had mixed opinions about the usefulness of  for decision-making.They were generally skeptical about any changes to their current workflow and decision-making processes, but interested in adaptable technology.Less experienced participants seemed to navigate information from detail to the whole picture, while more experienced participants seemed to use information about the whole picture to navigate to useful detail.Interestingly, most participants did not mention predictions (e.g., about mortality, vital trends, and risk scores) as desirable without being prompted.However, some participants showed interest in optimizing medication administration and nutrition.Less experienced clinicians expressed an interest in patient summary tools but still wanted to double-check the information's correctness, completeness, and origin, which could be made easier if  had a direct link to the source.Simultaneously, participants expressed concerns about the risk of false negatives and false positives.Participants concluded that any risk prediction might only be used as an indicator at best ('Patient p is now at an increased risk of condition d.') or a non-intrusive prompt ('Have you considered issue i?') and, at worst, might confuse or manipulate decision-making, especially for clinicians who had not mastered setting detailed patient information into the overall context, or in an emergency, where a predictive score might become an easy fallback.
Some participants questioned the usefulness of using  in the ICU for patients who were already seriously ill, suggesting that it would be more beneficial to use AI/ML to prevent patients from needing intensive care.Additionally, some recommended using AI/ML CDSS in areas with a lack of clinical skills, such as regional and remote hospitals, or ICUs with less experienced medical teams.Beyond that, participants expressed an interest in learning more about their patients' clinical and social details ('I wish I could look into the patient').They also suggested having a live storyboard to track their patients' injuries.Additionally, they discussed the potential benefits of  for administrative tasks such as staffing, payroll, bed allocation, scheduling family visits, acuity on a hospital level, and research.

DISCUSSION: IMPLICATIONS FOR AI RESEARCH
Our preliminary findings suggest nuanced relationships between clinicians and their information environment.The following discussion highlights a few areas of concern and aims to encourage deeper thinking and questioning.We focus the discussion on CDSS for direct patient care since AI/ML CDSS for administration or research carries relatively less risk for individual patient outcomes.

Needs limit how AI/ML CDSS might be used
What clinicians need and value in their decision-making gives a direction on what  can support.It needs to be considered that  does not only support clinicians but by extension, their clinical skills and experience.Table 2 connects the needs identified in the previous section to specific considerations for AI development (here called 'AI component') and column 4 elucidates questions that clinicians might ask about those needs in relation to , the new software, information or technology proposed to address those Clinicians determine information/data accuracy by setting them into context with their experience and skills.Asynchrony between decision and action exists in Figure1, as the decision-maker may not be the same person executing the action, and some decisions are made in collaboration rather than individually.
needs.The elements presented in Table 2 stand in relationship to each other.Figure 1 depicts the current map of needs and values in ICU decision-making.The benefit that  can bring to clinical decisions is limited by how well it can provide accurate, available, and understandable information while also respecting their expertise, autonomy, and time.Although the impact of  on clinicians' decision-making is uncertain, it is clear that  will be subject to clinicians' skills, experience, and caution.

The Loss of Delicate Decisions
In many instances, research has to follow a reductionist approach in which we tackle a narrowly scoped problem area to investigate its underlying concept.The reductionist approach is limited when narrowly scoped research areas intersect with the subjective experience of reality.For instance, a problem that is frequently modeled is 'predicting mortality of patient p based on data d'.Even though mortality is also used as a proxy in clinical studies, Laupland et al. [14] find that mortality risk significantly underestimates the difficulty of conditions and biases the assessment of critical illness.What may not be considered (anymore) by clinicians when  is used?Predicting mortality with statistical accuracy is certainly interesting from a data-driven perspective, but what value does that prediction have for clinicians working on the floor?If we zoom out, we can ask what value  =  has for ward management, hospital management, or for statewide resource management.Without considering the environment in which  =  is to be used, it is difficult to reach situational accuracy and to make the delicate decision of defining the problem appropriately from a data perspective.The key question is how we can meaningfully utilize AI/ML capabilities and data-driven approaches.To do so, we require a deep understanding of the intended environment.Compared with the variety and nuances of needs and situations that concern clinicians, this research is like dipping toes into an ocean of innumerable and (in-)discernible currents.A participant expressed that 'you often see that software is designed with people who aren't clinically oriented.It does potentially just change your problem as opposed to make it easier or match it with our needs.'

Limitations and Future Research
Whilst we have consulted clinical experts, our work is limited by the researchers' background.That limitation also allowed us to elicit results that may be obvious for clinical researchers, but very informative for the data science community.An observational study would be a great addition to the research conducted here.Additionally, our data gathering is limited to Australia with a high emphasis on Queensland, excluding rural areas, which might be of special interest.Workflows and guidelines in other countries differ vastly, especially regarding time spent at the bedside, software use, and team dynamics.Further research is necessary to explore how  could be designed to address the needs and values outlined here and determine trade-offs.Our approach has the potential to be applied to other industries.

CONCLUSION
When it comes to the development of AI/ML CDSS, the ICU is a special environment as it combines multiple challenging characteristics that contribute to its complexity.Clinicians and technologists face delicate decisions about the extent and nature of AI/ML CDSS () in the ICU.There is not just one thing that can support or improve decision-making in the ICU.Rather,  relates to various needs, values, and processes.Special attention needs to be paid to clinicians: their experience, clinical skills, and largely their communication skills are the glue, so to speak, that holds the relevant information and pieces it together to a decision that is the most appropriate for a patient at a given time.The time seems to be ripe to transfer AI/ML CDSS capabilities to the ICU environment.
For that, clinicians' needs and values are an illuminating source to clarify how to shape  so that fits perfectly into the complexity of the ICU environment.

Figure 1 :
Figure1: Relationships between needs/values in ICU decisionmaking. is the 'thing' (AI/ML CDSS) that aims to support clinicians' decision-making.A clinician has experience, clinical skills, communication skills and operates with caution.A clinician cares for patients and makes clinical decisions with a certain degree of autonomy, depending on their amount of experience and skills.The amount of time available influences patient care and the decision-making process regarding the amount and depth of information considered.Experience and skills influence the interpretation of information.The clinical decision depends on the patient's information, which needs to be relevant, comprehensible, available and accurate.Clinicians determine information/data accuracy by setting them into context with their experience and skills.Asynchrony between decision and action exists in Figure1, as the decision-maker may not be the same person executing the action, and some decisions are made in collaboration rather than individually.

Table 1 :
we present a summary of the decision-making needs and what is valued for decision-making articulated in our interviews with ICU clinicians, organised under a manageable set of thematic headings.While this presentation necessarily trades detail for broader meaning, each point can be traced back to one or more interview responses.(Directquotesare shown in 'italics'.)Availableinformation:Clinicians get information and data from various systems and sources.A major frustration is the time spent navigating people and systems for information, especially during emergencies.(4)Comprehensibleinformation: Data and information need to be communicated in an easy-to-understand way, including the origin of the piece of information, appropriate to the clinician's role and level of experience.Conversations revealed a high degree of variability related to role and experience.(5)Clinicalskills have a high impact on patient outcomes and include knowledge and skills in conducting clinical procedures and assessments as well as the ability to synthesize and identify relevant clinical information.(6)Effective communication: Clinicians need to navigate different types of interactions effectively and sensibly, especially with families.Participants tend to prefer face-to-face and oral communication over digital communication.Notably, participants highlighted that they 'would prefer someone with clear communication as opposed to someone with more clinical skills'.Clinicians reported that their clinical skills and experience are major factors in their decisions.They commented that the EHR is used for documentation, to help remember certain facts, and to spot clinical trends.They also emphasized the importance of understanding not This question guideline for our semi-structured interviews incorporates feedback from ICU professionals.
(1) Patient story: To interpret data and information in context, clinicians need to understand the patient's story and identify important elements.Additionally, clinicians need to know why a certain decision/action was made 'Because if you don't ask why something has happened, that's when things happen.' (2) Accurate information: Inaccuracy can have severe consequences in the ICU.Clinicians validate data and information for accuracy and relevance e.g., by double-checking Electronic Health Records (EHR) data and information communicated orally.'Numbers don't necessarily translate into what my patient is presenting as physically.' (3) ...explore further Why?Could you explain?How does that relate to...? Is there anything else you would like to share or ask?

Table 2 :
Mapping decision needs and values to data analysis and AI considerations.