Just a Breath Away: Investigating Interactions with and Perceptions of Mediated Breath via a Haptic Cushion

Feeling another person’s breathing is an intimate encounter that can promote deep connection, communicate affective information, and influence our physiological state. Emerging technologies are exploring the mediation of biosignals, such as breathing, between people to yield meaningful interactions. However, little is known about people’s subjective and physiological responses to mediated breath and the implications for designing these mediatory interfaces. We contribute to this space with three studies investigating subjective and physiological responses to mediated breath. We mediate breath with a custom pneumatic haptic cushion and offer strategies for simulating diverse breathing patterns. Our studies investigate three aspects of mediated breath interactions: physiological responses to regular breathing patterns (n=21); recognition of expressive breathing patterns (n=21); and subjective responses to mediated breath between partners in combination with video, audio or in isolation (n=8). We discuss our key findings and highlight areas of consideration for designers curating mediated breath interactions with tangible interfaces.


INTRODUCTION
A recent wave of technologies has been exploring a new paradigm of interaction by mediating biosignals between partners.This sharing of explicit or implicit biosignal data can foster meaningful exchanges and convey a sense of physical presence or connectedness [15,50].At this nascent stage in the design and deployment of such devices, the field is still developing a formative understanding of how people respond to mediated biosignals and what it means to have the 'invisible made visible' in our interactions with others.As highlighted by Stepanova et al. [51], we need to better understand the challenges and affordances of sharing biosignals to support the design of effective mediatory technologies.Existing studies in this space have focused primarily on transmitting heart rate and skin conductance data (see Figure 6 in [15]), while other biosignals remain underexplored.In this work, we focus on one such biosignal, the breath, which we argue has unique benefits for mediated interactions that warrant further study and attention.
Why the breath?Breathing is much more than a regulatory system responding to metabolic demands; it is simultaneously tightly coupled to our nervous system activity [65], influenced by our emotions [22] and responsive to external stimuli or people in our environment [11,36].Moreover, breath is an integral part of our vocalisation mechanism, constantly adapting to enable vocal expression and functions such as speech, laughter, coughing or sneezing.Unlike other biosignals, we can consciously control our breathing, enabling greater possibilities for active and passive communication via mediated breath.Furthermore, as with one's heartbeat or body temperature, breathing can be felt when in close physical contact with another person, offering a unique opportunity for mediated breath to elicit a sense of physical intimacy and presence, a quality identified by Hassenzahl et al. as valuable but challenging to achieve in mediatory devices [20].This nuanced multi-dimensionality of breath affords a potentially rich interaction space for mediated breath experiences but also raises questions of whether these factors may conflate one another, be misinterpreted or 'lost in translation', or elicit unexpected responses when the breath is felt.Few works have explored mediated breath between partners [3,13,26,30,[32][33][34]53], and the studies conducted with these interfaces do not focus on such factors.
We contribute to this space by conducting a series of behavioural studies to investigate specific aspects of mediated breath interactions between partners.We focus on three questions for this initial investigation: 1. Do rhythmic breathing patterns mediated by the device elicit changes in, or synchronisation of, breathing, and what are people's subjective experiences of this? 2. How do people interpret and perceive non-regular breathing patterns such as laughter or talking when mediated through the device?and 3. How is mediated breath perceived differently when contextualised by accompanied audio or visual communication compared to as a standalone channel of connection?We mediate breath with a huggable pneumatic cushion, affording an embodied kinesthetic interaction with the breath reminiscent of hugging a loved one and feeling their breathing.We develop strategies for mapping dynamic breathing patterns onto the cushion's inflation to simulate diverse breathing patterns and expressions.Our three user studies offer a preliminary investigation into the questions posed above; study 1 investigates physiological and subjective responses to regular breathing patterns, study 2 explores participants' recognition and perception of expressive breathing patterns, and study 3 gathers subjective responses of couples to real-time sharing of their breath in combination with video, audio or in isolation.
Thus, we make a three-fold contribution to this field: firstly, we present Embreathe, a haptic cushion system for mediating dynamic breath data from one partner to another in real-time; secondly, we conduct a series of three user studies with Embreathe to gain insight into interactions afforded by and participants' perceptions of mediated breath; finally, we discuss our key findings and highlight areas of consideration for designers curating mediated breath interactions.

RELATED WORK 2.1 Interactions with Biodata in HCI
Recent years have seen an increase in HCI research exploring the interaction space afforded by biodata.Existing work has captured and translated a wide range of physiological markers such as breath [10], heart rate (incl.heart rate variability) [60], muscle activity [19], temperature [18], brain activity [59] and skin conductance [1].Interaction strategies include: tangibly displaying biosignals that remain otherwise hidden or unconscious, for example, SingingKnit provides singers with sonification of their laryngeal muscle activity [40]; bringing awareness to biosignals, for example, the Breathing Scarf brings attention to the breath when an aroused state of stress is detected [10]; biofeedback and feedforward guidance for biosignals we can consciously control, for example, ViBreathe provides feedforward guidance to support breath regulation using heart rate variability data [64]; gamifying biosignals for entertainment and/or health and well-being, for example, DEEP encourages children with anxiety to do breathing exercises by engaging them in a virtual reality underwater world that responds to their breath [56]; augmenting communication with biodata, for example, Significant Otter enables partners to send emojis that represent heart rate data [29]; and sharing biosignals between people to promote a sense of connectedness, for example, JeL fosters connection by encouraging breath synchronisation between partners in an immersive VR experience [52].These works showcase the versatility of biodata as a fundamental building block with which to design meaningful interactions.
We focus in this work on the latter strategy: mediating biosignals between partners.Biodata can be transmitted between partners in a multitude of ways, encompassing a spectrum from symbolic representation (e.g. a visual animation of the partner's heartrate data [28]) to simulation or replication of the data (e.g.simulating the heat of a partner's hand for remote hand holding [17]).Simulations of biodata allow partners to interpret each other's raw data in a form that is 'close' to the original scenario, for example, a pneumatic representation of breathing patterns [26] in contrast to processing data to a numerical value, such as breathing rate, or symbolic representation, such as an animation [30].
We use the strategy of simulation for Embreathe, by mimicking the expansion and contraction of the chest and abdomen while breathing via inflation and deflation of a cushion interface.Simulations do not interpret the biodata for the user.Rather, the aim is to relay the biodata with minimal modification.This opens up possibilities for rich remote interactions, potentially as rich as the close physical interactions that are being simulated.
2.1.1Interpreting Biodata.Biodata represent complex processes in the body and can be challenging to interpret in both real and mediated scenarios.Individuals construct interpretations of biosignals differently, influenced by social expectations and contextual factors [31], and biodata cannot be separated from cultural and social context.While some technologies aim to make intepretations for the user, for example displaying discrete affective states based on biodata, on-going conversations in HCI have critiqued this approach [6,23].Boehner and colleagues call for systems to support users in 'understanding, interpreting, and experiencing emotion in its full complexity and ambiguity' [6].The ambiguity in mediated biosignals can offer opportunities for reflection and open-ended discussion between partners [23], but can also cause worry and concern when negative affect is interpreted from a partner without being able to contact them to confirm this or provide support [42].In the case of breathing data, few studies have touched on the implications of interpreting and misinterpreting diverse breathing patterns.Karpashevich and colleagues [24] articulate their experiences of distortions, individual understandings and dynamic relationships with mediated breath during their soma design engagement with the Breathing Shell, a corset containing pneumatic pillows that can sense and 'mirror' the breathing patterns of the wearer.However, in this case, the wearer engages with their own breath and can draw on their self-awareness when interpreting the interaction.When feeling the breathing of a partner, however, the user may not have any other feedback or shared context with which to support their interpretations.We explore how participants respond to and interpret mediated breath in various scenarios, with the aim of expanding our understanding of these interactions between partners.
The modality and context of use are important factors in the interaction supported by tangible devices.For example, with Heart in your Hands [49], a person's heartbeat is simulated in a red silicone device shaped to replicate a real heart, thereby evoking intimacy through vulnerability by embodying the experience of holding the other person's real heart in one's hands.Corsetto [25] translates the somatic movements of a singer to audience members in real-time via a pneumatic wearable system, generating sensations on the body in locations that mirror the somatic experiences of the singer to give the wearer an empathetic awareness of the singer's body.With PillowTalk [41], partners hear each other's heartbeat via a speaker placed under their pillow, recreating the sensation of laying their head on the other partner's chest while in bed and hearing their heartbeat.The embodied experience of the sound is very different in this scenario from hearing the heartbeat via, for example, headphones in a public space which contextually would not simulate the intimate interaction of cuddling a partner in bed.Similarly, Dodge's The Bed [12] seeks to mediate intimacy and physical presence between remote partners by exploring the bed environment; a space "'loaded" with meaning.They use tangible artifacts unique to the environment -i.e.pillows and curtains -to convey personal data from one partner to the other; a huggable pillow heats up and simulates a heartbeat to elicit a feeling of holding the partner, curtains flutter in response to qualities of the remote partner's breath and voice.In both of these works, the materiality and cultural meaning of the pillow as a familiar, intimate, and comforting object contributes to and is inseparable from the interaction afforded.Schiphorst and colleagues explore this in a different way, taking the pillow out of the bedroom context and into urban social spaces to encourage connection and social intimacy between multiple people in a social setting [45].Their move.mepillow interfaces with touch and movement data, using this to interpret the social current context and adapt their behaviour (via varied audio, visual and haptic outputs) to encourage new social behaviours.Here, the familiarity, comfort and playfulness (e.g.childhood 'pillow fights') associated with pillows is employed as a way to engage social intimacy.
We take inspiration from these works to produce meaningful interactions with Embreathe by using a tangible cushion interface that embraces the qualities of familiarity and comfort inherent in the materiality and cultural associations of this form factor and embodies the intimate experience of feeling a partner's breathing when hugging them.

Interactions with Simulated and Mediated Breath
The majority of breath-related tangible interfaces focus on simulating regular breathing patterns in the context of health and well-being, for example: breathing guides for relaxation or to reduce stress [2,4,8,38,46,58,64], devices for supporting sleep [48,55,62], and devices for reducing anxiety or fear [14,21,63].These interfaces typically present a regular breathing pattern to the user at a lower frequency than their own breathing to encourage their breath to slow down and subsequently activate their parasympathetic nervous system associated with rest and relaxation [65].
Several studies demonstrate that users' breathing rates can entrain to the breathing rate presented via a tangible device.For example, with a wearable pendant Breeze [16], a significant effect was observed of breathing pace (realised with haptic, audio or visual cues) on participants' breathing rates.Ban et al. [4] conducted a study (n = 16) with participants hugging a cushion that mechanically simulates breathing, and found strong evidence that participants' breathing rates entrained to the presented breathing rates (8 bpm, 15 bpm and 24 bpm).
There is a small but growing number of tangible mediated breath interfaces that transmit breathing data between partners [3,13,26,30,[32][33][34]53].These explore multiple factors, for example, emotional connection or communication between partners [26,53], social factors such as performativity or vulnerability when publicly sharing breath data [13], cultivating empathy with a partner in a one-on-one tutorial context [30] and playful sharing of breath in an interactive experience [34].One of these projects, the Breath-ingFrame by Kim and colleagues [26], explores the remote sharing of dynamic breathing patterns (i.e.not simplified to breathing rate) between romantic partners for telepresence.They observed that the pneumatic representation of breath in their device elicited interesting engagements between the partners through this dynamic behaviour, for example sparking curiosity about the current state of the other partner from their breathing pattern, delivering intentional and playful breath to each other, and mimicking or synchronising their breathing with their partners.We expand on this work by conducting behavioural studies to investigate specific aspects of such interactions with dynamic mediated breath.

EMBREATHE: ENABLING INTERACTIONS THROUGH BREATH
Embreathe is a huggable cushion interface that simulates breathing via pneumatic actuation.The interface is an adaptation of an existing artifact by Haynes and colleagues [21], modified here for the purpose of investigating interactions that can be afforded with mediated breath.The original artifact is a breathing cushion interface targeted at easing anxiety and providing comfort, which was designed and fabricated through an iterative design process.
In a behavioural study (n = 129) [21], it was found that simulating a slow breathing rate of 10 beats per minute (bpm) through the cushion was effective at reducing induced exam anxiety in students.
During the design stages and evaluative study, people frequently described the felt breath as "life-like", reminding them of holding a loved one (such as a partner, child or pet) which prompted us to investigate mediated breath with the device.We name this new interface Embreathe, combining the words Embrace and Breathe to reflect the physical interaction afforded by the device.However, capturing and reproducing dynamic breathing patterns in real-time is a non-trivial task and first required a modification of the system's hardware and software.

Design Rationale
Our primary design criteria for modifying the cushion system hardware and system were: (1) Capturing dynamic breathing patterns: We incorporated a sensing system to capture breathing data in real-time and store or transmit that data.(2) Reproducing dynamic breathing patterns via the cushion: We upgraded the driving mechanism to be capable of simulating diverse non-sinusoidal breathing patterns and developed a mapping from the collected breathing data onto the available range of motion of the cushion.(3) Operating remotely in real-time: We included remote communication between the breath sensor and cushion driving mechanism with minimal delay between sensing and subsequent simulation of breathing.

Embreathe Cushion Mechanism
The updated cushion mechanism is shown in Figure 2 (left).We adapted the mechanism from [21] by replacing the external motorised crank and slider with a linear actuator (12 V, 5 A, 20 % duty cycle, 6-inch stroke) to actuate the pump.This allows for specialised control and simulation of non-sinusoidal breathing patterns.The position of the linear actuator is tracked by a potentiometer that feeds back to a proportional controller.A microcontroller handles the proportional control of the pneumatic system and drives the linear actuator via a motor driver using a Pulse Width Modulated (PWM) signal.

Sensing and Transmitting Breathing Data
To collect breathing data in real-time, a wearable sensor (Figure 2 (middle)) was fabricated consisting of an adjustable waistband that holds a force sensing module (3D printed case containing a Force Sensistive Resistor (FSR)) against the ribcage to detect expansion and contraction of the torso.Pressure on the 5 cm by 5 cm front face of the module presses directly onto the FSR.Other methods were tested such as using conductive rubber or e-textile stretch sensors, but the FSR module was found to be the most reliable during testing.The 3D printed module holding the FSR is similar to the system developed by Vanegas et al. [57] who demonstrated that this sensing method can effectively capture breathing data.Breathing data is collected every 100 ms by an ESP32 microcontroller board that transmits the data to the cushion control system either via serial communication (during the testing and user study stages) or via WiFi (for remote communication).

Simulating Dynamic Breathing Patterns
To develop a mapping from the raw breathing data captured by the sensor to the cushion driving mechanism, we collected a large sample of data from the breathing sensor while it was worn during conversation and animated expression (wearer was female, 27 years old, 172 cm tall).We annotated the data and extracted clips of resting breathing, deep breathing, and expressions such as gasping, laughing and talking.We included in the recording a large inhale and exhale for calibration.
Our initial strategy was to map the maximum inhale and exhale onto the maximum and minimum range of the linear actuator position and scale the breathing data proportionally onto that range.However, we found that this mapping resulted in an exaggerated scaling down of data in the amplitude range of resting breathing making it challenging to perceive this through the cushion.Since full inhalations and exhalations were rare in the data, even in expressive content, we investigated strategies for scaling up the simulated amplitude of the central -and most frequently occurring -range of breathing rates.For real-time transmission and simulation of breath, we achieve this by having a calibration phase in which the resting breathing pattern of the sender is recorded for 10 seconds and the range from minimum to maximum of this data is mapped proportionally to the central 50% of motion of the linear actuator driving the cushion as illustrated in Figure 2 (right) where  in this case is equal to 50% of the linear actuator range ( = /2).To eliminate erratic behaviour due to noisy sensor data, we use a smoothing function for the proportional control that calculates the Setpoint at time-step  as   = 0.4  + 0.6  −1 , where   is the desired position at time-step  calculated from the breathing data and   −1 is the previous Setpoint.To compensate for shifts or drifts in the data over time due to postural changes and movement of the person wearing the breathing sensor, we keep track of the mean amplitude across a 10 s interval and update the minimum and maximum mapping by increments of 20 if the mean shifted by ±80 (units based on the 0 to 4095 range of the Arduino function analogread() used to capture breathing sensor data).This behaviour and compensation strategy can be observed in the partner breathing data presented later in study 3 results (Figure 7).
We use the above mapping and strategy for real-time breathing transmission between partners (e.g. in study 3) and found it effective at conveying the dynamic behaviour of the breath.However, we identified three key limitations of the strategy: first, mapping breathing sensor data (corresponding to chest circumference) directly onto linear actuator position (corresponding to cushion volume) assumes a linear relationship between volume and circumference which is not the case, second, the strategy maps all resting breathing rates of people to the same inflation amplitude of the cushion, which is not representative of different bodies and breathing styles, and third, some higher amplitude expressive content or movements are clipped due to the proportional scaling.
We developed and trialed strategies to address these limitations when producing the 6 expressive stimuli presented in study 2. First, instead of mapping the breathing data directly onto the linear actuator position, we map onto the circumference of the cushion body measured by placing the breathing sensor on the cushion; Figure 3A shows the relationship between linear actuator position and cushion expansion by recording breathing sensor values with the sensor placed on the cushion during inflation and deflation.Using a polynomial line of best fit to estimate the relationship between the two, we effectively map from the expansion of the person's abdomen while breathing to the equivalent expansion of the cushion body.Second, rather than scaling the resting breathing data to the central 50% of linear actuator motion, we scale it in relation to the difference in abdomen expansion between resting and deep breathing.For example, in the test data, the amplitude of resting breathing as recorded by the breathing sensor was approximately 20% of the displacement during deep breathing.Deep breathing was mapped onto the largest displacement of the cushion (tidal volume 450 ml) and the resting breathing to 20% of the displacement (tidal volume of 169 ml).Therefore, the value of  (see Figure 2 (right)) was derived such that the amplitude of resting breathing mapped to a linear actuator range of motion that equated to a tidal volume of 169 ml.Third, to address clipping, we use a simple compression function for each expressive clip that linearly compresses all breathing data outside the central range of resting breathing such that the maximum inhalation and exhalation recorded in the clip are mapped to the maximum and minimum inflation of the cushion, while the central range of data remains the same, as shown in Figure 3B.

Performance Characterisation
The capacity of Embreathe to simulate dynamic behaviour is defined by the speed and torque of the linear actuator when connected to the pump mechanism.The flow rate output by the pump in both directions -inflation and deflation -is linearly related to motor speed with a maximum flow rate of 18 slm for inflation and 16 slm for deflation.The motor stalls at PWM values below ∼40 meaning that some slow and subtle changes cannot be captured by the system.This did not pose a concern for the performance of the interface since inflation or deflation of this amplitude is unlikely to occur for long periods in breathing data or be perceptible through the cushion for short periods due to the damping of the system.The maximum inflation rate of the cushion mechanism is influenced by pressure feedback from the cushion interface resulting in a maximum inflation rate of approximately 0.43  −1 when the pump system is unattached, 0.36  −1 when the system is inflating the cushion and 0.35  −1 when the system is inflating the cushion while the cushion is being hugged.The system was capable of reaching maximum inflation volume even when tightly hugged, so the impact of varied hugging styles was on the inflation rate alone.Figure 4 provides examples of the system simulating expressions via Embreathe when being hugged.These indicate that some higher frequency data is lost due to the maximum inflation rate of the  system, but the general dynamic shape of the expressions was recreated and frequencies in the range of typical breathing rates (deep, resting, and shallow) were replicated well by the system.This data was collected from one person firmly hugging the cushion during activation.During the experiment we recorded the actual inflation of the cushion for each stimulus and verified them by visual inspection to ensure the stimuli were presented with the same or higher degree of accuracy as shown in Figure 4. We found there to be visually no difference between the stimuli presented to participants to indicate that the way they held the cushion had a big impact on the stimuli presented, although the felt experience of the cushion's behaviour may vary when hugging it more or less tightly.We asked participants in all studies to try to hold the cushion consistently throughout the study.

Investigating Mediated Breath with Embreathe
We developed Embreathe for the purpose of investigating interactions with mediated breath.The following three studies present our first steps on an on-going exploration of mediated breath; at this stage we focus on individual aspects of the interaction separately, providing a foundation for subsequent investigations into bi-directional and situated mediated breath interactions.The three studies focus on the following research questions: study 1 investigates how rhythmic breathing patterns mediated by the device elicit changes in, or synchronisation of, breathing, and participant's subjective experiences of this; study 2 explores participants' recognition and perception of expressive breathing patterns such as laughter or talking when mediated through the device; and study 3 gathers subjective responses of couples to real-time sharing of their breath, comparing mediated breath interactions when presented as a standalone channel versus in combination with audio or video.

USER STUDY 1: BREATHING ENTRAINMENT
In our first study, we investigated whether participants' breathing rates are influenced by breathing simulated by the cushion.As previously discussed, several existing works have demonstrated the phenomenon of breathing entrainment with devices.We conduct this study with Embreathe to verify this effect in our device prior to having partners use the device together in real-time.

Participants
In total, 21 participants took part in this study (12 identified as female, 9 as male) in the age range 24-49 years old (M = 30.4,SD = 6.5).They were a mix of postgraduate students and university staff recruited through word of mouth.All participants gave informed consent prior to participation.Participants with serious respiratory or heart conditions were excluded from participation.This study was approved by the Faculty of Engineering Research Ethics Committee at the University of Bristol (reference number 0019) and participants received an online voucher worth £10 to thank them for their time.

Stimuli
Breathing stimuli were presented to participants in 5-minute sessions, each composed of 1-minute stillness, followed by 3 minutes of cushion activation, and then a further 1-minute stillness.Participant breathing was recorded for the full 5-minute duration.Four breathing rates (6, 12, 16 and 20 bpm) were simulated using sinusoidal waveforms with a constant minute ventilation of 2.7 liters per minute.These rates were selected to cover the typical range of resting breathing rates for adults; approximately 12 to 20 bpm [9], centred around a mean of 16 bpm (study results vary, e.g.16.6 ±2.8 bpm [54], 14 bpm [39] and 15.7-15.8bpm [47]).With these stimuli we explore the physiological responses of receivers when feeling the resting breathing of a potential sender.
The user study protocol is shown in the Appendix.The breathing rates of 12, 16, and 20 bpm were presented to participants first in a randomised order, with 6 bpm presented last since it lies well outside the typical resting breathing rate of adults.This rate was added to explore how participants respond to a breathing rate expected to be highly incongruent with their normal resting rate.We use a low breathing rate of 6 bpm since this slow rate reportedly corresponds to a range of psychological and physiological benefits [65].With this breathing rate we consider partner scenarios such as providing support to a loved one when stressed.

Method
Participants were welcomed and seated in a quiet room for the study.They wore the breathing sensor, which was adjusted to fit around their lower ribcage, and asked to sit in a relaxed position hugging the cushion to their torso.The experimenter and motor mechanism were situated in a separate sound-insulated room, and a short test was conducted prior to starting to check the signal readings.The participant was given instructions by the experimenter via a video call with video and audio switched on for the participant throughout the experiment and off for the experimenter except for communication with the participant.
The behaviour of the cushion was controlled via pre-recorded control signals, saved as ASCII files of linear motor positions on an SD card connected to the micro-controller.Position data was read at 100ms intervals, and the motor position updated using proportional control.The current timestamped position was returned to the experiment laptop at 100ms intervals for verification.Breathing sensor data was sampled at 20 ms intervals, timestamped and transmitted to the experiment laptop in batches of 5 readings every 100 ms.

Data Processing
Breathing data was analysed for 17 of 21 participants (data for participants 1, 12, 18 and 21 discarded due to errors in recording).The breathing data was resampled to have a constant sampling rate and smoothed using a Savitzky-Golay (polynomial) smoothing filter with order 1 and framelength 17 to reduce noise.Maxima and minima (peak exhalation and inhalation respectively) were identified using built-in Matlab functions 'findpeaks' and 'islocalmin', using a minimum prominence value of 40.Once the breathing minima and maxima were identified (and verified by visual inspection), the data was separated into three sections; one minute prior (time 0-60 s), during (120-180 s) and post (240-300 s) cushion activation.The breathing rate was approximated from the mean period between minima and between maxima within each timeframe.due to violation of sphericity) found no statistical difference in the starting breathing rates (prior to cushion activation) of participants for each of the four simulated breathing rates both in the order presented to participants (F(2.5, 39.3) = .39,p = .72, 2  = .15)and according to stimulus (F(3.0, 47.9) = .23,p = .87, 2  = .05,corresponding to data labelled 12A, 16A, 20A and 6A in Figure 5).This indicates that resting breathing rates of participants generally did not vary significantly throughout the experiment.A further repeated measures ANOVA found a significant difference in the breathing rates of participants during cushion activation across the four simulated breathing rates (F(3, 48) = 16.8, p < .001, 2  = .68),indicating that the movement of the cushion had an impact on the participants' breathing.
Figure 5 shows participants' breathing rates prior to, during and post cushion activation.Highlighted regions indicate breathing rates within ±2 bpm of the cushion breathing rate.For breathing rates of 20 bpm and 6 bpm, we see that a small sample of participants' breathing rates tended towards half or double frequencies of the cushion's rate.This may be a reason why entrainment was not statistically found for these stimuli and suggests that we should take these harmonic frequencies into account when studying entrainment behaviours in general.

Subjective Responses.
Participants were asked to give feedback after each stimulus of how it felt when the cushion was active.The cushion breathing rate of 12 bpm most frequently received comments that the participants synchronised with the breathing rate of the cushion.The slower pace (compared to the 16 and 20 bpm) was frequently associated with being close to another being, for example, "when I was a kid I'd sleep on my dad's belly, and it reminded me of that", "cadence similar to someone falling asleep, at early stage it felt like when you wake up early and normally would find it hard to get back into sleep, but if you are cuddling someone sleeping next to you, you slip back into sleep straight away", "calming, it felt like having a child or cat on your lap going to sleep".These descriptions were consistent with the concept of larger bodies having slower breath and smaller bodies having faster breath, for example, the participant that was reminded of their father said "It reminded me of sleeping on my dad because I couldn't synchronise my breathing with him, he took much bigger and deeper breaths than I could." For faster breathing rates (16 bpm and 20 bpm), some participants were reminded of smaller beings; "it was more like a cat or baby", "when it started, I was like 'Oh! It's like my cat.", "this one was shallow and faster like a dog, not as comforting", while other participants perceived the breathing as adults with elevated breathing rates "like someone is breathing too fast", "quicker, almost slightly distressing", "I thought the pillow might be hyperventilating".
Participants described the 6 bpm cushion rate as having slower, and deeper breaths and there was a significant difference in their perception of this breathing rate.For some participants it was unsettling; "feels wrong, not like a real person breathing", "didn't feel real, didn't match size of cushion, felt like a horse", "it unnerved me with the force of the pillow expanding", "felt like it was forcing a breathing pattern on me", while other participants perceived it as slow or meditative; "felt like a breathing exercise for meditation", "too slow and deep to sync up with, like in a yoga session when they guide you in deep breathing", "more calming, slower and deeper breaths were more encouraging making deeper breaths, if feeling bad day or anxious it would be good".
In this experiment, the breathing was not presented to participants as being from another person, yet participants still frequently anthropomorphized the cushion, for example "[the cushion] felt in sync with me more and I felt attached to it", "it feels like it comes to life -It's alive!-but at end feels like something parted", "strange, felt myself becoming more protective over it", "felt weird when it stopped like it had died".After the experiment, when asking participants if they would like to use the cushion to feel the breathing of a loved one when apart, responses were mixed.Some were positive about the interaction; "it would be nice if you missed them, you would feel close", "when I had long-distance relationship, I would have got one".Some participants highlighted specific scenarios for use, for example "if it was during a call or something" or "if a friend who is anxious and has panic attacks she might want someone there".Other participants felt that it would reinforce the absence of the other; "feel like it would hammer home their absence even greater", "feels lonely feeling someone who's far away".

USER STUDY 2: INTERPRETING EXPRESSIONS
In this study, we explore participant perceptions of expressive breathing patterns simulated by Embreathe.When considering mediated experiences of breath between people, it is expected that people will exhibit other behaviours aside from rhythmic breathing.If the cushion was used to augment an audio or video call, for example, we would expect users to exhibit behaviours common to conversation such as talking and laughing.And in nonconversational contexts -e.g.watching a movie together or going to sleep -we would still expect to occasionally observe behaviours such as coughing, sighing or yawning.What does it mean to feel these expressions mediated by the cushion?We are not aware of other studies exploring this; with the Breathing Shell [24], the first person experiences report on such expressions, for example, one colleague hiccups while wearing the interface and feels their hiccup replayed to them, reassuring them that the interface is functioning.
In BreathingFrame, participants explore being playful with their breath to connect to their partner, but specific expressions are not investigated.Therefore, in this study, we take a first step to investigate how such expressions are perceived and interpreted when simulated by Embreathe.

Stimuli
To gain a preliminary understanding of how irregular breathing expressions translate through the cushion, we selected six common everyday expressions to present to participants.These were; talking, laughing, giggling, coughing, sighing and gasping, as shown in Figure 4. Section 3.4 details how these expressions were collected and mapped onto the cushion.These were presented to participants in two formats; with three breaths included before and after in the clips (stimuli 1-6) and as isolated clips (stimuli 7-12).

Participants and Method
The participants and ethical approval were the same as for study 1, and the two studies were conducted on the same day.All participants had completed study 1 before participating in study 2. In study 2, participants held the cushion in the same setup as before but without the breathing sensor.The session was split into two parts (the user study protocol is shown in the Appendix).Participants were informed that they would be presented with different expressions through the cushion that had been recorded from a real person.In the first part, participants were presented with six expressions through the cushion in a randomised order with no further context or information and asked 'what did that expression of the cushion feel like?'.In the second part, they were presented with the same expressions and asked to identify each expression from a list of all expressions.Questions were asked by the experimenter and shown on screen for the participants.Their responses were given verbally and recorded in writing by the experimenter.

Initial Interpretations of Expressions.
Participants struggled to recognise the expressions in the cushion, and very few participants made correct identifications; gasping (3 participants), sighing (2), talking (2), laughing (1), giggling (1), coughing (0).This indicates that such expressions could be misinterpreted by receiving partners when no additional context is provided in an interaction.Some expressions received multiple incorrect but consistent identifications, for example, 7 participants described the coughing stimulus as crying, and 4 participants identified giggling as hyperventilating or panting.This may suggest that there are features of the expressions that are capable of eliciting consistent interpretations between participants, but a new mapping is required to detect and display the expressions correctly.

Identifying
Expressions from a List.When given a list of expressions to identify from, accuracy was 0.27 across all participants and stimuli.This is higher than chance accuracy (0.17) but still low.Figure 6 shows the confusion matrix of perceived expressions, which shows that sighing was the most frequently correctly recognised (52.4% correct) and talking the least (7.1%).For every expression, at least 33 % of participants agreed on an interpretation, which for sighing and gasping were correct, but for laughing, giggling, coughing and talking were incorrect.laughing, giggling and coughing were frequently cross-identified, which we posit is due to their similar pattern of rapid inhalations and exhalations.with the cushion".Others reported that the expressions "[felt] like natural expressions" and "always felt alive", even if they struggled to recognise the behaviour.Of the 21 participants, 17 commented that the expressions were difficult to identify or differentiate and 5 mentioned that there was not enough context or that additional information such as sound would help them to interpret the cushion's expression.Four participants commented on not knowing what a real person's breathing would feel like during these expressions; one discussed trying to imagine the feeling "I was thinking about being in a womb and feeling what those expressions would be like", while for others the feel of the expressions was different to their expectation "how I would do those [expressions] is very different to someone else".It may be that people would have better recognition of these expressions when recorded from someone they have close familiarity with, or by learning to recognise the patterns specific to their partner's expressions through extended periods of use together.

USER STUDY 3: FEELING THE BREATHING OF A PARTNER
In our third study, we investigate how couples respond to and interpret each other's breathing when mediated in real-time by Embreathe and whether additional context (such as video or audio) is supportive.At this stage, we separate the experiences of sending breathing data and receiving breathing data to explore perceptions of each.

Participants
The participants were four couples that had experienced long distance periods apart in their relationships (see Table 1).The participants were recruited by email and word of mouth.Couple 3 had their baby with them during the study and the study order was adapted to allow for them to tend to the baby.All participants provided written consent to take part in the study and the study was approved by the Faculty of Engineering Research Ethics Committee at the University of Bristol (reference number 0019).Participants were each given an online voucher worth £10 to thank them for their time.

Method
The user study protocol is shown in the Appendix.This study was again conducted in a lab environment, with the experimenter and cushion mechanism in a separate room.At the start of the study, the partners were labelled A and B and assigned to separate rooms, the sending partner wearing a breathing sensor and the receiving partner hugging the cushion (in the second half, the partners switched to experience both).When presenting the cushion to the couples, we framed it as a device to support remote interactions during long-distance periods in their relationship.As before, the cushion mechanism was in a separate room with the experimenter but now the breathing sensor data from the sending partner was transmitted in real-time to the cushion as outlined in Sections 3.3 and 3.4.
For the duration of the study both participants and the researcher (each sat in a separate sound-proof room) had a video call connection set up between them.The first part was the no visual/audio contact scenario, simulated by turning all video and audio off and partners just interacting via Embreathe.Second, to simulate an audio call, the audio was enabled with video switched off.Finally, the video call scenario was simulated by switching on both audio and video.Then the partners switched role and the process was repeated.The researcher had their video and microphone turned off during each scenario, only switching the microphone on to give instructions.They were always able to hear the participants via the audio connection, enabling participants to unmute and ask for assistance or stop the study at any time, but also allowing for conversation relating to the interface to be noted.
Participants provided written responses throughout the experiment in a booklet containing all of the study questions.Before experiencing Embreathe, participants were asked (in the booklet) for demographic information, basic information about their relationship to their partner (including number and length of long-distance periods experienced), and to rate the experience and level of presence they feel when having no visual/audio contact, an audio call, or a video call with their partner during long-distance periods in their relationship.The couples were then taken through each stage of the study and after each scenario (no visual/audio contact, audio call or video call) in each condition (holding the breathing cushion or wearing the breathing sensor) participants were asked to give feedback on the interaction in the booklet and then verbally afterwards.We chose to use the booklet rather than verbal responses throughout to minimise the effect of partners influencing each other's responses.For each of the stages, participants were asked to rate each experience based on the level of presence they felt with their partner on a scale from 0 ('No sense of presence.Don't feel connected with them at all') to 10 ('High sense of presence.Feel very connected with them') and their enjoyment of the interaction on a scale from -5 ('Negative: "I hate this way of interacting"') to 5 ('Positive: "I love this way of interacting"').After each stage they were also asked for any free-form comments or feedback on the interaction.inspection, there is a marked difference in breathing pattern between the scenarios in which participants are talking to one another (audio and video call) and when there is no visual/audio contact, which is similar across the other three couples.Both breathing traces in the no visual/audio contact scenario show a rhythmic and consistent breathing pattern.During the audio call and video call the couple were in conversation throughout and their breathing trace varies dynamically.The control signal captures the core information but cannot reproduce the higher frequency details.The occasional sharp minima in the breathing trace indicate when the participant wearing the breathing sensor has sharply exhaled, shifted position, or made a sudden movement.The other couples exhibited similar behaviours.No participants commented on these shifts in position, although it is likely that they were noticeable.We noted that the sudden shifts occurred most frequently during the audio and video call scenarios (since participants were more animated and likely to move when engaged in conversation) and rarely during the no contact scenario.Therefore, its possible that partners were distracted by the audio or visual context and not noticed the shifts, or the additional context helped them to interpret the shifts.

Presence and Affect.
Figure 8 shows the interaction and presence ratings that participants gave each scenario (no visual/audio contact, audio call or video call) in each of the three contexts explored in the study (their previous long-distance experiences, wearing the breathing sensor and holding the breathing cushion).We note that participants rated their previous long-distance experiences from memory which was recent for couple 4 but further in the past for couples 1, 2 and 3. We see that participant responses to each scenario when holding the breathing cushion (yellow) are consistently equal to or higher than the other scenarios for both the experience of interaction and level of presence, with the most favoured interaction being a video call with Embreathe.In the final set of questions, Embreathe was rated as a positive addition to video calls; participants gave the cushion a mean score of 2.5 using a Likert scale from -5 ('No, it detracted from the video call experience.') to 5 ('Yes, it enhanced the video call experience').The ratings indicate that Embreathe significantly enhanced the participants' experience of interaction and sense of presence compared to having no contact with their partner.Wearing the breathing sensor also increased ratings compared to no contact, but to a lesser degree.

Subjective Responses to
Embreathe.Throughout the experiment, data was collected from various sources; participants' written comments, their verbal feedback to the experimenter, and the conversations couples had during the different phases of the experiment.Using a qualitative description approach [43], we explore this data in relation to our two research questions; how do couples respond to and interpret each other's breathing mediated by the cushion?and is additional context such as video or audio supportive?We present the results below, first addressing question one by presenting the responses of participants relating to their general experience of Embreathe, their specific experiences of sending and receiving breath, and the effect of the system on their breathing.We go on to address question two by presenting participant responses relating to contextual factors of the different scenarios.
General Perceptions of Embreathe: All 8 participants described their experience with the cushion as positive ('excellent', 'nice', 'pleasant', 'really loved [...] cushion', etc) and 4 participants described it as calming ('soothing', 'relaxing', 'comforting', etc).The interaction modality was new to all participants and they frequently commented on the sense of physical presence it afforded, for example, participant 4A said that '[the cushion] brought memories and made me imagine holding my partner in my arms'.Participant 4B commented 'It was a fun yet calming way to interact with my partner that I have never experienced before (especially whilst not physically being present with each other)', and when first feeling her partners breathing, laughed and exclaimed 'It's so cute!Feel like I have a baby version of you in my arms'.For participant 3B, the interaction reminded her of 'those times you were away in Switzerland and I made a pillow 'you' with your t-shirts on', when she was missing her partner's physical presence.
The Experience of Sending Breath: Though the future vision for Embreathe is a bi-directional interaction between partners, in this study we were interested to first explore the individual experiences of sending and receiving breath.When partners were sending breath, we did not expect them to experience an increased sense of presence or connectedness since the breathing sensor was designed purely to collect data and did not provide any interactivity for the sending partner.For some participants, this was the case; partner 3A wrote 'there was no way of knowing if he felt my breathing, so I didn't really feel any connection' and 3B wrote 'wearing the sensor without sound or video isn't really interacting!I guess I was aware we were doing the same thing at the same time'.However, for other participants, the knowledge that their partner felt their breathing was enough to elicit a sense of connection, for example, 1B said 'the knowledge that your partner is sensing your breathing is really soothing.Even though I wasn't the one receiving an input I still felt connected.' When the audio and video channels were included, some partners commented on feeling more connected since they had some feedback from the receiving partner, for example, participant 4A said 'It was nice to be able to see my partner react to deep breaths'.
The Experience of Receiving breath: For receiving partners, the interactions with the cushion varied within and across participants.From the results in Figure 8, we expected that using Embreathe during video calls would be the preferred mode of interaction with the interface.However, 6 of the 8 participants (1A, 1B, 2A, 3A, 4A, 4B) said that they preferred using the cushion without any other contact.Participants commented that this interaction made them more fully 'aware' or 'focused' on the breathing of the cushion / their partner.Participant 1B said 'I feel like this was the most intimate.I really loved it!'.The physicality of interacting with the cushion was highlighted by many participants, for example, participant 4B commented 'I particularly liked how you did not even need the screen for this.We rely so much on our visual interactions or audio communication we often forget the physical elements of dialogues in long distance.[...] [The cushion made me] more aware of [my partner's] physical presence.'Participant 1B wrote 'it's such an intimate sensation, as though the only other time I would be so aware of my partner's breathing is when we fall asleep holding each other' and 1A said it 'feels cosy holding it, feel like I'm snuggled on the sofa; already private and intimate behaviour.'The feel and behaviour of the cushion interface was a significant factor in the interaction, sometimes more so than the partner connection, for example, participant 2A said 'It felt quite soothing. . .but not necessarily due to knowing that it was my partner's breathing; it just felt quite relaxing and comforting.Knowing it was his breathing made it feel cute.' Whereas other participants focused on the communication it afforded with their partner, for example, 4B described is as 'like a conversation through breath.I actually laughed when my partner was speeding up his breathing / being playful with it.' When the audio and video channels were added, the interaction with Embreathe was significantly altered.The breathing patterns of participants were affected by talking, which many participants found challenging to perceive and the additional channels of communication were often described as distracting.Participant 1A said 'I felt overall it was less soothing because the conversation we were having made it harder to focus on my and my partner's breathing.It felt a bit like we were just having a normal conversation and the cushion was just pulsing', and participant 2B commented that 'as soon as you have the visual aspect that is what I was focussing on and I forgot about the breathing'.Participant 4A (who had preferred the no visual/audio contact scenario) discussed that they would potentially use the cushion for audio or video calls in situations 'where the focus was elsewhere, not on talking'.Participant 1B highlighted that, in conversation, 'subject matter could impact how much the pillow influences how we feel.Our conversation was jovial and about normal life things, and so that intimate connection wasn't necessary.Intimate conversations may be different.' Participants highlighted some experiences that were augmented by the cushion, for example participant 1A said 'it was nice to feel [my partner] laugh through the cushion' and participant 1B commented 'knowing we had more than just our images on screen was nice'.Participant 3B commented positively that 'in lulls in conversation, I would focus on the breathing'.
The Effect of Embreathe on Partners' Breathing: Multiple participants commented that they changed their breathing as the sending partner.Participant 1A said 'knowing my breathing was being monitored made it easier to concentrate on my breathing and slow it down.' Couples 1, 2 and 4 had exchanges in which one partner would purposefully make the cushion behave differently to communicate with their partner, for example, participant 3B would 'occasionally take a deeper breath to make my presence felt'.When receiving breath, some participants commented on feeling that their breath had synchronised to the breathing of the cushion.When couple 3 discussed this in conversation, 3A said 'I try to synchronise my breathing with you sometimes when we're in bed, but we just have different rates of breathing and I end up getting annoyed by it', which she thought might translate to the cushion too.
Holding the cushion appeared to influence participants' perception of their experience.Participants described being more aware of their connection to their partner as well as being more conscious of their own and their partners' breathing.Participant 1B described using Embreathe as 'another way of putting yourself in the moment.It signals that this is time for us to be together and feel connected.' Participant 3B commented 'I found the more I thought about what was going on -that the movement really was driven by my partner -the stronger the presence I felt.' Context Requirements: Although participants typically preferred using Embreathe without other communication channels, their comments indicated that some level of shared context was desirable.For example, participant 3A said that, while they liked the no visual/audio contact scenario best with the cushion, 'a short 10 second conversation or even a text to know [I] was connected [to my partner] and that they felt the breathing would have helped', and 2A said 'I think if I was long distance without other communication and the cushion deflated and stopped I would be scared' and so wanted some additional communication channel to avoid such scenarios.When discussing how they would most like to use Embreathe, all participants described scenarios in which there was shared context that wasn't centered around conversation; they discussed using Embreathe for connecting to their partner before bed or for going to sleep (7 participants), while watching a film together (4 participants) and for snuggling on the sofa or taking naps together (3 participants).These scenarios could include audio, but were primarily occasions in which the participants most missed the physical presence of the other, for example, 1B described the cushion as good for such activities in which you are 'doing nothing but knowing you are not alone'.Participant 4B said that they would want it for bedtime because 'this is when I feel most apart from [my partner] just after we would call to say goodnight', they were also curious about using the cushion while watching a movie because 'maybe I would sense when [my partner] found something scary or funny etc as we watched the film'.Participant 1B also thought it 'would be fun to watch a movie together with it and feel reactions.' Participant 2B described using the cushion with audio as feeling 'a bit like bedtime, when you can't see each other but can talk to each other and feel each other's breathing'.
Although participants generally found the interaction challenging during conversation, this may have been in part due to the lab setup and the style of conversation.Couple 4 discussed how different it would be to experience the cushion at home compared to in the lab, and later 4B commented 'I would love to spend more time with it and interact in our own environments to understand the difference.' Participant 1B highlighted that the context of conversation would change the interaction; '[the] subject matter influences the effect [of the cushion], if you were missing each other it could be more important whereas this is just everyday chat.'

DISCUSSION
Our studies with Embreathe provide various insights into interactions with mediated breath.We discuss below the key findings that arose from these studies that we believe to be of value for curating mediated breath interactions.

Rhythmic Breath vs Dynamic Breath
Existing works that mediate breath have primarily focused on conveying breathing rate rather than reproducing dynamic breathing patterns.Study 1 highlighted that simulating breathing with a simple sinusoidal wave function in the frequency range of resting breathing was enough to elicit associations of interacting with a living being (i.e.hugging an adult, child or pet) and even convey emotional states from the breathing (whether these emotional states are correctly interpreted is another question).Similarly, in study 3, participants had positive reactions to feeling their partners' uninterrupted resting breathing through the cushion, and associated the experience with holding their partner and being physically close to them.This indicates that simulating regular breathing patterns based on breathing rate and amplitude alone is a valuable interaction modality, one that can provide a sense of presence between partners and some degree of communication of affective state.
In contrast, the dynamic breathing patterns that participants shared in study 3 could afford a greater level of interactivity, for example playing with their breath.As found in study 2, expressive content such as laughter was very challenging to interpret through the cushion without any other context, and in study 3, several participants noted situations or specific moments in which the cushion made movements that they could not understand.Moreover, the perception of Embreathe's dynamic behaviour was highly influenced by the accompanied interaction, for example, participants in study 3 found the breathing patterns challenging to perceive during conversation on the video call and their comments suggest that it did not add clear benefit to the interaction.Given the extensive challenges present in accurately sensing and simulating dynamic breath in real-time, we suggest that designers carefully consider the context of use for a proposed interface and whether the advantages of using dynamic breath outweigh the disadvantages in comparison to simulating regular breathing.

Preferred Contexts for Mediated Breath
Interacting with Embreathe during a video call or audio call received high scores on the likert scales for experience of interaction and sense of presence.However, participant feedback highlights that in both of these scenarios, conversation was a challenging interaction with Embreathe.Participants found the cushion's movement during speech more difficult to perceive and interpret than for normal breathing.As seen in Figure 7, this may be because the breathing patterns captured by the sensor are significantly altered by conversation and less accurately simulated by the cushion.Moreover, participants' attention became split between the conversation (seeing and listening to their partner) and the cushion's behaviour, reducing further their capacity to perceive and process the cushion's movement.We conclude that the current iteration of Embreathe is not suited to interactions centered around talking but it is possible that improved mappings from expressive breathing data to the cushion could improve the interaction with Embreathe while talking; in study 2 some expressions elicited consistent responses across participants indicating that there may be features of the expressions that can effectively convey the expressive content.
For couples in study 3, the desired context of use for interacting with Embreathe was primarily during quiet times of the day such as when going to sleep or watching TV.From their comments, we suggest that ideal use cases for Embreathe have three key components: first, they are situationally congruent with the real interaction being simulated, (for example, Embreathe simulates the experience of hugging one's partner, therefore lying in bed or on the sofa -primary environments in which these interactions happen -are well suited for Embreathe interactions); second, there is some shared context that both partners are synchronously engaged in (for example, watching TV at the same time gives the partners some context to support them in interpreting behaviours such as laughter); and third, an additional audio channel can support their interaction if it is not centered around conversation (for example, saying good night to each other and then falling asleep while interacting with Embreathe).

Mediated Breath is Situated
Participant feedback highlighted that mediating breath with Embreathe offered them a unique interaction modality for physically connecting with their partner that current communication technologies do not afford.It became clear throughout the studies that participant comments were inextricably entangled with the materiality of the cushion and their somaesthetic experience of hugging the interface, which in some instances was given more attention than the feeling and interpretation of the partner's breathing.Therefore, the findings of these studies may not translate to other mediated breath interfaces.Moreover, although participants speculated about using the interface at home, the experiences of the participants in our lab setting cannot be directly translated to their experience of Embreathe in the physical context of their home environment or in the emotional context of being separated from their partner.

LIMITATIONS AND FUTURE WORK
We have explored the physiological and subjective responses of participants to using Embreathe with the aim of extending our understanding of mediated breath interactions.However, as mentioned above, the experience of mediated breath cannot be decontextualised or separated into parts and 'put back together'.These studies were conducted in a lab setting and explored uni-directional interactions with a prototype system.This entails limitations in our ability to investigate the rich interaction space of mediated breath which provide promising avenues for future work.
The Embreathe system, as presented here, has several limitations.On the hardware side, the maximum inflation rate of the driving mechanism limits the system's capacity to simulate dynamic patterns and the bulky pump mechanism reduces the practicality of using Embreathe in home environments.While the inflation rate could be improved with a more sophisticated linear actuator and pump mechanism, in future work we will look first at revising the whole system to provide an untethered and more portable version of the interface.This could be achieved using a system similar to Kitmuti and colleague's design [27], if care is taken to ensure that the strength of the system is adequate, the noise and vibration from the motors is minimised and the soft, huggable quality of the cushion is retained.We also experienced limitations with the breathing sensor; the system was sensitive to changes in posture or accidental touches from the participants and only captured one dimension of breathing -the expansion and contraction of the lower ribswhich excludes the fuller range of breathing expressed through other areas of the belly, torso and shoulders.Our next version of the breathing sensor could make use of textile stretch sensors embedded into a wearable top to collect richer breathing data from across the torso.Machine learning could then be used to interpret the data and isolate breathing patterns from postural changes to enable greater comfort and freedom of movement for the wearer while capturing the relevant data.
On the software side, our proposed strategies for mapping breathing onto the cushion have not been formally evaluated, and the system has a delay of approximately 200 ms between sensing breathing data and actuating the cushion (this was not commented on by participants, but may become apparent in bi-directional interactions, particularly if partners attempt to synchronise their breathing).It is not clear from our studies, whether improving the mapping from breathing sensor data to cushion behaviour would improve the experience for partners.It may be that the limited inflation rate of the cushion was the reason for participants struggling to identify expressions through the cushion in study 2. For example, sighing was the most frequently recognised and this expression was accurately simulated by the cushion, whereas other expressions such as coughing and giggling had rapid inhalations and exhalations that were less accurately replicated by the driving mechanism of the cushion and were rarely correctly identified.Alternatively, it may be that these expressions are challenging to recognise through the cushion because we rarely feel these physically in another person and would in fact equally struggle to recognise the felt sensation of them in a real scenario.An additional factor is that the expressions used in study 2 were recorded from one individual and, although they were mapped to the size of the cushion, we do not know how unique these expressions are to the individual; for example, we expect that breathing style, body size, gender and cultural background may have an influence on the physicality of these expressions.In future work, it will be important to conduct experiments comparing expressions recorded from a variety of people and processed with different mapping strategies.This would enable us to identify the best strategies for scaling and translating breathing patterns onto the cushion.
All interactions with Embreathe were uni-directional and conducted in a lab setting.This provides a foundation for understanding participant's responses to sending and receiving breath in isolation, but the next step is to explore these responses in dynamic and situated scenarios.For example, in study 1 we found evidence to suggest that the majority of participants' breathing rates were influenced by that of the cushion.However, in future work we plan to explore whether, in a bi-directional interaction, this would result in both partners' breathing rates converging to some intermediate rate, or one partner's breathing entraining to the other, or neither partners' breathing entraining.When investigating interactions between partners, participants often described the interaction as intimate and wanted to use the cushion in private environments such as on the sofa or in bed, very different to the lab environment with an experimenter listening to their conversation.In future work, we will conduct in-the-wild studies to understand how participants respond to the interaction when in the privacy of their own home and during periods of long-distance in their relationships.In this situation, participants could engage with the prototypes independently of a researcher and over an extended time period, with data collected via diary studies and periodic interviews with both partners.This gives us the opportunity to understand how the device is engaged with and perceived in a real-life setting and how the partners' relationship with and through the device changes over time.In particular, this affords insight into interesting factors such as whether increased usage of the interface deepens the partners' connection and improves their ability to interpret each other's breathing expressions.These future studies require an improved Embreathe system with a more wearable and reliable breathing sensor and a smaller, ideally fully untethered, driving mechanism for the cushion, as discussed previously.They also require careful planning and consideration of factors such as; ensuring data privacy when sharing participants' personal data remotely, ensuring physical safety for participants and a robust system for the interface, and addressing ethical concerns regarding how and when data is collected.While these studies pose challenges, they are worth engaging in to understand the practical application of these technologies, and to explore further questions, for example, whether partners wish for asynchronous as well as synchronous interactions with mediated breath.

CONCLUSION
We have introduced Embreathe, a pneumatic cushion interface and breathing sensor system capable of conveying dynamic breathing patterns from one partner to another in real-time.We offered multiple strategies for simulating dynamic breathing patterns via the cushion and characterised the interface capability.Through a series of 3 user studies we investigated individual aspects of mediated breath interactions afforded by Embreathe.We observed that participants' breathing showed signs of entraining to breathing rates of 12 bpm and 16 bpm presented through the cushion.When presenting expressive breathing patterns such as laughter, talking and coughing through the cushion, we found that participants frequently misinterpreted the expressive content and found it difficult to recognise the behaviour of the cushion.When exploring realtime sharing of breath with couples, we found that partners enjoyed using Embreathe to connect with each other and that the interface increased their sense of presence in a laboratory-simulated remote interaction.Using the interface with no other contact (audio or visual) was the preferred mode of interaction for the majority of participants and they wanted to use it during quiet times such as bedtime or snuggling on the sofa.Participant feedback highlighted that Embreathe offers a physical connection that their current communication technologies do not afford, and is desirable for periods of separation.Future work will investigate bi-directional interactions via Embreathe with couples in their home environments and real long-distance scenarios.
In conclusion, we have extended our understanding of physiological and subjective responses to mediated breath by exploring some aspects of the rich interaction space afforded by Embreathe.

Figure 2 :
Figure 2: Left: The Embreathe interface and modified mechanism used in the user studies; A) Embreathe cushion, B) Microcontroller (encased ESP32 board) collecting data from breathing sensor, C) FSR module of breathing sensor, D) Breathing sensor waistband (elasticated with hook-and-loop fastening), E) Linear actuator driving the pneumatic mechanism, F) Connection to power cable, G) Motor driver board, H) Microcontroller (Arduino Uno for the user study, ESP32 for remote communication), I) Potentiometer connected to linear actuator arm via pulley, J) 500 ml capacity syringe acting as pneumatic piston, K) Pump and cushion are connected via a 10 m long tube.Middle: Adult wearing the breathing sensor.Right: Illustration of generating a mapping from a 10 s sample of breathing data (blue) onto the full range of motion of the pump mechanism.Max.amplitude of breathing data (x) indicated with black dash-dotted lines, max.displacement of the linear actuator (L) indicated with red dashed lines.

Figure 3 :
Figure3:A) The polynomial relationship between linear actuator position and breathing sensor data when worn by the cushion (data contains 30 sets of readings separated into inflation (red) and deflation (blue) when running the linear actuator at 3 PWM levels of 150, 200 and 250 from min.to max.inflation).B) Scaling and compression function; all data between the midpoint of linear actuator position and the maximum amplitude of resting breathing (  ) from is scaled by a factor of  to provide the appropriate tidal volume.If the remaining data in the sample is ≤  1 (where  1 is max.linear actuator position), the data can be linearly scaled by factor , otherwise, all data ≥   is compressed by a linear compression function based on mapping the maximum amplitude ( 2 ) to the maximum amplitude of the linear actuator ( 1 ).A1) Sample clip of giggling before (grey) and after (blue) the polynomial mapping in A. B1) Sample clip of giggling before (orange) and after (blue) the scaling with a compression function as defined in B.

Figure 4 :
Figure 4: Performance of Embreathe when simulating the six expressive clips used in study 2. 'FSR reading' (blue) is the raw breathing sensor data mapped onto potentiometer position using the mapping described in Section 3.4.'Desired control' (dotted red) is the FSR reading sampled at 100 ms intervals as received by the microcontroller to be used as the setpoint for the proportional control.'Potentiometer feedback' (yellow dashed) is the recorded potentiometer reading at each time step and the Process Variable for the proportional control.

Figure 5 :
Figure 5: Breathing rate of all 17 participants (P1, 12, 18 and 21 excluded) prior (A), during (B) and post (C) cushion activation for each stimulus (cushion breathing rates of 12, 16, 20 and 6 bpm).Cushion breathing rates and entrainment windows of ± 2 bpm are highlighted (blue) as well as the entrainment window of harmonic frequencies to which some participants exhibit entrainment (red).

Figure 6 :
Figure 6: Confusion matrix of participant responses for all stimuli (1-12) in part 2 of Section 2 of the experiment.Data for all 21 participants is included, excluding 3 clips that were not presented due to a problem with serial connection.
Figure 7  shows the recorded breathing trace of couple 1 during each stage of the experiment and the resulting control signal sent to the cushion interface.Through visual

Figure 7 :
Figure 7: The recorded breathing traces of Couple 1 in each of the six sections.Top row shows the breathing trace (BT) and subsequent cushion control (CC) for the breathing of partner B during sections 1A-3A.Bottom row shows the BT and CC for the breathing of partner A during sections 1B-3B.BT is the reading from the breathing sensor (analog input in the range 0 to 4095), inverted so that peaks indicate inhalation.In each BT plot, dashed lines indicate the calculated minimum and maximum mapped onto the minimum and maximum of the cushion inflation, which is continuously updated as discussed in Section 3.4.CC is the target position given to the linear actuator controller at each time step, with maximum range 45 to 665.

Figure 8 :
Figure 8: For each scenario -no visual/audio contact (top), an audio call (middle) or a video call (bottom) -participant ratings are given for three contexts; their previous long-distance experiences (L-D), interactions when wearing the breathing sensor (B-S) and interactions when holding the Embreathe cushion (E).Ratings are given using Likert scales for interaction [from -5 ('Negative: "I hate this way of interacting"') to 5 ('Positive: "I love this way of interacting"')] and level of presence [from 0 ('No sense of presence.Don't feel connected with them at all') to 10 ('High sense of presence.Feel very connected with them')].Participants are identified by individual markers indicating their couple (1 to 4) and label (A, unfilled symbol, or B, filled).Shaded regions are box plot representations of the data with interquartile ranges shaded darker.

Figure 9 :
Figure 9: User study 1 and 2 protocol.Shaded stages indicate when participants are asked to provide verbal responses to questions.Data collected at each stage is listed (left), corresponding to the stage with matching colour and symbol

Figure 10 :
Figure 10: User study 3 protocol.Shaded stages indicate when participants are asked to provide written responses to questions.

Table 1 :
Participant Information for Each Couple and Rela-