breatHaptics: Enabling Granular Rendering of Breath Signals via Haptics using Shape-Changing Soft Interfaces

Feeling breath signals from the digital world has many values in remote settings. These signals have been visually or audibly represented in previous research, but recent advances in wearable technology now enable us to simulate breath signals via haptics, as an intimate and intuitive form of non-verbal interaction. Prior works relied on low-resolution methods of breath signal rendering and thus a limited understanding of associated haptic perceptions. Addressing this gap, our research introduces breatHaptics, a wearable that offers a high-resolution, haptic representation of breath signals. By utilizing extracted breath data, a mapping algorithm model and finely-tuned soft actuated materials, we deliver a granular simulation of human breath. Through a perception study involving force discrimination testing and haptic experience evaluation, we demonstrate breatHaptics’ ability to create a rich, nuanced tactile sensation of feeling breath haptically. Our work illustrates the promising role of breatHaptics as part of wearable technologies in offering well-being support.


INTRODUCTION
From infants in mothers' arms feeling the up and down rhythms from their mothers' chests, to yoga learners guided by their teacher through inhaling and exhaling rhythms, reaching a balance between mind and body, the role of breathing is not just a facet of human physiology, but also reflects our emotions, values, and needs [29] [33], and intertwines with our daily interactions.With the unique value of feeling breath signals in daily interactions, being able to reproducing that direct, detailed, and nuanced sensation of breath in the digital realm holds significance.
Haptic interfaces, providing direct tactile feedback, offer unique opportunities for direct and intimate interactions beyond traditional visual or auditory channels.Recent research has delved into wellness-focused haptic wearables [11,51], and the enhancement of tactile resolution for user engagement in daily notifications [17,35] and motor guidance [24,66].While most breath-related systems are visually or auditorily based, we recognize that haptic delivery of breath signals presents unique value and potential.
Rendering breath signals with high resolution goes beyond basic parameters like frequency or amplitude.Instead, it delves into providing a more detailed and nuanced sensation of breath.A "finegrained rendering" ensures that breath signals are captured, processed, and simulated with high fidelity, mirroring the original breath curve as closely as possible.The resulting granular tactile feedback are valuable for many applications where users need to feel and interpret breath in detail, such as remote bonding, breath training techniques, and directed breathing for emotional support.
Yet, the challenge remains in encoding breath signals into detailed haptic feedback and deliver that to users.Current haptic devices that simulate breath signals often only capture basic parameters such as amplitude and frequency, resulting in low resolution tactile sensations.This overlooks the nuanced complexity of breath data.Consequently, the richness of breath haptic interactions, especially considering the pivotal role of breath in our daily experiences, is undermined.There is a noticeable gap in achieving fine-grained haptic simulation of breath signals and a corresponding evaluation of haptic perception of such signals.
To address this challenge, we developed our breath haptic wearable device using soft actuators, designed an algorithm model for high-resolution haptic representation of breath data, and conducted evaluations involving haptic force discrimination and haptic experience.Additionally, we showcased several applications to demonstrate the wide-ranging potential of our approach.
Our work advances this field by: 1. Introducing a haptic wearable capable of fine-grained rendering of breath signals, facilitated by soft actuators.2. Evaluating the effectiveness of our device through a perception study on haptic force discrimination and haptic experience.3. Discussing applications to highlight how a rich and nuanced tactile sensation of breath signal enabled by our wearables can open new avenues for well-being applications.

RELATED WORK 2.1 Leveraging Human Breath in HCI
Human breath as part of the physiological signals can indicate social presence and enhance interpersonal connection [9].The breathing patterns correlates with cognitive and affective states such as emotion, attention, etc. [29].The breath signal parameters such as amplitudes, frequencies and regularities have correlations with anger, joy, fear, and other kinds of emotions [56].Breathing can also serve as social signals [30].The changes and patterns of one's breathing can inform themselves or others about the person's feelings [5,29,30].Thus, a rising number of works in the HCI field have attempted to build breath-sharing systems to communicate someone's affective states to others or themselves.
Most breath-sharing systems are visually or auditorily focused, with less emphasis on haptics [57].Yet, the muscular movement of breathing, linked with interoceptive perception, can create intimacy and lower cognitive load through passive touch [15,19,64], bypassing the need for visual or auditory signal translation [38].Hence, we propose exploiting breath for haptic interaction techniques, enhancing remote health and well-being.

Haptic Wearables and Soft Actuators
Haptic Wearables as an emerging domain in HCI gains popularity for well-being applications [18,23,35,48,69].In developing natural and intuitive haptic interfaces wore on user's body, prior literature has identified rididness of feeling a "machine-like" touch in existing haptic devices that are different from a direct human-to-human touch [32].The natural haptic experience's rigidness link back to the materials touching the user.
Soft machines, with comfort and flexibility akin to skin-touch, are preferable for haptic applications due to their similarity to human skin's Young's modulus, unlike most metals [40].Reliable and robust soft actuators are suited for future haptic devices for richer feedback [74].Dielectric elastomer actuators have high frequency and strain but need high voltages to yield low forces [7,37,41].They hence typically have a thin single-film design to reduce the required actuation voltage, resulting in lower force outputs, often around 10 mN [16].Electromagnetic actuators, though fast and controllable, are often bulky [6,55,71].This type of actuator often utilizes cables or twisted strings that are rigid compared to human skin, and are predominantly used in exoskeletons for kinesthetic haptic feedback [74].With our purpose of on-skin haptics on cutaneous forces, we saw that pneumatic elastomeric actuators made out of silicone meet our design criteria.Silicone elastomer's young's modulus is about 10 5 Pa, matching the human skin (<10 6 Pa) [40].It is soft and safe for skin and the actuators are non-electric, lowcost, and easy to fabricate [2,26,39,44].Considering the need for resolving the rigidness and providing natural and skin-like haptic experiences, pneumatic elastomeric actuators have the potential to become the breatHaptics material of choice.

Review of Haptic Representations of Breath Signal
Current breath representation methods primarily extract several breath parameters (frequency, amplitude, and/or moving alone a sine wave [57]) to influence interactive outputs, like changes in visual elements [12], alterations in sound patterns, and tactile patterns illustrated in Fig. 2 below.However, these approaches only represent a part of the breath data due to the breath signal's complex, high-dimensional, and noisy nature.Many wearable systems strive to simulate human breathing using force haptics (Fig. 2), but the breath representation remains on produce breath tempo or low-fi breath signal, not a detailed waveform simulation.Subsequently, there is also a lack of research in how these haptically represented breathing signals are differentiated and interpreted by users.This simplification and limited resolution invariably reduces the range of sensations conveyed and felt, imposing constraints on many well-being applications where nuances are valued in conveying subtle changes and emotions.
For these reasons, most of the applications are limited on the functions of: 1. Tactile Notification: Wearables like neck pendants [22] animal toys [1], and bracelets [52] use vibrotactile feedback to deliver users with breath-like patterns, but these messages are often rudimentary and lack the intricacies of breath sensations.2. Simple Breathing Guidance: Devices such as vibrotactile units [43,59] guide users in breathing exercises but often lack the depth and granularity of actual breath patterns.3. Breathing Pacer for Relaxation: These primarily fetch breath frequencies to provide relaxation and lack the intricate variations and nuances that come with a full breath cycle.Examples include pneumatic devices like chairs [73], and fabrics [67,68].

3.
Our work achieves a complete waveform simulation of breath signal, enriches tactile sensations of breath, and unlocks new opportunity for applications (Fig. 11).One technical challenge faced in existing works is that it is nontrivial to render the granular breath signal on a haptic device, because both breath signal and the actuation of device are noisy, multidimensional and highly non-linear.Our proposed model, detailed on the implementation page, addresses this by mapping the complete breath curve onto the force curve used for tactile sensation, accounting for the non-linear complexities of breath data.In this work, we hope to provide a fine-grained tactile experience of breath signals and explore its potential for well-being applications.

Fabrication and Hardware
Pneumatically activated structures offer a broad spectrum of shapes and dynamic actuation modes.We utilized the 'Baromorphs' pneumatically actuated structure as our actuation mechanism, as described by Siéfert et al [63].Its quick transformation from flat to 3D, unrestricted by size, allows for versatile use as a haptic wearable on any body part.We designed our lightweight form factor for the skin, inspired by previous soft actuated wearables such as [31,42] (please refer to Supplementary Figure 1 for fabrication procedure).We first assessed each actuator's force outputs for pulse width modulation (PWM) from a Vernier Breath Force Sensing belt.We actuated each shape with a calculated airflow from the air pump (Section 3.3), controlling airflow speed by alternating the pump and valve states using PWM.Higher PWM duty cycles increased pneumatic pressure, inflating the actuator and boosting the force sensor's output.We then measured differing PWM ranges for inflation and deflation (inflation force outputs from 40% to 100% PWM and deflation from 70% to 100% PWM).For inflation and deflation, the sensor could not detect any force below 40 % and 70 % PWM respectively.Example results are shown below in Fig. 5.
We used three example geometries (Triangle, Ellipse, Square) and an auto-control measurement algorithm, adjusting airflow speed from 40% to 100% PWM during inflation and deflation.These measurements yielded a force characterization set to construct actuation signal control models in Section 3.3.

Mapping Algorithm
Our Algorithm Pipeline is shown in Fig. 6.The mapping algorithm generate output actuation signals (PWM vs Time) from a breath signal input, aiming to achieve a highly granular simulation of the breath signal.We obtain the breath signal by using a breath sensor (Vernier Breath Force Sensing Belt, used on upper chest area of a person, and measures the force sensed by the belt upon chest inflation).This signal is used as input to our algorithm which then controls the inflation and deflation of the soft actuators to simulate breath signal.Our motivation for the mapping is to approximate the human chest movements during breathing, by the wearable's actuation upon pumping air in.
To approximate an input breath signal, the device needs to achieve a specific inflation amplitude and speed at a every time step.To do so, we need to correctly adjust the PWM on the fly, which requires an accurate and fast mapping from inflation amplitude and inflation speed, to the PWM.However, the actuation of our device is to inflate a soft membrane with irregular shape and interconnected channels with PWM-controlled air flow.This is a noisy, multi-dimensional and highly non-linear process that traditional fitting methods, such as polynomial fitting, failed to fit based on our initial experiments.This leads us to use the multilayer perceptron neural network [53], which is a universal function approximator that can fit a wide range of high-dimension functions with noise, and non-linearity.

Model Architecture
We adopted the PyTorch framework [54] to architect a Multilayer Perceptron (MLP) model.This model was specifically trained on data sourced from the mechanical characterization of pneumatic actuators, as depicted in Fig. 5.The model accepts two input parameters: 1. Force magnitude  2. Derivative of the force magnitude  ′ originating from actuation.The primary output of the model is the Pulse Width Modulation (PWM) value, crucial for actuator control.More specifically, to yield a target force magnitude   and its derivative  ′  at an instant , the actuator is instructed to use a PWM value of   .
The MLP architecture features two hidden layers.The first layer contains 64 neurons, while the second houses 32 neurons.This configuration is visually represented in Fig. 6b.To fine-tune our model, we employed the AdaGrad optimization technique available within PyTorch [54].Our training dataset comprises triplets: (  ,  ′  ,   ), where: 1. (  ,  ′  ) are the input features 2.  is the ground truth output.During the training process, inflation and deflation were treated distinctly in separate loops.The PWM outputs from both loops were combined to generate a continuous sequence of inflation and deflation events.For each distinct actuator, we trained its own mapping model.The dataset for each actuator contain approximately 4400 data points, capturing both inflation and deflation dynamics at varied PWM settings.
The details for the multilayer perceptron model is shown in Fig. 6b and Fig. 6c.Here we used the measurement data from Triangle as an example.Once the model is established (Fig. 6c), given breath signal (which can be obtained from a human subject via a breath sensor, as shown in Fig. 6d), the corresponding control signals (PWM) vs. time (Fig. 6e) will be generated.We thus established corresponding mapping model for each actuator in our library.

Validation.
We evaluated the performance of our algorithm model, and an example of its output is visualized in Fig. 7.We calculated the Symmetric Mean Absolute Percentage Error (sMAPE) [27,28] for our scale-independent time-series data, given by: sMAPE = 100 The result of sMAPE achieves approximately 11.3% between actuation force signal and breath signal.Fig. 7 also shows that the peaks and valleys lined up between both sets of data, and the actuation curve captured the same shapes as the breath curve.Hence, this indicates our relatively precise mapping from the algorithm, which enables us to simulate the breath signal with significant granularity.Our system, as depicted in Fig. 4, incorporates Control Model Adjusting.Considering each application scenario carries unique haptic design requirements-such as actuator shape, signal intensity, and applied body location-our system offers a GUI slider bar (Fig. 4) to scale the Actuation Signal Output to user's liking based on Percentage (0%-100%) (Fig. 7).The importance of this adaptability feature is underscored by discrepancies between mechanical properties measured in actuators and users' haptic perceptions.Design goals and haptic criteria for applications are often descriptive with minor quantification.Therefore, our Control Model Adjusting function aids users in achieving their desired tactile sensation, a concept further demonstrated in the subsequent perception study section.

PERCEPTION STUDY
To address our second research question -understanding the effectiveness of our approach in delivering a rich and nuanced sensation of the breath signal -we firstly assess our device's primitive affordance on user's force discrimination from cutaneous force haptic feedback (4.1.1),which is important in achieving granular tactile sensation of breath.Then, we evaluate how users would perceive our fine-grained simulation of breath signal, which is crucial for applications.(4.1.2).

Method and Procedure
Our study involved 18 participants, aged 18 to 35, who were randomized into three groups and assigned each group to a case study prototype (Fig. 8).This 40-minute study, which compensated participants with $15 each, tested the prototypes on three body locations.The chosen wearing position (Palm, Abdomen, Upper Back) were selected from areas with lower spatial acuity (like the palm) to those with higher spatial acuity (like the abdomen and upper back) [36].The three locations are also easily accessible and can comfortably accommodate wearable devices, and represents areas where individuals might commonly wear devices or sensors.
We opted for a between-subjects design.A between-subjects design ensures that each participant is exposed to only one condition, eliminating potential biases or carryover effects from one condition to another, thereby minimizing individual differences and ordering effects in the study results.We conducted an ANOVA test on pre-survey data (including Weight (p-value=0.74>0.05,F=0.3); Height (p-value=0.32>0.05,F=1.20); Familarity with Haptic Device (p=0.11>0.05,F= 2.5); Experience in Design (p=0.64>0.05,F=0.45)) to verify that the groups were statistically similar, supporting the validity of our study design.The p-values obtained were greater than 0.05, confirming that the group differences were not statistically significant.-The first step is our Force Discrimination Testing, which serves as a primitive psychophysical evaluation on how wearer's body distinguish the cutaneous force from the pneumatic actuation.This section includes a Free Magnitude Estimation Test, followed by an analysis of the test result.Developed by Zwislocki & Goodman, the Free Magnitude Estimation Test is a psychophysical method adopted by a range of haptic perception studies to quantify the user's ability to discriminate the magnitude of object properties, such as force [75].Since the force output from our actuators is transformed into cutaneous haptic force feedback, we focused on force amplitude as the free magnitude estimation parameter.Our computer algorithm randomly ordered five fixed actuator inflation force amplitudes from 0% to 100% controlled by the computer and hardware module, presented at the same frequency (about 0.3 Hz).We presented one level of actuation at a time, and waited for their response.After their report of Free Magnitude number on the current level, we proceeded to the next level, with their reported numbers must being non-negative [47].
For data analysis, we fit each participant's reported numbers onto the same scale by calculating the logarithms of each number based on their mean [47], as shown in Fig. 9.To avoid priming, we did not debriefed the concept of breath haptics to our participants until at the end of this discrimination test.We demonstrated three prototypes, used breath signal samples self-recorded, and presented our Control Model Adjusting module.
Participants were asked to adjust the GUI slide bar to a signal intensity level at the beginning to find the intensity level of best fit.We did not ask them any questions during the process as we do not want to influence their attentions in experiencing the haptic sensation.The Likert scale Survey questions were given only after participants declared that they had arrived on their desired intensity level by spending time to experience the haptic effect.At the end, we interviewed them with questions regarding their general experience reflection, and collected subjective feedback to sensations of breath from our demonstrations.9 shows the force discrimination result: our participants can clearly distinguish a significant range of different forces from actuation, at force amplitude levels 20% to 100%, showing its efficacy in providing fine-grained tactile sensation.A larger standard deviation was observed at the 20% amplitude level compared to higher values such as 100%, indicating a larger individual variation at the lower end.We noticed varying perceptions based on body locations and actuator shapes.Triangle-shaped actuators had a lower discrimination range than Square ones, while Ellipse-shaped actuators showed the best force level discrimination.This is likely due to factors like body location perception, actuator surface area, and geometry (For detailed actuator parameters, refer to the Supplementary Figure 2).Overall, our result indicates that our device can afford granular haptic perception.When controlling its actuation to simulate breath data in high resolution, it is able to offer the granular tactile sensation.

Result and Discussion
While this psychophysical method is often used in haptic devices of other mechanisms to provide characterization of force perception before building applications, such as [35], [62], to our knowledge, there is limited prior work, not only in breath haptic systems but pneumatic soft actuator in general, that quantify the haptic force discrimination from users.Hence, our new scientific data also demonstrated the promising role of pneumatic soft actuators for haptic devices.-Participants' responses underscore a rich and nuanced feeling of breath signal.In Fig. 10, the result shows that haptically rendering breath signals with high granularity impacted differently on the five HX standards.In particular, granularity itself is demonstrated in being able to differentiate nuanced patterns in Expressivity [58].Shapes with higher granularity (Expressivity) tend to evoke a higher rating of realistic breath sensation (Realism) and perceived engagement (Immersion), while shapes with lower one (Square) elicit more feel-good (Autotelic) and harmonious (Harmony) responses.The ratings also align with force discrimination testing results (Fig. 9): A higher discrimination range shape (Ellipse) also has a higher granular patterns (Expressivity).Notably, a higher resolution (greater discrimination range) does not necessarily translate into a more positive breath haptic experience.
The Square shape, despite having the least discrimination range, yields the highest Autotelic scores.Moving forward, our result implies that customizing actuators for versatile applications holds great benefits.
Through the Control Model Adjusting phase, participants were able to achieve their desired haptic effect.Fig. 10 illustrates the empirical result of intensity customization from participants, with a standard deviation (SD) of about 0.15 across the three.The effectiveness of this adjusting process is emphasized by many participants, who sought to strike a balance between comfort and explicit haptic signals (P1-P4, P6-10, [13][14][15][16][17][18].Some participants sought to avoid an artificial feeling, noting that higher percentage values would be distracting or overwhelming, but lower values would be too weak for their liking (P2, P10).For future, designing applications that also enable user adjustments to a desired breath haptic effect can serve as a guiding principle.
In Overall Experience collected from post-interview, the feedback aligns with our goal of creating intimate, rich and nuanced tactile sensation of breath.The actuation patterns were described as organic and gentle, and evoked a sense of realism through their irregular, lifelike rhythms (P1,P3,P8,P17).Using breatHaptics felt like an intimate moment, reminiscent of feeling someone's breath (P3,P4,P8,P10,P12,P17).Towards their feedback of the breatHaptics device, almost all participants landed on the description of softness and tenderness (P1-P18).The textures of material were drawn comparisons to sashimi and second skin, and were praised for sleek and minimal, elegant qualities (P2,P3,P6,P16).In a few instances of negative comments, some participants expressed feelings of weirdness when first exposed to the actuation patterns (P4,P8), but reported gradually adapting to and even feels comfortable overtime.Future work could further explore this adaptation process to support user acceptance towards this non-traditional sensory feedback.
To summarize, a fine-grained simulation of breath signals on our prototypes evoked diverse user perceptions.High resolution rendering coupled with its control model adjustments in the breath haptic interactions contributed to the haptic effects.Participants value the ability to tailor actuators and the control adjustments to their needs.Our overall findings suggest that breatHaptics offered participants a unique, intimate, and sensory-rich experience for their perception and interaction with breath signals.

APPLICATION DISCUSSION
The subjective feedback collected in our study enabled us to pinpoint user's thoughts about breatHaptics demonstration and future potentials in delivering granular breath signals in practical settings.Participants related their haptic experience to a way of remote communication between person, such as "holding someone's hand" (P4), and "remotely hugging somebody"(P6).This feeling of connectness even extended to non-human such as "petting a breathing cat" (P10).Their experience of interpreting a breath signal was extended to feeling remote presence of someone (P4, P5, P9, P12, P17).
Various HCI studies have been exploring methods of promoting remote co-presence: "the sense of being together with another" [3,49].While realizing the limitation of traditional medium such as text, visual or audio that lack physical intimacy, we recognize the value of fine-grained simulation in breatHaptics, as some being realistic and immersive to create the sensation of feeling someone's breath.This capability not only fosters connectivity between individuals in a tangible and mobile approach, but also opens a new mode of remote co-presence.
2. An Engaging Form for Online Breath Training (Fig. 11.d,e,f)Some participants mentioned the influence of actuation on their breathing style, suggesting it mildly aids in refining their own breathing.The haptic feedback, as per some users, was more effective in focusing on breathing compared to visual or auditory signals (P13, P16, P17).Increased attentions from tactile feedback was reported, which improved concentration on breathing (P7,P8).
Breathing practices, such as Yoga and Singing, require the individual to be self-aware of their body and actively control their breath using diaphragmatic muscle movements [61].In traditional learning, instructors guide students by breath synchronization and sometimes touch the target muscle area to direct students' attention.Towards supporting online breathing practices, previous studies explored displays, wearable devices, and virtual reality environments [21,45,70].We advocate for a more expressive and direct approach for these scenarios to enhance these practices (in which the Ellipse Patches could be a candidate).The pads, placed on the diaphragmatic muscle area, provide high resolution breath signal for clear guidance in learning specific breathing techniques.
3. A Symbiotic Companion for Affective Support (Fig. 11.g,h,i) Many participants suggests that our breath haptic signal could be used to alleviate stress and regulating emotions through slower breath rhythms (P2, P4, P7, P8, P17).P12 suggested the paced inflating-deflating pattern could help manage strong emotions or unhappiness, while P5 found the experience unobtrusive.The feeling of relaxation was a recurring theme in our received feedback, and participants found the breatHaptics prototypes comfortable and appealing.These highlights the potential of breatHaptics as tool for affective support.
The link between breath patterns and emotions is well-studied, with research showing that physically mirroring someone's breathing can provide calmness and emotional support [8,56,60].Our wearable could function symbiotically, reflecting the user's emotional state via synchronized breathing.By initiating a slow, calming breath cycle and bringing rich and nuanced tactile sensations, it intervenes the disrupted breath patterns caused by negative emotions like anger or stress.This shared breathing activity between user and wearable could bring valuable interventions.

LIMITATION AND FUTURE WORK
Although participants' feedback provides meaningful insights into breatHaptics applications, we also acknowledge that future field studies or participatory design are in need to further understand each application scenario.It could be worthwhile to conduct inthe-wild testing and evaluation workshops to capture the user experience when using the devices for applications.
Currently, we pre-made the actuator geometries.With advancements in computational simulation and mechanical analysis [50], we propose developing an inverse design tool for personalized actuator geometry designs based on user's body shape and haptic feedback requirements.
Since we are focusing on the high resolution simulation of breath, the real-time breath signal connection, though not explored, is a practical extension and a matter of implementation achievable via Bluetooth sensor integration.Further advancements could also incorporate the breath sensors directly into the haptic wearable, replacing external sensing belts.Stretchable force sensors like liquidmetal-based sensors [72] in future designs may allow for a lighter [14], more efficient system.

CONCLUSION
In this study, we presented breatHaptics, a haptic wearable capable of enabling fine-grained tactile sensations of breath from soft actuators.We implemented a haptic wearable device that supports multiple actuator geometries to achieve a high resolution emulation of breath signal, through a mapping algorithm model with customizable control model adjusting.Our system was evaluated through a perception study to see how users distinguish cutaneous forces from the actuation, and how they interpret the breath signals delivered to them.We also discussed application scenarios that our device can be built into, inspired by subjective feedback received.For future work, we also discussed how our approach could be improved to further enact the role of wearable haptic devices in offering well-being support.

Figure 2 :
Figure 2: Related Work of Breath Haptic Interactive Systems

Figure 6 :
Figure 6: The Algorithm Pipeline.(Triangle -as an example) 6a: Force Amplitude Measurements from Fig.5 on Varying PWM; b: Multiplayer Perceptron on Force Amplitude Measurements; c: The Breath Signal Input, the force amplitude from the input is then scaled to match the actuator's amplitude range; d: PWM vs Actuation Force Dynamics -The model is built for mapping Breath Signal to Actuation Signal; e: The Actuation Signal Output.

4. 1 . 2
Breath Haptic Experience Evaluation.-In the second step, we adopted the Haptic Experience (HX) Model developed by Sathiyamurthy et al.: Autotelics, Harmony, Expressivity, Immersion, and Realism [58].To better evaluate in our context, we constructed the following key questions that we wish to explore into our Likert scale questions: Autotelics -This experience is (1 Not pleasant at all -7 Very Pleasant); Realism -I feel like I experienced someone's breathing (1 Not at all -7 Very Well); Harmony -The feeling from this experience is (1 Very Distracting -7 Not Distracting at all); Expressivity -I can feel the differences in inflation patterns (1 Not at all -7 Very Well); For Immersion, we included two questions that fit into the internal and external aspects of breath-signal sharing: I could follow or be guided by this breathing pattern (1 Not at all -7 Very Well); If say this is a breath signal from your friend, I can feel their body states (1 Not at all -7 Very Well).

Figure 9 :
Figure 9: Result of Force Discrimination Testing, adopted from Zwislocki & Goodman -Y-axis: Participant's Free Magnitude Estimate of Perceived Amplitude; X-axis: Percentage of Max Force Amplitude controlled in trials.

Figure 10 :
Figure 10: Breath Haptic Experience -a: Five dimensions of HX Model by Sathiyamurthy et al.; b.Participants' Control Model Adjusting

1 .
A New Expression of Remote Co-presence (Fig 11.a,b,c)

Figure 11 :
Figure 11: breatHaptics as a unique way of building well-being applications.We demonstrated some examples in Fig.10: a. Connecting parent and child who are separate apart; b.Feeling the life-like pets through touching their breathing body in virtual reality; c.Collecting the breath memoir of a specific moment; d.Assisting online Yoga class to boost learning and engagement; e. Instructing breathing techniques in singing practice; f.Breathing guidance in strength training; g.Providing ambient rhythmic breathing guide; h.Promoting calmness for alleviating stress before an important event; i. Mirroring the breath of oneself to provide companion.