Augmenting Embodied Learning in Welding Training: The Co-Design of an XR- and tinyML-Enabled Welding System for Creative Arts and Manufacturing Training

Metal welding is a craft manufacturing skill that can be unusually difficult to externalize and represent to novices. Building competency requires an apprentice to iteratively practice embodied skills and sensitize themselves to a sensorially complex practice. To explore these challenges, we organized a series of co-design workshops with a youth program in welding and fabrication. Working with eight instructors and four students, we identified opportunities for mixed reality, sensing, and tinyML processes to augment welding training and practice. This resulted in an extended reality (XR) welding helmet and torch that enhances the embodied learning of welding in three key ways: biometric sensing enhances mindfulness and stress management in sensorially challenging environments; acoustic sensing focuses learner attention on non-visual cues of weld performance; and combined motion-sensing and visual XR feedback helps improve proprioceptive and embodied learning. These features are assessed and we offer design implications for augmenting novice learning of craft-practice with XR approaches.


INTRODUCTION
Many industries rely on skilled welders to produce the products we use daily.Attracting and retaining novice welders within training opportunities has, however, become increasingly challenging.The American Welding Society predicts a deficit of 375,000 welders in the United States by 2024 [20].The shortage of welding operators can be attributed to two main factors: negative perceptions surrounding the profession and the challenge of replacing retirees with younger generations [20].This suggests a need for increased creativity in training young welders.Training welders requires developing complex embodied knowledge across eye-hand coordination, movement, proprioception, and sound.This embodied knowledge is acquired through in-situ apprenticeship and hands-on interactions with tools and materials [7].These attributes can be complex to understand and hard to replicate in training scenarios.As such, systems are needed that can support the development of this embodied knowledge in a more experientially efficient and rich way than simply "welding more".
Prior research and commercial products have explored XR (extended reality) and VR (virtual reality) as possibilities for welding augmentation [23,29,31,33,34], but these systems are generally designed for classrooms or settings without active welding.Presently, the emphasis primarily lies on enhancing eye-hand coordination and psychomotor skills.
Relatedly, the HCI community is interested in employing technologies such as XR, gesture tracking, computer vision, and machine learning to understand nuanced embodied behaviors.These technologies enable instantaneous monitoring and feedback that otherwise require close observation and monitoring by an expert craftsperson.These instructors may not always be accessible in these shrinking crafts.Additionally, welding presents a demanding environment both for trainees and embodied interactive technologies.Welding is extremely loud; this can be off-putting for novices and challenge their attention during welding.UV emissions, IR radiation and extreme heat can challenge sensors and requires protective helmets, gloves, and clothing.Protective gear can restrict sensory-motor perception and limit opportunities for haptic and sensory feedback from systems.As a result, systems that support and augment welding's rich embodied practices are underexplored.
Our work responds to this need by creating and evaluating a functional XR-and ML-(machine learning) enabled welding system that supports augmented welding training.We approach this work as an iterative co-design process with an after-school program that provides welding training opportunities to regional youth.Workshops drove need finding, and we used their environment and programmatic context to ground scenario making.This led to the development of an integrated software and hardware system that supports augmentative interactions in XR for welding training.Contributions of this work include this system, and how we approach augmenting embodied understandingthrough eye-hand coordination, active listening, and mindfulness--to help novice learners manage and navigate a challenging craft practice.We additionally share the design and findings of experiential workshops on assistive XR with training welders, and discuss the use of sensing, ML, and XR to understand and supplement the learning of embodied crafts.

RELATED WORK Embodied Learning
The rise of mixed-reality technology has had a profound impact on facilitating immersive experiences that extend beyond real-world environments.This has resulted in diverse approaches to exploring and utilizing such opportunities, ranging from entertainment to educational applications [10].As evidence of this trend, prior research has been conducted on guidelines for mixed reality and the definition of the virtual self.These endeavors seek to deepen our comprehension of the relationship between the virtual world and physical embodiment [1,11,17].

Mixed reality training for Hand-Crafts
The current teaching techniques for crafts and hands-on skills, including welding, often rely on traditional methods rather than utilizing modern technology.There have been some approaches leveraging XR simulations to accelerate training in traditional crafts [6] and to enable students to enter the workforce more rapidly [8].

Mixed reality training for Welding
Previous research has created AR, VR, and specialized peripheries for welding simulators that allow users to experience a cohesive and immersive environment for setting up, executing, and validating welding processes [23,29,31,34].Publicly available products such as the Miller AugmentedArc, Lincoln Electric VRTEX 360, Soldamatic, and guideWELD® VR welding simulator help beginners develop muscle memory safely and effectively, but use costly and abstracted systems.Currently, research and products are limited to non-insitu training and focused predominantly on psychomotor skills [12].As White points out, the acoustic consistency of welding arcs provides critical feedback to experienced welders, allowing them to adjust factors like gun height and angle when vision is limited and force feedback is minimal during the actual welding process [32].Sound is overlooked, but is an essential sensory cue for novice students learning welding.White et al. developed a system for training students that includes sound data-however, this data is not used to provide feedback to students [33].We worked closely with the Industrial Arts Workshop (IAW), a non-profit youth welding training program to inform our approach, which was grounded in co-and participatory design.A series of three on-site community workshops were conducted to explore welding instructor and student needs, achieve working understanding of learning contexts, and identify design opportunities along key moments of the curriculum.

Approach
From the outset, we chose to emphasize a participatory, co-design process rather than more classically theorybased or researcher-led approaches.With this practice, we consider our community partners as co-researchers and designers, rather than research subjects or test users [19].
The goal is for design partners to become more than "a passive object of study" [28], but in order for them to act as an "expert of their experiences" [30] and shape design research, suitable tools and processes must be offered.
However, tools and techniques for co-designing mixed reality experiences are limited.Though methods range in fidelity, from those inspired by paper prototypes [13,21] to highly developed electronic software [2], work aimed at reducing time and technical barriers to augmented and mixed reality prototypes is largely focused on professional design teams.Rapid prototyping tools for AR also lack transferability due to their bespoke nature [3,15,16,22].As a result, we opted to focus on storyboarding and scenario making as an accessible means to co-design.
Our approach takes form in three participatory sessions spanning the course of four months.To fully value the partner site and understand the embodied experience of metalwork, the entire team (10 members) took courses and training from instructors at IAW, deferring to their experience in this domain.Workshops, focused on co-design and community-led activities in local contexts, contributed to ideation, validation, and determination of design direction.

Workshop 1: Evaluative Focus Groups
An initial two-hour diagnostic workshop was conducted with 6 instructors and staff at IAW to understand their current curriculum, learning objectives, and challenges faced by students and instructors.Participants were asked to map out each phase of the current instructional experience, along with its related goals, required skills, success metrics, challenges, and opportunities.Material flexibility afforded emergent, group-initiated methods of organization, with participants altering the provided template to create new ways of visualizing and mapping, and displaying a preference for collaborative activities.Insights provided working context from which we identified five key opportunity spaces for further exploration.

Mindfulness practices within skill building
A notable characteristic of the partner site's pedagogy was the emphasis on soft skills and self-development.
Many instructors expressed the importance of confidence-building and mindfulness in their teaching philosophy: "Having the mindfulness to be able to recognize when one is suffering and then give that suffering compassion...when a student is frustrated with a difficult welding objective, if that student is able to stop and recognize that frustration, anger, or whatever they are feeling.This is mindfulness."The presence of meditation and positive self-talk training in IAW's programming was emphasized to be beneficial for both task-specific and holistic student outcomes.

Creativity and collaboration
Creative expression and excitement for both technical and artistic aspects of welding were prioritized in the partner site's pedagogy.Instructors cited the arts-based curriculum and opportunities for social engagement to "make friends and work closely with peers" as unique strengths of the program.These priorities informed the exploration of scenarios and features that build upon existing creative practices and notable aspects of the curriculum, such as collaborative design refinement and composition.

Orientation and recruitment
Current recruitment methods occur off-site, with limited capacity for prospective students to immerse in or identify with the trade.The ability to bring welding to prospective students through simulated experiences would expand the partner site's reach to students who might otherwise lack understanding or interest.

Sensory sensitivity exposure
The welding environment is sensorially rich but can be overwhelming to beginner welders, with instructors citing this as a key barrier: "One of the first challenges is decreasing the chaos of the experience (reconciling and managing it) to establish focus on a very small space".Beginners do become accustomed to these sensory stimuli with repeated encounters over time.This suggested valuable opportunities for system-mediated regulation and adaptation of physiological responses.

Explicit skill-based scenarios
Technical proficiency and safety were stated to be pivotal to further training, with instructors emphasizing the building of muscle memory and consistency as key goals across the curriculum.Supporting and enhancing proprioception through mastery of task-specific processes -such as maintaining weld travel speed and distance to plate -is a key opportunity in welding training.

Workshop 2: Welding Training Immersion
To gain first-hand exposure to the embodied acts of welding, five designers and researchers from our team participated in a six-hour welding workshop led by five IAW instructors.The workshop was a condensed version of IAW's after-school program; the program typically takes place three days a week over the course of ten weeks.As our team had limited-to-no experience with welding, a beginner's mindset afforded understanding of the technical, mental, and emotional nuances of learning metalwork as might be encountered by the site's students (Figure 1) This experience reinforced the need to foster an integration of holistic sensory modalities, embodied practice, and mind-body connection through a technical system.

Scenario Development
Based on these workshops, open, iterative rounds of scenario construction simultaneously defined and built out the five opportunity spaces.Through vignette sketches and storyboards, we explored various productive scenarios and attempted to reflect key needs derived from the workshops.In tandem with this process, the resulting scenarios served as the basis for rapid XR prototyping.Four representative scenarios were selected, with each scenario responding to a distinct set of needs.These were refined for review and discussion with community partners in the next workshop.Prototype development allowed the instructional team to experience and respond to XR technologies in in-situ at their work site, and also informed and validated future technical directions.

Workshop 3: Experiential Co-Design
The final four-hour workshop took place with six IAW instructors, three of which were former students of the program.Past students were included to better recognize and prioritize student needs and experiences.We began with mutual context-setting, group discussion, and a silent brain-writing warm up.The workshop was framed by field research and visualization techniques (collage, 3D modeling, storyboarding), with materials and procedure reflecting methods such as bodystorming, experience prototyping, and the use of creative toolkits [19].
Workbooks with supplementary reference materials and open prompts were given to participants to annotate and sketch concepts (Figure 2).Sessions were facilitated to encourage participants to guide, and ultimately decide, on topics and activities of focus.Scenarios were presented as a basis for ideation and validation of design directions.A series of low-fidelity XR prototypes were also presented, including a Meta Quest Pro headset demo and Adobe Aero mockups on iPad (Figure 3).
Activities aimed at individual and collaborative ideation invited community collaborators into prototyping key scenarios in-situ.In-context stations with toolkits of laser-cut viewfinders and prepared craft materials allowed for visual expressions of creativity through collage, drawing, and role-playing scenarios (Figure 4).These open-ended forms of experience prototyping helped participants build upon scenario prototypes in intuitive, accessible ways.Designing conditions for participants to encounter the potential system in-situ created greater understanding of how XR experiences would work in their contexts of use.By involving partners in on-site inquiry and development, their tacit knowledge and bodily, lived experiences were embedded into valued design directions.
Outcomes of the workshop resulted in the co-design and validation of three key opportunities, which emerged over time through the input of multiple working ideas and diverse perspectives represented.These built on shared inspiration and experiences co-created through collaborative engagements with the partner site.

Visual XR Guides and Integrated Motion Sensing
The highly immersive and embodied nature of welding practice makes shared experiences between instructors and students difficult to facilitate.The detailed supervision of specific weld performance in real time is constrained by time, instructor-student ratios, material resources, and perceptual factors.It is exceptionally difficult to visually monitor student process during a weld and provide feedback to the student in a timely, safe, and audible manner.
Likewise, externalizing and representing nuanced hands-on welding skills is a challenge for instructors, who expressed that "the most difficult part to learn or teach is (how) to pay attention to the details".Information currently resides in written worksheets and documentation, or in post-weld evaluation and feedbackchannels in which both instructors and students expressed need for improvement: "There is nothing to compare from the textbook (to tell) whether they are doing a right weld".
Based on these challenges, opportunities to support contextual knowledge transferral and standardization of performance feedback were further developed through the design of XR-supported visual guides and an integrated welding device system.

Sensing Sonic Cues during Welding Practice
Due to hazardous work conditions and requisite welding personal protective equipment, sensory perception is either greatly limited or overwhelmed by environmental stimuli.Temperature, visibility, sound, and dexterity are influenced.Vision too is restricted due to low visibility created by the auto-darkening feature of welding helmets and the brightness of the welding arc.These constraints create the need and conditions to explore other forms of perceptual supports.
Though it's common practice for welders to evaluate welds visually, refined visual diagnosis of welds is only possible after the fact.This creates opportunity for other sensory modalities to play a greater role in welding practices during the act of welding itself.Based on student and instructor insights, as well as first-hand experience gleaned through participatory welding workshops, we learned that experienced welders are able to assess welds through active listening.This auditory based method of weld diagnosis was emphasized by instructors-"(a good speed) should sound like sizzling bacon, not popcorn"-and informed a proof-of-concept sound-based sensing system.

Pre-Welding Meditation
All three workshops emphasized the importance of mind-body connection and self-regulation in moments of stress: "For most students (success and interest) depends on if you're good in moments of chaos."This informed an exploration of physiologically-informed measures to enhance welding skill performance.Biometric signals such as heart rate and activity, and brainwaves (EEG) were considered but presented many challenges, for example, to integrate into protective equipment.We instead build on existing breathwork practices used on-site.Respiratory rate capture could be performed with low-cost sensors embedded in the welding helmet.Since breath can be consciously regulated, exhalations can be measured in relation to system prompts.This allows for a responsive feedback loop and for physiologicallymediated improvements in focus and attention.

VISUAL XR FEEDBACK AND GUIDANCE
Building upon our rationale and feedback from our workshop sessions, we created a series of visual feedback mechanisms for the XR display.We use two separate XR indicators for work and travel angle in order to make slight changes and adjustments visible.The indicators, along with status icons presented near the top of the viewport, allow users to see feedback without taking focus away from an active weld.The status icons, by contrast, can give a much clearer overview of performance for instructors or users viewing live playback, particularly when focus on the tip of the welding gun is not as important.
To calibrate the XR representation of the weld to a real workpiece, the start and endpoints of the weld line can be set by the user.This can be done using the welding gun at the start and end of the path.This allows the position of the scrolling guide line and inch grid to be adjusted.The angle indicators use the direction of this line to determine their axes, which keeps visuals correct even if a weld is not perfectly level.In establishing our UI feedback systems, we leveraged the workshop responses and instructor feedback to determine values as satisfactory and unsatisfactory.

SENSING WELDING PRACTICE THROUGH SOUND
Through our interviews, we learned that welding instructors often use sounds to guide their teaching.
Because of the restriction of welding space and having limited welding instructors, they sometimes need to stand away from the student and infer student's mistakes only through sound.They informed us that the most notable sound that a beginner error may create is the sound of having the tip of the gun being too far away from the welding plate and many wrong settings can also generate distinct sounds.
This highlights how welding is a multi-sensory practice.Yet, prior VR welding training systems [31,34] only track hand movement.To address this, we employ tinyML enabled sound detection to recognize key factors such as settings and tip distance.The ML model is trained and deployed onto a Seeed microcontroller, which relays that information to the UI and provides visual feedback.
In this way, ML-detected sonic cues are used to enrich training and cultivate non-visual perception among novices in welding tasks.

Signaling Welding Start and End
To detect the beginning and end of welds, we collected nineteen minutes of welding sound from our partner site.Recordings were made using five different devices simultaneously: the on-board microphone (MP34DT05) of an Arduino Nano 33 BLE Sense (Rev1), the on-board microphone (MSM261D3526H1CPM) of a Seeed Studio XIAO nRF52840 Sense, two Samsung smartphones (A52 and S8), and a USB microphone.In addition, nineteen minutes of inactive samples was prepared by collecting ambient noise recorded by the phones and microcontrollers in combination with other datasets [5,24].We trained a classification system with 97.67% accuracy to detect welding.This classifier replaces the need to electromechanically detect interactions with the physical button on the welding gun.The classifier is used as an input for XR feedback to initiate welding tracking and to make the system more portable.

Sensing Welding Mistakes
Metal Inert Gas (MIG) welding, the technique we are analyzing, involves extruding a metal wire through the tip of the welding gun, shielding the wire with inert gas, and using the heat generated by short-circuit current between the wire and the workpiece to fuse the two metals together.Incorrect settings of this system would result in poor-quality welds, as shown in the figure on the right.For example, if the amperage of the welder is set too low, it will result in an excessively thin weld bead and lead to inconsistent penetration of the working plate.
Different settings would result in changes of the welding sound, which offers potentially important training feedback.This was noted both by prior research [32] and by instructors from our partner sites.Building on this, we invited an experienced welder to repeatedly perform the same welding movement, only changing one setting per weld-Table 1 shows each of the settings.We collected a total of 20 minutes and 39 seconds of audio data, evenly distributed across the categories, for training and testing a tinyML classification model.We designed the model so that it can alert beginners to attend to common errors, such as incorrect settings and gun tip distance.This proofof-concept model is deployed to a microcontroller and connected to the augmented helmet to provide feedback.

Training
We used the Edge Impulse platform to train the model and extracted Mel-filterbank energy features from every second of the audio data we collected.The subsequent layers are: 1) a 1D convolution and pooling layer with 8 neurons and a kernel size of 3; 2) a dropout layer with 0.25 rate, 3) a 1D convolution and pooling layer with 16 neurons and a kernel size of 3; 4) another dropout layer of 0.25 rate; 5) a layer to flatten the outputs; and finally 6) an output layer of the seven classes.We ran 500 epochs of the architecture above, with a learning rate of 0.005.The final training accuracy over the validation set is 92.5%

Challenges
We encountered challenges with both sound data collection and the ML model design.The extreme heat, light, and sound conditions in the welding space all contributed to the difficulty of using precision audio equipment to build a comprehensive model.Additionally, different setups of each welding session made it difficult to both record and standardize comparative samples.
To combat this, we recorded with multiple devices from different angles and distances to reduce overall data collection time and improve the generalizability of the data.The tuning of the ML model relied on the visualized sound spectrograph and a trial-and-error approach.

Results
Overall, our model achieved an accuracy of 77.62% on the test dataset, under a confidence threshold of 0.4.
Based on the confusion matrix in Table 2 we observed that the model often confused accurate welding settings with the setting of too high amperage, due to the fact that the smaller welder we sampled our data from could not exceed the recommended amperage by a noticeable amount.This result validated that sound contains useful information for welding training.

PRE-WELDING MEDITATION
The environment in which welding takes place can be overwhelming due to loud noises, sparks, heat, and burning smells.There have been approaches to help practitioners become accustomed to these conditions in a safe environment, by creating simulations of the welding experience as part of the XR learning process.These approaches have been limited to non-in-situ environments [14].
During our workshops, we observed students engaged in regular meditation sessions.The brief meditation primarily involved breathing exercises that took place immediately before starting to weld.By encouraging learners to breathe, instructors aimed to induce relaxation and foster a sense of focus on the task, materials, and proprioception.Students were encouraged to use these meditation techniques whenever they felt mentally fatigued during the welding process to help self-regulate attention.We explored how this can be scaffolded and augmented with technology mediated support.

Meditation and Breath
The significance of mental well-being and the growing popularity of meditation have witnessed a remarkable surge.This trend is evident not only in the industry but also in the academic sphere.Mobile applications such as Headspace, Aura, as well as dedicated devices such as the Muse headband or Core Mediation trainer focus on breath practice and provide tools for mindfulness practices to contribute to the quality of daily life.Additionally, prior research has tried to understand the relationships between the mindfulness exercise and skill performance.For example, Khng found that taking a few deep breaths before taking tests results in better academic performance [10].Philippot demonstrated that different emotional states can also be triggered by specific breathing patterns [25].In addition, there have been multiple projects that leverage technology to create systems which support the awareness of breathing [26] and further explore the enhancement of mindfulness design through breathing in mixed reality environments [27].

Approach
In order to enhance the breathing exercises and mindfulness of welders, we leverage the welders' breath as an input for our AR system (see below).We used an off-the-shelf anemometer to track breathing.The anemometer is placed near the mouth and nose inside the welding helmet.This is then used to measure the exhaled wind speed of breath and to track the breathing pattern of welders over time.Illustrated (left) is the breathing rate of a professional welder as they performed a welding task; it depicts a consistent and regular breathing pattern.
To monitor the trainees' mental well-being, we gather real-time data on their breathing rates and present it within the XR UI.The platform begins by encouraging each trainee to engage in breathing exercises before the welding session.These strategies help learners practice and adopt mindfulness techniques that will benefit task performance and eye-hand coordination.This scaffolding also helps to emphasize the significance of mindful breathing throughout welding training.To provide support for and to evaluate these experiences, we created a functional device system that integrates the sound and breath sensing with a Meta Quest Pro headset and a standard MIG welding helmet and gun.As these devices are designed for use in active welding scenarios, protecting the devices and sensors from the possible extreme temperatures, sparks, and brightness led much of the mechanical design.

Helmet: Design and Implementation
Our design uses a modified welding helmet fixed to the Meta Quest Pro XR headset.We interfaced the Seeed ESP32S3 board to a Unity program running the XR display, using the Quest Pro's USB-C port and the Serial Port Utility Pro plugin [35].Quest's native framework for headset/controller tracking and passthrough enabled rapid and reliable prototyping in Unity, although we encountered setbacks when integrating serial data for our auxiliary sensors and implementing transparent objects over the passthrough.The adjustable Quest head strap and connected battery replace the traditional helmet insert and the device is removable from the helmet with plastic clips.3D printed components were made with PLA filament on accessible FDM printers.After measuring temperatures around an active weld, we found the PLA material to be suitable in the locations on the helmet and gun.To protect the forward-facing sensors, the Quest is placed behind an auto-dimming display, such that the forward-facing sensors and the real-time passthrough are not impacted by the extreme light.
After testing an early version, we encountered issues with the headset's spatial sensing being lost due to the display auto-darkening and the cameras losing tracking.
To account for this, the auxiliary side cameras on the Quest Pro are exposed with cutouts to ensure accurate tracking with or without an active arc.Additional sensors such as the microphone and anemometer are fixed to the inside of the helmet, and are connected through the Quest's side port to interface with the Unity program in real-time.

Welding Gun: Design and Implementation
The device attachment onto the welding gun must be securely mounted and aligned to maintain proper tracking of the welding gun.To do this, a 3D-printed enclosure mount interfaces a Quest Touch Pro controller with a welding stick, aligning the independent systems together as well as protecting the controller from weld spatter and heat.Using the existing buttons on the Touch Pro controllers allow users to control different settings such as point of view (POV) recordings, spatial calibration, and navigating the Quest UI.The backwards-facing orientation of the controller also protects the tracking cameras from the extreme light conditions during active welding.A key design aspect of the welding gun was the ability to transport our system to multiple welding setups in various locations.As such, the welding gun attachment had to be independent and removable to be added to different machine systems.Replaceable zip-ties allow our system to be added and removed easily.

SYSTEM WALKTHROUGH 1) Initialization
When the welder initially wears the helmet, they must adjust the Quest optics and headband fit settings, and ensure that the proper welding safety equipment is worn.

4) Active weld feedback
During the weld, the display automatically dims, allowing the welder to see the molten weld bead along with the XR display elements, providing responsive feedback on the welder's performance.

2) Calibration
The welder uses the gun to place coordinate locations for the start and end of the weld, linking the real world weld line to a graphic representation in the XR display.

5) Reflection and evaluation
The automatically generated 3D weld line can be reviewed post-weld from inside the headset, allowing the welder to monitor their variance from an ideal path and reflect on their overall performance and behavior.
3) Pre-weld feedback and meditation Before welding, the welder can use the breath-controlled meditation program and gun angle monitoring to focus and prepare for a successful weld.

6) Real-time instructor view
At any time, an instructor or the welder can review realtime or recorded point of view (POV) footage of the experience from the welding helmet to monitor behaviors or analyze difficult scenarios.

EVALUATION AND FEEDBACK
Interviews and hands-on demonstrations were held with experts in mixed reality, applied ML, interaction design, and human-machine collaboration to assess the present system.Feedback resulted in improvements to the visual user interface, onboarding experience, and meditation guides.Adjustments to helmet and torch design were made to improve comfort and minimize fatigue, including improved counterbalancing and in-hand ergonomics.
Future comparative user studies will assess the efficacy of embodied learning and impact of mindfulness practices with our XR system, in comparison to traditional welding education or VR-simulated training.Lab studies with novice welders will assess the impact of receiving visual XR guidance and guided breathing exercises on simulated welding, evaluating key measures such as travel speed, weld angle, and work angle consistency; aim accuracy; and improvement across welds.In-situ qualitative studies will evaluate the system with IAWtrained students during live welding tasks.Debriefing interviews will be conducted across both studies to assess self-reported qualitative measures, such as perceived usefulness, desirability, and confidence during task.

Embodied learning of welding with XR
A key aspect of our work is the ability to enable in-situ welding experiences using a lightly modified off-theshelf XR and welding setup.Prior work presents an abstracted training simulation that separates feedback and practice from real-world welding [8].Our approach prioritizes the in-situ value of welding as opposed to an abstracted VR system.This will enable students to engage in direct, real-time learning experience that could make acquired skills directly transferrable to practice.Further more, this approach will enable us to inform ways of incorporating the socio-material aspects of crafts and skill training in XR systems beyond welding training.It integrates phenomenological aspects of the environment, such as sonic cues, interactions with the instructor, and the embodied interactions of the student welder.

Co-design workshops for XR system development
When designing for a multi-sensory and immersive craft, we chose to leverage co-design workshops over traditional user studies for their holistic and experiential feedback.In designing these technological interventions in pedagogical areas of learning and behavioral growth, we found it important to practice ground-up co-design of the XR system to ensure successful augmentation of the potentially nuanced and sensitive interactions that occur when welding.We also recognize the value of sensory awareness in the recruiting of our participants, as selfreflective feedback about sensory stimuli and behavior became significant in developing the XR system.Because welding practice is evaluated in specific embodied ways, learning the sensorial cues and their implications from both instructors and students allowed us to directly sense, monitor, and map these attributes to measures of performance, such as sound, welding angles, and speed.

Evaluating weld performance with sound analysis
We present a proof-of-concept prototype that uses tinyML to detect welding mistakes and provide realtime and in-situ feedback to trainees.Past work has used tinyML to classify in-situ noise [9], but to our knowledge, no prior work provides real-time feedback for in-situ welding from sound.Welding sounds are often used as a companion to simulate the working environment in AR/VR settings.For example, White et al. used generated sound to improve the fidelity of their training simulation [33].Our prototype treats sound as a source of information, instead of a result of welding for simulation purposes.Our ML model distinguishes sounds that are hard to tell apart by the ears of untrained researchers, such as the sound when wire speed is set too fast or amperage set too low.
Our model result shows the possibility of using sound to help novices' awareness of settings and diagnose adjustments during welds.We can tell what errors might be present from some of the weld texture after completing a weld, but it is almost impossible to tell while welding when vision is limited and darkened, as addressed by our collaborators.They highlighted that they normally listen to sounds from student working areas, since they cannot be with every student at all times.The sound helps them monitor students, in case they need to step in and help.
Additionally, some weld issues may be hard to detect visually even by experienced welders-for example, an incorrectly sized nozzle may block the gas channel and cause bad weld texture-but sound can aid detection.

Mindfulness in embodied welding practice
We investigated how to improve mindfulness by monitoring students' breathing while welding and providing realtime feedback.Practicing mindfulness helps connect the body and mind [4], which is crucial for improving the learning process in any physical skill.By monitoring breath, the system provides real-time instructions to connect to one's body and calm down.This system could be used in teaching embodied skills that are particularly demanding, such as machining and building construction.
Prototyping XR experiences with welding Our investigation demonstrates that relatively offthe-shelf hardware systems (Quest) and accessible software pipelines (Unity) can be used to make effective, immersive experiences for embodied learning augmentation without significant modification or custom development.Integration of the XR-enabled system into existing equipment and craft-tools makes these systems potentially low-disruption in rituals and processes.Our work explores the democratized access to embodied skill acquisition through open-access systems and software, compared to other available training platforms.Although our work focuses on the specific sensorial and experiential attributes related to welding, we believe that our insights can be directly applied to building systems in similar embodied practices, such as machining, woodworking, sculpting, and soldering.
Despite welding-specific implementation and design decisions, our system serves as an example of the accessibility of modern XR systems and sensor prototyping with craft-specific tools and wearables.
We envision that craft-based embodied skills such as woodworking and machining would benefit from realtime tracking of hands and tools, XR visualization, and in-situ analysis of the workpiece and practice with intelligent aids.Our work explores the possibility that this level of sophisticated, real-time, and embodied tracking and feedback can be applied to real craft practice, instead of abstracted exercises.

LIMITATIONS
We would like to acknowledge that there are some potential limitations to our current approach.
XR Headset: Welding is already sensorially demanding and owing to the protective helmet creates a limited field of view.Our system further restricts this by adding a significant amount of visual interfaces that could obstruct a user's vision.An additional drawback of the Quest Pro is the screen-based display system that prevents direct line-of-sight with the weld in the event of a system failure.Future systems that incorporate projection-based XR displays could still maintain direct vision of the work area, which would be an important safety feature should this approach be used widely.Nevertheless, our exploration found the Quest Pro to be easy to work with in terms of mechanical integration, sensor connection, and development options.Participants, however, found that the headset was too heavy for prolonged use and didn't offer the same fit customizability that standard welding helmets provide.Additionally, participants found the visual passthrough of the Quest Pro to be dark, distorted, and less than optimal quality.As XR platforms continue to improve sensor quality, weight, and ergonomics, we can expect that a modified device such as ours would also begin to alleviate some of these concerns.
Sound Sensing: Our sound tinyML module has some limitations.We are aware that novice welders could make compound errors.For example, both the wire speed might be too slow, and the amperage too high.Our current system only identifies one error at a time.The ability to identify and provide feedback on compound faults requires a significant amount of data for complex variations and exceeds what our sound model can support.Second, our ML models are prepared with relatively small datasets and we have yet to test our sound classification model in-situ with live data.Since we trained with a relatively small dataset, this could expose potential overfitting errors.

FUTURE WORK
There are many opportunities to improve both the technical dimensions and the embodied experience within this work.For example, the welding torch can be iterated to rely less on the Quest Pro controller to track the torch's speed and weld angle.This change would result in a smaller form factor for the torch and would additionally help improve portability and accessibility by removing the need for a bulky welding torch attachment.
Improvements could be further supported in two main ways: increased use of sonic analysis and the addition of computer vision tracking.This work demonstrates a proof of concept for sonic cues to inform welding feedback.Additional data collection to build a larger dataset of welding tasks, along with more in-depth analysis of weld sounds and their qualities, would help to overcome current limitations.For example, we had difficulty discerning some weld qualities with sound features from our limited recording.We expect a larger dataset and more sophisticated ML system can discern more complex behaviors, allowing sound and ML to become a more prominent training tool with welding.We envision that ML could identify individual respiratory patterns in the data and provide personalized feedback tailored to one's unique breathing patterns.In addition, the integration of computer vision techniques could enable tracking of the weld gun and hand movement during welding tasks, as well as other features of the workspace and workpiece.
Finally, it is important to acknowledge that our work has not yet assessed our XR welding training with end-users.Work is needed to understand how our platform affects the speed and quality of welding skill acquisition for novices.To achieve this, we plan to deploy the platform in A/B lab studies and over multiple weeks at IAW.This will assess how extended use of this device contributes to the formation of skills, habits, and the novice experience, and could demonstrate the individual and cumulative benefits of our proposed systems on embodied learning.

CONCLUSIONS
In this pictorial, we present the development of a multimodal XR system for in-situ training of the welding process.Through a co-design process with a youth welding training program, we uncovered design opportunities including: the value of incorporating mindfulness practices within skill building, building exposure to and scaffolding for multisensory awareness during welding, and explicit skills-based practice that augmented reality could help enhance.This informed the development of an embodied, multimodal learning system for welding training that addresses these needs.
The system consists of two main components: an augmented welding torch for hand-tracking, and a welding helmet with an integrated XR headset, breath sensor, microphone, and tiny-ML enabled microcontrollers.The system detects welding quality through sound, provides real-time visual feedback on welding torch movements, and enables the learner to practice mindfulness with breathing exercises to improve sociomaterial attention.This system distinguishes itself from prior work, which has predominantly focused on VR training, visual feedback, and enhancing psychomotor skills exclusively.
Our system's emphasis on welding sounds is a key innovation that addresses a notable gap in current welding training: scaffolding novice attention and understanding of implicit sonic cues that offer important feedback.We demonstrate the potential for this approach by assembling a preliminary dataset and training tinyML models to detect active welding and six common welding mistakes.Finally, our approach represents a low-cost, open, and accessible platform to democratize access to safe welding training and practice for community organizations and educational contexts.

Table 1 :
Each of the settings we sampled data from

Table 2 :
Confusion matrix of the error detection model