Wizard of Props: Mixed Reality Prototyping with Physical Props to Design Responsive Environments

Driven by the vision of future responsive environments, where everyday surroundings can perceive human behaviors and respond through intelligent robotic actuation, we propose Wizard of Props (WoP): a human-centered design workflow for creating expressive, implicit, and meaningful interactions. This collaborative experience prototyping approach integrates full-scale physical props with Mixed Reality (MR) to support ideation, prototyping, and rapid testing of responsive environments. We present two design explorations that showcase our investigations of diverse design solutions based on varying technology resources, contextual considerations, and target audiences. Design Exploration One focuses on mixed environment building, where we observe fluid prototyping methods. In Design Exploration Two, we explore how novice designers approach WoP, and illustrate their design ideas and behaviors. Our findings reveal that WoP complements conventional design methods, enabling intuitive body-storming, supporting flexible prototyping fidelity, and fostering expressive environment-human interactions through in-situ improvisational performance.

With the rapid advancements in ubiquitous technology, robotics, and shape-changing interfaces, and their increased integration into environmental design, there is a vision that future everyday surroundings will become responsive [4,5,30].Existing instances include responsive architecture [1,33], transformable furniture [2], and shape-changing walls [1].The speculative scenarios in Figure 1 indicate how interactive surroundings could enrich our daily lives.Such developments in transformative physicality hold the potential to facilitate expressive embodied interactions, aligning with our body's dexterous abilities and nuanced perception of touch.Our project was motivated by the research question: What design workflow may support prototyping these large-scale interactions in a humancentered way?Software design often begins with hand-sketches and digital mockups [53].Handheld interfaces can be ideated with clay, CAD tools, and tested with 3D printing [53].However, when the design scope expands to full-body or room-scale interactive interfaces, the prototyping process becomes complex.Building and controlling interactive systems on such scales pose technical challenges, and traditional low-fi methods are not always ideal.Paper sketches [6] lack the tactile affordances of physical objects.Cardboard prototypes, typical in Architecture design, tend to be static and lack visual richness.While virtual reality (VR) simulates in-situ interaction for spatial testing, it misses embodied haptic feedback.

IDEA BEHIND THE WORKFLOW DESIGN IDEA BEHIND THE WORKFLOW DESIGN MOTIVATION AND INTRODUCTION MOTIVATION AND INTRODUCTION
Drawing inspiration from Wizard of Oz (WoZ) prototyping [7] in HCI, we propose Wizard of Props, a collaborative experience prototyping approach [9,10] that integrates full-scale physical props with mixed reality (MR) to design responsive environments.Embodied interaction can be achieved through human manipulation of the props and synchronization in MR.This pictorial presents our design of WoP prototyping tools through two design explorations that demonstrate diverse design solutions for varying technology resources, contextual considerations, and target audiences.
While focusing on mixed environment building in Design Exploration One, we analyze the fluidity in prototyping the environment with different methods and workflows.In Design Exploration Two, we concentrate on learning how novice designers approach WoP in interaction design with a design session.We illustrate participants' design ideas and behaviors in the process.Based on the exploration findings, we discuss how WoP complements conventional design methods, enabling more intuitive body-storming, easing the sense-making of human scale, supporting flexible prototyping fidelity, and provoking personal ideation with nuanced interactions.Lastly, we view WoP as a stage that enables embodied ideations through role-playing, reflecting on how expressive environment-human interactions emerge through such improvisational performance.Responsive environments [5] envision everyday surroundings that dynamically change their configuration, motion, or shape to adapt to different needs or to respond to human behaviors.While early explorations were mainly art installations [11,12], the HCI community has been actively working towards this vision using shape-changing interfaces [3,13,14], swarm robots [14,15], and interactive materials [16].Given the complexity of building a large-scale interactive systems, our work specifically emphasizes the design perspective and leverages emerging MR technology, along with the power of WoZ prototyping [17,32] to support embodied design ideation [18].Here we summarize prior research that inspired our exploration.
Wizard-of-Oz Prototyping Wizard-of-Oz Prototyping Inspired by the original Mechanical Turk [20], a fraudulent chess-playing machine operated by a hidden human chess master, WoZ prototyping [22] and testing methods [21] have become widely integrated into design and HCI research.This rapid prototyping approach enables the demonstration of mock-up interfaces through the use of a human Wizard that simulates system interactions in response to user input [23].While extensively used for testing conversational agent interactions [23,24,25], WoZ has also been incorporated into human-robot interaction [19,26,27,29] and tangible interfaces [28,31].Notably related to our theme, Sirkin et al. explored human reactions and behaviors while interacting with a robotic footstool by puppeteering it with a fishing line [29].Spadafora et al. conducted improvisation sessions with human Wizards to ideate the behaviors of interactive objects [31].With the development of Mixed Reality, hiding Wizards behind the virtual scene to provide versatile multimodal interactions has become convenient, inspiring new exploration directions, as discussed in the following sections.

Design Embodied Mixed Reality Interaction with WoZ Design Embodied Mixed Reality Interaction with WoZ
Incorporating embodied interaction into the VR environment is a popular research motivation in HCI [41].While haptic researchers focus on wearable interfaces [34] to provide realistic touch experiences in VR, others investigate room-scale haptics delivered by mobile robots [35,36].Low-tech methods that enable rapid readaptance to simulate different scenarios have also been explored, such as hand-held props repurposed from everyday objects [38] and Substitutional Reality, where furniture in the physical room is paired with a counterpart in VR [37].Drawing on the WoZ concept, HapticTurk [39] introduces "human actuators" lifting and tilting a player to add haptic feedback to a video game.Similarly, TurkDeck [40] uses human volunteers to align physical props as the player explores a virtual room.

Mixed Reality Prototyping Mixed Reality Prototyping
Beyond supporting the creation of new interactive experiences, MR has also been leveraged by the design community to envision possible futures.The use of MR in the design process is emerging as a promising solution, as it combines the advantages of virtual content for quick iteration and physical prototyping for embodied and tactile experiences.Early MR prototyping applications explore using see-through HMDs to augment physical car mock-ups with virtual surface textures [49].Barbieri et al. [48] present a method for conducting usability tests in a MR environment to evaluate human interaction with household appliances.An et al. [10] propose a new workflow where designers can author the form of virtual content through realtime sketching and experience the design in VR together with quick-and-dirty physical proxies.While our work largely resonates with these ideas, our primary focus lies in ideating and prototyping large-scale responsive environments, where the "human body" scale and behavior are brought in.
Bodystorming for Embodied Design Ideation Bodystorming for Embodied Design Ideation Bodystorming [42] is a generative design practice that enables prototyping in context, supporting collaborative embodied cognition [18,47].It can be achieved by role-playing an interaction with actors and props [44], or "acting it out" in an in-situ environment [43], which may or may not involve improvisation [19,45].MR's capability to provide immersive visual content within situated environments holds promise in supporting embodied ideation.Weijdom et al. [46] have previously combined WoZ methods with bodystorming to enhance collaborative design for performative MR experiences.Our work proposes a new mixed prototyping workflow while also investigating how embodied ideation enhances empathetic design and provokes creativity.The first WoP exploration was started as an ideation among experts in our research group working on tangible interaction, architecture, and industrial design.Following the conventional design process, which starts with hand sketches, we posed the following questions: Can we sketch a real-scale environment in VR for an intuitive spatial sense while animating them to test different interactions?How can physical props support the design process when sketching by tracing them, and also enrich the design experience through haptic feedback?

BACKGROUND AND RELATED WORK BACKGROUND AND RELATED WORK
This method focused on integrating physical props and leveraging spatial sketching to support room-scale interaction design.Inspired by Tilt Brush 1 , we developed a custom VR sketching tool simplified and tailored for this application.Since perceiving depth and angles was complex in 3D sketching for precise spatial representation [8], we took advantage of having these props and proposed a "sketching by tracing" approach, enabling a hybrid environment-building process.After drawing, the sketched environment in VR could be interactive by having a "human actuator" manipulate the props on demand, while the virtual object synchronizes its behavior with its physical counterpart.
For room-scale motion tracking prior to inside-out tracking solutions, we utilized the HTC VIVE 2 with lighthouses on the ceiling of a studio space and mounted VIVE trackers to each prop.Our demonstrating scenario was exploring the future home life, featuring transformative furniture and an automatic robot assistant, as these were promising and popular elements in speculative responsive surroundings.Here, we present the designed workflow, system, and takeaways from the design process.

Create Physical Props Create Physical Props
We started from creating full-scale physical props.For instance we repurposed real furniture with wheels to enable flexible movement.We also built a robot assistant out of cardboard, which is a common lowfidelity method to mockup physical objects and environments.

PROTOTYPING WORKFLOW PROTOTYPING WORKFLOW Sketch Objects and Environments Sketch Objects and Environments
After setting up the VR system, we sketched the shapes of the objects in virtual scene by tracing the props.We also drew the walls in VR to create a "room" space, while they also worked as "curtain" that let virtual props appear and disappear magically.

Wizard of Oz User Testing Wizard of Oz User Testing
While an experiencer interacted with the environment through voice commands, touch, or body gestures, the designer assumes the role of Wizard of Oz and invokes the associated actions by actuating the props.In the photo, the "autonomous" behavior of the robot is controlled by the designer pushing it.To improve the tracing accuracy and enhance the sketching experiences, we designed custom add-ons for VIVE controller in the shape of pen and brush.They were 3D printed and attached to controller by snapping into the hole on the top.Also, the length and shape of the add-ons were carefully designed as once the tip of the pen or brush touched the prop surface, the end side of it would press the trigger button to precisely detect the tracing behavior.Another button on the side was also enabled to activate the sketching so that designers could still draw virtual element in air.

Sketch by Tracing Sketch by Tracing
The VR headset did not have adequate video see-through functionality for this prototype.Therefore, we switched between setup steps without a headset, and immersive design steps with a headset.
Step 1: (no headset) use the controller to select the tracker mounted on the prop to pair them.Step 2: (no headset) trace the outline of the props.Step 3: (with headset) draw the VR objects according to the outline while aligning the "brush" controller onto the prop surface.Step 4: (no headset) sketch the wall to create a virtual space.
To help the designers when tracing without headset, we set up a large monitor (Fig. 8) connected to the PC which showed the virtual scene view.This view also helped human-actuators to better control the behavior and timing of the prop's interactions in later WoP testing.

Prop Movement Synchronization Prop Movement Synchronization
The physical props are synchronized with virtual graphics through VIVE trackers that are physically attached to the prop.In the sketching step, each virtual model is either attached to a mobile tracker or to the environment.The VIVE Pro headset, controllers, and the trackers detect their position and orientation optically in relation to the VIVE 2.0 Lighthouses and transmit them to the PC via Bluetooth.The experiencer sees any movement of tracked virtual props in their headset, while the designers observe the experiencers view on a large screen.To gain insights into the WoP approach, our team, comprising experts in tangible interaction, industrial, game, and architecture design, alongside a guest expert in CAD tool design, conducted first-person experiments to explore the extensive possibilities of prototyping with the proposed workflow.While the proposed theme was to ideate different forms, functionalities, and behaviors for future home robots and furniture, we primarily focused on observing and discussing the design process rather than the outcomes.We also demonstrated our tool to 50 novices during community-based open house events, where they designed with it.In these hands-on sessions, we observed that different groups of designers incorporated diverse entry points and entangled workflows, adapting to the context and their personal design needs, habits, and expertise.The WoP method inherently values and fosters this flexible process, which we define as adaptive fluidity.

FLUIDITY IN PROTOTYPING THE MIXED ENVIRONMENT FLUIDITY IN PROTOTYPING THE MIXED ENVIRONMENT
In creating the physical prototype, designers would either reappropriate objects they found in their environment, or build objects from scratch out of cardboard.For the virtual model, they would trace physical objects like furniture with our tool, draw mid-air, find existing 3D models online, or create them in CAD.The figure below shows the fluidity of the different mixed environment building processes.The most intriguing takeaway was how all these approaches would be combined by the designers to remix and invent to support their prototyping and iteration at all stages, while also accommodating different skill levels.
While we focused more on the mixed environment building in this round, our next exploration concentrates more on understanding how novice designers approach WoP for large-scale interaction design.

WoP Design Exploration Two WoP Design Exploration Two
Thanks to the embedded tracking system of Oculus Quest 3 , our second exploration proposes a more light-weight, low-cost, mobile, and easy-to implement prototyping method.Instead of hanging the lighthouses onto the ceiling and installing tracker to each prop, we only needed to attach printed QR markers.We utilized an Android phone camera to detect the movement of the marked objects, and communicated it to the virtual scene from the phone through an online server.Figure 14 depicts the system design.
To demonstrate the idea, we built a testing environment in the scene of an interactive sliding subway door, inspired by Ju's use of door opening as an entryway to understand implicit interaction [50].The decision of doors merits us full-body interactions and can be easily prototyped using a one degreeof-freedom (DoF) sliding motion, which helps constrain the explorations while still leaving a rich design space.Instead of taking the strategy of handsketching for the virtual scene, we use high fidelity 3D models designed with a CAD tool in this round to test alternative possibilities.Additionally, we believe a more specific scene would be helpful for novice designers.Detailed design decisions and rationales are explained in the following sections.High Fidelity Virtual Environment High Fidelity Virtual Environment For creating a more convincing in-situ experience to test with novice designers, we develop a high-fi virtual scene.It includes a subway station with a boarding train 4 .The virtual door panels move in sync with the physical props.In the exploration, we introduced an additional element into the scenario: the passenger approaching the subway door carrying a bulky box.This addition was intended to create some awkwardness and complexity, enhancing the tension and entanglement of the interaction between the human and the door behavior, which might provoke more nuanced perceptions.
The hand-held prop is constructed with frames and joints.Its hollow design allows for inside-out tracking based on the hand tracking from the Quest 2 headset.A virtual representation of the box is rendered according to its estimated location and rotation.Moreover, the building block fashion of the prop design makes it easy to reconfigure for other scenarios, such as carrying a suitcase or a bicycle.

Easy Calibration with Gesture Commands Easy Calibration with Gesture Commands
The spatial and movement synchronization between the physical prop and the virtual model is crucial to support an immersive design experience [51] and to ensure users' safety.In our implementation, we introduced an algorithm that spatially aligns the virtual scene with the physical prop through a series of gestural commands (Figure 14).In the VR application with handtracking enabled, the researcher would first use the right index finger to point at the central origin of the physical door, and then trace the orientation of the door panels with the same finger (Figure 13).The virtual scene would relocate and reorient itself to match the position and rotation of the physical prop.

Prop Movement Synchronization Prop Movement Synchronization
The movement synchronization of the sliding door is done through an optical tracker on a tripod behind the door prop (Figure 10 , 11).This tracker consists of an Android phone running a tracking application based on the OpenCV 5 .Markers with generated QR codes are printed and taped onto the door panels.
The tracking application tracks the spatial position of the markers, converts the movement data into a string message, and sends it to a central server using the MQTT network messaging protocol 6 .The central MQTT broker posts commands to the client VR application, which responsively updates the position of the door panels in the virtual scene accordingly.In this exploration, we intend to learn how people would approach our system to design the interaction and how WoP could be incorporated with established design methods.After researchers in our team built the props and set up the MR system, we organized a design session.Eight novice designers (3 female, 5 male, age: M=20) who are students in HCIrelated areas from local universities were invited and randomly paired into groups of two.
Working as collaborative design groups, they were given a speculative scenario: "Envision a future expressive subway door.

OBSERVING DESIGN SESSION OBSERVING DESIGN SESSION
How would you imagine it to interact with an approaching passenger?What if the passenger also carries a bulky box?"We asked them to engage in three design activities: hand-sketching the door behavior, designing with WoP, and sharing their experiences with researchers.Inspired by the Personality method proposed by [31], we encouraged our participants to design a proactive automatic system while focusing on the aesthetics of interactions.
When starting with paper and pen prototyping, participants were more likely to ideate the door behavior from a 3rd-person perspective.While sketching the interactions with different door panel locations and arrows, they also mimicked the panels' movement with their hands when discussing and presenting their ideas.Noticeably, the door panels always moved in a symmetrical way.Some of them also raised specific technology considerations, such as adding light indicators and weight sensors.The carried box was seldom mentioned in the design process.
Groups designing in WoP first incorporated bodystorming.They started from having the "passenger" in MR approaching the door with the box and the other participant responded with door interactions based on their intuition, guiding the researcher-actuator.As the door actuators had to manually coordinate, they considered more possibilities, such as asymmetric opening.Though voice was the most common used in design communication, they fine tuned the interaction by actuating the doors while having the passengers show their ideas with body gestures.
In the discussion, participants reflected how these two design sessions complement each other for iteration.
Conventional paper and pen ideation with face-to-face collaboration enabled more comprehensive view in system design and more efficient communication.But it was challenging for novice-designers to take human-scale into consideration.WoP greatly improved the bodily and spatial awareness, which supported the interaction design to be in-situ and human-centered.As WoP provides an informal design stage, improvisational behaviors could emerge.Here we illustrate the results of the expressive subway door ideated from our four design groups.Each group was prompted to converge on one final design at the end of the session.The "WARNING" Door The "WARNING" Door jiggles to "talk" to the passenger for its impending closure.By acting out the design with WoP, participants realized that the mechanical sound and motion of the jiggling door could effectively communicate the alert that the door is about to close.So they removed the initial light indicator idea.They commented "The shaky cardboard emulates the (disrepaired) condition of the actual subway system."

EXPRESSIVE DOOR INTERACTIONS EXPRESSIVE DOOR INTERACTIONS
The "ANNOYING" Door The "ANNOYING" Door always contradicts the passenger's expectations, inspired by interpersonal interaction in the WoP process.As the passenger approached, the other participant "played a joke" by blocking the way with the door repeatedly.This led to the passenger's frustration, and only when they gave up and threw the box on the ground, "that is when the door returns to normal." The "OMNISCIENT" Door The "OMNISCIENT" Door can read passengers' intentions with an adaptive response.For instance, aligning the center line of the opening to the passenger's position when it notices their inconvenience with the box or opening the door quickly when they are in a rush.These ideas emerged from the WoP process, highlighting how full-scale in-situ prototyping promoted greater empathy.
The "SHY" Door The "SHY" Door opens when the passenger is not facing it.In the WoP system, it is natural for the passenger to turn their body and walk around to experience the space.When the actuator participant randomly opened the door while the passenger walked backwards, they realized it actually conveyed a sense of shyness.They later tuned the interaction with more hesitated opening behavior while faster closing.

TAKEAWAYS FROM THE TWO DESIGN EXPLORATIONS TAKEAWAYS FROM THE TWO DESIGN EXPLORATIONS
Human Scale Matters Human Scale Matters Designing environments in human scale is an important consideration [52], which means ensuring that the everyday objects we interact with "are of a size and shape that is reasonable for an average person to use."However, when transitioning to an interactive environment, the scale needs to encompass an "animated" layer, where the object's motion range, speed, timing, etc., must all be reasonable for a human, presenting a challenge for novice designers during ideation.Encouragingly, we observed that WoP eased the sense-making of the scale.
In Exploration Two, the integration of physical and virtual elements enhanced participants' spatial awareness of the door, which aided decision-making.One group commented, "Initially, we planned to open the door the same width as the passenger when ideating with paper and pen.But in VR, we realized the actual subway door is narrower, and we ended up opening the entire door." Regarding timing, a group modified the door's opening duration from 30s, a chosen arbitrary number, to 5s after acting it out.They explained, "The door I see in VR is different from how I assume it by looking at the cardboard door, so setting the open time to 5 seconds is more reasonable."Meanwhile, for the speed of the door, before using WoP, a group used the term "fast" to describe the motion.During the WoP session, however, the participant in VR was able to communicate precise instructions to the actuator, requesting them to adjust the speed to be "slightly faster" or "a bit slower".This first-person encounter in an immersive setting brought the notion of speed to life, enabling a more accurate and refined design iteration.
Flexible Fidelity for Prototyping Flexible Fidelity for Prototyping Choosing the ideal fidelity to build the prototype is an art in itself [32].Higher fidelity representations result in more accurate interpretations by target audiences, whereas lower fidelity encourages rapid prototyping, leaving more space for creativity [53].Striking the right balance to create meaningful prototypes always requires the designer's tacit knowledge and profound experiences.
As an experimental design method, WoP aims to ease designers from this trade-off.The physical props, which encourage embodied prototyping, are crafted with low-fi materials.Using paper knives and glues, the geometry and scale of the props can be easily altered and tailored.Meanwhile, the virtual scene, which can be easily modified digitally, offers flexibility in the fidelity setting.
Another dimension of flexible fidelity is the motion quality and behavior: The speed, range and accuracy of the motion of simulated objects in our explorations is constrained by the physical capabilities of the human who performs it.For instance, the human actuators may not be able to mimic the speed of an approaching train, the range of a flying drone, or the mechanical precision of an assembly line robot.
Analogueous to how WoP accommodates pre-defined CAD models and hand-sketched objects, a future system may also accommodate programmed animation in addition to human-actuated props.In these situations, coordinating the animations with the prop actuation at the experiencer's interaction points is crucial.For instance, an animation may depict the arriving train, after which the actuator receives a cue to move into place to simulate the door interactions.
Collaborative Design in MR Collaborative Design in MR Collaborative prototyping is a key element of WoP.While the presented systems merge multiple communication modalities, as a first-step experiment to test out the concept, we only have one person in VR at a time.In future explorations, we plan to investigate incorporating see-through VR, having human-actuators in AR, or using projection mapping of the VR scene to ensure engagement among all team members, as observed in prior applications [10,39].Despite these limitations, the current design constraints sparked improvisational ideations.
In the current WoP setup, we observed that conversational communication was the most common form of interaction.However, participants also shared design ideas and details through gestures and by physically manipulating the props.For instance, the experiencer would use hand gestures to indicate the desired direction and speed of the door's movement in E2.More narrative-driven, some E2 experiencers creatively used the carried box to jam the door, while others utilized their foot to block the closing door.These spontaneous improvisations prompted group discussions on how the door should react.
Moreover, the visual separation by the HMD prompted a more physical and improvisational approach towards the typical face-to-face brainstorming.The experiencer approached the door from a first-person view, while the actuator-participant reacted intuitively based on observing the experiencer's behavior.This separation in grounding due to the HMD lead to some misunderstandings between the participants, but these in turn triggered "surprising and accidental" interactions that inspired new possibilities.

A STAGE FOR EMBODIED IDEATION WITH ROLE-PLAYING A STAGE FOR EMBODIED IDEATION WITH ROLE-PLAYING
We view the WoP's environment as a stage where all participants become actors in a hybrid scene, portraying either a human character or an object prop to complete the improvisational performance.Being "masked" in a way through an in-situ hybrid environment and props helps the participants quickly engage in role-playing with reduced social awkwardness, which may occur and hinder the bodystorming process with novices.
When prototyping with WoP, the boundaries between the roles of designer, tester, and facilitator become blurred, and the exchange of roles is fluid.While all participants can be regarded as designers, the person leading the decision-making dynamically changes throughout the process.The human-actuator (facilitator) takes more agency in performing interactions by handling the physical props but often experiences only a partial perspective of the interaction, typically not that of the user.In contrast, the experiencer can provide valuable suggestions based on their first-person interpretation in the "preview mode."However, they rely on the facilitator to animate the props in the "edit mode."Such role-switching could be even more seamless with the development of see-through and multiuser MR technology.
When playing a character role as an experiencer, participants can easily explore different behaviors and angles by simply moving and turning their own bodies.This immersive testing experience, with real-scale visual and haptic feedback, prompts a shift in language usage from "I think the door should" to "I feel I would like." The physicality of the props gives objects greater immediacy and agency; a virtual model has less "weight" in interacting with a person, but a physical door can nudge a person, and a physical object has to be prevented from falling.The physicality of the objects gives the interactions a logic that virtual objects lack.The role-playing method facilitates a more personal and expressive object-human interaction stemming from interpersonal dynamics.For instance, the design elements of play and humor emerged during the ideation for the "Annoying" door (Figure 16).On the other hand, WoP requires the prop-actuator to act in a way that makes sense to the object's affordance [47].For instance, actuating the door in a "doorway," while pushing the home robot like a "walking Roomba."While future interactions can include more versatile manipulations, such as lifting, bending, or grabbing, the prop-actor's behavior still needs to refer to physical actions allowed and invited by an object or environment.Though being defined, operated, and acted by humans, it is intriguing to see how these social interactions are translated into a new type of "body language" [54] that adapts to an object's affordance in an responsive environment context.

Fig. 1 .
MR Prototyping to Ideate Responsive Environment ABSTRACT ABSTRACT

Fig. 9 .
Fig. 9. Fluidity in the Workflow of Prototyping the Mixed Environment

3 .
Oculus (now Meta) Quest: https://www.oculus.com/experiences/quest/ 4. Train Asset: https://www.cgtrader.com/3d-models/interior/other/subway-metro-stationFull-scale Prop with Simple Mechanism Full-scale Prop with Simple Mechanism Also made with cardboard, this prop consists of a frame and two door panels attached to casters, enabling the panels to slide with 1 DoF.During the exploration, participants act as human actuators, pulling or pushing each door panel with various range and speed.NEW SOLUTIONS AND RATIONALES NEW SOLUTIONS AND RATIONALESTracking with Phone and QR Marker Tracking with Phone and QR Marker To ensure that the participants do not occlude tracking during the design process, the QR markers and the tripod with the phone are positioned on the back side of the cardboard door.This low-cost and spatially flexible solution enhances the accessibility of the prototyping method.