Gamified Virtual Reality for Computational Thinking

In Computer Science Education, coding activities are extremely important to teach younger students the basics of programming and computational thinking. To provide an immersive experience, in this paper, we propose VRCoding, a Virtual Reality (VR)-based block coding system. VRCoding can teach computational thinking in an immersive Virtual Reality environment, exploiting passive haptics to improve interaction and give tactile feedback to the users. Passive haptics is obtained using simple physics placeholders, i.e., textured parallelepipeds, that are tracked in real-time, and aligned with the coding blocks in VR. The system is tested on a group of secondary school users, performing simple coding exercises with a standard monitor-based block coding environment and with the proposed VRCoding block language. Results show positive feedback concerning the sense of presence and the user experience.


INTRODUCTION
Gami cation is a relatively new trend that leverages elements of games to engage, motivate, and in uence people's behavior in nongame contexts.By integrating game-like elements such as competition, rewards, challenges, and progression into areas such as education, business, health, and marketing, gami cation seeks to enhance user participation, drive desired outcomes, and create immersive experiences.
Gami cation has been applied in many di erent domains in the last few years.One of those domains is healthcare [22], where researchers reported positive outcomes in applying gami cation and serious gaming in health and wellness contexts related to chronic disease rehabilitation, physical activity, and mental health.In education and training [7], game elements increase students' motivation, engagement, and performance.Moreover, the bene ts of gaming techniques have been shown in the Software Engineering eld: its application here deserves special attention, given the humanintensive nature of software processes [19].Finally, gami cation has also been applied in corporate environments to improve employees' results in developing their daily tasks and work [15].
Recently, Virtual Reality (VR) devices allowed the possibility of devising and implementing immersive experiences, thus increasing the level of presence and engagement, which are e ective in applications such as learning and training [20,25].VR technologies are becoming more and more accessible in terms of application development frameworks, usability, and visualization, e.g., headmounted devices (HMD) costs.In particular, VR could revolutionize learning experiences by o ering extensive simulation scenarios and engaging hands-on activities.
In Computer Science Education, there is a growing interest in leveraging VR to improve how computational thinking [28] and coding concepts are taught and learned [3].Coding activities play an essential role in developing computational thinking skills, and block coding visual languages have demonstrated e ectiveness in introducing basic computational concepts such as sequencing, selection, and iteration [18,30].Tangible coding, see e.g.[5], based on the combination of block languages and physical objects, develop computational thinking through playful coding experiences.
The immersive nature of VR could further enhance the e ectiveness of block coding languages by providing a more engaging and interactive environment for learners.However, existing VR systems mainly rely on controller-based interactions, limiting the immersive experience.Although recent advancements have made hand-tracking capabilities more accessible in VR, interactions remain constrained due to the absence of tactile feedback.
This paper presents a gami ed immersive coding system based on VR and passive haptic technologies inspired by standard block coding approaches, typically used to teach students the basics of computational thinking.Our proposal builds upon the iCoding game app [6], a 3D/VR game whose scenario consists of a virtual arcade room with custom cabinets.In iCoding players can move in a virtual game room, select a cabinet by simply approaching it, and solve coding exercises implemented in a 2D arcade game projected on the cabinet's screen.Each virtual cabinet in [6], besides providing a di erent arcade game, is equipped with a 2D virtual block editor through which players can code solutions to computational thinking challenges.The iCoding system transfers traditional coding activities (based on visual languages and arcade games) into an immersive VR experience.This combination requires additional skills, such as rapidly adapting the passage from 3D to 2D scenarios during a game.In [6] the virtual coding blocks are manipulated via the Meta Quest controllers.This kind of interaction has several drawbacks, due to the di culty in moving and combining 2D virtual blocks within a 3D scenario with the sole use of the controllers.
In this paper, we present an extension of the iCoding system based on passive haptics to facilitate the interaction between players and the block-coding language de ned in [6].The novelty of our work is the use of passive haptics to improve engagement when interacting with the block-coding language needed to solve computational thinking challenges in the virtual game room.Simple physical cardboard blocks, i.e., textured parallelepipeds, are tracked and augmented in the virtual environment to represent a variety of fundamental code blocks, e ectively bridging the visual elements of block coding with tangible objects.These blocks serve as interactive elements that users can manipulate and combine intuitively to construct simple programs.Preliminary results show that our system enhances the user experience of block coding compared to a traditional visual editor, o ering learners a more intuitive and immersive experience.

RELATED WORK 2.1 Block Coding Systems
Logo [1], is probably the father of every modern coding system.Logo is a body-syntonic programming language to control a turtle used as a pen to draw geometric shapes.Visual Programming Languages (VPL) are widespread in Computer Science Education and Educational robotics nowadays.The Google Blockly library1 provides a customizable block editor that can be integrated into a web app.Sandboxed client-side interpreters for Javascript and Python can be incorporated and adapted to speci c applications (e.g., games, puzzles, coding exercises) in a quite exible way.Blockly has been used in several coding systems such as MIT App Inventor2 , and Microsoft Makecode3 .Scratch4 is another popular block coding system designed to stimulate creativity.In all these VPLs the use of multilanguage blocks is aimed at providing a user-friendly User Interface (UI) avoiding errors due to syntax errors that often represent a major obstacle for beginners.VPLs are also used in educational robotics systems such as Lego Mindstorm5 and Nao 6 .From a design point of view, coding VPLs typically provide a graphical counterpart of basic control structures that may require building non-trivial geometrical shapes (e.g., nested selection and iteration blocks).For this reason, they can also be viewed as tools for improving Human-Computer Interaction (HCI) skills.Scratch Jr 7 , the pre-school version of Scratch, provides a block language speci cally designed for storytelling based on a collection of stripes that simplify the geometry of the resulting program at the cost of a restricted set of available commands.Concerning block coding tools for 3D scenarios, we mention Beetle Blocks [14], a 3D extension of Logo [1] used to create scripts to draw 3D shapes.In [8], PyWeCode, an arcade game implemented in Python, has been proposed to provide a multiplayer experience for traditional block coding systems.
In this context, it is also worth mentioning tangible coding projects, such as Project Bloks 8 or CodeJumper 9 that combine coding with the physical experience of STEM games such as Lego10 and Meccano11 .Pocketcode12 is a block coding system for smartphones and tablets that also provides blocks for controlling virtual objects using accelerometer and gyroscope data [5].A preschool tangible coding system for storytelling based on simple geometrical shapes and image processing has been proposed in [5].

Interaction in VR and Passive Haptic Approaches
Most VR devices rely on handheld controllers for basic interaction tasks such as object grasping and manipulation.As VR environments strive for greater realism, there is a growing interest in using the hands as a more natural and intuitive method for interacting with 3D objects [13,23].Nevertheless, bare-hand interaction su ers from instability problems due to noise in tracking and lack of tactile feedback [27].This has led to the emergence of passive haptics as a promising approach.Passive haptics involves incorporating physical objects into the virtual environment, allowing users to touch them while perceiving their virtual counterparts.By integrating physical objects into the virtual world, passive haptics enables users to experience a more immersive and natural interaction.
While passive haptics holds great potential for enhancing VR experiences, their practical implementation and e ectiveness in various domains are still being explored.The authors in [11] developed a system based on passive haptics to allow remote participants to play cards together in a virtual environment by tracking an entire deck of cards individually represented in the virtual environment.In [10], the authors analyze the smoothness of the trajectories during a pick-and-place task performed in immersive VR without tactile feedback and in a Mixed Reality scenario combining VR and physical objects.They found that passive haptics allows smoother movements.Yu et al. [29] presented a prototype that uses origami carried by a drone to provide haptic feedback in VR, delivering the origami to the user's hand just as they are about to touch a virtual object.Ban et al. [2] propose a hand displacement technique called "wormholes", which involves discontinuously teleporting the virtual hand as users insert their hand into a designated hole to interact with a virtual object while simultaneously touching a real prop.This approach o ers an alternative to hand redirection methods, where the virtual hand is gradually displaced from its physical position as it approaches a target object.

VR for Learning Environment
Several approaches for enriching learning environments and applications via VR technologies exist, e.g.[17,21].
Mozilla Hubs 13 is a virtual collaboration platform, in which it is possible to create 3D scenarios that can be run directly on a web browser.Hubs allows the creation of customized rooms for holding lectures, classes, and much more, which can be followed via computer, laptop, tablet, or smartphone, but also with a virtual reality viewer.Hubs is indeed based on WebXR, a web API that provides access to input and output features associated with virtual and augmented reality devices.Mozilla Hubs does not provide programmable virtual objects, if not for basic functionalities.Mozilla Hubs has been recently used for remote teaching, e.g., see [4,9].
3D animation techniques such as Augmented Reality (AR) and VR, embedded in design platforms based on block coding systems such as Scratch, Blockly, StoryBlock and CoSpaces Edu are also available.In particular, CoSpaces Edu [12] is a block coding system for creating real-time 3D content.It provides a 3D scene editor integrated with a visual editor for the CoBlocks language, a block language inspired by Google Blockly, enriched with blocks for 3D animation.Users can incorporate di erent objects and characters in the 3D scene provided by the system.CoBlocks is used to create scripts to manipulate the objects and characters in the scene.The resulting 3D game can be played in VR/AR experiences via smartphones and head-mounted devices. 13https://hubs.mozilla.com/Unity and Unreal Engine are cross-platform game engines developed to create real-time 3D content.Unreal provides a visual language called Blueprints to simplify game development.Both tools o er a set of libraries to create and test virtual and augmented reality applications for the most common head-mounted devices.A collaborative use of Unity for mixing coding activities and VR experience is described in [5].
It is important to remark that CoSpaces Edu, Unity, and Unreal are 3D development tools not designed for an immersive coding experience.At least in the proposed interaction mode of the three tools, players have no more access to the code/block editor when immersed in the game.Therefore, they modify the behavior of virtual objects if not according to the rules de ned apriori by the designer.
Our proposal extends the players' experience by providing a virtual block editor for a fully immersive coding experience.

SYSTEM DESIGN AND IMPLEMENTATION
The system was developed using the Unity game engine, with Meta Quest 2 serving as the head-mounted display.Here, we describe the two main components of our prototype: • The immersive VR environments where coding exercises will take place, exploiting gami cation, i.e., a Virtual Arcade Room, where the user can approach di erent gaming cabinets, showing di erent games.At the beginning of the experience, the games are not working, so the user must solve some coding exercises to enable the missing gaming features.
• The technique allows interaction with the coding block in VR stably and reliably, exploiting passive haptics, thus combining interaction with simple real placeholders superimposed with the coding block in VR.

Gami cation: the Virtual Arcade Game Room
The VR environment consists of a virtual arcade game room with four di erent cabinets each one presenting a di erent 2D arcade game, namely Albion, Legend of Carl, Dystopian Future, and Path nder (see Fig. 2).Each arcade game consists of a graphical game board and a set of coding problems to be solved via the visual programming language UEBlockly, speci cally devised in Unity (see [6]) The four arcade games propose a series of coding problems (e.g., moving a sprite in a given direction, etc.) that the player can solve by interacting with a virtual block editor for the UEBlockly language.
When the user approaches one of the arcade games and starts playing with it, a message is shown explaining that a coding action is required to make the game work correctly.The UEBlockly editor is displayed on a canvas next to the cabinet, explaining the coding exercise to be solved.The exercise must be solved using the proper block, creating and assigning the correct variable values, and implementing the correct algorithm.
In Fig. 3, we show a possible solution for the following problem proposed in the Albion games: "Our character must jump to avoid falling into the hole.To make this possible, we need to change the speed value to 1 and, only then, enable the jump through the enableJump variable."

Interaction: Virtual Coding Blocks
To enhance the manipulation of coding blocks, we exploit passive haptics: users manipulate real cardboard blocks while seeing in the VR environment the corresponding coding blocks.The virtual environment is designed as a simple room with a table aligned with the position of its real-world counterpart, which serves as the designated working space.The size of the virtual table is set to match the one of the real table and is aligned with it during an o ine stage.This ensures that users know the workspace they are operating in during runtime, helping them avoid pushing objects o the table.
Figure 4 shows the working space in the virtual environment.The Start block symbolizes the initial point of a user's program.To begin constructing their program, users can drag and position a code block of their choice into the white-transparent region below the Start block.The green button executes the code, and the red one removes the positioned block, thus resetting the scene.Our system provides the following set of commands (the corresponding virtual blocks are shown in Fig. 5): • Var block, used to create a new variable or to refer to an existing variable (e.g., to modify its value), see the light blue block in Fig. 5. • Value block, used together with a Var block to assign a value to a variable, see the light green block in Fig. 5. • FloatOp, used to compute mathematical operations between variables and/or values.• BoolOp, used to compare variables and values, see the light purple block in Fig. 5. • If block and While block, used to introduce conditional logic and to implement loops, see the orange block in Fig. 5.The FloatOp block has the capability of assigning the operation's result to a variable using the Var block, while the BoolOp block is indispensable for the If and While blocks, serving as the required boolean condition.The Var, Value, BoolOp, and FloatOp blocks o er input elds for naming variables, referencing variables, and inserting values.When a user touches an input eld, it is selected, and a virtual keyboard appears on the table, enabling the user to specify their inputs.The operational blocks also feature two buttons that allow the user to change the operation type.

Real Targets and Virtual Blocks Mapping
To interact with real objects in virtual environments, we must track their 6DoF 14 pose in real-time.Following the approach presented in [10], we developed a vision-based approach using the Vuforia SDK.
Given the six distinct types of coding blocks described in the previous section, it would be necessary to use six physical targets to represent each in the virtual environment.To ensure easy detection, tracking, and comfortable grasping for users, we opted for targets measuring 15x5x5 cm.However, considering the restricted eld of view of the camera (80°), the resulting working space is also limited.Therefore, using all six targets simultaneously would result in suboptimal tracking performance and an unpleasant user experience.As a result, we decided to utilize three physical targets instead.These targets were speci cally identi ed as the Blue Target, the Purple Target, and the Orange Target.The virtual coding blocks have the same dimensions as the physical ones.
Since we have six distinct types of coding blocks and three physical targets, a dynamic mapping mechanism is required to establish the association between the virtual coding blocks and their corresponding real target.Considering a speci c exercise that the user needs to solve, at startup, we establish the association between the real targets and virtual coding blocks as follows: • The Blue target is mapped to the rst block required to initiate the program.• The Purple target is mapped to a randomly selected block from the set of blocks necessary to complete the exercise.• The Orange target is mapped to a randomly selected block from all the available blocks.At runtime, when the user moves a block and positions it below the Start block to incorporate it into the program, the virtual coding block's position becomes xed, interrupting the mapping between the real target and the virtual code block.The real target is then represented in the virtual environment as a gray-colored block, identi ed as the Ghost block (see Figure 6).
Users can then move the Ghost block to the New Block area at the table's edge (shown in Figure 4) to establish a new mapping between the associated real target and a virtual coding block.If the next block required to continue the exercise has not been mapped to a real target yet, that particular virtual coding block is selected.When the next block required to continue the exercise is mapped, for the Blue and Purple targets, the new coding block is chosen among the blocks needed to complete the exercise.In the case of the Orange target, the virtual coding block is chosen from the entire pool of available blocks.In both cases, the selection excludes the blocks already mapped to a real target.This approach ensures that users always have the necessary blocks to continue the exercise and guarantees a diverse pool of blocks for selection.
can press either the red or green button (shown in Figure 4) to remove coding blocks with xed positions and start to build their program or execute the resulting code again.Improving block deletion will be part of future work.

EVALUATION 4.1 Virtual Arcade Room
We conducted a preliminary evaluation of the immersive Virtual Arcade Room with 15 secondary school students enrolled in an internship at the University of Genoa.
Users are immersed in the VR environment using a Meta Quest 2 and interact using the associated controllers.
Participants were administered two questionnaires, the User Experience Questionnaire (UEQ) [16] and the Slater-Usoh-Steed (SUS) questionnaire [24,26], to gather subjective feedback on user experience and sense of presence, respectively.To analyze the results of the UEQ, we used the data analysis tool provided on the o cial website 15 .The Presence score for each subject is obtained by averaging the subject answers of the SUS (range 1-7).The resulting scores were averaged across all subjects for each condition to obtain a global Presence score.
The analysis of the free comments reports a general positive feeling about the idea of teaching coding in VR ("Wonderful idea to mix coding and virtual reality"), about the sense of really being in the game room ("Very interesting, it is like being in another world").However, many users felt uncomfortable with the interaction ("Commands must be improved and now they limit immersion").Thus, we propose the VRCoding system to address these limitations.

Virtual Coding Blocks
We conducted a within-subject experimental session to evaluate our system, comparing it with Google Blockly 16 , a conventional block coding system.We collected data from 8 subjects (age 20.87 ± 4.22), all of whom had normal or corrected-to-normal vision.All participants had general coding experience, but only 5 had speci c block coding experience.Among the participants, 6 were high school students enrolled in a Computer Science internship at the University of Genoa.
The participants were asked to solve a simple coding exercise, which involved initializing a variable to 10 and decrementing its value until it reaches 0 using a while loop (see Figure 5).To ensure that all participants started at the same level and to focus on evaluating the user experience rather than learning outcomes, the exercise and its solution were discussed with each participant before the experiment.
The exercise was performed in two experimental conditions (Figure 7): • VRCoding, where participants used the system here described • Blockly, where they used the Blockly interface on a PC.Each subject tested both experimental conditions in a pseudorandom order to eliminate possible learning e ects.During the VRCoding condition, participants were seated in front of the table used as a working space.

Measurements
During the task, data on completion time, the number of errors, and exercise completion status were recorded for both experimental conditions.Errors were determined by counting the number of blocks that were incorrectly positioned.

Results
4.4.1 Users performance.All 8 participants completed the requested exercise in both conditions.However, one participant made 1 error 16 Blockly, https://developers.google.com/blockly in the VRCoding condition, and another participant made 1 error in the Blockly condition.
Table 1 presents the average completion time across subjects for the VRCoding and Blockly conditions.The t-test analysis indicated a non-signi cant di erence between the two conditions ( = 0.073).However, a larger sample size could potentially lead to a statistically signi cant result.

CONCLUSION AND DISCUSSION
The preliminary test conducted with the Virtual Arcade Room described in [6] highlighted a potential improvement in the teaching of computational thinking concepts, by mixing immersive VR, gami cation, and block-coding languages.Nevertheless, several interaction issues hampered a fully immersive experience.
For these reasons, we addressed the issue of interacting with Virtual Blocks in an e ective way, exploiting passive haptics to improve interaction with the users' hands and give tactile feedback.
We performed an experiment comparing our novel VRCoding with a traditional PC code-based environment (Google Blockly).The participants had to solve a known programming task with the two environments.In this way, we focus our analysis on usability and the sense of presence.
We did not focus on analyzing movement trajectories in the VR-Coding condition since we assumed that this type of system, which exploits passive haptics, actually leads to more realistic movements compared to those in pure VR [10].
In both conditions, participants successfully completed the given exercise without encountering signi cant di culties.Only two participants made errors, with one error occurring in the VRCoding condition and the other in the Blockly condition.In our experimental session, the average time to complete the task was higher for the VRCoding condition.This could be attributed to three main reasons: i) before actually starting to assemble the blocks to solve the exercise, 5 participants spent some time manipulating the blocks in their hands, experiencing passive haptic feedback; ii) 2 participants encountered signi cant di culties while interacting with the virtual keyboard, resulting in a substantial amount of time spent on providing input for the blocks; iii) on average, 4.8 ± 2.02 instances of object tracking failures were observed during the experiments.These failures led to interruptions in participants' actions and increased completion times for the exercise.
In the SUS questionnaire used to assess the sense of presence, we obtained a higher mean score of 5.5 in the VRCoding condition compared to the Blockly condition with a mean score of 4. One of the participants reported experiencing di culty in maintaining focus on the exercise in the Blockly condition due to the presence of ambient noise.However, this issue was not perceived in the VRCoding condition, which provided an immersive virtual environment.
As shown in Figure 8, the user experience evaluation indicated that the VRCoding condition achieved signi cantly higher scores in terms of Attractiveness, Stimulation, and Novelty.Similar results were observed for Perspicuity, while the Blockly condition received higher scores in terms of E ciency and Dependability.These ndings suggest that the combination of an immersive virtual environment and passive haptics has the potential to enhance learner engagement and overall experience.This may contribute to a more immersive and enjoyable learning environment, ultimately improving the e ectiveness of the learning process.
The lower scores observed in E ciency and Dependability for the VRCoding condition can be primarily attributed to the object tracking failures, we should further address.
Preliminary results are promising in terms of usability, perceived novelty, thus engagement, and sense of presence.Future work will consist in improving the stability of the tracking and in embedding our novel VRCoding, which exploits passive haptics into the game arcade room of [6], to exploit its positive aspects in terms of immersivity and amusement.Furthermore, our system could be used to implement coding activities based on di erent block shapes and spatial layouts, e.g., stripes or stacks (see Scratch Jr. [30]).Indeed, we could plan di erent teaching paths with several challenges, each with di erent geometries, adding further gami cation elements.

Figure 3 :
Figure 3: UEBlockly script: to allow the character to jump to avoid falling into the hole (top) the user must write the UEBlockly script on the (bottom).

Figure 4 :
Figure 4: The virtual table aligned with the real one and used as the working space.The Start block serves as a starting point for the program.The green button can execute the code, while the red button can remove xed blocks.

Figure 5 :
Figure 5: Blocks arrangement to initialize a variable to 10 and decrement its value in a while loop.

Figure 5
Figure 5 illustrates a sample program, which initializes a variable with a value of 10 and decrements its value until it reaches 0.

Figure 6 :
Figure 6: User manipulating a Ghost block after placing a block.

Figure 8
Figure 8 presents the mean scores obtained from the UEQ responses for each scale considered by the questionnaire: Attractiveness, Perspicuity, E ciency, Dependability, Stimulation, and Novelty.

Figure 7 :
Figure 7: Experimental setup for the VRCoding condition (left) and the Blockly condition (right).

Figure 8 :
Figure 8: User experience evaluation with the UEQ questionnaire for the two considered experimental conditions.Mean values averaged across all the participants, are plotted on the visualization scales provided by the o cial data analysis tool.

Table 1 :
Average time (s) to complete the task for the two experimental conditions with corresponding standard deviations.4.4.2 Sense of Presence and User Experience.Table2shows the results (mean values and standard deviations) obtained from the SUS questionnaire to assess the sense of presence.The t-test analysis con rmed the statistical di erence between the two conditions ( < 0.01)

Table 2 :
Average Presence score for the two experimental conditions with corresponding standard deviations.