Let's Make this Fun!: Activities to Motivate Children and Teens to Complete Questionnaires

Research in human-robot interaction and HCI frequently uses questionnaires, but these are boring, and participants, particularly children and teenagers, lack motivation to complete them. To improve participants' enjoyment, engagement, and motivation during this process, we propose adding a fun activity afterwards, where a completed questionnaire is needed for participation. We trialed several ideas during a slow game jam with 11--16-year-olds and a participatory design workshop for zoomorphic robots with 8--11-year-olds. Initial observations suggest these ideas were engaging for participants, helped with quick questionnaire completion, and helped build rapport between the researchers and participants. Therefore, we present guidelines for researchers wanting to use a fun activity or a series of fun activities like this. Further investigation is needed to establish if this approach has an impact on response quality.


INTRODUCTION
Human-robot interaction (HRI) research often relies on participants completing questionnaires to collect subjective data, e.g., Godspeed, NASA TLX, NARS, RoSAS [27].Participants may also have to complete the same questionnaire multiple times so researchers can compare experimental conditions.Researchers want high response rates and good-quality responses, so the question arises of how to facilitate this.This is very important in experiments involving children and teenagers, who tend to have a shorter attention span [22].Questionnaire administration is particularly difcult to manage when a group of participants is completing the questionnaires simultaneously, which is a situation we have come across during slow game jams (SGJs) and participatory design (PD) workshops.
Our aim was to make the completion of questionnaires fun, to encourage children and teens to respond to all questions in a timely matter without undue time pressure.Unlike other methods which have gamifed questionnaires by digitizing and modifying the questionnaires themselves [4,7,19], we do this by gamifying the process.Our approach incorporates fun activities that use the completed paper questionnaires themselves, meaning this method works with any paper questionnaire, standard or otherwise.
We trialed diferent activities with 11-16-year-olds during a week-long SGJ on designing serious games for cybersecurity, and with 8-11-year-olds at the end of a half-day PD workshop on designing zoomorphic robots for animal welfare education.Our fndings suggest these activities are enjoyable, help build rapport between participants and researchers, are quick, and do not have a negative efect on response quality.We present a set of guidelines for researchers who want to use similar activities to improve participant enjoyment, engagement, and motivation to complete questionnaires.

BACKGROUND 2.1 The Problem
Completing questionnaires is boring, and survey fatigue is a commonly recognized problem [12,15].In some cases data can be collected via more engaging methods, but questionnaires are often the most appropriate tool [15].When faced with tasks that are not inherently interesting, enjoyable, or satisfying, like questionnaires, an individual's intrinsic motivation to complete the task tends to be low [20], and researchers may need to rely on participants being sufciently motivated by the social responsibility to perform their role in the study [12].Müller et al. [15] suggest increasing participant motivation by explaining the questionnaire's importance or by providing an incentive (usually monetary).However, this approach is not always the most efective with children [26].
Low motivation when doing questionnaires can cause problems.Participants are more to engage in 'satisfcing'-using suboptimal cognitive efort to answer questions, instead choosing the frst 'acceptable' response-and may not even complete the questionnaire in its entirety [13].This is particularly likely if the questionnaires are long or repeated, and survey fatigue sets in.Satisfcing and other problems like non-completion risks researchers being unable to properly address their research questions, wasting valuable time and resources.

Towards A Solution
2.2.1 Engaging Qalitative Methods.Many studies in HRI and human-computer interaction (HCI), rely on sedentary approaches like interviews and questionnaires to gather subjective data.These approaches require time and efort from the participant, but they are not necessarily designed to be fun or engaging, and time is required to build rapport [23].Therefore, researchers have explored making qualitative research methods more engaging and enjoyable, particularly in HCI [5,18,23] and research with children [2,18].
For example, cultural probes have been presented as an alternative to questionnaires and aim to make participation more fun and engaging [5].They also circumvent the issue of building rapport by removing the researcher from the environment under investigation.Cultural probes are toolkits with fun artifacts to interact with, like disposable cameras, paper maps and stickers, and postcards, which participants use to independently carry out tasks in the specifc place or environment of interest.Both the artifacts in the toolkit as well as the activities are designed to be fun and engaging [5].However, cultural probes require efort and commitment from the participant over a longer period of time, which can still result in incomplete responses and low response rates [18,25].
Walking interviews, such as the Walking & Talking method [23], have been designed to make participation in interviews more engaging.It involves repeating the same questions and elicitation of emotions using standardized tools such as the Plutchik emotion wheel [17], for a series of places in the urban environment under investigation, which can get repetitive.Several aspects of the Walking & Talking methodology have therefore been specifcally designed to make participation more engaging and relaxing, and build rapport quickly.These include the physical act of walking, which also made it easier to open up, particularly for male participants [23], the personalization of the methodology through participants selecting their own meaningful places and walking routes, and moving towards a distribution of roles in which the researcher and participant are equal conversation partners.[23,24].The in-between time while walking from place to place was also designed to ofer physical and mental breaks throughout the interview.These aspects have been shown to result in a lower threshold for participating, improve the quality of responses, and lead to higher intrinsic motivation of participants to complete the study [23,24].

Gamified
Qestionnaires.Researchers have also tried gamifying questionnaires to try to improve respondent engagement during web surveys, where response rates are low and dropof rates are high [10].Approaches like this tend to incorporate one or more motivational afordances, e.g., points, scoreboards, challenges [6].Gamifed surveys appear to have a positive impact on respondents' psychological factors, like how enjoyable [11,14] and fun [7] they fnd the questionnaires, but the impact on respondents' behavior, like the prevalence of satisfcing, is unclear [10].
Bailey et al. [3] distinguish between hard gamifcation (i.e. a game with survey questions) and soft survey gamifcation (i.e. a survey with game elements) [10].In hard gamifcation approaches where the questionnaire is fully gamifed, like Rogoda et al. [19], the results are not always equivalent to the original constructs.In contrast, soft gamifcation approaches, such as Harms et al. [7], can increase users' perceived fun and willingness to use the survey, but may also result in a slightly lower response rate than a traditional web survey.Likewise, Dorcec et al. [4] incorporated playing a game in-between questions of a digital survey, and displayed relevant information on a digital version of a car's dashboard while presenting the survey question using text, to exploring participant's willingness to pay for electrical vehicle charging.Results indicated that respondents found this game-based survey more stimulating and novel than the traditional text-based survey.
However, the gamifcation of surveys requires them to be digitized, using a lot of time and resources.They may also be time-and resource-intensive to administer when a large group of participants are completing questionnaires in-person simultaneously, as each participant needs to log on using separate devices.Furthermore, some gamifcation approaches may prevent researchers from using standardized questionnaires, as rewording the questions into a game may afect construct validity [10,19].

OUR APPROACH
Rather than gamifying the questionnaire itself [7,19] or playing a game in between questions [4], we include a game activity at the end where participants need to use their completed paper questionnaire to participate.We also incorporate other game elements like points and a scoreboard to tie these game activities together in cases where the same questionnaire needs to be flled out repeatedly throughout the study.Similar to the design of the Walking & Talking method [23], we split the participant's experience into coherent parts punctuated by transitions (i.e. the game activities).They have physical activity incorporated in them to promote physical and mental relaxation.In addition, we personalize some of the game activities towards the researcher as a way to more quickly establish rapport between participant and researcher [23].
We trialed our approach on two occasions with two diferent age groups.In both studies, ethical approval was obtained from the university ethics board, and informed consent was obtained from participants and their parents or guardians using information sheets and signed consent forms.We give examples of activities used in the studies and present some refections.
3.1 Trial 1: Slow Game Jam 3.1.1Context.We used this approach during a SGJ that involved 27 participants aged between 11-and 16-years-old (=12.8) over fve consecutive days.A SGJ is a multidisciplinary collaborative framework for serious game putting participants and experts at the center of the design [1].We ran a SGJ to co-design serious games to improve teenagers' understanding of cybersecurity.Participants flled out a NASA-TLX questionnaire [8,9] after each research activity to measure their perceived workload.The questionnaire had six items, each scored on a 21-point scale, of which one item was reverse-scored.Over the course of the SGJ, each participant needed to fll out 27 questionnaires, 18 of which were the NASA-TLX.These had to be completed on paper, as digital versions of the scale have been shown to alter results [16].
We employed our gamifcation approach, and each completed questionnaire was used in an activity.In each activity they could score a point, which was tracked on a scoreboard.The participant with the most points at the end of the SGJ received a gift voucher and a winner's certifcate.

Activities.
Basketball & Paper Airplanes.In the basketball activity, participants turned their completed questionnaire into a paper ball and had one try to throw it through a basketball hoop to score a point.Diferent variations of this game with diferent difculty levels were played, from a basketball-shaped bin placed on the foor to a basketball hoop hanging high on the wall.A diferent variation included turning the questionnaire into a paper airplane and throwing it across a line drawn on the foor or trying to get it to land in a certain area of the room.The most popular variation among participants (and researchers) was when one of the researchers placed the basketball bin on or above their head, and participants tried to throw their paper ball into the bin (Fig. 1a).
Quiz Questions.To add another type of game, we used a quiz where three or four researchers made a statement about themselves, of which one was false.Participants then had to put their flledout questionnaire in front of the researcher whose statement they thought was false.If they guessed correctly, they were awarded a point.Quiz questions were about the researchers to help build rapport and limit participants' ability to cheat by searching for answers online.Examples of quiz questions were: (i) [Researcher A] played for the national women's football team, (ii) [Researcher B] speaks Klingon, (iii) [Researcher C] wrote a Christmas show for children's television.

Trial 2: Participatory Design Workshop
3.2.1 Context.We also used this approach with 24 8-11-year-olds (=10.3)during the feedback session of a PD workshop focused on zoomorphic robot design.They had to complete a short, childfriendly intrinsic motivation inventory questionnaire [21] that measured their perceptions of enjoyment of, efort put into, and usefulness of the workshop.The questionnaire had 11 items across three measures, two of which had an item that was reverse-scored.

Activity.
Voting.Participants used their completed questionnaire to vote on their favorite of the two robots they had seen in the demonstration phase of the workshop.They placed their questionnaire in front of the robot they wanted to vote for, and there was a paper crown for the winning robot (Fig. 1b).Participants only needed to complete one questionnaire, so there were no points or scoreboard.

Refections
The activities seemed to work well at motivating the participants and injecting fun into the studies.Here we discuss some refections on the activities based on a review of videos, photos, and written observations from the SGJ and PD workshop, as well as analysis of the completed questionnaires.

Completion Rate.
For the PD workshop, all 24 questionnaires were flled in and handed in without issue, which is not surprising for an in-person study with a single questionnaire.More importantly, for the SGJ, out of 419 questionnaires administered, 388 were completed and returned-a completion rate of 92.6%.

Response
Qality.If participants are rushing, they may make errors on attention checks and reverse-scored items.Therefore, for the PD workshop, we used Cronbach's alpha to compare the internal consistency of the enjoyment and efort measures, with and without the reverse-scored items.For the enjoyment subscale, =0.91 without the reverse-scored item and =0.89 with, and for the efort subscale, =0.78 without and =0.78 with the reverse-scored item.The similarity of Cronbach's alpha with and without the reversescored items suggests that participants were not rushing to the extent that they failed to read the questions.Such an analysis is unsuitable for the NASA TLX, as the subscales measure diferent dimensions of workload.However, a visual inspection of the questionnaires found no obvious satisfcing (e.g., straightlining) and participants appeared to provide genuine responses.In our studies the activities seemed not to have a negative impact on response quality, but we caution the use of this technique with qualitative responses without further investigation.

Speed.
In both trials the questionnaires were completed quickly as participants were eager to take part in the activities.Timestamped photos from the PD workshop show it only took six minutes for the whole process including the explanation, questionnaire completion, voting, vote count-up, and winner announcement.

Enjoyment.
The activities made typically dull tasks interesting and fun.The voting picked up the energy at the end of the PD workshop, with children eager to place their votes and cheering when the winning robot was announced.During the SGJ, participants were engaged with the game activities and the scores; even once they had taken their turn, they stayed to cheer on other participants and see how they performed.Participants who failed to earn a point wanted a second try.They also came to check their scoreboard ranking and make sure all their points had been counted.

Rapport
Building.We found the activities used during the SGJ helped build rapport between the participants and researchers, particularly the 'fun fact' quiz questions, as participants expressed interest and asked the researchers follow-up questions.Developing rapport is particularly useful for longer-term studies, e.g., SGJs, summer schools, prolonged co-design studies.

GUIDELINES
In this section we summarize guidelines for designing and using game activities like these, frstly for individual activities, as in the PD workshop, and secondly for a series of activities, as in the SGJ.

Individual Game Activities
For any single game activity, it ideally will: (1) Require a completed questionnaire to participate (2) Be fexible to diferent completion times (3) Preserve questionnaires for digitization (4) Check all questionnaires are completed and received (5) Be matched to the age group (6) Build in movement The rationale for these criteria are explained below.
4.1.1Qestionnaire to Participate.The main point of the activity is to motivate questionnaire completion, so, for this to work, a completed questionnaire is necessary to access the activity.A questionnaire can be traded for the means to participate, e.g., to 'buy' a ball, or be a physical part of the activity, i.e., as the ball itself.
4.1.2Time Flexibility.During both trials we observed that some children and teens may need more time to read, understand, and complete questionnaires.To avoid rushing, we suggest using an activity that does not require everyone to be ready at the same time.
4.1.3Preserve Qestionnaires.While the basketball and paper airplane games were fun, we then had to collect and unfold the questionnaires for processing and storage, which was inconvenient and time-consuming.Having the questionnaire folded or crumpled up also made it hard to check if the participant had responded to all items and written their name; for four activities in the SGJ, at least one questionnaire had information missing.
4.1.4Total Check.Particularly for larger groups, it is helpful to build into the activity an opportunity to count up the total questionnaires received.Voting activities work well for this, as the tallying process can highlight if any questionnaires are missing.4.1.5Age Appropriate.Activities that are fun for children might not be so fun for teenagers.Some activities are easy to adjust, e.g., selecting appropriate quiz questions.Conversely, activities that rely on fne motor skills might need to be omitted for younger children.

Movement.
Many of the research tasks in the studies were sedentary, so we used the activities to bring in movement for physical and mental relaxation [23].It can be built into the 'non-physical' activities, like quizzes and voting, by conducting them in a separate area so participants have to walk over to submit their answer.

Series of Game Activities
If participants need to complete multiple questionnaires during the course of a study, you may want to link the activities into a series.For a series of activities, we recommend to: (1) Tie a chance to earn points to each activity (2) Display scores openly (3) Include a prize to motivate participation in activities (4) Use activities that incorporate a range of skills The rationale for these criteria are explained below.

Earn Points.
Points are a common motivational afordance [6].In this context, the chance to earn a point further motivates participation in an activity, indirectly motivating questionnaire completion.Clear success criteria (e.g., a bin to land in, a taped line on the foor to reach) makes it easy to see who scored a point.

Display Scores.
A scoreboard is another of the motivational afordances listed by Hamari et al. [6].By displaying scores openly, participants can see how many points separate them from the leader and stay motivated to participate in the activities.

Give Prizes.
Providing a prize for the top scorer follows the general survey advice of incentives and motivational afordances [6,15].We also suggest entering all other participants who complete all the questionnaires into a prize draw, as a way to keep participants completing questionnaires, even if they are far of the lead.

Range.
For participants to feel they can succeed in the series and thereby stay motivated to complete the activities, they need a sense of self-efcacy.This becomes tricky if all activities rely on a particular skill that the participant feels they are not good at, i.e., all motor skills based or general knowledge based.A variety may help prevent one person getting very far ahead or behind.

FUTURE WORK
This work presents a preliminary exploration of these activities.For future work, we would want to compare the impact of the diferent activities on participants' psychological factors and behavioral outcomes during the questionnaires.In addition, we would want to assess the impact on response quality for open-ended questions (e.g., average response length).Furthermore, we would also like to test enjoyable research methods in studies with adults.

CONCLUSION
Our aim was to inject some fun into the usually boring task of questionnaire completion in HCI and HRI studies by using game elements to make it more enjoyable and motivating for participants.Results from the trials with children and teenagers indicated that engaging in the activities was fun, did not negatively impact response quality, and can help build rapport quickly between researchers and participants.We hope the proposed fun activities and guidelines will enable other researchers to adopt and further develop this approach to make questionnaires more fun for everyone!