Development of Meal Partner Robot and Applications Towards Enriching Mealtime Experience

Co-eating which means eating with others has many positive effects and enriches our lives. However, many people, including elderly people living alone and children whose parents are both working, tend to eat alone in the current social situation. We believe that a robot that embodies in the real world and can interact in real-time can be a good meal partner. In this paper, we described the development and applications of a meal partner robot. We present the robot's design concept and the configuration of hardware and software, from the viewpoint that a robot that serves as a meal partner needs to have the ability to behave as if it were eating a meal with humans. We also developed the dialogue interaction system for food delivery and eating behavior expression using the chest and hand monitors of the robot, as an application example of the meal partner robot.


INTRODUCTION
In recent years, robots have been playing an active role in various areas of society.The number of households with robots is also increasing.Robots are expected to become more and more a part of our lives in the future.Eating is an important element in our daily lives.It is an indispensable part of human life, and is something that everyone does several times a day basically.In addition to nutritional intake, meals also have social and cultural meaning.
Co-eating, eating with others, makes mealtimes more enjoyable and improves the quality of life and life satisfaction, compared to eating alone [2,4,9].It has also been reported that co-eating has various efects, such as reducing the risk of depression and mortality in the elderly [22,24], as well as reducing obesity and underweight [12,23].In recent years, there has been a period of dramatic reduction in opportunities to eat with others due to the global pandemic of COVID-19.In addition, an increase in solitary eating has been noted in Japan, with an increase in the number of single-person, single-parent, and dual-income households.According to "Shokuiku Promotion Policies: FY2017" from Ministry of Agriculture, Forestry and Fisheries in Japan [16], more than 30 percent of those who frequently eat alone stated that they do not want to eat alone, but have no choice because the time and place of the meal do not suit with other people or they have no one to eat with.
We believe that robots can be good dining partners.Much previous research has been carried out on robots involved in mealtime, such as those that assist eating and feeding [7,17], and those that encourage eating while checking the eating status [15].However, there are not enough studies focusing on the behavior of robots as partners during mealtime.Considering the situations in which humans eat together, it would feel strange if the people who are there with you are not eating.Therefore, it is considered that a meal partner robot is required not only to engage in interaction such as conversation during the meal but also to make it look as if the robot is also eating and enjoying the mealtime.In some previous research, it is suggested that the robot's eating behavior could enrich the eating experience [5,6].This paper describes the development of a meal partner robot that can perform eating behavior expressions and some applications related to meals.

RELATED RESEARCH
In order to address the problem of the increasing number of people who are forced to eat alone in the current social situation, co-eating using online video conferencing systems such as Zoom and Skype is becoming more and more popular.Asynchronous co-eating has also been proposed, in which people watch a prerecorded video message during mealtime and send video responses to each other repeatedly [18,19].There is also research on the system in which people can interact with a female avatar who is eating during mealtime [21].The system could improve the eating experience compared to solitary eating.
In the feld of robotics research, a system has been proposed that uses mixed reality (MR) to link the robot's movements with the movements of the food image represented in the MR head-mounted display, making it appear as if the robot is eating the food [5].Using the system, the enjoyment and deliciousness of the food was improved.There is another research in which a robot expresses eating behavior by installing a display at the tips of its arm [6].The image of the food is shown in the display and the robot moves as if bringing the food to its mouth.In this paper, with the aim of performing richer expressions of eating and implementing various applications related to mealtime, a meal partner robot with an arm tip display and a chest display was developed.

DESIGN AND SYSTEM OF THE MEAL PARTNER ROBOT 3.1 Design Concept
Related research on meal partner robots shows that sharing eating behavior is important.In addition, recognition abilities such as object recognition, person recognition, and speech recognition are useful in mealtime interactions.Furthermore, we think that a simple design that does not create undue expectations that cause a negative adaptation gap [10,11] and can make people feel relaxed is important.Therefore, we decided to proceed with the development of a meal partner robot based on the design of a robot for watching over elderly people [20].This is a simple two-headed high robot with two arms.The design concepts are also suitable for a meal situation: "just having it in a room makes people happy," "lovely," "friendly," "making people want to be kind," and "easy acceptance for being watched over."The design of the original robot's face is based on Lorenz's baby schema [13], which relates to the cuteness of babies.We added the design of the mouth with an expression of eating in the developed meal partner robot because it is for sharing eating behavior with people.

Hardware Confguration
In this subsection, we describe the hardware overview of the developed meal partner robot called Mamoru'22 (Figure 1).The height of the robot is 445mm, the width is 495mm, and the depth is 275mm.

Motor.
Ternty-three KRS series motors are used: three axes for the neck, one for each eye, two for each shoulder, three axes per leg for the movement mechanism consisting of four wheels, and one for each arms' bending mechanism.The motors for the upper limbs were arranged based on the idea that the robot needs to rotate its shoulder on the pitch and roll axes and bend its arm to perform the eating motion.In addition, to expand the capability of the interaction during a meal, the robot is designed with the ability to move with wheels.

Arm
Structure.We think a meal partner robot needs to present the food being eaten and perform eating motions with the arm.To present the food being eaten, there should be a display for projecting the food at the tip of the arm.To perform eating motions, the arm should be bendable.In order to achieve both of these requirements, we applied "Soft fexible wire-driven fnger mechanism" [8] to the structure of the arm.It consists of two coil springs arranged in parallel, one continuous wire that passes inside each spring, a pulley that folds back the wire, skeletal structures that connect the two springs, and skeletal structures at the tip (pulley side) and root (opposite side to the pulley).When a wire that passes through springs is wound by a motor, the springs are bent and the arm motion of eating behavior is achieved.A liquid-crystal display whose size is 1.44 inches is placed at the place on tip of the arm, the body part corresponding to the hand, to show images of food being eaten.Wiring to the display is routed through the inside of the skeletal structure connecting the two springs, preventing the wires from being disconnected by arm movements.A silicon rubber (Dragon Skin FX-Pro from Smooth-On, Inc.) exterior is attached to the entire arm, covering everything except the display.This exterior can be fexibly deformed according to the movement of the internal structure.

Camera module.
There is IMX219-130 camera module for Jetson in each eye.The camera has a resolution of 3280 x 2464 and a viewing angle of 130 degrees.In order to install the camera on the front side as much as possible to take advantage of the viewing angle and not to spoil the design, the exterior of the eyeball consisting of the black eye part and the white eye part was created.

Sound Module.
ReSpeaker, a microphone array device, is installed inside the exterior of the head.It is a device with one microphone in each of four directions, and is capable of localizing sound sources.A small speaker was connected to the audio jack of the microphone array device which can reduce the infuence of the sound emitted from the speaker.

Displays for Eating Behavior.
As mentioned above, a 1.44inch display is placed at the right hand to show images of food being eaten.A 3.5-inch display was also placed on the chest.The chest display can be used for showing some images during the expression of eating behavior, and can also be used for debugging the robot.

3.2.6
Frame and Exterior.The frame that connects the motors and the backpack that holds the electrical components are mainly from Kondo Kagaku's KXR robot kit.The exterior of the parts other than the silicon rubber of the arms is created by 3D printing using ABS material.

Software Confguration
The overview of the system is shown in Figure 2. The KondoH7 board [1,14] is used as the board to drive the motors.This board has six ICS ports for daisy-chaining servo motors and one COM port for serial communication with a computer to control the motors and store the robot's movements.
Jetson Xavier NX is built into the robot itself as a computer that is connected to the KondoH7 board and sends commands to the motors.The Jetson Xavier NX is also used to process sensors and output images to displays.The system uses Robot Operating System (ROS), with Jetson as the ROS master, and sensor data acquired from cameras and microphones are converted into a form that can be used in ROS.Image outputs to the hand and chest displays are also communicated by using ROS topic function.The hand display is connected through SPI and the chest one is connected using HDMI.We use diferent connection methods to avoid complications in display settings associated with the operation of multiple displays.The robot can also be connected via SSH communication from an external PC.

EXAMPLES OF INTERACTION APPLICATION WITH THE DEVELOPED MEAL PARTNER ROBOT
This section describes two examples of applications utilizing the meal partner robot: a food delivery ordering system linked to a delivery website, and eating behavior expressions using chest and hand displays.

Application with Delivery Services Using a Chest Display
Though the main purpose of this research is making the mealtime itself more enjoyable, it is considered useful that the meal partner robot is able to interact concerning meal preparation.While the developed robot cannot manipulate objects, it can present information using displays.Utilizing the hardware design, we developed the system for ordering food delivery through dialogue interaction and  information presentations in the chest monitor.Food delivery services have become popular in recent years in many countries, and they can also store past order history and suggest meals according to the users' preferences.When using a food delivery service, the customers decide on the food they want to eat, search for the restaurant for the food, decide on the restaurant they want to order from, and then decide on the menu in the restaurant.Because of the delivery fees, all food is ordered from the same restaurant at one meal in most cases.A fow diagram when performing the developed food delivery application is shown in Figure 3. Figure 4 shows scenes of ordering delivery using dialogue interaction and presentation of the web pages.The fow of conversation when ordering delivery is implemented by the system using rasa dialogue framework [3].The transition of the delivery service web pages shown in the chest display is controlled  using Selenium, a framework for manipulating the browser.The linkage between the dialogue system and the display transition is implemented using ROS, and a program is constructed to transition the web pages shown at the chest display according to the intent and entity of the user's speech.

Eating Behavior Expressions Using Displays
The developed meal partner robot can make use of the display on its chest to express its eating behavior.We think the display can be used to represent the outside of the robot (e.g.plate) or the inside of the robot (e.g.stomach).Each method of expression is described in detail in Figure 5.
We conducted an online survey using a crowdsourcing service to evaluate which of the two is more appropriate as an expression of eating.The participants were 300 people, consisting of 50 males and females in their teens and 100 males and females in their 20s or older.In the survey, we showed videos of the robot performing each expression and asked the participants to indicate which is a more appropriate expression for eating behavior by the robot.A twotailed binomial test was used for statistical evaluation.The results of the experiment are shown in Figure 6.Overall, 133 participants answered the outside of the robot was a more appropriate behavior, while 167 answered the inside of the robot was more appropriate (p=0.057).Signifcantly more females in their 20s or older felt that the inside of the robot was more appropriate (p=0.021).

Figure 1 :
Figure 1: Overview of the developed meal partner robot hardware components.(Top: without exterior, Bottom: with exterior)

Figure 2 :
Figure 2: Total system of the developed meal partner robot.

Figure 3 :
Figure 3: Food delivery ordering system using voice communication and display of the robot.

Figure 4 :
Figure 4: Scenes of ordering food delivery through dialogues and chest display presentations.

Figure 5 :
Figure 5: The fow of the videos that were used in the online survey about eating expressions by the robot.

Figure 6 :
Figure 6: The results of an online survey about the preferences of the eating expressions described in Figure 5.