ABSTRACT

A robotic gadget that is equipped with a movable weight inside its body is developed. By controlling the movement of the internal weight together with other robotic behaviors such as hand gestures or speech dialogues, it is expected that emotional and/or intentional messaging between users is enhanced. To gain knowledge for designing effective weight shifts, an elicitation study was conducted to investigate how users holding this gadget in their hand interpreted its 36 weight shift patterns generated by setting four basic movement parameters (target position, trajectory, speed, and repetition). Results present mappings between these parameters and the emotional perception of the users. Furthermore, specific weight shift patterns that can express certain human emotions and intentions are revealed. These findings will be useful for designing effective weight shifts to enhance emotional and intentional messaging between users. This study attempts to open a new dimension for the expression capability of robotic gadgets.
References
- Ahmad Hoirul Basori, Abdullah Bade, Mohd Shahrizal Sunar, Daut Daman, and Nadzari Saari. 2009. Haptic Vibration for Emotional Expression of Avatar to Enhance the Realism of Virtual Reality. In Proceedings of the International Conference on Computer Technology and Development. IEEE, 416--420. DOI: http://dx.doi.org/10.1109/ICCTD.2009.132Google Scholar
Digital Library
- Alexis E. Block and Katherine J. Kuchenbecker. 2018. Emotionally Supporting Humans Through Robot Hugs. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. ACM, 293--294. DOI: http://dx.doi.org/10.1145/3173386.3176905Google Scholar
- Margaret M. Bradley and Peter J. Lang. 1994. Measuring Emotion: The Self-Assessment Manikin and the Semantic Differential. Journal of Behavior Therapy and Experimental Psychiatry 25, 1 (1994), 49--59. DOI: http://dx.doi.org/10.1016/0005--7916(94)90063--9Google Scholar
Cross Ref
- Paul Bucci, Lotus Zhang, Xi Laura Cang, and Karon E. MacLean. 2018. Is It Happy?: Behavioural and Narrative Frame Complexity Impact Perceptions of a Simple Furry Robot's Emotions. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, Paper No.509. DOI: http://dx.doi.org/10.1145/3173574.3174083Google Scholar
- Tadeusz Cali'ski and Jerzy Harabasz. 1974. A Dendrite Method for Cluster Analysis. Communications in Statistics 3, 1 (1974), 1--27. DOI: http://dx.doi.org/10.1080/03610927408827101Google Scholar
- Angela Chang, Sile O'Modhrain, Rob Jacob, Eric Gunther, and Hiroshi Ishii. 2002. ComTouch: Design of a Vibrotactile Communication Device. In Proceedings of the 4th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques. ACM, 312--320. DOI: http://dx.doi.org/10.1145/778712.778755Google Scholar
Digital Library
- Sharp Corporation. 2016. RoBoHoN. (2016). Retrieved September 2, 2019 from https://robohon.com/.Google Scholar
- Nikolas Coupland, Justine Coupland, Howard Giles, and Karen Henwood. 1988. Accommodating the Elderly: Invoking and Extending a Theory. Language in Society 17, 1 (1988), 1--41. DOI: http://dx.doi.org/10.1017/S0047404500012574Google Scholar
Cross Ref
- Maartje M.A. de Graaf, Somaya Ben Allouch, and Jan A.G.M. van Dijk. 2016. Long-Term Acceptance of Social Robots in Domestic Environments: Insights from a User's Perspective. In Proceedings of the 2016 AAAI Spring Symposium Series: Enabling Computing Research in Socially Intelligent Human-Robot Interaction. AAAI, 96--103.Google Scholar
- Winand H Dittrich and Stephen E G Lea. 1994. Visual Perception of Intentional Motion. Perception 23, 3 (1994), 253--268. DOI: http://dx.doi.org/10.1068/p230253Google Scholar
Cross Ref
- Paul Ekman and Wallace V. Friesen. 1969. The Repertoire of Nonverbal Behavior: Categories, Origins, Usage, and Coding. Semiotica 1, 1 (1969), 49--98. DOI: http://dx.doi.org/10.1515/semi.1969.1.1.49Google Scholar
Cross Ref
- Frank A. Geldard. 1960. Some Neglected Possibilities of Communication. Science 131, 3413 (1960), 1583--1588. DOI: http://dx.doi.org/10.1126/science.131.3413.1583Google Scholar
- Rachel Gockley, Allison Bruce, Jodi Forlizzi, Marek Michalowski, Anne Mundell, Stephanie Rosenthal, Brennan Sellner, Reid Simmons, Kevin Snipes, Alan C. Schultz, and Jue Wang. 2005. Designing Robots for Long-term Social Interaction. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2199--2204. DOI: http://dx.doi.org/10.1109/IROS.2005.1545303Google Scholar
Cross Ref
- Fumio Hara and Hiroshi Kobayashi. 1996. A Face Robot able to Recognize and Produce Facial Expression. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 1600--1607. DOI: http://dx.doi.org/10.1109/IROS.1996.569026Google Scholar
Cross Ref
- Fabian Hemmert, Susann Hamann, Matthias Löwe, Josefne Zeipelt, and Gesche Joost. 2010. Weight-Shifting Mobiles: Two-Dimensional Gravitational Displays in Mobile Phones. In CHI 2010 Extended Abstracts on Human Factors in Computing Systems. ACM, 3087--3092. DOI: http://dx.doi.org/10.1145/1753846.1753922Google Scholar
- Yuhan Hu, Zhengnan Zhao, Abheek Vimal, and Guy Hoffman. 2018. Soft Skin Texture Modulation for Social Robotics. In Proceedings of the 2018 IEEE International Conference on Soft Robotics. IEEE, 182--187. DOI: http://dx.doi.org/10.1109/ROBOSOFT.2018.8404917Google Scholar
Cross Ref
- Eun-Sook Jee, Yong-Jeon Cheong, Chong Hui Kim, Dong-Soo Kwon, and Hisato Kobayashi. 2009. Sound Production for the Emotional Expression of Socially Interactive Robots. In V. Kulyukin (eds) Advances in Human-Robot Interaction. IntechOpen, 257--272. DOI: http://dx.doi.org/10.5772/6838Google Scholar
- Nils B. Jostmann, Daniël Lakens, and Thomas W. Schubert. 2009. Weight as an Embodiment of Importance. Psychological Science 20, 9 (2009), 1169--1174. DOI: http://dx.doi.org/10.1111/j.1467--9280.2009.02426.xGoogle Scholar
Cross Ref
- Malte F. Jung. 2017. Affective Grounding in Human-Robot Interaction. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction. ACM, 263--273. DOI: http://dx.doi.org/10.1145/2909824.3020224Google Scholar
Digital Library
- Lawrence H. Kim and Sean Follmer. 2019. SwarmHaptics: Haptic Display with Swarm Robots. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, Paper No.688. DOI: http://dx.doi.org/10.1145/3290605.3300918Google Scholar
- Diana Löffer, Nina Schmidt, and Robert Tscharn. 2018. Multimodal Expression of Artifcial Emotion in Social Robots Using Color, Motion and Sound. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction. ACM, 334--343. DOI: http://dx.doi.org/10.1145/3171221.3171261Google Scholar
- Michal Luria, Jessica Hodgins, and Jodi Forlizzi. 2018. The Effects of Eye Design on the Perception of Social Robots. In Proceedings of the 27th IEEE International Conference on Robot and Human Interactive Communication. IEEE, 1032--1037. DOI: http://dx.doi.org/10.1109/ROMAN.2018.8525767Google Scholar
Cross Ref
- Stephen M. Marson and Rasby M. Powell. 2014. Goffman and the Infantilization of Elderly Persons: a Theory in Development. Journal of Sociology and Social Welfare 41, 4 (2014), 143--158.Google Scholar
- Albert Mehrabian. 1996. Pleasure-Arousal-Dominance: A General Framework for Describing and Measuring Individual Differences in Temperament. Current Psychology 14, 4 (1996), 261--292. DOI: http://dx.doi.org/10.1007/BF02686918Google Scholar
Cross Ref
- Toru Nakata, Tomomasa Sato, and Taketoshi Mori. 1998. Expression of Emotion and Intention by Robot Body Movement. In Y. Kakazu, M. Wada, T. Sato (eds) Intelligent Autonomous System IAS-5. IOS Press, 352--359.Google Scholar
- Jeffrey S. Nevid. 2018. Essentials of Psychology: Concepts and Applications, 5th Edition. Cengage.Google Scholar
- Jiaqi Nie, Michelle Park, Angie Lorena Marin, and S. Shyam Sundar. 2012. Can You Hold My Hand?: Physical Warmth in Human-Robot Interaction. In Proceedings of the 7th ACM/IEEE International Conference on Human-Robot Interaction. ACM, 201--202. DOI: http://dx.doi.org/10.1145/2157689.2157755Google Scholar
- Ryuma Niiyama, Lining Yao, and Hiroshi Ishii. 2014. Weight and Volume Changing Device with Liquid Metal Transfer. In Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction. ACM, 49--52. DOI: http://dx.doi.org/10.1145/2540930.2540953Google Scholar
Digital Library
- Hiroshi Noguchi, Yuiko Koyano, Hiromi Mori, Chieko Komiyama, Hiromi Sanada, and Taketoshi Mori. 2019. Exploration of Communication Robot Use for Older Patients in an Acute Hospital based on Case Trials. Journal of Nursing Science and Engineering 6, 2 (2019), 70--82. DOI: http://dx.doi.org/10.24462/jnse.6.2_70Google Scholar
- Yohei Noguchi, Hiroko Kamide, and Fumihide Tanaka. 2018. Effects on the Self-disclosure of Elderly People by Using a Robot Which Intermediates Remote Communication. In Proceedings of the 27th IEEE International Conference on Robot and Human Interactive Communication. IEEE, 612--617. DOI: http://dx.doi.org/10.1109/ROMAN.2018.8525562Google Scholar
Cross Ref
- Yohei Noguchi, Hiroko Kamide, and Fumihide Tanaka. 2020. Personality Traits for Social Mediator Robot Encouraging Elderly Self-Disclosure on Loss Experiences. ACM Transactions on Human-Robot Interaction (2020), in press.Google Scholar
- Marianna Obrist, Sue Ann Seah, and Sriram Subramanian. 2013. Talking about Tactile Experiences. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1659--1668. DOI: http://dx.doi.org/10.1145/2470654.2466220Google Scholar
Digital Library
- Denis Peña and Fumihide Tanaka. 2018. Validation of the Design of a Robot to Study the Thermo-Emotional Expression. In Ge S. et al. (eds) Social Robotics. ICSR 2018. Lecture Notes in Computer Science, vol.11357. Springer, Cham, 75--85. DOI: http://dx.doi.org/10.1007/978--3-030-05204--1_8Google Scholar
- Shafq Ur Réhman. 2010. Expressing Emotions through Vibration for Perception and Control. Ph.D. Dissertation. Department of Applied Physics and Electronics, Umeå University.Google Scholar
- Shafq Ur Réhman and Li Liu. 2008. Vibrotactile Emotions on a Mobile Phone. In Proceedings of the IEEE International Conference on Signal Image Technology and Internet Based Systems. IEEE, 239--243. DOI: http://dx.doi.org/10.1109/SITIS.2008.72Google Scholar
Digital Library
- Shafq Ur Réhman and Li Liu. 2010. iFeeling: Vibrotactile Rendering of Human Emotions on Mobile Phones. In Jiang X., Ma M.Y., Chen C.W. (eds) Mobile Multimedia Processing. WMMP 2008. Lecture Notes in Computer Science, vol.5960. Springer, Berlin, Heidelberg, 1--20. DOI: http://dx.doi.org/10.1007/978--3--642--12349--8_1Google Scholar
- James A. Russell and Albert Mehrabian. 1977. Evidence for a Three-factor Theory of Emotions. Journal of Research in Personality 11, 3 (1977), 273--294. DOI: http://dx.doi.org/10.1016/0092--6566(77)90037-XGoogle Scholar
Cross Ref
- Jotaro Shigeyama, Takeru Hashimoto, Shigeo Yoshida, Taiju Aoki, Takuji Narumi, Tomohiro Tanikawa, and Michitaka Hirose. 2018. Transcalibur: Weight Moving VR Controller for Dynamic Rendering of 2D Shape Using Haptic Shape Illusion. In Proceedings of the ACM SIGGRAPH Emerging Technologies. ACM, Article No.19. DOI: http://dx.doi.org/10.1145/3214907.3214923Google Scholar
Digital Library
- Jotaro Shigeyama, Takeru Hashimoto, Shigeo Yoshida, Takuji Narumi, Tomohiro Tanikawa, and Michitaka Hirose. 2019. Transcalibur: A Weight Shifting Virtual Reality Controller for 2D Shape Rendering Based on Computational Perception Model. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, Paper No.11. DOI: http://dx.doi.org/10.1145/3290605.3300241Google Scholar
Digital Library
- Sichao Song and Seiji Yamada. 2017. Expressing Emotions through Color, Sound, and Vibration with an Appearance-Constrained Social Robot. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction. ACM, 2--11. DOI: http://dx.doi.org/10.1145/2909824.3020239Google Scholar
Digital Library
- Barbette Spitler. 2016. Make It Intentional: Harness the Power of Positive Perspectives. Balboa Press.Google Scholar
- Colin Swindells, Alex Unden, and Tao Sang. 2003. TorqueBAR: An Ungrounded Haptic Feedback Device. In Proceedings of the 5th International Conference on Multimodal Interfaces. ACM, 52--59. DOI: http://dx.doi.org/10.1145/958432.958445Google Scholar
Digital Library
- Hong Z. Tan, Nathaniel I. Durlach, Charlotte M. Reed, and William M. Rabinowitz. 1999. Information Transmission with a Multifnger Tactual Display. Perception & Psychophysics 61, 6 (1999), 993--1008. DOI: http://dx.doi.org/10.3758/BF03207608Google Scholar
Cross Ref
- Fumihide Tanaka, Javier R. Movellan, Bret Fortenberry, and Kazuki Aisaka. 2006. Daily HRI Evaluation at a Classroom Environment: Reports from Dance Interaction Experiments. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction. ACM, 3--9. DOI: http://dx.doi.org/10.1145/1121241.1121245Google Scholar
Digital Library
- Kazunori Terada, Atsushi Yamauchi, and Akira Ito. 2012. Artifcial Emotion Expression for a Robot by Dynamic Color Change. In Proceedings of the 21st IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 314--321. DOI: http://dx.doi.org/10.1109/ROMAN.2012.6343772Google Scholar
- Takayuki Todo. 2018. SEER: Simulative Emotional Expression Robot. In Proceedings of the ACM SIGGRAPH Emerging Technologies. ACM, Article No.15. DOI: http://dx.doi.org/10.1145/3214907.3214921Google Scholar
Digital Library
- Graham Wilson, Dobromir Dobrev, and Stephen A. Brewster. 2016. Hot Under the Collar: Mapping Thermal Feedback to Dimensional Models of Emotion. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 4838--4849. DOI: http://dx.doi.org/10.1145/2858036.2858205Google Scholar
- Sarah Woods, Kerstin Dautenhahn, Christina Kaouri, René te Boekhorst, and Kheng Lee Koay. 2005. Is This Robot Like Me? Links between Human and Robot Personality Traits. In Proceedings of the 5th IEEE-RAS International Conference on Humanoid Robots. IEEE, 375--380. DOI: http://dx.doi.org/10.1109/ICHR.2005.1573596Google Scholar
Cross Ref
- Steve Yohanan and Karon E. MacLean. 2011. Design and Assessment of the Haptic Creature's Affect Display. In Proceedings of the 6th International Conference on Human-Robot Interaction. ACM, 473--480. DOI: http://dx.doi.org/10.1145/1957656.1957820Google Scholar
- André Zenner and Antonio Krüger. 2017. Shifty: A Weight-Shifting Dynamic Passive Haptic Proxy to Enhance Object Perception in Virtual Reality. IEEE Transactions on Visualization and Computer Graphics 23, 4 (2017), 1285--1294. DOI: http://dx.doi.org/10.1109/TVCG.2017.2656978Google Scholar
Digital Library
Supplemental Material
Index Terms
OMOY: A Handheld Robotic Gadget that Shifts its Weight to Express Emotions and Intentions





Comments