skip to main content
research-article

Cognitive Robotics on 5G Networks

Published:16 July 2021Publication History
Skip Abstract Section

Abstract

Emotional cognitive ability is a key technical indicator to measure the friendliness of interaction. Therefore, this research aims to explore robots with human emotion cognitively. By discussing the prospects of 5G technology and cognitive robots, the main direction of the study is cognitive robots. For the emotional cognitive robots, the analysis logic similar to humans is difficult to imitate; the information processing levels of robots are divided into three levels in this study: cognitive algorithm, feature extraction, and information collection by comparing human information processing levels. In addition, a multi-scale rectangular direction gradient histogram is used for facial expression recognition, and robust principal component analysis algorithm is used for facial expression recognition. In the pictures where humans intuitively feel smiles in sad emotions, the proportion of emotions obtained by the method in this study are as follows: calmness accounted for 0%, sadness accounted for 15.78%, fear accounted for 0%, happiness accounted for 76.53%, disgust accounted for 7.69%, anger accounted for 0%, and astonishment accounted for 0%. In the recognition of micro-expressions, humans intuitively feel negative emotions such as surprise and fear, and the proportion of emotions obtained by the method adopted in this study are as follows: calmness accounted for 32.34%, sadness accounted for 34.07%, fear accounted for 6.79%, happiness accounted for 0%, disgust accounted for 0%, anger accounted for 13.91%, and astonishment accounted for 15.89%. Therefore, the algorithm explored in this study can realize accuracy in cognition of emotions. From the preceding research results, it can be seen that the research method in this study can intuitively reflect the proportion of human expressions, and the recognition methods based on facial expressions and micro-expressions have good recognition effects, which is in line with human intuitive experience.

References

  1. H. Lu, Y. Li, M. Chen, et al. 2018. Brain intelligence: Go beyond artificial intelligence. Mobile Networks and Applications 23, 2 (2018), 368–375. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. E. J. Topol. 2019. High-performance medicine: The convergence of human and artificial intelligence. Nature Medicine 25, 1 (2019), 44–56.Google ScholarGoogle ScholarCross RefCross Ref
  3. M. Moravčík, M. Schmid, N. Burch, et al. 2017. DeepStack: Expert-level artificial intelligence in heads-up no-limit poker. Science 356, 6337 (2017), 508–513.Google ScholarGoogle Scholar
  4. R. Liu, B. Yang, E. Zio, et al. 2018. Artificial intelligence for fault diagnosis of rotating machinery: A review. Mechanical Systems and Signal Processing 108 (2018) 33–47.Google ScholarGoogle ScholarCross RefCross Ref
  5. C. Krittanawong, H. J. Zhang, Z. Wang, et al. 2017. Artificial intelligence in precision cardiovascular medicine. Journal of the American College of Cardiology 69, 21 (2017), 2657–2664.Google ScholarGoogle ScholarCross RefCross Ref
  6. K. C. Yang, O. Varol, C. A. Davis, et al. 2019. Arming the public with artificial intelligence to counter social bots. Human Behavior and Emerging Technologies 1, 1 (2019), 48–61.Google ScholarGoogle ScholarCross RefCross Ref
  7. A. D. Rast, S. V. Adams, S. Davidson, et al. 2018. Behavioral learning in a cognitive neuromorphic robot: An integrative approach. IEEE Transactions on Neural Networks and Learning Systems 29, 12 (2018), 6132–6144.Google ScholarGoogle ScholarCross RefCross Ref
  8. A. Mishra and J. Son. 2019. Cognitive robotics: A platform for innovation. IEEE Potentials 38, 3 (2019), 39–42.Google ScholarGoogle ScholarCross RefCross Ref
  9. M. Meier, C. Mason, F. Putze, et al. 2019. Comparative analysis of think-aloud methods for everyday activities in the context of cognitive robotics. Detail 9, (2019) 10.Google ScholarGoogle Scholar
  10. L. Hu, Y. Miao, G. Wu, et al. 2019. iRobot-Factory: An intelligent robot factory based on cognitive manufacturing and edge computing. Future Generation Computer Systems 90 (2019), 569–577.Google ScholarGoogle ScholarCross RefCross Ref
  11. K. Merrick. 2017. Value systems for developmental cognitive robotics: A survey. Cognitive Systems Research 41 (2017), 38–55. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. D. Lange. 2019. Cognitive robotics: Making robots sense, understand, and interact. IEEE Computer 52, 12 (2019), 39–44.Google ScholarGoogle ScholarCross RefCross Ref
  13. Z. Li, F. Chen, A. Bicchi, et al. 2019. Guest editorial neuro-robotics systems: Sensing, cognition, learning, and control. IEEE Transactions on Cognitive and Developmental Systems 11, 2 (2019), 145–147.Google ScholarGoogle ScholarCross RefCross Ref
  14. A. Aly, S. Griffiths, and F. Stramandinoli. 2017. Metrics and benchmarks in human-robot interaction: Recent advances in cognitive robotics. Cognitive Systems Research 43 (2017), 313–323.Google ScholarGoogle ScholarCross RefCross Ref
  15. D. H. Perico, T. P. D. Homem, A. C. Almeida, et al. 2018. Humanoid robot framework for research on cognitive robotics. Journal of Control, Automation and Electrical Systems 29, 4 (2018), 470–479.Google ScholarGoogle ScholarCross RefCross Ref
  16. A. Romero, A. Prieto, F. Bellas, et al. 2019. Simplifying the creation and management of utility models in continuous domains for cognitive robotics. Neurocomputing 353 (2019), 106–118.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. N. Rasheed, S. H. M. Amin, U. Sultana, et al. 2017. Hopfield net spreading activation for grounding of abstract action words in cognitive robot. Biologically Inspired Cognitive Architectures 21 (2017), 37–46.Google ScholarGoogle ScholarCross RefCross Ref
  18. S. Lemaignan, M. Warnier, E. A. Sisbot, et al. 2017. Artificial cognition for social human–robot interaction: An implementation. Artificial Intelligence 247 (2017), 45–69. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. E. Bryndin. 2017. Program hierarchical realization of adaptation behavior of the cognitive mobile robot with imitative thinking. International Journal of Engineering Management 1, 4 (2017), 74–79.Google ScholarGoogle Scholar
  20. Y. H. Wu, V. Cristancho-Lacroix, C. Fassert, et al. 2016. The attitudes and perceptions of older adults with mild cognitive impairment toward an assistive robot. Journal of Applied Gerontology 35, 1 (2016), 3–17.Google ScholarGoogle ScholarCross RefCross Ref
  21. E. Bryndin. 2018. Communicative associative logic of cognitive professional robot with imitative thinking. Language 1 (2018), 2.Google ScholarGoogle Scholar
  22. Y. Chen, H. Jiang, C. Li, et al. 2016. Deep feature extraction and classification of hyperspectral images based on convolutional neural networks. IEEE Transactions on Geoscience and Remote Sensing 54, 10 (2016), 6232–6251.Google ScholarGoogle ScholarCross RefCross Ref
  23. T. Wiatowski and H. Bölcskei. 2017. A mathematical theory of deep convolutional neural networks for feature extraction. IEEE Transactions on Information Theory 64, 3 (2017), 1845–1866.Google ScholarGoogle ScholarCross RefCross Ref
  24. W. Feng, J. Wang, Y. Chen, et al. 2018. UAV-aided MIMO communications for 5G Internet of Things. IEEE Internet of Things Journal 6, 2 (2018), 1731–1740.Google ScholarGoogle ScholarCross RefCross Ref
  25. G. Akpakwu, B. Silva, G. P. Hancke, et al. 2017. A survey on 5G networks for the Internet of Things: Communication technologies and challenges. IEEE Access 5, 12 (2017), 3619–3647.Google ScholarGoogle Scholar
  26. M. Aazam, K. A. Harras, and S. Zeadally. 2019. Fog computing for 5G Tactile Industrial Internet of Things: QoE-aware resource allocation model. IEEE Transactions on Industrial Informatics 15, 5 (2019), 3085–3092.Google ScholarGoogle ScholarCross RefCross Ref
  27. J. Davies-Thompson, G. V. Elli, M. Rezk, et al. 2019. Hierarchical brain network for face and voice integration of emotion expression. Cerebral Cortex 29, 9 (2019), 3590–3605.Google ScholarGoogle ScholarCross RefCross Ref
  28. M. Chen, F. Herrera, and K. Hwang. 2018. Cognitive computing: Architecture, technologies and intelligent applications. IEEE Access 6 (2018), 19774–19783.Google ScholarGoogle ScholarCross RefCross Ref
  29. Y. Chen, J. D. E. Argentinis, and G. Weber. 2016. IBM Watson: How cognitive computing can be applied to big data challenges in life sciences research. Clinical Therapeutics 38, 4 (2016), 688–701.Google ScholarGoogle ScholarCross RefCross Ref
  30. A. Petersen, A. H. Petersen, C. Bundesen, et al. 2017. The effect of phasic auditory alerting on visual perception. Cognition 165 (2017), 73–81.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Cognitive Robotics on 5G Networks

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format
      About Cookies On This Site

      We use cookies to ensure that we give you the best experience on our website.

      Learn more

      Got it!