ABSTRACT
Movies are one of the biggest sources of entertainment, in individual and social contexts. By combining diverse symbol systems, such as images, texts, music and narration to tell stories, they often engage the viewers perceptually, cognitively and emotionally. Advances in digitalization and networking are enabling the access to enormous collections of videos and movies over the Internet, in social media, and through video on demand services on iTV. The development of video content-based analysis and classification techniques is also allowing the access to more information about or contained in the movies, demanding for new ways to search, browse and view videos and movies in this scenario. In this paper, we present and evaluate MovieClouds, an interactive web application designed to access, explore and visualize movies based on the information conveyed in the different tracks or perspectives of its content, especially audio and subtitles where most of the semantics is expressed, and with a special focus on the emotional dimensions expressed in the movies or felt by the viewers. For the overview, analysis, and exploratory browsing of the movies collection and the individual movies, it adopts a tag cloud unifying-paradigm, that gained popularity in Web 2.0, with the aim to extend to movies the power, flexibility, engagement and fun usually associated with clouds.
References
- Ahlberg, C. & Truvéé, S. 1995. Tight coupling: Guiding user actions in a direct manipulation retrieval system. In People and computers X: Proc. of HCI'95, Aug. 305--321. Google Scholar
Digital Library
- Chambel, T., Oliveira, E., & Martins, P. 2011. Being Happy, Healthy and Whole Watching Movies that Affect our Emotions. In Proc. of ACII'2011, Memphis, TN, USA. Google Scholar
Digital Library
- Chambel, T., and Guimarães, N., 2002. Context Perception in Video-Based Hypermedia Spaces. In Proc. of ACM Hypertext'02, College Park, MD, USA. Google Scholar
Digital Library
- Chen, Y. 2010. Exploratory Browsing: Enhancing the Browsing Experience with Media Collections, PhD thesis, Ludwig-Maximilians-Universität München, June.Google Scholar
- Christel, M. 2008. Amplifying Video Information-Seeking Success through Rich, Exploratory Interfaces, Studies in Computational Intelligence, Volume 142/2008, 21--30.Google Scholar
- Chu, S., Narayanan, S. and Jay Kuo, C.-C. 2008. Environmental Sound Recognition using MP-based Features. In IEEE Int. Conf. on Acoustics, Speech, and Signal Process.Google Scholar
- Clavel, C., Ehrette, T., Richard, G. 2005. Events Detection for an Audio-Based Surveillance System. Multimed & Expo.Google Scholar
Cross Ref
- Cunningham, S. and David M. Nichols. 2008. How people find videos. In Proc. of the 8th ACM/IEEE-CS joint conference on Digital libraries (JCDL '08), 201--210. Google Scholar
Digital Library
- Daniel, G. and Chen, M. 2003. Video Visualization In Proc. of the 14th IEEE Visualization 2003 (Vis'03). IEEE Visualization. IEEE Computer Society, Washington, DC, 54. Google Scholar
Digital Library
- Ekman, P. 1992. Are there basic emotions? Psychological Review, 99(3):550--553.Google Scholar
Cross Ref
- Gulik, R., van Vignoli, F. 2005. Visual Playlist Generation on the Artist Map. In Proc. of the 6th Int. Conf. on Music Information Retrieval. ISMIR 2005, London, UK. 520--523.Google Scholar
- Hanjalic, A. and Li-Qun Xu. 2005. Affective video content representation and modeling. Multimedia, IEEE Transactions on Multimedia, 7(1). Google Scholar
Digital Library
- Hauptmann, A. G. 2005. Lessons for the Future from a Decade of Informedia Video Analysis Research. Int. Conf. on Image and Video Retrieval, Singapure, July 20--22. LNCS, vol 3568, pp.1--10, Aug. Google Scholar
Digital Library
- Katsiouli P., Tsetsos V., Hadjiefthymiades S. 2007. Semantic video classification based on subtitles and domain terminologies. Workshop on Knowledge Acquisition from Multimedia Content.Google Scholar
- Kim, Y., Schmidt, E., Migneco, R., Morton, B., Richardson, P., Scott, J., Speck, J., and Turnbull, D. 2010. Music Emotion Recognition: a State of the Art Review, Int. Society of Music Information Retrieval (ISMIR), pp 255--266.Google Scholar
- Langlois, T., Chambel, T., Oliveira, E., Carvalho, P., Marques, G., & Falcãão, A. 2010. VIRUS: Video Information Retrieval Using Subtitles. In Proc.of Acad. MindTrek'2010. Google Scholar
Digital Library
- Langlois, T. & Marques, G. 2009. Music Classification Method based on Timbral Features. In Proc. of ISMIR'2009.Google Scholar
- Li, T., and Ogihara, M. 2003. Detecting emotion in music. In Proc. of the Intl. Conf. on Music Information Retrieval, Baltimore MD, October.Google Scholar
- Lohmann, S., Ziegler, J. and Tetzlaff, L. 2009. Comparison of Tag Cloud Layouts: Task-Related Performance and Visual Exploration. Proc. of INTERACT'09 Part I, Springer-Verlag Berlin, Heidelberg. 392--404. Google Scholar
Digital Library
- Lund, A. M. 2001. Measuring usability with the USE questionnaire. Usability and User Experience, 8(2).Google Scholar
- Marques, G., Langlois, T., Gouyon, F., Lopes, M., & Sordo, M. 2011. Short-term Feature Space and Music Genre Classification. Journal of New Music Research, Special Issue: Music and Machine Learning, 40(2), 2011.Google Scholar
- Marques, G., Domigues, M., Langlois, T, & Gouyon, F. 2011. Three Current Issues In Music Autotagging. In Proc. of ISMIR' 2011, Miami, FL, USA, Oct.24--28,Google Scholar
- Martinho, J., & Chambel, T. 2009. ColorsInMotion: Interactive Visualization and Exploration of Video Spaces. In Proc. of Academic MindTrek'2009, Tampere, Finland. Google Scholar
Digital Library
- Money, A. G. & Agius, H. 2009. Analysing user physiological responses for affective video summarization. Displays, 30(2), 59--70.Google Scholar
Cross Ref
- Oliveira, E., Martins, P., & Chambel, T. 2011. iFelt: Accessing Movies Through Our Emotions. In Proceedings of EuroiTV'2011, Lisbon, Portugal, Jun 29 - Jul 1. Google Scholar
Digital Library
- Oliveira, E., Benovoy, M., Ribeiro, N., & Chambel, T. 2011. Towards Emotional Interaction: Using Movies to Automatically Learn Users' Emotional States. In Proc.of Interact, Lisbon, Portugal, Sep 5--9. Google Scholar
Digital Library
- Perez, S. 2008. The Best Tools for Visualization. http://www.readwriteweb.com/archives/the_best_tools_for_visualization.phpGoogle Scholar
- Picard, R. W., Vyzas, E., & Healey, J. 2001. Toward Machine Emotional Intelligence: Analysis of Affective Physiological State. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1175--1191. Google Scholar
Digital Library
- Plutchik, R. 1980. Emotion: A psychoevolutionary synthesis Harper & Row, New York.Google Scholar
- Pradeep K. Atrey, Namunu C. Maddage and Mohan S. Kankanhalli. 2006. Audio Based Event Detection For Multimedia Surveillance, ICASSP.Google Scholar
- Rocha, T., & Chambel, T. 2008. VideoSpace: a 3D Video Experience. In Proceedings of Artech'2008, 4th International Conference on Digital Arts, Porto, Portugal.Google Scholar
- Russell J., 1980. A circumflex model of affect. Journal of Personality and Social Psychology, 39:1161--1178.Google Scholar
Cross Ref
- Sarmento, L., Carvalho, P., Silva, M. J., and Oliveira, E. de. 2009. Automatic creation of a reference corpus for political opinion mining in user-generated content. In Proc.of TSA'09 Google Scholar
Digital Library
- Scherer KR., 2005. What are emotions? and how can they be measured? Social Science Information, 44(4):695.Google Scholar
Cross Ref
- Soleymani, M. S., Chanel, C. G., Kierkels, J. K., & Pun, T. P. 2008. Affective Characterization of Movie Scenes Based on Content Analysis and Physiological Changes. Int. Symposium on Multimedia, 228--235. Google Scholar
Digital Library
- Thayer, R. E., 1989. The Biopsychology of Mood and Arousal, Oxford University Press, New York.Google Scholar
- Wattenburg, M., and Viegas, F. 2008. Tag Clouds and the Case for Vernacular Visualization. ACM Interactions, XV.4. Google Scholar
Digital Library
Index Terms
MovieClouds





Comments