Abstract
We describe a system that automatically notates a comparative visualization of multiple recorded performances of the same musical work. Written musical scores have transmitted basic performance information to musicians over the ages; however, these scores only provide skeletal instructions that must be fleshed out in performance, as musical notation describes phrasing, articulation, dynamics, accentuation, and other ornamentations in generalized and ambiguous forms. Consequently, musical performances derived from the same notation can vary widely from each other in the same manner that a written text may be spoken with intense emotion or in flat monotone. Prior to the advent of recording technology, musical performances were ephemeral, only occurring once, never to be heard again in exactly the same rendition. As a result, musical interpretations were informed only by live listening. Now, with more than a century of recorded performance practice, musicians can delve deeper into the history of their aural art to gain inspiration and insight from sources that would otherwise have been inaccessible. Performers have become interested in giving performances inspired by recordings of the past, which often obey a musical common sense alien to the standards of modern practice, and it is useful for historically informed performers to describe, analyze, emulate, and internalize the performance styles of the past through the detailed study of recordings. Although much can be learned by listening, a visual interface may reveal potentially inaudible details of a recording. Because performers interact daily with traditional musical notation—a sophisticated, if ambiguous, multidimensional visualization of musical information—one approach to the design of such an interface leverages performers' existing knowledge by reducing the gap between data visualization and traditional musical notation as much as possible. Using Abjad, a Python-based tool for musical composition, the symbols of conventional staff notation are augmented to illustrate the intensity and temporal proximity of performed musical events graphically, thus facilitating the comparison of individual performances and the study of changes in performance aesthetics over time.
- George Barth and Kumaran Arul. 2007, 2009, and 2012. Reactions to the Record: Early Recordings, Musical Style, and the Future of Performance (conference). Stanford University Department of Music.Google Scholar
- Trevor Bača, Josiah Oberholtzer, and Victor Adán. 2013. Abjad API for Formalized Score Control. (July 2013). projectabjad.orgGoogle Scholar
- John Butt. 2002. Playing with History: The Historical Approach to Musical Performance. Cambridge University Press.Google Scholar
- Chris Cannam, Christian Landone, Mark Sandler, and Juan Pablo Bello. 2006. The Sonic Visualiser: A Visualisation Platform for Semantic Descriptors from Musical Signals. In Proceedings of the 7th International Conference on Music Information Retrieval. 324--327.Google Scholar
- Nicholas Cook. 2007. Performance Analysis and Chopin's Mazurkas. Musicae Scientiae 11, 2 (2007), 183--208.Google Scholar
Cross Ref
- Nicholas Cook. 2010. The Ghost in the Machine: Towards a Musicology of Recordings. Musicae Scientiae 14, 2 (2010), 3--21.Google Scholar
Cross Ref
- Perfecto Herrera, O Celma, Jordi Massaguer, Pedro Cano, Emilia Gómez, Fabien Gouyon, Markus Koppenberger, David García, J. Mahedero, and Nicolas Wack. 2005. MUCOSA: A Music Content Semantic Annotator. In Proceedings of the 6th International Conference on Music Information Retrieval. 77--83.Google Scholar
- Richard Hudson. 1997. Stolen Time: The History of Tempo Rubato. Clarendon Press.Google Scholar
- Tobias Kunze and Heinrich Taube. 1996. See—A Structured Event Editor: Visualizing Compositional Data in Common Music. In Proceedings of the International Computer Music Conference.Google Scholar
- Alan Marsden, Adrian Mackenzie, Adam Lindsay, Harriet Nock, John Coleman, and Greg Kochanski. 2007. Tools for Searching, Annotation and Analysis of Speech, Music, Film and Video—A Survey. Literary and Linguistic Computing 22, 4 (2007), 469--488.Google Scholar
Cross Ref
- J. B. Mitroo, Nancy Herman, and Norman I. Badler. 1979. Movies from Music: Visualizing Musical Compositions. In ACM SIGGRAPH Computer Graphics, Vol. 13. ACM, 218--225. Google Scholar
Digital Library
- Paul Nauert. 1994. A Theory of Complexity to Constrain the Approximation of Arbitrary Sequences of Timepoints. Perspectives of New Music (1994), 226--263.Google Scholar
- Han-Wen Nienhuys and Jan Nieuwenhuizen. 2003. LilyPond, a System for Automated Music Engraving. In Proceedings of the XIV Colloquium on Musical Informatics (XIV CIM 2003). Citeseer, 167--172.Google Scholar
- Craig Stuart Sapp. 2005. Visual Hierarchical Key Analysis. Computers in Entertainment (CIE) 3, 4 (2005), 1--19. Google Scholar
Digital Library
- Craig Stuart Sapp. 2007. Mazurka Project Plugins for Sonic Visualizer. Retrieved from http://sv.mazurka.org.uk/.Google Scholar
- Craig Stuart Sapp. 2011. Computational Methods for the Analysis of Musical Structure. Ph.D. Dissertation.Google Scholar
- Craig Stuart Sapp. 2012. Op. 27—CCARH Wiki. Retrieved from http://wiki.ccarh.org/wiki/Op27.Google Scholar
- Isaac Schankler. 2013. Notational Alternatives: Beyond Finale and Sibelius. Retrieved from http://www.newmusicbox.org/ articles/notational-alternatives-beyond-finale-and-sibelius/.Google Scholar
Index Terms
Automated Notation of Piano Recordings for Historic Performance Practice Study
Recommendations
Automated analysis of performance variations in folk song recordings
MIR '10: Proceedings of the international conference on Multimedia information retrievalPerformance analysis of recorded music material has become increasingly important in musicological research and music psychology. In this paper, we present various techniques for extracting performance aspects from field recordings of folk songs. Main ...
Developing a rhythmic performance practice in music for piano and tape
There are many excellent works for piano and tape; however, there have been an insufficient number of pianists widely performing these works. The purpose of this article is to critically analyse the rhythmic relationships between piano and tape, serving ...
Piano&Dancer: Interaction Between a Dancer and an Acoustic Instrument
MOCO '17: Proceedings of the 4th International Conference on Movement ComputingPiano&Dancer is an interactive piece for a dancer and an electromechanical acoustic piano. The piece presents the dancer and the piano as two performers on stage whose bodily movements are mutually interdependent. This interdependence reveals a close ...






Comments