skip to main content
research-article

Automated Notation of Piano Recordings for Historic Performance Practice Study

Published:05 August 2014Publication History
Skip Abstract Section

Abstract

We describe a system that automatically notates a comparative visualization of multiple recorded performances of the same musical work. Written musical scores have transmitted basic performance information to musicians over the ages; however, these scores only provide skeletal instructions that must be fleshed out in performance, as musical notation describes phrasing, articulation, dynamics, accentuation, and other ornamentations in generalized and ambiguous forms. Consequently, musical performances derived from the same notation can vary widely from each other in the same manner that a written text may be spoken with intense emotion or in flat monotone. Prior to the advent of recording technology, musical performances were ephemeral, only occurring once, never to be heard again in exactly the same rendition. As a result, musical interpretations were informed only by live listening. Now, with more than a century of recorded performance practice, musicians can delve deeper into the history of their aural art to gain inspiration and insight from sources that would otherwise have been inaccessible. Performers have become interested in giving performances inspired by recordings of the past, which often obey a musical common sense alien to the standards of modern practice, and it is useful for historically informed performers to describe, analyze, emulate, and internalize the performance styles of the past through the detailed study of recordings. Although much can be learned by listening, a visual interface may reveal potentially inaudible details of a recording. Because performers interact daily with traditional musical notation—a sophisticated, if ambiguous, multidimensional visualization of musical information—one approach to the design of such an interface leverages performers' existing knowledge by reducing the gap between data visualization and traditional musical notation as much as possible. Using Abjad, a Python-based tool for musical composition, the symbols of conventional staff notation are augmented to illustrate the intensity and temporal proximity of performed musical events graphically, thus facilitating the comparison of individual performances and the study of changes in performance aesthetics over time.

References

  1. George Barth and Kumaran Arul. 2007, 2009, and 2012. Reactions to the Record: Early Recordings, Musical Style, and the Future of Performance (conference). Stanford University Department of Music.Google ScholarGoogle Scholar
  2. Trevor Bača, Josiah Oberholtzer, and Victor Adán. 2013. Abjad API for Formalized Score Control. (July 2013). projectabjad.orgGoogle ScholarGoogle Scholar
  3. John Butt. 2002. Playing with History: The Historical Approach to Musical Performance. Cambridge University Press.Google ScholarGoogle Scholar
  4. Chris Cannam, Christian Landone, Mark Sandler, and Juan Pablo Bello. 2006. The Sonic Visualiser: A Visualisation Platform for Semantic Descriptors from Musical Signals. In Proceedings of the 7th International Conference on Music Information Retrieval. 324--327.Google ScholarGoogle Scholar
  5. Nicholas Cook. 2007. Performance Analysis and Chopin's Mazurkas. Musicae Scientiae 11, 2 (2007), 183--208.Google ScholarGoogle ScholarCross RefCross Ref
  6. Nicholas Cook. 2010. The Ghost in the Machine: Towards a Musicology of Recordings. Musicae Scientiae 14, 2 (2010), 3--21.Google ScholarGoogle ScholarCross RefCross Ref
  7. Perfecto Herrera, O Celma, Jordi Massaguer, Pedro Cano, Emilia Gómez, Fabien Gouyon, Markus Koppenberger, David García, J. Mahedero, and Nicolas Wack. 2005. MUCOSA: A Music Content Semantic Annotator. In Proceedings of the 6th International Conference on Music Information Retrieval. 77--83.Google ScholarGoogle Scholar
  8. Richard Hudson. 1997. Stolen Time: The History of Tempo Rubato. Clarendon Press.Google ScholarGoogle Scholar
  9. Tobias Kunze and Heinrich Taube. 1996. See—A Structured Event Editor: Visualizing Compositional Data in Common Music. In Proceedings of the International Computer Music Conference.Google ScholarGoogle Scholar
  10. Alan Marsden, Adrian Mackenzie, Adam Lindsay, Harriet Nock, John Coleman, and Greg Kochanski. 2007. Tools for Searching, Annotation and Analysis of Speech, Music, Film and Video—A Survey. Literary and Linguistic Computing 22, 4 (2007), 469--488.Google ScholarGoogle ScholarCross RefCross Ref
  11. J. B. Mitroo, Nancy Herman, and Norman I. Badler. 1979. Movies from Music: Visualizing Musical Compositions. In ACM SIGGRAPH Computer Graphics, Vol. 13. ACM, 218--225. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Paul Nauert. 1994. A Theory of Complexity to Constrain the Approximation of Arbitrary Sequences of Timepoints. Perspectives of New Music (1994), 226--263.Google ScholarGoogle Scholar
  13. Han-Wen Nienhuys and Jan Nieuwenhuizen. 2003. LilyPond, a System for Automated Music Engraving. In Proceedings of the XIV Colloquium on Musical Informatics (XIV CIM 2003). Citeseer, 167--172.Google ScholarGoogle Scholar
  14. Craig Stuart Sapp. 2005. Visual Hierarchical Key Analysis. Computers in Entertainment (CIE) 3, 4 (2005), 1--19. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Craig Stuart Sapp. 2007. Mazurka Project Plugins for Sonic Visualizer. Retrieved from http://sv.mazurka.org.uk/.Google ScholarGoogle Scholar
  16. Craig Stuart Sapp. 2011. Computational Methods for the Analysis of Musical Structure. Ph.D. Dissertation.Google ScholarGoogle Scholar
  17. Craig Stuart Sapp. 2012. Op. 27—CCARH Wiki. Retrieved from http://wiki.ccarh.org/wiki/Op27.Google ScholarGoogle Scholar
  18. Isaac Schankler. 2013. Notational Alternatives: Beyond Finale and Sibelius. Retrieved from http://www.newmusicbox.org/ articles/notational-alternatives-beyond-finale-and-sibelius/.Google ScholarGoogle Scholar

Index Terms

  1. Automated Notation of Piano Recordings for Historic Performance Practice Study

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Article Metrics

        • Downloads (Last 12 months)11
        • Downloads (Last 6 weeks)1

        Other Metrics

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader
      About Cookies On This Site

      We use cookies to ensure that we give you the best experience on our website.

      Learn more

      Got it!