skip to main content
research-article

Catch me if you can: performance bug detection in the wild

Published:22 October 2011Publication History
Skip Abstract Section

Abstract

Profilers help developers to find and fix performance problems. But do they find performance bugs -- performance problems that real users actually notice? In this paper we argue that -- especially in the case of interactive applications -- traditional profilers find irrelevant problems but fail to find relevant bugs.

We then introduce lag hunting, an approach that identifies perceptible performance bugs by monitoring the behavior of applications deployed in the wild. The approach transparently produces a list of performance issues, and for each issue provides the developer with information that helps in finding the cause of the problem.

We evaluate our approach with an experiment where we monitor an application used by 24 users for 1958 hours over the course of 3-months. We characterize the resulting 881 issues, and we find and fix the causes of a set of representative examples.

References

  1. Andrea Adamoli and Matthias Hauswirth. Trevis: A context tree visualization & analysis framework and its use for classifying performance failure reports. In SoftVis '10: Proceedings of the ACM Symposium on Software Visualization, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Andrea Adamoli, Milan Jovic, and Matthias Hauswirth. Lagalyzer: A latency profile analysis and visualization tool. In ISPASS '10: Proceedings of the 2010 IEEE International Symposium on Performance Analysis of Systems and Software. IEEE, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  3. Andrew Bragdon, Steven P. Reiss, Robert Zeleznik, Suman Karumuri, William Cheung, Joshua Kaplan, Christopher Coleman, Ferdi Adeputra, and Joseph J. LaViola, Jr. Code bubbles: rethinking the user interface paradigm of integrated development environments. In Proceedings of the 32nd ACM/IEEE International Conference on Software Engineering - Volume 1, ICSE '10, pages 455--464, New York, NY, USA, 2010. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bryan M. Cantrill, Michael W. Shapiro, Adam H. Leventhal, and Sun Microsystems. Dynamic instrumentation of production systems. In USENIX 2004 Annual Technical Conference (USENIX'04), pages 15--28, June 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. I. Ceaparu, J. Lazar, K. Bessiere, J. Robinson, and B. Shneiderman. Determining causes and severity of end-user frustration. International Journal of Human-Computer Interaction, 17(3), 2004.Google ScholarGoogle ScholarCross RefCross Ref
  6. J. R. Dabrowski and E. V. Munson. Is 100 milliseconds too fast? In CHI '01 extended abstracts, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Y. Endo, Z. Wang, J. B. Chen, and M. Seltzer. Using latency to evaluate interactive system performance. In OSDI '96, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. K. Flautner, R. Uhlig, S. Reinhardt, and T. Mudge. Thread-level parallelism and interactive performance of desktop applications. In ASPLOS-IX, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides. Design Patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Archana Ganapathi and David Patterson. Crash data collection: A windows case study. In DSN '05: Proceedings of the 2005 International Conference on Dependable Systems and Networks, pages 280--285, Washington, DC, USA, 2005. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Kirk Glerum, Kinshuman Kinshumann, Steve Greenberg, Gabriel Aul, Vince Orgovan, Greg Nichols, David Grant, Gretchen Loihle, and Galen Hunt. Debugging in the (very) large: ten years of implementation and experience. In SOSP '09: Proceedings of the ACM SIGOPS 22nd symposium on Operating systems principles, pages 103--116, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Susan L. Graham, Peter B. Kessler, and Marshall K. McKusick. gprof: a call graph execution profiler. SIGPLAN Not., 39(4):49--57, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Matthias Hauswirth and Andrea Adamoli. Solve & evaluate with informa: a java-based classroom response system for teaching java. In Proceedings of the 7th International Conference on Principles and Practice of Programming in Java, PPPJ '09, pages 1--10, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Matthias Hauswirth and Trishul M. Chilimbi. Low-overhead memory leak detection using adaptive statistical profiling. SIGPLAN Not., 39(11):156--164, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Milan Jovic and Matthias Hauswirth. Measuring the performance of interactive applications with listener latency profiling. In PPPJ '08, pages 137--146. ACM, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Gregor Kiczales, Erik Hilsdale, Jim Hugunin, Mik Kersten, Jeffrey Palm, and William G. Griswold. An overview of AspectJ. In ECOOP'01, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Ben Liblit, Alex Aiken, Alice X. Zheng, and Michael I. Jordan. Bug isolation via remote program sampling. SIGPLAN Not., 38(5):141--154, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Sun Microsystems. BTrace. https://btrace.dev.java.net/, 2009.Google ScholarGoogle Scholar
  19. Sasa Misailovic, Stelios Sidiroglou, Henry Hoffmann, and Martin Rinard. Quality of service profiling. In ICSE '10: Proceedings of the 32nd ACM/IEEE International Conference on Software Engineering, pages 25--34, New York, NY, USA, 2010. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Todd Mytkowicz, Amer Diwan, Matthias Hauswirth, and Peter Sweeney. Evaluating the accuracy of java profilers. In PLDI '10: Proceedings of the ACM SIGPLAN 2010 conference on Programming language design and implementation, New York, NY, USA, 2010. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Priya Nagpurkar, Hussam Mousa, Chandra Krintz, and Timothy Sherwood. Efficient remote profiling for resource-constrained devices. ACM Trans. Archit. Code Optim., 3(1):35--66, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. ObjectWeb. ASM. Web pages at http://asm.objectweb.org/.Google ScholarGoogle Scholar
  23. T. Richardson, Q. Stafford-Fraser, K. R. Wood, and A. Hopper. Virtual network computing. IEEE Internet Computing, 02(1), 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. B. Shneiderman. Response time and display rate in human performance with computers. ACM Comput. Surv., 16(3), 1984. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. B. Shneiderman. Designing the user interface: strategies for effective human-computer interaction. Addison-Wesley, 1986. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. J. M. Spivey. Fast, accurate call graph profiling. Softw. Pract. Exper., 34(3):249--264, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. N. Zeldovich and R. Chandra. Interactive performance measurement with VNCplay. In FREENIX Track: USENIX Annual Technical Conference, April 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Catch me if you can: performance bug detection in the wild

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in

            Full Access

            • Published in

              cover image ACM SIGPLAN Notices
              ACM SIGPLAN Notices  Volume 46, Issue 10
              OOPSLA '11
              October 2011
              1063 pages
              ISSN:0362-1340
              EISSN:1558-1160
              DOI:10.1145/2076021
              Issue’s Table of Contents
              • cover image ACM Conferences
                OOPSLA '11: Proceedings of the 2011 ACM international conference on Object oriented programming systems languages and applications
                October 2011
                1104 pages
                ISBN:9781450309400
                DOI:10.1145/2048066

              Copyright © 2011 ACM

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 22 October 2011

              Check for updates

              Qualifiers

              • research-article

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader
            About Cookies On This Site

            We use cookies to ensure that we give you the best experience on our website.

            Learn more

            Got it!