skip to main content
research-article

Sampling-based program execution monitoring

Published:13 April 2010Publication History
Skip Abstract Section

Abstract

For its high overall cost during product development, program debugging is an important aspect of system development. Debugging is a hard and complex activity, especially in time-sensitive systems which have limited resources and demanding timing constraints. System tracing is a frequently used technique for debugging embedded systems. A specific use of system tracing is to monitor and debug control-flow problems in programs. However, it is difficult to implement because of the potentially high overhead it might introduce to the system and the changes which can occur to the system behavior due to tracing. To solve the above problems, in this work, we present a sampling-based approach to execution monitoring which specifically helps developers debug time-sensitive systems such as real-time applications. We build the system model and propose three theorems to determine the sampling period in different scenarios. We also design seven heuristics and an instrumentation framework to extend the sampling period which can reduce the monitoring overhead and achieve an optimal tradeoff between accuracy and overhead introduced by instrumentation. Using this monitoring framework, we can use the information extracted through sampling to reconstruct the system state and execution paths to locate the deviation.

References

  1. IEEE Standard Glossary of Software Engineering Terminology. IEEE Std 610.12--1990, Dec 1990.Google ScholarGoogle Scholar
  2. SAT4J. web page, Oct 2009a. www.sat4j.org.Google ScholarGoogle Scholar
  3. SHARCNET: Shared Hierarchical Academic Research Computing Network. web page, Oct. 2009b. www.sharcnet.ca.Google ScholarGoogle Scholar
  4. M. Arnold and B. G. Ryder. A framework for reducing the cost of instrumented code. In PLDI '01: Proceedings of the ACMSIGPLAN 2001 conference on Programming language design and implementation, pages 168--179, New York, NY, USA, 2001. ACM. ISBN 1-58113-414-2. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/378795.378832. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. T. Ball and J. R. Larus. Optimally profiling and tracing programs. In POPL '92: Proceedings of the 19th ACMSIGPLAN-SIGACT symposium on Principles of programming languages, pages 59--70, New York, NY, USA, 1992. ACM. ISBN 0-89791-453-8. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/143165.143180. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. T. Ball and J. R. Larus. Optimally profiling and tracing programs. ACM Trans. Program. Lang. Syst., 16(4):1319--1360, 1994. ISSN 0164-0925. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/183432.183527. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Y. Benjamini and Y. Hochberg. Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society. Series B (Methodological), 57(1):289--300, 1995. ISSN 00359246. URL http://www.jstor.org/stable/2346101.Google ScholarGoogle ScholarCross RefCross Ref
  8. M. Biberstein, V. C. Sreedhar, B. Mendelson, D. Citron, and A. Giammaria. Instrumenting annotated programs. In VEE '05: Proceedings of the 1st ACM/USENIX international conference on Virtual execution environments, pages 164--174, New York, NY, USA, 2005. ACM. ISBN 1-59593-047-7. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1064979.1065002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. B. Bouyssounouse and J.Sifakis, editors. Embedded Systems Design: The ARTIST Roadmap for Research and Development, volume 3436 of LNCS. Springer, first edition, May 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. A. Cheung and S. Madden. Performance profiling with endoscope, an acquisitional software monitoring framework. Proc. VLDB Endow., 1:42--53, 2008. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1453856.1453866. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. I. Chun and C. Lim. Es-debugger : the flexible embedded system debugger based on jtag technology. Advanced Communication Technology, 2005, ICACT 2005. The 7th International Conference on, 2:900--903, 0-0 2005. doi: 10.1109/ICACT.2005.246099.Google ScholarGoogle ScholarCross RefCross Ref
  12. T. H. Cormen, C. E. Leiserson, R. L. Rivest, and C. Stein. Introduction to Algorithms. MIT Press, second edition, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. R. Dick, D. Rhodes, and W. Wolf. Tgff: task graphs for free. In Hardware/Software Codesign, 1998. (CODES/CASHE '98) Proceedings of the Sixth International Workshop on, pages 97--101, Mar 1998. doi: 10.1109/HSC.1998.666245. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. S. Elbaum, H. N. Chin, M. Dwyer, and M. Jorde. Carving and replaying differential unit test cases from system test cases. Software Engineering, IEEE Transactions on, 35(1):29--45, Jan.-Feb. 2009. ISSN 0098-5589. doi: 10.1109/TSE.2008.103. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. S. Fischmeister and I. Lee. Handbook on Real-Time Systems, chapter Temporal Control in Real-Time Systems: Languages and Systems, pages 10-1 to 10-18. Information Science Series. CRC Press, 2007.Google ScholarGoogle Scholar
  16. P. Frankl and E.Weyuker. An applicable family of data flow testing criteria. Software Engineering, IEEE Transactions on, 14(10):1483--1498, Oct 1988. ISSN 0098-5589. doi: 10.1109/32.6194. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. M. Gallaher and B. Kropp. The Economic Impacts of Inadequate Infrastructure for Software Testing. National Institute of Standards & Technology Planning Report 02-03, May 2002.Google ScholarGoogle Scholar
  18. M. Hutchins, H. Foster, T. Goradia, and T. Ostrand. Experiments of the effectiveness of dataflow- and controlflow-based test adequacy criteria. In ICSE '94: Proceedings of the 16th international conference on Software engineering, pages 191--200, Los Alamitos, CA, USA, 1994. IEEE Computer Society Press. ISBN 0-8186-5855-X. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Learning From Software Failure. IEEE Spectrum, September 2005.Google ScholarGoogle Scholar
  20. M. Jiang, M. A. Munawar, T. Reidemeister, and P. A. Ward. System monitoring with metric-correlation models: problems and solutions. In ICAC '09: Proceedings of the 6th international conference on Autonomic computing, pages 13--22, New York, NY, USA, 2009. ACM. ISBN 978-1-60558-564-2. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1555228.1555233. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. S. H. Kirani, I. A. Zualkernan, and W.-T. Tsai. Evaluation of expert system testing methods. Commun. ACM, 37(11):71--81, 1994. ISSN 0001-0782. doi: http://doi.acm.org/10.1145/188280.188373. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. N. Kumar, B. R. Childers, and M. L. Soffa. Low overhead program monitoring and profiling. In PASTE '05: Proceedings of the 6th ACM SIGPLAN-SIGSOFT workshop on Program analysis for software tools and engineering, pages 28--34, New York, NY, USA, 2005. ACM. ISBN 1-59593-239-9. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1108792.1108801. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. J. J. Labrosse. MicroC OS II: The Real Time Kernel. CMP Books, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. E. Lee and C. Zilles. Branch-on-random. In CGO '08: Proceedings of the sixth annual IEEE/ACM international symposium on Code generation and optimization, pages 84--93, New York, NY, USA, 2008. ACM. ISBN 978-1-59593-978-4. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1356058.1356070. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. B. Liblit, A. Aiken, A. X. Zheng, and M. I. Jordan. Bug isolation via remote program sampling. In PLDI '03: Proceedings of the ACM SIGPLAN 2003 conference on Programming language design and implementation, pages 141--154, New York, NY, USA, 2003. ACM. ISBN 1-58113-662-5. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/781131.781148. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. B. Liblit, M. Naik, A. X. Zheng, A. Aiken, and M. I. Jordan. Scalable statistical bug isolation. In PLDI '05: Proceedings of the 2005 ACM SIGPLAN conference on Programming language design and implementation, pages 15--26, New York, NY, USA, 2005. ACM. ISBN 1-59593-056-6. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1065010.1065014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. C.-K. Luk, R. Cohn, R. Muth, H. Patil, A. Klauser, G. Lowney, S. Wallace, V. J. Reddi, and K. Hazelwood. Pin: building customized program analysis tools with dynamic instrumentation. In PLDI '05: Proceedings of the 2005 ACM SIGPLAN conference on Programming language design and implementation, pages 190--200, New York, NY, USA, 2005. ACM. ISBN 1-59593-056-6. doi: http://doi.acm.org/10.1145/1065010.1065034. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. E. Metz, R. Lencevicius, and T. F. Gonzalez. Performance data collection using a hybrid approach. In ESEC/FSE-13: Proceedings of the 10th European software engineering conference held jointly with 13th ACMSIGSOFT international symposium on Foundations of software engineering, pages 126--135, New York, NY, USA, 2005. ACM. ISBN 1-59593-014-0. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1081706.1081729. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. J. Misurda, J. A. Clause, J. L. Reed, B. R. Childers, and M. L. Soffa. Demand-driven structural testing with dynamic instrumentation. In ICSE '05: Proceedings of the 27th international conference on Software engineering, pages 156--165, New York, NY, USA, 2005. ACM. ISBN 1-59593-963-2. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1062455.1062496. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. W. Orme. Debug and Trace for Multicore SoCs. ARM, September 2008. http://www.arm.com/pdfs/CoresightWhitepaper.pdf.Google ScholarGoogle Scholar
  31. R. Santelices and M. J. Harrold. Efficiently monitoring data-flow test coverage. In ASE '07: Proceedings of the twenty-second IEEE/ACM international conference on Automated software engineering, pages 343--352, New York, NY, USA, 2007. ACM. ISBN 978-1-59593-882-4. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1321631.1321682. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. A. Shye,M. Iyer, V. J. Reddi, and D. A. Connors. Code coverage testing using hardware performance monitoring support. In AADEBUG'05: Proceedings of the sixth international symposium on Automated analysis-driven debugging, pages 159--163, New York, NY, USA, 2005. ACM. ISBN 1-59593-050-7. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1085130.1085151. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. A. Srivastava and A. Eustace. Atom: a system for building customized program analysis tools. SIGPLAN Not., 39(4):528--539, 2004. ISSN 0362--1340. doi: http://doi.acm.org/10.1145/989393.989446. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. D. Tennenhouse. Proactive computing. Commun. ACM, 43(5):43--50, 2000. ISSN 0001-0782. doi: http://doi.acm.org/10.1145/332833.332837. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. M. Thorup. All structured programs have small tree width and good register allocation. Inf. Comput., 142(2):159--181, 1998. ISSN 0890-5401. doi: http://dx.doi.org/10.1006/inco.1997.2697. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. B. L. Titzer and J. Palsberg. Nonintrusive precision instrumentation of microcontroller software. In LCTES '05: Proceedings of the 2005 ACM SIGPLAN/SIGBED conference on Languages, compilers, and tools for embedded systems, pages 59--68, New York, NY, USA, 2005. ACM. ISBN 1-59593-018-3. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1065910.1065919. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. T. Zhang, X. Zhuang, S. Pande, andW. Lee. Anomalous path detection with hardware support. In CASES '05: Proceedings of the 2005 international conference on Compilers, architectures and synthesis for embedded systems, pages 43--54, New York, NY, USA, 2005. ACM. ISBN 1-59593-149-X. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1086297.1086305. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. A. X. Zheng, M. I. Jordan, B. Liblit, M. Naik, and A. Aiken. Statistical debugging: simultaneous identification of multiple bugs. In ICML '06: Proceedings of the 23rd international conference on Machine learning, pages 1105--1112, New York, NY, USA, 2006. ACM. ISBN 1-59593-383-2. doi: http://doi.acm.org.proxy.lib.uwaterloo.ca/10.1145/1143844.1143983. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Sampling-based program execution monitoring

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM SIGPLAN Notices
      ACM SIGPLAN Notices  Volume 45, Issue 4
      LCTES '10
      April 2010
      170 pages
      ISSN:0362-1340
      EISSN:1558-1160
      DOI:10.1145/1755951
      Issue’s Table of Contents
      • cover image ACM Conferences
        LCTES '10: Proceedings of the ACM SIGPLAN/SIGBED 2010 conference on Languages, compilers, and tools for embedded systems
        April 2010
        184 pages
        ISBN:9781605589534
        DOI:10.1145/1755888

      Copyright © 2010 ACM

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 13 April 2010

      Check for updates

      Qualifiers

      • research-article

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader
    About Cookies On This Site

    We use cookies to ensure that we give you the best experience on our website.

    Learn more

    Got it!