skip to main content
research-article

Directed incremental symbolic execution

Published:04 June 2011Publication History
Skip Abstract Section

Abstract

The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice.

In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.

References

  1. S. Anand, C. S. Păsăreanu, and W. Visser. Symbolic execution with abstraction. International Journal on Software Tools for Technology Transfer (STTT), 11:53--67, January 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. T. Apiwattanapong, A. Orso, and M. J. Harrold. Jdiff: A differencing technique and tool for object-oriented programs. Automated Software Engineering, 14(1):3--36, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. W. R. Bush, J. D. Pincus, and D. J. Sielaff. A static analyzer for finding dynamic programming errors. Software: Practice and Experience, 30(7):775--802, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. C. Cadar and D. R. Engler. Execution generated test cases: How to make systems code crash itself. In SPIN, pages 2--23, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. W. C. Chang. Improving Dynamic Analysis with Data Flow Analysis. PhD thesis, University of Texas at Austin, 2010.Google ScholarGoogle Scholar
  6. Choco. Main-page Choco. http://www.emn.fr/z-info/choco-solver/, 2010.Google ScholarGoogle Scholar
  7. L. A. Clarke. A program testing system. In Proceedings of the 1976 annual conference, ACM '76, pages 488--491, 1976. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. C. Csallner, N. Tillmann, and Y. Smaragdakis. Dysy: Dynamic symbolic execution for invariant inference. In ICSE, pages 281--290, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. L. de Moura and N. Bjørner. Z3: An efficient SMT solver. In TACAS, pages 337--340, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. X. Deng, Robby, and J. Hatcliff. Kiasan/KUnit: Automatic test case generation and analysis feedback for open object-oriented systems. In TAICPART-MUTATION, pages 3--12, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. P. Godefroid. Compositional dynamic test generation. In POPL, pages 47--54, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. P. Godefroid, N. Klarlund, and K. Sen. DART: Directed automated random testing. In PLDI, pages 213--223, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. P. Godefroid, S. K. Lahiri, and C. Rubio-Gonzalez. Incremental compositional dynamic test generation. Technical Report MSR-TR-2010-11, Microsoft Research, 2010.Google ScholarGoogle Scholar
  14. T. L. Graves, M. J. Harrold, J.-M. Kim, A. Porter, and G. Rothermel. An empirical study of regression test selection techniques. ACM Transactions Software Engineering and Methodology, 10(2):184--208, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. M. J. Harrold, J. A. Jones, T. Li, D. Liang, A. Orso, M. Pennings, S. Sinha, S. A. Spoon, and A. Gujarathi. Regression test selection for java software. In OOPSLA, pages 312--326, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. D. Jackson. Software Abstractions: Logic, Language, and Analysis. The MIT Press, Cambridge, MA, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. A. Joshi and M. Heimdahl. Model-Based Safety Analysis of Simulink Models Using SCADE Design Verifier. In SAFECOMP, volume 3688 of LNCS, pages 122--135, September 2005. Google ScholarGoogle Scholar
  18. S. Khurshid, I. García, and Y. L. Suen. Repairing structurally complex data. In SPIN, pages 123--138, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. S. Khurshid, C. S. Păsăreanu, and W. Visser. Generalized symbolic execution for model checking and testing. In TACAS, pages 553--568, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. S. Khurshid and Y. L. Suen. Generalizing symbolic execution to library classes. In PASTE, pages 103--110, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. M. Kim, D. Notkin, and D. Grossman. Automatic inference of structural changes for matching across program versions. In ICSE, pages 333--343, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. J. C. King. Symbolic execution and program testing. Communications of the ACM, 19(7):385--394, 1976. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. S. K. Lahiri, K. Vaswani, and T. Hoare. Differential static analysis: Opportunities, applications, and challenges. In FoSER, pages 201--204, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. S. Lauterburg, A. Sobeih, D. Marinov, and M. Viswanathan. Incremental state-space exploration for programs with dynamically allocated data. In ICSE, pages 291--300, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. H. Leung and L. White. Insights into regression testing. In ICSM, pages 60--69, 1989.Google ScholarGoogle Scholar
  26. C. Păsăreanu and N. Rungta. Symbolic PathFinder: symbolic execution of Java bytecode. In ASE, pages 179--180, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. S. Person, M. B. Dwyer, S. Elbaum, and C. S. Păsăreanu. Differential symbolic execution. In FSE, pages 226--237, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. C. S. Păsăreanu, P. C. Mehlitz, D. H. Bushnell, K. Gundy-Burlet, M. Lowry, S. Person, and M. Pape. Combining unit-level symbolic execution and system-level concrete execution for testing NASA software. In ISSTA, pages 15--25, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. D. Qi, A. Roychoudhury, and Z. Liang. Test generation to expose changes in evolving programs. In ASE, pages 397--406, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. SAE-ARP4761. Guidelines and Methods for Conducting the Safety Assessment Process on Civil Airborne Systems and Equipment. SAE International, December 1996.Google ScholarGoogle Scholar
  31. R. Santelices and M. J. Harrold. Exploiting program dependencies for scalable multiple-path symbolic execution. In ISSTA, pages 195--206, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. K. Sen, D. Marinov, and G. Agha. CUTE: a concolic unit testing engine for c. In ESEC/FSE, pages 263--272, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. C. Seo, S. Malek, and N. Medvidovic. An energy consumption framework for distributed Java-based software systems. Technical Report USC-CSE-2006-604, University of Southern California, 2006.Google ScholarGoogle Scholar
  34. J. Sztipanovits and G. Karsai. Generative programming for embedded systems. In GPCE, pages 32--49, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. K. Taneja, T. Xie, N. Tillmann, J. de Halleux, and W. Schulte. Guided path exploration for regression test generation. In ICSE, New Ideas and Emerging Results, pages 311--314, 2009.Google ScholarGoogle Scholar
  36. W. Visser, K. Havelund, G. P. Brat, S. Park, and F. Lerda. Model checking programs. Automated Software Engineering, 10(2):203--232, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Z. Xu, M. B. Cohen, and G. Rothermel. Factors affecting the use of genetic algorithms in test suite augmentation. In GECCO, pages 1365--1372, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Z. Xu and G. Rothermel. Directed test suite augmentation. In APSEC, pages 406--413, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. G. Yang, M. B. Dwyer, and G. Rothermel. Regression model checking. In ICSM, pages 115--124, 2009.Google ScholarGoogle Scholar

Index Terms

  1. Directed incremental symbolic execution

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in

          Full Access

          • Published in

            cover image ACM SIGPLAN Notices
            ACM SIGPLAN Notices  Volume 46, Issue 6
            PLDI '11
            June 2011
            652 pages
            ISSN:0362-1340
            EISSN:1558-1160
            DOI:10.1145/1993316
            Issue’s Table of Contents
            • cover image ACM Conferences
              PLDI '11: Proceedings of the 32nd ACM SIGPLAN Conference on Programming Language Design and Implementation
              June 2011
              668 pages
              ISBN:9781450306638
              DOI:10.1145/1993498
              • General Chair:
              • Mary Hall,
              • Program Chair:
              • David Padua

            Copyright © 2011 ACM

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 4 June 2011

            Check for updates

            Qualifiers

            • research-article

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader
          About Cookies On This Site

          We use cookies to ensure that we give you the best experience on our website.

          Learn more

          Got it!