skip to main content
research-article

MPSoC Software Debugging on Virtual Platforms via Execution Control with Event Graphs

Published:13 October 2016Publication History
Skip Abstract Section

Abstract

Virtual Platforms (VPs) are advantageous to develop and debug complex software for multi- and many-processor systems-on-chip (MPSoCs). VPs provide unrivaled controllability and visibility of the target, which can be exploited to examine bugs that cannot be reproduced easily in real hardware (e.g., bugs originating from races or happening during a processor stand-by state). However, VPs as employed in practice for debugging are generally underutilized. The accompanying debug ecosystem is based mostly on traditional tools, such as step-based debuggers and traces, that fall short to address the enormous complexity of modern MPSoCs and their parallel software. Finding a bug is still largely left to the developer’s experience and intuition, using manual means rather than automated or systematic solutions that exploit the controllability and visibility of VPs. Profiting from VPs for MPSoC software debugging is an open question. To bridge this gap, this article presents a novel framework for debug visualization and execution control that, relying on the many benefits of VPs, helps to identify and test possible concurrency-related bug scenarios. The framework allows examining and steering the target system by manipulating an abstract graph that highlights relevant inter-component interactions and dependencies. The proposed framework reduces the effort required to understand complex concurrency patterns and helps to expose bugs. Its efficacy is demonstrated on (i) a shared memory symmetric multi-processing platform executing Linux and parallel benchmarks, and (ii) a distributed automotive system for driver assistance applications.

References

  1. Daniel Aarno and Jakob Engblom. 2014. Software and System Development Using Virtual Platforms: Full-System Simulation with Wind River Simics. Morgan Kaufmann, Waltham, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Rahul Agarwal and Scott D. Stoller. 2006. Run-time detection of potential deadlocks for programs with locks, semaphores, and condition variables. In Proceedings of the 2006 Workshop on Parallel and Distributed Systems: Testing and Debugging (PADTAD’06). 51--60. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. ARM. 2015. Fast Models. Retrieved September 2015 from http://www.arm.com/products/tools/models/fast-models.Google ScholarGoogle Scholar
  4. Ròbert Lajos Bücs, Luis Gabriel Murillo, Ekaterina Korotcenko, Gaurav Dugge, Rainer Leupers, Gerd Ascheid, Andreas Ropers, Markus Wedler, and Andreas Hoffmann. 2015. Virtual hardware-in-the-loop co-simulation for multi-domain automotive systems via the functional mock-up interface. In Proceedings of the Forum on Specification and Design Languages (FDL’15). 1--6.Google ScholarGoogle ScholarCross RefCross Ref
  5. Jeronimo Castrillon, Aamer Shah, Luis Gabriel Murillo, Rainer Leupers, and Gerd Ascheid. 2011. Backend for virtual platforms with hardware scheduler in the MAPS framework. In Proceedings of the 2nd IEEE Latin American Symposium on Circuits and Systems (LASCAS’11). 1--4.Google ScholarGoogle ScholarCross RefCross Ref
  6. Jong-Deok Choi and Andreas Zeller. 2002. Isolating failure-inducing thread schedules. SIGSOFT Softw. Eng. Notes 27, 4 (July 2002), 210--220. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Tom De Schutter. 2014. Better Software. Faster!: Best Practices in Virtual Prototyping. Synopsys Press, Mountain View, CA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Dawson Engler and Ken Ashcraft. 2003. RacerX: Effective, static detection of race conditions and deadlocks. SIGOPS Oper. Syst. Rev. 37, 5 (Oct. 2003), 237--252. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Colin J. Fidge. 1988. Timestamps in message-passing systems that preserve the partial ordering In. Proceedings of the 11th Australian Computer Science Conference. 56--66.Google ScholarGoogle Scholar
  10. Graphviz. 2015a. Graph Visualization Software. Retreived September 2015 from http://www.graphviz.org.Google ScholarGoogle Scholar
  11. Graphviz. 2015b. gvpr: Graph Pattern Scanning and Processing Language. Retrieved September 2015 from www.graphviz.org/pdf/gvpr.1.pdf.Google ScholarGoogle Scholar
  12. Graphviz. 2015c. The DOT Language. Retrieved September 2015 from http://www.graphviz.org/content/ dot-language.Google ScholarGoogle Scholar
  13. Klaus Havelund and Thomas Pressburger. 2000. Model checking java programs using java pathfinder. Int. J. Softw. Tools Technol. Trans. 2, 4 (2000), 366--381.Google ScholarGoogle ScholarCross RefCross Ref
  14. ITRS. 2011. International Technology Roadmap for Semiconductors Design. Retrieved 2011 from http://www.itrs.net/.Google ScholarGoogle Scholar
  15. Pallavi Joshi, Mayur Naik, Chang-Seo Park, and Koushik Sen. 2009. CalFuzzer: An extensible active testing framework for concurrent programs. In Proceedings of the 21st International Conference on Computer Aided Verification (CAV’09). 675--681. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Baris Kasikci, Cristian Zamfir, and George Candea. 2012. Data races vs. data race bugs: Telling the difference with portend. In Proceedings of the 17th International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS XVII). 185--198. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Dieter Kranzlmüller, Siegfried Grabner, and Jens Volkert. 1997. Debugging with the MAD environment. Parallel Comput. 23, 1 (1997), 199--217. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Joydip Kundu and Janice E. Cuny. 1995. A scalable, visual interface for debugging with event-based behavioral abstraction. In Proceedings of the 5th Symposium on the Frontiers of Massively Parallel Computation. IEEE, 472--479. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Leslie Lamport. 1978. Time, clocks, and the ordering of events in a distributed system. Commun. ACM 21, 7 (July 1978), 558--565. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Thomas J. LeBlanc and John M. Mellor-Crummey. 1987. Debugging parallel programs with instant replay. IEEE Trans. Comput. 36, 4 (1987). Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Shan Lu, Soyeon Park, Chongfeng Hu, Xiao Ma, Weihang Jiang, Zhenmin Li, Raluca A. Popa, and Yuanyuan Zhou. 2007. MUVI: Automatically inferring multi-variable access correlations and detecting related semantic and concurrency bugs. SIGOPS Operat. Syst. Rev. 41, 6 (Oct. 2007), 103--116. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Shan Lu, Soyeon Park, Eunsoo Seo, and Yuanyuan Zhou. 2008. Learning from mistakes: A comprehensive study on real world concurrency bug characteristics. In Proceedings of the 13th International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS XIII). 329--339. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Mathworks. 2015a. Matlab/Simulink. Retrieved 2015 from http://mathworks.com/products/simulink/.Google ScholarGoogle Scholar
  24. Mathworks. 2015b. Simulink Simscape Multi-Domain Physical System Library. Retrieved 2015 from mathworks.com/products/simscape/index.html.Google ScholarGoogle Scholar
  25. Luis Gabriel Murillo, Ròbert Lajos Bücs, Daniel Hincapie, Rainer Leupers, and Gerd Ascheid. 2015. SWAT: Assertion-based debugging of concurrency issues at system level. In Proceedings of the 20th Asia and South Pacific Design Automation Conference (ASP-DAC’15). 600--605.Google ScholarGoogle ScholarCross RefCross Ref
  26. Luis Gabriel Murillo, Juan Eusse, Jovana Jovic, Sergey Yakoushkin, Rainer Leupers, and Gerd Ascheid. 2012a. Synchronization for hybrid MPSoC full-system simulation. In Proceedings of the 49th Design Automation Conference. 121--126. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Luis Gabriel Murillo, Julian Harnath, Rainer Leupers, and Gerd Ascheid. 2012b. Scalable and retargetable debugger architecture for heterogeneous MPSoCs. In Proceedings of the System, Software, SoC and Silicon Debug Conference (S4D’12). 1--6.Google ScholarGoogle Scholar
  28. Luis Gabriel Murillo, Simon Wawroschek, Jeronimo Castrillon, Rainer Leupers, and Gerd Ascheid. 2014. Automatic detection of concurrency bugs through event ordering constraints. In Proceedings of the Design, Automation and Test in Europe Conference and Exhibition (DATE’14). 1--6. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Madanlal Musuvathi, Shaz Qadeer, Thomas Ball, Gerard Basler, Piramanayagam Arumuga Nainar, and Iulian Neamtiu. 2008. Finding and reproducing heisenbugs in concurrent programs. In Proceedings of the 8th USENIX Conference on Operating Systems Design and Implementation (OSDI’08). 267--280. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Harold Pashler. 1994. Dual-task interference in simple tasks: Data and theory. Psychol. Bull. 116(2) (1994), 220--244.Google ScholarGoogle Scholar
  31. Idar Petersen. 2003. Wheel Slip Control in ABS Brakes using Gain Scheduled Optimal Control with Constraints. Ph.D. Dissertation. Department of Engineering Cybernetics, Norwegian University of Science and Technology Trondheim, Norway.Google ScholarGoogle Scholar
  32. Michiel Ronsse and Koen De Bosschere. 2000. Non-intrusive on-the-fly data race detection using execution replay. In Proceedings of the 4th International Workshop on Automated Debugging (AADEBUG’00). ACM, 148--163.Google ScholarGoogle Scholar
  33. Synopsys. 2015. Virtualizer Development Kits (VDK). Retrieved September 2015 from http://synopsys.com/ Prototyping/VirtualPrototyping/Virtualizer-Development.Google ScholarGoogle Scholar
  34. Univ. of Delaware. 2007. The Modified SPLASH-2. Retrieved September 2015 from http://www.capsl.udel.edu/splash/.Google ScholarGoogle Scholar
  35. Tingting Yu, Witawas Srisa-an, and Gregg Rothermel. 2012. SimTester: A controllable and observable testing framework for embedded systems. SIGPLAN Not. 47, 7 (Mar. 2012), 51--62. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Tingting Yu, Witawas Srisa-an, and Gregg Rothermel. 2013. SimRacer: An automated framework to support testing for process-level races. In Proceedings of the 2013 International Symposium on Software Testing and Analysis (ISSTA’13). 167--177. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. MPSoC Software Debugging on Virtual Platforms via Execution Control with Event Graphs

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        • Article Metrics

          • Downloads (Last 12 months)2
          • Downloads (Last 6 weeks)0

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader
        About Cookies On This Site

        We use cookies to ensure that we give you the best experience on our website.

        Learn more

        Got it!