skip to main content

Type stability in Julia: avoiding performance pathologies in JIT compilation

Published:15 October 2021Publication History
Skip Abstract Section

Abstract

As a scientific programming language, Julia strives for performance but also provides high-level productivity features. To avoid performance pathologies, Julia users are expected to adhere to a coding discipline that enables so-called type stability. Informally, a function is type stable if the type of the output depends only on the types of the inputs, not their values. This paper provides a formal definition of type stability as well as a stronger property of type groundedness, shows that groundedness enables compiler optimizations, and proves the compiler correct. We also perform a corpus analysis to uncover how these type-related properties manifest in practice.

Skip Supplemental Material Section

Supplemental Material

Auxiliary Presentation Video

Presentation video for the OOPSLA '21 research track talk: "Type Stability in Julia: Avoiding Performance Pathologies in JIT Compilation". As a scientific programming language, Julia strives for performance but also provides high-level productivity features. To avoid performance pathologies, Julia users are expected to adhere to a coding discipline that enables so-called type stability. Informally, a function is type stable if the type of the output depends only on the types of the inputs, not their values. This paper provides a formal definition of type stability as well as a stronger property of type groundedness, shows that groundedness enables compiler optimizations, and proves the compiler correct. We also perform a corpus analysis to uncover how these type-related properties manifest in practice.

References

  1. Gerald Aigner and Urs Hölzle. 1996. Eliminating Virtual Function Calls in C++ Programs. In European Conference on Object-Oriented Programming (ECOOP). https://doi.org/10.1.1.7.7766Google ScholarGoogle ScholarCross RefCross Ref
  2. Robert G. Atkinson. 1986. Hurricane: An Optimizing Compiler for Smalltalk. In Object-Oriented Programming Systems, Languages and Applications (OOPSLA). https://doi.org/10.1145/960112.28712 Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Edd Barrett, Carl Friedrich Bolz-Tereick, Rebecca Killick, Sarah Mount, and Laurence Tratt. 2017. Virtual Machine Warmup Blows Hot and Cold. Proc. ACM Program. Lang., 1, OOPSLA (2017), https://doi.org/10.1145/3133876 Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Aurele Barriere, Olivier Flückiger, Sandrine Blazy, David Pichardie, and Jan Vitek. 2021. Formally Verified Speculation and Deoptimization in a JIT Compiler. Proc. ACM Program. Lang., 5, POPL (2021), https://doi.org/10.1145/3434327 Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Julia Belyakova, Benjamin Chung, Jack Gelinas, Jameson Nash, Ross Tate, and Jan Vitek. 2020. World Age in Julia: Optimizing Method Dispatch in the Presence of Eval. Proc. ACM Program. Lang., 4, OOPSLA (2020), https://doi.org/10.1145/3428275 Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Jeff Bezanson, Jiahao Chen, Ben Chung, Stefan Karpinski, Viral B. Shah, Jan Vitek, and Lionel Zoubritzky. 2018. Julia: Dynamism and Performance Reconciled by Design. Proc. ACM Program. Lang., 2, OOPSLA (2018), https://doi.org/10.1145/3276490 Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B. Shah. 2017. Julia: A Fresh Approach to Numerical Computing. SIAM Rev., 59, 1 (2017), https://doi.org/10.1137/141000671 Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Brett Cannon. 2005. Localized Type Inference of Atomic Types in Python. Master’s thesis. California Polytechnic State University.Google ScholarGoogle Scholar
  9. Craig Chambers and David Ungar. 1989. Customization: Optimizing Compiler Technology for SELF, a Dynamically-Typed Object-Oriented Programming Language. In Proceedings of the ACM Programming Language Design and Implementation Conference (PLDI). 146–160. https://doi.org/10.1145/73141.74831 Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Mason Chang, Michael Bebenita, Alexander Yermolovich, Andreas Gal, and Michael Franz. 2007. Efficient just-in-time execution of dynamically typed languages via code specialization using precise runtime type inference. Donald Bren School of Information and Computer Science, University of California, Irvine, 2007.Google ScholarGoogle Scholar
  11. Marco F. Cusumano-Towner, Feras A. Saad, Alexander K. Lew, and Vikash K. Mansinghka. 2019. Gen: A General-purpose Probabilistic Programming System with Programmable Inference. In Proceedings of the ACM Programming Language Design and Implementation Conference (PLDI). https://doi.org/10.1145/3314221.3314642 Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. L. Peter Deutsch and Allan M. Schiffman. 1984. Efficient Implementation of the Smalltalk-80 System. In ACM Symposium on Principles of Programming Languages (POPL). https://doi.org/10.1145/800017.800542 Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Gilles Duboscq, Thomas Würthinger, Lukas Stadler, Christian Wimmer, Doug Simon, and Hanspeter Mössenböck. 2013. An Intermediate Representation for Speculative Optimizations in a Dynamic Compiler. In Workshop on Virtual Machines and Language Implementations (VMIL). https://doi.org/10.1145/2542142.2542143 Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Olivier Flückiger, Guido Chair, Ming-Ho Yee, Jan Jecmen, Jakob Hain, and Jan Vitek. 2020. Contextual Dispatch for Function Specialization. Proc. ACM Program. Lang., 4, OOPSLA (2020), https://doi.org/10.1145/3428288 Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Olivier Flückiger, Gabriel Scherer, Ming-Ho Yee, Aviral Goel, Amal Ahmed, and Jan Vitek. 2018. Correctness of speculative optimizations with dynamic deoptimization. Proc. ACM Program. Lang., 2, POPL (2018), https://doi.org/10.1145/3158137 Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Andreas Gal, Brendan Eich, Mike Shaver, David Anderson, David Mandelin, Mohammad R. Haghighat, Blake Kaplan, Graydon Hoare, Boris Zbarsky, Jason Orendorff, Jesse Ruderman, Edwin W. Smith, Rick Reitmaier, Michael Bebenita, Mason Chang, and Michael Franz. 2009. Trace-based just-in-time type specialization for dynamic languages. In Programming Language Design and Implementation (PLDI). https://doi.org/10.1145/1543135.1542528 Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Shu-yu Guo and Jens Palsberg. 2011. The Essence of Compiling with Traces. In Symposium on Principles of Programming Languages (POPL). https://doi.org/10.1145/1926385.1926450 Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. U. Hölzle and D. Ungar. 1994. Optimizing dynamically-dispatched calls with run-time type feedback. In Programming Language Design and Implementation (PLDI). https://doi.org/10.1145/178243.178478 Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Francesco Logozzo and Herman Venter. 2010. RATA: Rapid Atomic Type Analysis by Abstract Interpretation – Application to JavaScript Optimization. In Compiler Construction (CC). https://doi.org/10.1007/978-3-642-11970-5_5 Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Magnus O. Myreen. 2010. Verified Just-in-Time Compiler on X86. In Symposium on Principles of Programming Languages (POPL). https://doi.org/10.1145/1706299.1706313 Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Guilherme Ottoni. 2018. HHVM JIT: A Profile-Guided, Region-Based Compiler for PHP and Hack. In Programming Language Design and Implementation (PLDI). https://doi.org/10.1145/3192366.3192374 Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Artem Pelenitsyn, Julia Belyakova, Benjamin Chung, Ross Tate, and Jan Vitek. 2021. Type Stability in Julia: Avoiding Performance Pathologies in JIT Compilation (Artifact). https://doi.org/10.5281/zenodo.5500548 Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Artem Pelenitsyn, Julia Belyakova, Benjamin Chung, Ross Tate, and Jan Vitek. 2021. Type Stability in Julia: Avoiding Performance Pathologies in JIT Compilation (Extended Version). arxiv:2109.01950.Google ScholarGoogle Scholar
  24. Armin Rigo. 2004. Representation-Based Just-in-Time Specialization and the Psyco Prototype for Python. In Partial Evaluation and Program Manipulation (PEPM). https://doi.org/10.1145/1014007.1014010 Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Thomas Würthinger, Andreas Wöß, Lukas Stadler, Gilles Duboscq, Doug Simon, and Christian Wimmer. 2012. Self-Optimizing AST Interpreters. In Dynamic Language Symposium (DLS). https://doi.org/10.1145/2384577.2384587 Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Guoqing Xu, Matthew Arnold, Nick Mitchell, Atanas Rountev, and Gary Sevitsky. 2009. Go with the Flow: Profiling Copies to Find Runtime Bloat. In Programming Language Design and Implementation (PLDI). https://doi.org/10.1145/1542476.1542523 Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Francesco Zappa Nardelli, Julia Belyakova, Artem Pelenitsyn, Benjamin Chung, Jeff Bezanson, and Jan Vitek. 2018. Julia Subtyping: A Rational Reconstruction. Proc. ACM Program. Lang., 2, OOPSLA (2018), https://doi.org/10.1145/3276483 Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Type stability in Julia: avoiding performance pathologies in JIT compilation

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Article Metrics

        • Downloads (Last 12 months)159
        • Downloads (Last 6 weeks)29

        Other Metrics

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader
      About Cookies On This Site

      We use cookies to ensure that we give you the best experience on our website.

      Learn more

      Got it!