skip to main content
research-article

Multi-Objective Optimization of Deployment Topologies for Distributed Applications

Published:20 January 2018Publication History
Skip Abstract Section

Abstract

Modern applications are typically implemented as distributed systems comprising several components. Deciding where to deploy which component is a difficult task that today is usually assisted by logical topology recommendations. Choosing inefficient topologies allocates the wrong amount of resources, leads to unnecessary operation costs, or results in poor performance. Testing different topologies to find good solutions takes a lot of time and might delay productive operations. Therefore, this work introduces a software-based deployment topology optimization approach for distributed applications. We use an enhanced performance model generator that extracts models from operational monitoring data of running applications. The extracted model is used to simulate performance metrics (e.g., resource utilization, response times, throughput) and runtime costs of distributed applications. Subsequently, we introduce a deployment topology optimizer, which selects an optimized topology for a specified workload and considers on-premise, cloud, and hybrid topologies. The following three optimization goals are presented in this work: (i) minimum response time for an optimized user experience, (ii) approximate resource utilization around certain peaks, and (iii) minimum cost for running the application. To evaluate the approach, we use the SPECjEnterpriseNEXT industry benchmark as distributed application in an on-premise and in a cloud/on-premise hybrid environment. The evaluation demonstrates the accuracy of the simulation compared to the actual deployment by deploying an optimized topology and comparing measurements with simulation results.

References

  1. Aldeida Aleti, Barbora Buhnova, Lars Grunske, Anne Koziolek, and Indika Meedeniya. 2013. Software architecture optimization methods: A systematic literature review. IEEE Transactions on Software Engineering 39, 5 (2013), 658--683. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Danilo Ardagna, GiovanniPaolo Gibilisco, Michele Ciavotta, and Alexander Lavrentev. 2014. A multi-model optimization framework for the model driven design of cloud applications. In Search-Based Software Engineering. Lecture Notes in Computer Science, Vol. 8636. Springer International Publishing, 61--76.Google ScholarGoogle Scholar
  3. Steffen Becker, Heiko Koziolek, and Ralf Reussner. 2009. The Palladio component model for model-driven performance prediction. Journal of Systems and Software (JSS) 82, 1 (2009), 3--22. Special Issue: Software Performance - Modeling and Analysis. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Fabian Brosig, Nikolaus Huber, and Samuel Kounev. 2014. Architecture-level software performance abstractions for online performance prediction. Science of Computer Programming 90 (2014), 71--92.Google ScholarGoogle ScholarCross RefCross Ref
  5. Andreas Brunnert and Helmut Krcmar. 2017. Continuous performance evaluation and capacity planning using resource profiles for enterprise applications. Journal of Systems and Software (JSS) 123 (2017), 239--262.Google ScholarGoogle ScholarCross RefCross Ref
  6. Andreas Brunnert, Andre van Hoorn, Felix Willnecker, et al. 2015. Performance-oriented DevOps: A Research Agenda. Technical Report SPEC-RG-2015-01. SPEC Research Group—DevOps Performance Working Group, Standard Performance Evaluation Corporation (SPEC).Google ScholarGoogle Scholar
  7. Feifei Chen, John Grundy, Jean-Guy Schneider, Yun Yang, and Qiang He. 2015. StressCloud: A tool for analysing performance and energy consumption of cloud applications. In 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering, Vol. 2. 721--724. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Curtis Hrischuk, Murray Woodside, and Jerome Rolia. 1999. Trace-based load characterization for generating performance software models. IEEE Transactions on Software Engineering 25, 1 (Jan. 1999), 122--135. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Robert Huang and Eric Masanet. 2015. Data Center IT Efficiency Measures. Technical Report. National Renewable Energy Laboratory (NREL), Golden, CO.Google ScholarGoogle Scholar
  10. Nicolas Huber, Fabian Brosig, Simon Spinner, Samuel Kounev, and Manuel Bahr. 2016. Model-based self-aware performance and resource management using the descartes modeling language. IEEE Transactions on Software Engineering PP, 99 (2016), 1--1. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Samuel Kounev. 2005. Performance Engineering of Distributed Component-Based Systems—Benchmarking, Modeling and Performance Prediction. Ph.D. dissertation. Technische Universität Darmstadt, Darmstadt, Germany.Google ScholarGoogle Scholar
  12. Anne Koziolek, Heiko Koziolek, and Ralf Reussner. 2011. PerOpteryx: Automated application of tactics in multi-objective software architecture optimization. In Proceedings of the Joint ACM SIGSOFT Conference--QoSA and ACM SIGSOFT Symposium--ISARCS on Quality of Software Architectures--QoSA and Architecting Critical Systems--ISARCS. ACM, 33--42. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Heiko Koziolek, Steffen Becker, Jens Happe, Petr Tuma, and Thijmen de Gooijer. 2014. Towards software performance engineering for multicore and manycore systems. ACM SIGMETRICS Performance Evaluation Review 41, 3 (2014), 2--11. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Johannes Kross, Felix Willnecker, Thomas Zwickl, and Helmut Krcmar. 2016. PET: Continuous performance evaluation tool. In Proceedings of the 2nd International Workshop on Quality-Aware DevOps (QUDOS 2016). ACM, New York, 42--43. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Peter Libič, Lubomr Bulej, Vojtěch Horky, and Petr Tma. 2015. Estimating the impact of code additions on garbage collection overhead. In Computer Performance Engineering. Lecture Notes in Computer Science, Vol. 9272. Springer International Publishing, 130--145.Google ScholarGoogle Scholar
  16. Martin Lukasiewycz, Michael Glaß, Felix Reimann, and Jürgen Teich. 2011. Opt4J—A modular framework for meta-heuristic optimization. In Proceedings of the Genetic and Evolutionary Computing Conference (GECCO 2011). 1723--1730. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Daniel A. Menascé. 2008. Computing missing service demand parameters for performance models. In Proceedings of the 2008 Computer Measurement Group Conference (CMG 2008). 241--248.Google ScholarGoogle Scholar
  18. Benjamin Speitkamp and Martin Bichler. 2010. A mathematical programming approach for server consolidation problems in virtualized data centers. IEEE Transactions on Services Computing 3, 4 (2010), 266--278. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Simon Spinner, Giuliano Casale, Fabian Brosig, and Samuel Kounev. 2015. Evaluating approaches to resource demand estimation. Performance Evaluation 92 (2015), 51--71. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Felix Willnecker, Andreas Brunnert, Wolfgang Gottesheim, and Helmut Krcmar. 2015. Using dynatrace monitoring data for generating performance models of java ee applications. In Proceedings of the 6th ACM/SPEC International Conference on Performance Engineering. 103--104. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Felix Willnecker and Helmut Krcmar. 2016. Optimization of deployment topologies for distributed enterprise applications. In Proceedings of the 2016 12th International ACM SIGSOFT Conference on Quality of Software Architectures (QoSA 2016).Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Multi-Objective Optimization of Deployment Topologies for Distributed Applications

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in

            Full Access

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader
            About Cookies On This Site

            We use cookies to ensure that we give you the best experience on our website.

            Learn more

            Got it!