Software Microbenchmarking in the Cloud. How Bad is it Really?
Journal article, 2019

Rigorous performance engineering traditionally assumes measur- ing on bare-metal environments to control for as many confounding factors as possible. Unfortunately, some researchers and practitioners might not have access, knowledge, or funds to operate dedicated performance-testing hard- ware, making public clouds an attractive alternative. However, shared public cloud environments are inherently unpredictable in terms of the system perfor- mance they provide. In this study, we explore the effects of cloud environments on the variability of performance test results and to what extent slowdowns can still be reliably detected even in a public cloud. We focus on software microbenchmarks as an example of performance tests and execute extensive experiments on three different well-known public cloud services (AWS, GCE, and Azure) using three different cloud instance types per service. We also com- pare the results to a hosted bare-metal offering from IBM Bluemix. In total, we gathered more than 4.5 million unique microbenchmarking data points from benchmarks written in Java and Go. We find that the variability of results differs substantially between benchmarks and instance types (by a coefficient of variation from 0.03% to > 100%). However, executing test and control ex-periments on the same instances (in randomized order) allows us to detect slowdowns of 10% or less with high confidence, using state-of-the-art statis- tical tests (i.e., Wilcoxon rank-sum and overlapping bootstrapped confidence intervals). Finally, our results indicate that Wilcoxon rank-sum manages to detect smaller slowdowns in cloud environments.

performance- regression detection

cloud

microbenchmarking

performance testing

Author

Christoph Laaber

Joel Scheuner

Chalmers, Computer Science and Engineering (Chalmers), Software Engineering (Chalmers), Software Engineering for People, Architecture, Requirements and Traceability

Philipp Leitner

Chalmers, Computer Science and Engineering (Chalmers), Software Engineering (Chalmers), Software Engineering for People, Architecture, Requirements and Traceability

Empirical Software Engineering

1382-3256 (ISSN) 1573-7616 (eISSN)

Areas of Advance

Information and Communication Technology

Subject Categories

Software Engineering

Computer Science

DOI

10.1007/s10664-019-09681-1

More information

Latest update

7/8/2019 8