Software Microbenchmarking in the Cloud. How Bad is it Really?
Artikel i vetenskaplig tidskrift, 2019

Rigorous performance engineering traditionally assumes measur- ing on bare-metal environments to control for as many confounding factors as possible. Unfortunately, some researchers and practitioners might not have access, knowledge, or funds to operate dedicated performance-testing hard- ware, making public clouds an attractive alternative. However, shared public cloud environments are inherently unpredictable in terms of the system perfor- mance they provide. In this study, we explore the effects of cloud environments on the variability of performance test results and to what extent slowdowns can still be reliably detected even in a public cloud. We focus on software microbenchmarks as an example of performance tests and execute extensive experiments on three different well-known public cloud services (AWS, GCE, and Azure) using three different cloud instance types per service. We also com- pare the results to a hosted bare-metal offering from IBM Bluemix. In total, we gathered more than 4.5 million unique microbenchmarking data points from benchmarks written in Java and Go. We find that the variability of results differs substantially between benchmarks and instance types (by a coefficient of variation from 0.03% to > 100%). However, executing test and control ex-periments on the same instances (in randomized order) allows us to detect slowdowns of 10% or less with high confidence, using state-of-the-art statis- tical tests (i.e., Wilcoxon rank-sum and overlapping bootstrapped confidence intervals). Finally, our results indicate that Wilcoxon rank-sum manages to detect smaller slowdowns in cloud environments.

performance- regression detection

performance testing

cloud

microbenchmarking

Författare

Christoph Laaber

Universität Zürich

Joel Scheuner

Chalmers, Data- och informationsteknik, Software Engineering, Software Engineering for People, Architecture, Requirements and Traceability

Philipp Leitner

Chalmers, Data- och informationsteknik, Software Engineering, Software Engineering for People, Architecture, Requirements and Traceability

Empirical Software Engineering

1382-3256 (ISSN) 1573-7616 (eISSN)

Vol. 24 4 2469-2508

Styrkeområden

Informations- och kommunikationsteknik

Ämneskategorier

Programvaruteknik

Datavetenskap (datalogi)

DOI

10.1007/s10664-019-09681-1

Mer information

Senast uppdaterat

2019-09-02