Using mutation testing to measure behavioural test diversity
Paper i proceeding, 2020

Diversity has been proposed as a key criterion to improve testing effectiveness and efficiency. It can be used to optimise large test repositories but also to visualise test maintenance issues and raise practitioners' awareness about waste in test artefacts and processes. Even though these diversitybased testing techniques aim to exercise diverse behavior in the system under test (SUT), the diversity has mainly been measured on and between artefacts (e.g., inputs, outputs or test scripts). Here, we introduce a family of measures to capture behavioural diversity (b-div) of test cases by comparing their executions and failure outcomes. Using failure information to capture the SUT behaviour has been shown to improve effectiveness of history-based test prioritisation approaches. However, historybased techniques require reliable test execution logs which are often not available or can be difficult to obtain due to flaky tests, scarcity of test executions, etc. To be generally applicable we instead propose to use mutation testing to measure behavioral diversity by running the set of test cases on various mutated versions of the SUT. Concretely, we propose two specific b-div measures (based on accuracy and Matthew's correlation coefficient, respectively) and compare them with artefact-based diversity (a-div) for prioritising the test suites of 6 different open-source projects. Our results show that our b-div measures outperform a-div and random selection in all of the studied projects. The improvement is substantial with an average increase in average percentage of faults detected (APFD) of between 19% to 31% depending on the size of the subset of prioritised tests.

test selection

diversity-based testing

test prioritisation

empirical study

Författare

Francisco Gomes

Göteborgs universitet

Felix Dobslaw

Chalmers, Data- och informationsteknik, Software Engineering

Robert Feldt

Chalmers, Data- och informationsteknik, Software Engineering

Proceedings - 2020 IEEE 13th International Conference on Software Testing, Verification and Validation Workshops, ICSTW 2020

254-263 9155915
9781728110752 (ISBN)

13th IEEE International Conference on Software Testing, Verification and Validation Workshops, ICSTW 2020
Porto, Portugal,

Informationsteori för programvarutestning

Vetenskapsrådet (VR) (2015-04913), 2016-01-01 -- 2019-12-31.

Ämneskategorier

Geoteknik

Programvaruteknik

Sannolikhetsteori och statistik

DOI

10.1109/ICSTW50294.2020.00051

Mer information

Senast uppdaterat

2024-01-03