Learning How to Search: Generating Exception-Triggering Tests Through Adaptive Fitness Function Selection
Paper in proceeding, 2020

Search-based test generation is guided by feedback from one or more fitness functions—scoring functions that judge solution optimality. Choosing informative fitness functions is crucial to meeting the goals of a tester. Unfortunately, many goals—such as forcing the class-under-test to throw exceptions— do not have a known fitness function formulation. We propose that meeting such goals requires treating fitness function identification as a secondary optimization step. An adaptive algorithm that can vary the selection of fitness functions could adjust its selection throughout the generation process to maximize goal attainment, based on the current population of test suites. To test this hypothesis, we have implemented two reinforcement learning algorithms in the EvoSuite framework, and used these algorithms to dynamically set the fitness functions used during generation.We have evaluated our framework, EvoSuiteFIT, on a set of 386 real faults. EvoSuiteFIT discovers and retains more exception-triggering input and produces suites that detect a variety of faults missed by the other techniques. The ability to adjust fitness functions allows EvoSuiteFIT to make strategic choices that efficiently produce more effective test suites.

Reinforcement Learning

Search-Based Software Engineering

Automated Test Generation

Author

Hussein Almulla

University of South Carolina

Gregory Gay

University of Gothenburg

Proceedings - 2020 IEEE 13th International Conference on Software Testing, Verification and Validation, ICST 2020

63-73 9159064
9781728157771 (ISBN)

2020 IEEE 13th International Conference on Software Testing, Validation and Verification (ICST)
Porto, Portugal,

Subject Categories

Software Engineering

Computer Science

DOI

10.1109/ICST46399.2020.00017

More information

Latest update

3/21/2023