Mastering Uncertainty in Performance Estimations of Configurable Software Systems

Understanding the influence of configuration options on performance is key for finding optimal system configurations, system understanding, and performance debugging. In prior research, a number of performance-influence modeling approaches have been proposed, which model a configuration option'...

Full description

Saved in:
Bibliographic Details
Published in2020 35th IEEE/ACM International Conference on Automated Software Engineering (ASE) pp. 684 - 696
Main Authors Dorn, Johannes, Apel, Sven, Siegmund, Norbert
Format Conference Proceeding
LanguageEnglish
Published ACM 01.09.2020
Subjects
Online AccessGet full text
ISSN2643-1572
DOI10.1145/3324884.3416620

Cover

More Information
Summary:Understanding the influence of configuration options on performance is key for finding optimal system configurations, system understanding, and performance debugging. In prior research, a number of performance-influence modeling approaches have been proposed, which model a configuration option's influence and a configuration's performance as a scalar value. However, these point estimates falsely imply a certainty regarding an option's influence that neglects several sources of uncertainty within the assessment process, such as (1) measurement bias, (2) model representation and learning process, and (3) incomplete data. This leads to the situation that different approaches and even different learning runs assign different scalar performance values to options and interactions among them. The true influence is uncertain, though. There is no way to quantify this uncertainty with state-of-the-art performance modeling approaches. We propose a novel approach, P4, based on probabilistic programming that explicitly models uncertainty for option influences and consequently provides a confidence interval for each prediction of a configuration's performance alongside a scalar. This way, we can explain, for the first time, why predictions may cause errors and which option's influences may be unreliable. An evaluation on 12 real-world subject systems shows that P4's accuracy is in line with the state of the art while providing reliable confidence intervals, in addition to scalar predictions.
ISSN:2643-1572
DOI:10.1145/3324884.3416620