Software doping analysis for human oversight

This article introduces a framework that is meant to assist in mitigating societal risks that software can pose. Concretely, this encompasses facets of software doping as well as unfairness and discrimination in high-risk decision-making systems. The term software doping refers to software that cont...

Full description

Saved in:
Bibliographic Details
Published inFormal methods in system design Vol. 66; no. 1; pp. 49 - 98
Main Authors Biewer, Sebastian, Baum, Kevin, Sterz, Sarah, Hermanns, Holger, Hetmank, Sven, Langer, Markus, Lauber-Rönsberg, Anne, Lehr, Franz
Format Journal Article
LanguageEnglish
Published New York Springer US 01.05.2025
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0925-9856
1572-8102
1572-8102
DOI10.1007/s10703-024-00445-2

Cover

More Information
Summary:This article introduces a framework that is meant to assist in mitigating societal risks that software can pose. Concretely, this encompasses facets of software doping as well as unfairness and discrimination in high-risk decision-making systems. The term software doping refers to software that contains surreptitiously added functionality that is against the interest of the user. A prominent example of software doping are the tampered emission cleaning systems that were found in millions of cars around the world when the diesel emissions scandal surfaced. The first part of this article combines the formal foundations of software doping analysis with established probabilistic falsification techniques to arrive at a black-box analysis technique for identifying undesired effects of software. We apply this technique to emission cleaning systems in diesel cars but also to high-risk systems that evaluate humans in a possibly unfair or discriminating way. We demonstrate how our approach can assist humans-in-the-loop to make better informed and more responsible decisions. This is to promote effective human oversight, which will be a central requirement enforced by the European Union’s upcoming AI Act. We complement our technical contribution with a juridically, philosophically, and psychologically informed perspective on the potential problems caused by such systems.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0925-9856
1572-8102
1572-8102
DOI:10.1007/s10703-024-00445-2