Stable and actionable explanations of black-box models through factual and counterfactual rules

Recent years have witnessed the rise of accurate but obscure classification models that hide the logic of their internal decision processes. Explaining the decision taken by a black-box classifier on a specific input instance is therefore of striking interest. We propose a local rule-based model-agn...

Full description

Saved in:
Bibliographic Details
Published inData mining and knowledge discovery Vol. 38; no. 5; pp. 2825 - 2862
Main Authors Guidotti, Riccardo, Monreale, Anna, Ruggieri, Salvatore, Naretto, Francesca, Turini, Franco, Pedreschi, Dino, Giannotti, Fosca
Format Journal Article
LanguageEnglish
Published New York Springer US 01.09.2024
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1384-5810
1573-756X
1573-756X
DOI10.1007/s10618-022-00878-5

Cover

More Information
Summary:Recent years have witnessed the rise of accurate but obscure classification models that hide the logic of their internal decision processes. Explaining the decision taken by a black-box classifier on a specific input instance is therefore of striking interest. We propose a local rule-based model-agnostic explanation method providing stable and actionable explanations. An explanation consists of a factual logic rule, stating the reasons for the black-box decision, and a set of actionable counterfactual logic rules, proactively suggesting the changes in the instance that lead to a different outcome. Explanations are computed from a decision tree that mimics the behavior of the black-box locally to the instance to explain. The decision tree is obtained through a bagging-like approach that favors stability and fidelity: first, an ensemble of decision trees is learned from neighborhoods of the instance under investigation; then, the ensemble is merged into a single decision tree. Neighbor instances are synthetically generated through a genetic algorithm whose fitness function is driven by the black-box behavior. Experiments show that the proposed method advances the state-of-the-art towards a comprehensive approach that successfully covers stability and actionability of factual and counterfactual explanations.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1384-5810
1573-756X
1573-756X
DOI:10.1007/s10618-022-00878-5