Science and technology for the eXplanation of AI decision making

You are here

Black box AI systems for automated decision making, often based on machine learning over (big) data, map a user’s features into a class or a score without exposing the reasons why. This is problematic not only for lack of transparency, but also for possible biases inherited by the algorithms from human prejudices and collection artifacts hidden in the training data, which may lead to unfair or wrong decisions.
The project, funded by an ERC Advanced Grant awarded to Fosca Giannotti, focuses on the urgent open challenge of how to construct meaningful explanations of opaque AI/ML systems, introducing the local-to-global framework for black box explanation, articulated along three lines: (i) the language for expressing explanations in terms of expressive logic rules, with statistical and causal interpretation; (ii) the inference of local explanations for revealing the decision rationale for a specific case, by auditing the black box in the vicinity of the target instance; (iii), the bottom-up generalization of many local explanations into simple global ones, with algorithms that optimize for quality and comprehensibility. An intertwined line of research will investigate (i) causal explanations, i.e., models that capture the causal relationships among the (endogenous and exogenous) variables and the decision, and (ii) mechanistic/physical models that capture the detailed data generation behavior behind specific deep learning models, by means of the tools of statistical physics of complex systems. It will also develop: (1) an explanation infrastructure for benchmarking the methods developed within and outside my project, equipped with platforms for the users’ assessment of the explanations and the crowdsensing of observational decision data; (2) an ethical-legal framework, both for compliance and impact of our developed methods on current legal standards and on the “right of explanation” provisions of the GDPR; and (3) a repertoire of case studies in explanation-by-design, with a priority in health and fraud detection applications.

The ERC Advanced Grant XAI “Science & technology for the eXplanation of AI decision making”, led by Fosca Giannotti of the Italian CNR, in collaboration with the PhD program in “Data Science” by Scuola Normale Superiore in Pisa, invites applic

Start Date
30 September 2019
End Date
30 September 2024
Department of Computer Science, University of Pisa (DI-UNIPI)
Istituto di Scienza e Tecnologie dell’Informazione, National Research Council of Italy (ISTI-CNR)