Explainable AI in the financial sector

The way in which AI is utilized in finance is becoming more and more complex, often occluding how a decision is reached. However, the parties that have a stake in financial services –such as customers and regulators– either require or have a right to explanation, for instance on how personal data is used in lending. The question arises what kind of explanations of AI processes are required for each stakeholder depending on the situation.

Objective

The goal of the project is to set up a series of studies in cooperation with organizations in the financial sector, focusing first on lending. The studies can map what kind of explanation is required by what kind of stakeholder, and how that explanation can be formulated based on the underlying algorithm. Potential parties to cooperate with are SMEs that provide loans or credit, and regulators in the financial sector.

Results

  • Explainable AI in the context of the financial sector accurately defined.
  • A framework for explainable AI in the financial sector per type of stakeholder and requirements of the explanation.
  • Principles and guidelines for developing AI-applications within the financial sector, cognizant of the requirements of explainability.

Duration

01 June 2020 - 30 June 2022

Approach

HU researchers involved in the research

Related research groups