This repository contains a highly-modular, model-agnostic framework for explanation generation. Moreover, you will find an introduction into the library here, which may help you with the explanation of your ML prediction. The introduction will walk you through the essential components of the library and further present to you examples how to extend it. Ultimately, it will guide you through generation of a first explanation with the help of this library.
For developers, it is suggested that you take a look at the Python documentation for more detailed information on the API.
Currently implemented explanation methods:
- CEM: “Explanations based on the Missing: Towards Contrastive Explanations with Pertinent Negatives”
- LIME: “"Why Should I Trust You?": Explaining the Predictions of Any Classifier”
- Model-Agnostic Explanation Library (MAX Lib)
- This open source project is funded by the German Federal Ministry for Economic Affairs and Climate Action as part of the EMPAIA project under Grant Agreement No. 01MK20002C.
Prerequisites
For installation, it is necessary to have the package manager Poetry and Python 3.8 installed.
Package Installation
Source your desired python environment and install MAXi via:
- Pip
pip install git+https://github.com/dailab/MAXi-XAI-lib.git
- Poetry
poetry add git+https://github.com/dailab/MAXi-XAI-lib.git
Alternatively, if you intend on modifying MAXi-lib locally, these steps are necessary:
- Navigate to the desired destination directory.
- Clone the repository:
git clone https://github.com/dailab/MAXi-XAI-lib.git
- Now, either install the package in an exclusive python environment:
cd constrastive-explaination-prototype
poetry install
- Or install the package within another Poetry environment:
cd *LOCATION_OF_YOUR_PACKAGE*
poetry add --editable *PATH_TO_MAX_LIB*
In case you use poetry as python package manager within your project, it is recommended to set the python versions to:
python = ">=3.8,<3.12"
Otherwise, some poetry dependency resolution issues might occur.
This section covers the main components of this library. Each component takes an integral part in the explanation generation. Below, an overview of the components is displayed:
For every entity, there exists a base class that has to be inhereted from. That makes the implementation of custom components a little simpler as the API is described in detail in those abstract classes. Furthermore, the API will automatically be imposed by Python.
- Modules in this subpackage implement target function of the optimization, as we formulate the explanation as an optimization problem
- Measurement of the cost
- Formalizes the model-agnostic explanation method
- Incorporates external inference model - is called for its calculations
Currently implemented: Click here
- Inference methods need to return array with classification probabilities in order for the loss function to incorporate the prediction into its calculations
- If that is currently not the case, consider implementing a custom Inference Quantizer
- This subpackage consists of modules which help you transform an arbitrary predicition result to the required format
- One can disregard this package, when no pre-inference processing is required and when the prediction is already in a method-conforming format
- The BaseQuantizer translates the produced prediction into a explanation method compatible format
- Here, the inference result is required to be of the typical classification problems format - each entry of the array indicates a likeliness of a class being present in the image
- Input could be any type of model prediction e.g. a segmentation mask
- A valid output would be e.g.
array([0.1, 2.3, -4.1])
- This class offers you the option to implement a preprocessing (pre-inference) procedure
- Furthermore, it stores the external inference method and can subsequently apply the Inference Quantizer
- Given an image, the procedures are called in the following order: preprocessing -> inference -> quantizer
- Each operation, again, is optional. When left out, the identity function is used.
This package contains the necessary engines to solve the optimization problems. In other words, it contains the optimizer algorithms and gradient calculation methods. Like for the loss functions, we provide a base class for both the optimizers and gradient methods in order for easy extension of the library.
Currently implemented:
- Adaptive Exponentiated Gradient
- Adaptive Optimistic Exponentiated Gradient
Currently implemented:
- Gradient Estimation
- TensorFlow-based Gradient Calculation
The Explanation Generator interconnects all the above mentioned components and in that way enables the optimization procedure with respect to the defined explanation model.
This open source project is funded by the German Federal Ministry for Economic Affairs and Climate Action as part of the EMPAIA project under Grant Agreement No. 01MK20002C.