XPER (eXplainable PERformance) is a methodology designed to measure the specific contribution of the input features to the predictive performance of any econometric or machine learning model. XPER is built on Shapley values and interpretability tools developed in machine learning but with the distinct objective of focusing on model performance (AUC, ) and not on model predictions (). XPER has as a special case the standard explainability method in Machine Learning (SHAP).
The library has been tested on Linux, MacOSX and Windows. It relies on the following Python modules:
Pandas
Numpy
Scipy
Scikit-learn
XPER can be installed from PyPI:
pip install -i https://test.pypi.org/simple/ XPER==0.0.4
After a correct installation, you should be able to import the module without errors:
import XPER
import XPER from XPER.datasets.sample import sample_generation X_train, y_train, X_test, y_test, p, N, seed = sample_generation(N=500,p=6,seed=123456)
from XPER.datasets.load_data import boston df = boston() df.head(3)
import joblib model = joblib.load('xgboost_model.joblib') result = loaded_model.score(X_test, y_test) print("Model performance: ",result)
from XPER.models.Performance import evaluate_model_performance Eval_Metric = ["Precision"] PM = evaluate_model_performance(Eval_Metric, X_train, y_train, X_test, y_test, model) print("Performance Metrics: ",PM)
from XPER.models.Performance import calculate_XPER_values CFP = None CFN = None result = calculate_XPER_values(X_test, y_test, model, Eval_Metric, CFP, CFN, PM) print("Efficiency bench XPER: ", result[-1])
The contributors to this library are