Metadata-Version: 2.4
Name: treeshap
Version: 0.1.1
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: MIT License
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Rust
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Scientific/Engineering
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Dist: numpy>=1.20
Summary: Exact TreeSHAP in Rust — fast Shapley values for XGBoost, LightGBM, and ONNX tree ensembles
License: MIT OR Apache-2.0
Requires-Python: >=3.8
Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM
Project-URL: Changelog, https://github.com/saswatsusmoy/treeshap-rs/releases
Project-URL: Documentation, https://github.com/saswatsusmoy/treeshap-rs/blob/main/docs/architecture.md
Project-URL: Issues, https://github.com/saswatsusmoy/treeshap-rs/issues
Project-URL: Repository, https://github.com/saswatsusmoy/treeshap-rs

# treeshap

Exact TreeSHAP in Rust with Python bindings. Computes Shapley values for XGBoost, LightGBM, and ONNX tree ensemble models.

## Installation

```bash
pip install treeshap
```

Pre-built wheels are available for Linux (x86_64, aarch64), macOS (x86_64, Apple Silicon), and Windows (x86_64). Python 3.8+ required.

## Usage

```python
import numpy as np
from treeshap import TreeEnsemble, ShapExplainer

# Load a model
model = TreeEnsemble.from_file("model.json", "xgboost")

# Explain predictions
explainer = ShapExplainer(model)
explanation = explainer.explain(X)  # X is a numpy array

# Access results
print(explanation.shap_values)   # numpy array (n_samples, n_features)
print(explanation.base_value)    # float or list[float]
print(explanation.predictions)   # numpy array (n_samples,)

# Verify local accuracy
report = explanation.verify()
assert report.is_pass

# Generate plots (returns SVG bytes)
svg = explanation.plot_waterfall(sample_index=0)
svg = explanation.plot_beeswarm()
svg = explanation.plot_importance()
```

## Supported Formats

| Format | Loader | Example |
|:---|:---|:---|
| XGBoost JSON | `TreeEnsemble.from_file(path, "xgboost")` | `booster.save_model("model.json")` |
| LightGBM text | `TreeEnsemble.from_file(path, "lightgbm")` | `booster.save_model("model.txt")` |
| ONNX | `TreeEnsemble.from_file(path, "onnx")` | TreeEnsembleRegressor/Classifier |

## Performance

Sub-millisecond single-sample latency. 10,000 samples with 100 trees at depth 6 in 2.8 seconds on Apple M3. GIL is released during computation for full multi-core utilization.

## License

MIT OR Apache-2.0

