Metadata-Version: 2.4
Name: spark-advisor-models
Version: 0.1.16
Summary: Shared Pydantic contracts for spark-advisor microservices
Project-URL: Homepage, https://github.com/pstysz/spark-advisor
Project-URL: Repository, https://github.com/pstysz/spark-advisor
Project-URL: Issues, https://github.com/pstysz/spark-advisor/issues
Author: Pawel Stysz
License-Expression: Apache-2.0
Keywords: apache-spark,metrics,models,pydantic,spark
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Typing :: Typed
Requires-Python: >=3.12
Requires-Dist: opentelemetry-api>=1.25.0
Requires-Dist: opentelemetry-exporter-otlp-proto-grpc>=1.25.0
Requires-Dist: opentelemetry-sdk>=1.25.0
Requires-Dist: orjson>=3.10
Requires-Dist: pydantic-settings[yaml]>=2.0
Requires-Dist: pydantic>=2.10
Requires-Dist: structlog>=24.0
Description-Content-Type: text/markdown

# spark-advisor-models

Shared Pydantic models and configuration for the [spark-advisor](https://github.com/pstysz/spark-advisor) ecosystem.

## Install

```bash
pip install spark-advisor-models
```

## What's inside

- **Spark job metrics** — stages, executors, tasks, quantile distributions (`model/metrics.py`)
- **Analysis output** — recommendations, severity levels, causal chains (`model/output.py`)
- **AI tool schemas** — input/output models for Claude API, generated from Pydantic (`model/input.py`)
- **Spark config wrapper** — typed access to spark.* properties (`model/spark_config.py`)
- **Shared configuration** — rule thresholds, AI settings, NATS settings (`config.py`, `settings.py`)
- **Structured logging** — `configure_logging()`, `bind_nats_context()`, `nats_handler_context()` (`logging.py`)
- **Distributed tracing** — OpenTelemetry W3C Traceparent propagation via NATS headers (`tracing.py`)
- **Utilities** — byte formatting, statistics helpers (`util/`)

## Usage

```python
from spark_advisor_models.model.metrics import JobAnalysis, StageMetrics
from spark_advisor_models.config import Thresholds, AiSettings
```

## Links

- [Main project](https://github.com/pstysz/spark-advisor)
- [Contributing](https://github.com/pstysz/spark-advisor/blob/main/CONTRIBUTING.md)

## License

Apache 2.0
