Metadata-Version: 2.4
Name: respan-instrumentation-pydantic-ai
Version: 0.1.1
Summary: Respan instrumentation plugin for PydanticAI
License: Apache 2.0
Author: Respan
Author-email: team@respan.ai
Requires-Python: >=3.11,<3.14
Classifier: License :: Other/Proprietary License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Requires-Dist: opentelemetry-semantic-conventions-ai (>=0.4.1)
Requires-Dist: pydantic-ai (>=0.2.0)
Requires-Dist: respan-sdk (>=2.5.0)
Requires-Dist: respan-tracing (>=2.16.1,<3.0.0)
Description-Content-Type: text/markdown

# respan-instrumentation-pydantic-ai

Respan instrumentation plugin for [PydanticAI](https://ai.pydantic.dev/).

This package enables PydanticAI's native OpenTelemetry emission and maps the
resulting PydanticAI attributes directly into the Respan/Traceloop conventions
used by the OTLP pipeline.

## Install

```bash
pip install respan-instrumentation-pydantic-ai
```

## Quickstart

```python
import os

from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel
from pydantic_ai.providers.openai import OpenAIProvider
from respan import Respan
from respan_instrumentation_pydantic_ai import PydanticAIInstrumentor

respan = Respan(
    api_key=os.environ["RESPAN_API_KEY"],
    instrumentations=[PydanticAIInstrumentor()],
)

agent = Agent(
    OpenAIModel("gpt-4o-mini", provider=OpenAIProvider(api_key=os.environ["OPENAI_API_KEY"]))
)

result = agent.run_sync("Write a one-line haiku about tracing.")
print(result.output)

respan.flush()
```

## Notes

- By default the instrumentor enables global PydanticAI instrumentation via `Agent.instrument_all(...)`.
- Pass `agent=...` to `PydanticAIInstrumentor(...)` if you only want to instrument one agent instance.
- The plugin uses explicit `InstrumentationSettings(version=4)` to keep emitted spans on the current GenAI semantic conventions used elsewhere in this repo.
- This package does not depend on OpenInference at runtime; it consumes native PydanticAI telemetry directly.

