Metadata-Version: 2.4
Name: agentscope-otel
Version: 0.2.1
Summary: Python SDK for AgentScope — AI Agent Observability with OpenTelemetry
Project-URL: Homepage, https://github.com/moklabs/agentscope
Project-URL: Repository, https://github.com/moklabs/agentscope
Project-URL: Issues, https://github.com/moklabs/agentscope/issues
Project-URL: Documentation, https://github.com/moklabs/agentscope/tree/main/packages/sdk-python
License: MIT
License-File: LICENSE
Keywords: agent,agentscope,ai,gen-ai,llm,observability,opentelemetry,otlp,sdk,tracing
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: System :: Monitoring
Requires-Python: >=3.10
Requires-Dist: httpx>=0.25.0
Requires-Dist: opentelemetry-api>=1.28.0
Requires-Dist: opentelemetry-exporter-otlp>=1.28.0
Requires-Dist: opentelemetry-sdk>=1.28.0
Provides-Extra: dev
Requires-Dist: pytest>=8.0; extra == 'dev'
Requires-Dist: ruff; extra == 'dev'
Description-Content-Type: text/markdown

# agentscope-otel

Python SDK for [AgentScope](https://github.com/moklabs/agentscope) -- AI Agent Observability.

## Installation

```bash
pip install agentscope-otel
```

## Quick Start

```python
from agentscope_otel import AgentScope

# Initialize once at startup (3 lines!)
scope = AgentScope.init(
    endpoint="http://localhost:3001",
    project_id="my-project",
)

# Use decorators for automatic span management
@scope.wrap_agent("my-agent", agent_name="researcher", agent_task="Answer questions")
async def run_agent(question: str):
    # Nested tool call -- auto-linked via context propagation
    with scope.trace_tool("web-search") as span:
        results = await search(question)

    # LLM call with auto token tracking
    @scope.wrap_llm_call("generate", model="claude-sonnet-4-5")
    async def call_llm(prompt):
        return await llm.complete(prompt)  # should have input_tokens, output_tokens attrs

    return await call_llm(f"Summarize: {results}")

# Run and clean up
import asyncio
asyncio.run(run_agent("What is AgentScope?"))
scope.flush()
```

## API

### Initialization

```python
scope = AgentScope.init(
    endpoint="http://localhost:3001",
    project_id="my-project",
    tenant_id="default",         # optional
    api_key="sk-...",            # optional
    service_name="my-service",   # optional
    debug=True,                  # optional
)
```

### Decorators

```python
@scope.wrap_agent("name", agent_name="...", agent_task="...")
@scope.wrap_tool("name", agent_name="...")
@scope.wrap_step("name")
@scope.wrap_llm_call("name", model="claude-sonnet-4-5")
```

### Context Managers

```python
with scope.trace_agent("name") as span:
    ...
with scope.trace_tool("name") as span:
    ...
with scope.trace_step("name") as span:
    ...
```

### Manual Span Control

```python
span = scope.create_agent_span("name", agent_name="...")
# ... work ...
span.end()

llm_span = scope.create_llm_span("name", model="claude-sonnet-4-5")
# ... call LLM ...
scope.end_llm_span(llm_span, LLMResult(input_tokens=100, output_tokens=50))
```

### Cost Calculation

```python
from agentscope_otel import calculate_cost, MODEL_PRICING

cost = calculate_cost("claude-sonnet-4-5", input_tokens=1000, output_tokens=500)
```

## Supported Models

| Model | Input (cents/1M tokens) | Output (cents/1M tokens) |
|-------|------------------------|-------------------------|
| gpt-4o | 250 | 1000 |
| gpt-4o-mini | 15 | 60 |
| claude-sonnet-4-5 | 300 | 1500 |
| claude-haiku-4-5 | 80 | 400 |
| claude-opus-4-6 | 1500 | 7500 |
| gemini-2.0-flash | 10 | 40 |

## Span Types

- `session` -- Top-level session grouping
- `agent_run` -- Agent execution
- `step` -- Generic step within an agent
- `tool_call` -- Tool invocation
- `llm_call` -- LLM API call
- `sub_agent` -- Nested agent invocation

## Example

A runnable quickstart is included in `examples/quickstart.py`:

```bash
python examples/quickstart.py
```

## Requirements

- Python 3.10+
- OpenTelemetry Python SDK

## License

MIT
