Metadata-Version: 2.4
Name: agent-obs-sdk
Version: 0.1.2
Summary: Lightweight Python SDK for Agent Observability Tracing — instruments multi-agent AI workflows
License: MIT
Keywords: agents,langchain,observability,opentelemetry,tracing
Requires-Python: >=3.9
Requires-Dist: httpx>=0.24.0
Requires-Dist: opentelemetry-api>=1.20.0
Requires-Dist: opentelemetry-sdk>=1.20.0
Provides-Extra: langchain
Requires-Dist: langchain>=0.1.0; extra == 'langchain'
Description-Content-Type: text/markdown

# Agent Trace SDK (agent-obs-sdk)

Welcome to the **Agent Trace SDK**! 

This is the official, lightweight Python library for connecting your AI agent swarms to the **Agent Observability Tracing Platform**. 

## The Core Idea

As AI workflows become increasingly complex—incorporating multiple specialized agents, chained LLM calls, and asynchronous interactions—debugging and observability become critical challenges. It's often difficult to answer questions like:
- *Why did Agent A hand off control to Agent B instead of Agent C?*
- *What specific prompt sequence caused the LLM to hallucinate?*
- *How long did the retrieval step take compared to generation?*

**Agent Trace SDK** solves this. It provides drop-in observability for multi-agent workflows, letting you trace logic execution, tool calls, token usage, and latency seamlessly. By simply decorating your agent functions, you gain deep visibility via our interactive tracing dashboard.

## Key Features
- **Zero-Friction Integration**: Start tracing with just a single decorator (`@trace_agent()`).
- **Framework Agnostic but LangChain Friendly**: Works out-of-the-box with any standard Python functions, with optional extensions specifically for LangChain workflows.
- **Deep Visibility**: Captures inputs, outputs, execution time, and any intermediate states.
- **OpenTelemetry Based**: Built on standard `opentelemetry` patterns for robust, reliable metric delivery.
- **Hierarchical Traces**: Automatically links parent-child agent workflows for easy debugging of Swarm/Agent hierarchies.

## Installation

Getting started is as easy as installing via pip:

```bash
pip install agent-obs-sdk
```

*(Optional) If you are using LangChain, you can install the specific extras:*

```bash
pip install "agent-obs-sdk[langchain]"
```

## Quick Start & Usage

Below is a minimal setup that shows how you can start tracing an agent and tracking its outputs.

```python
import os
from agent_trace import trace_agent, configure

# 1. Configure the SDK with your platform credentials
configure(
    api_endpoint=os.getenv("AGENT_OBS_ENDPOINT", "http://localhost:8000"),
    api_key=os.getenv("AGENT_OBS_API_KEY", "YOUR_API_KEY"),
    project_id="YOUR_PROJECT_SLUG",
)

# 2. Decorate your agent functions
@trace_agent(agent_name="GreetingAgent")
def my_agent(query: str):
    """A conversational agent that processes a user's query."""
    print(f"Agent is reasoning about '{query}'...")
    # ... complex LLM calls or tool invocations here ...
    return f"Processed Response to: {query}"

if __name__ == "__main__":
    reply = my_agent("Hello, can you help me plan a trip?")
    print(reply)
```

## Configuration Options

You can configure the SDK using the `configure()` function, or via environment variables (which are automatically picked up):
- `AGENT_OBS_ENDPOINT` / `api_endpoint`: The base URL of the Agent Observability Platform.
- `AGENT_OBS_API_KEY` / `api_key`: Your secret token.
- `AGENT_OBS_PROJECT_ID` / `project_id`: Distinguishes traces across multiple projects.

## Contributing

We welcome issues, feature requests, and pull requests! Let's build the future of AI Agent Observability together.
