Metadata-Version: 2.4
Name: syntropy-ai
Version: 0.2.1
Summary: Universal AI Observability and Security Agent — trace, guard, and govern LLM calls across every provider.
Project-URL: Homepage, https://syntropy.dev
Project-URL: Documentation, https://docs.syntropy.dev/sdk/python
Project-URL: Repository, https://github.com/elhossam7/syntropy-platform-build
Project-URL: Changelog, https://github.com/elhossam7/syntropy-platform-build/blob/main/CHANGELOG.md
Project-URL: Issues, https://github.com/elhossam7/syntropy-platform-build/issues
Author-email: Syntropy <sdk@syntropy.dev>
License-Expression: MIT
Keywords: agents,ai,anthropic,bedrock,gemini,guardrails,langchain,llamaindex,llm,mistral,observability,openai,tracing
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Libraries
Classifier: Typing :: Typed
Requires-Python: >=3.9
Requires-Dist: httpx>=0.24.0
Provides-Extra: all
Requires-Dist: anthropic>=0.18.0; extra == 'all'
Requires-Dist: boto3>=1.28.0; extra == 'all'
Requires-Dist: google-generativeai>=0.4.0; extra == 'all'
Requires-Dist: langchain-core>=0.1.0; extra == 'all'
Requires-Dist: llama-index-core>=0.10.0; extra == 'all'
Requires-Dist: mistralai>=0.1.0; extra == 'all'
Requires-Dist: openai>=1.0.0; extra == 'all'
Provides-Extra: anthropic
Requires-Dist: anthropic>=0.18.0; extra == 'anthropic'
Provides-Extra: bedrock
Requires-Dist: boto3>=1.28.0; extra == 'bedrock'
Provides-Extra: dev
Requires-Dist: mypy>=1.8; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.21; extra == 'dev'
Requires-Dist: pytest-cov>=4.0; extra == 'dev'
Requires-Dist: pytest>=7.0; extra == 'dev'
Requires-Dist: ruff>=0.3.0; extra == 'dev'
Provides-Extra: google
Requires-Dist: google-generativeai>=0.4.0; extra == 'google'
Provides-Extra: langchain
Requires-Dist: langchain-core>=0.1.0; extra == 'langchain'
Provides-Extra: llamaindex
Requires-Dist: llama-index-core>=0.10.0; extra == 'llamaindex'
Provides-Extra: mistral
Requires-Dist: mistralai>=0.1.0; extra == 'mistral'
Provides-Extra: openai
Requires-Dist: openai>=1.0.0; extra == 'openai'
Description-Content-Type: text/markdown

# Syntropy Python SDK

Trace, guard, and govern your AI agents with zero-effort integration.

## Installation

```bash
pip install syntropy-ai
```

## Quick Start

```python
from syntropy import Syntropy

client = Syntropy(
    api_key="syn_your_key_here",
    base_url="https://ingest.syntropy.dev",
    agent_id="my-agent",
)

# Trace an LLM call (buffered, non-blocking)
client.trace(
    prompt="What is the capital of France?",
    response="The capital of France is Paris.",
    model="gpt-4o",
    tokens_in=12,
    tokens_out=8,
    latency_ms=340,
)

# Synchronous trace (blocks until confirmed)
result = client.trace_sync(
    prompt="Hello",
    response="Hi there!",
    model="gpt-4o-mini",
)
print(result.success, result.trace_id)
```

## OpenAI Integration

Auto-trace all `chat.completions.create()` calls:

```python
import openai
from syntropy import Syntropy
from syntropy.integrations.openai import wrap_openai

syn = Syntropy(api_key="syn_...", agent_id="my-agent")
client = wrap_openai(openai.OpenAI(), syntropy=syn)

# This call is now auto-traced — no code changes needed:
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}],
)
```

## Async Support

```python
from syntropy.client import AsyncSyntropy

async def main():
    client = AsyncSyntropy(
        api_key="syn_...",
        base_url="https://ingest.syntropy.dev",
        agent_id="my-agent",
    )

    result = await client.trace(
        prompt="Hello",
        response="Hi!",
        model="gpt-4o",
    )
    print(result.success)
    await client.close()
```

## Batch Tracing

```python
client.trace_batch([
    {"agent_id": "agent-1", "prompt": "Q1", "response": "A1", "model": "gpt-4o"},
    {"agent_id": "agent-2", "prompt": "Q2", "response": "A2", "model": "gpt-4o"},
])
```

## Configuration

| Option | Default | Description |
|--------|---------|-------------|
| `api_key` | required | Your Syntropy API key |
| `base_url` | `http://localhost:4000` | Ingestion service URL |
| `agent_id` | `""` | Default agent ID |
| `flush_interval` | `2.0` | Seconds between buffer flushes |
| `batch_size` | `50` | Max traces per batch |
| `max_queue_size` | `10000` | Max buffered traces |
| `enabled` | `True` | Set `False` to disable tracing |
