Metadata-Version: 2.4
Name: raindrop-azure-openai
Version: 0.0.3
Summary: Raindrop observability integration for Azure OpenAI
Author-email: Raindrop AI <sdk@raindrop.ai>
License-Expression: MIT
Project-URL: Homepage, https://raindrop.ai
Project-URL: Documentation, https://docs.raindrop.ai
Project-URL: Repository, https://github.com/invisible-tools/dawn
Classifier: Typing :: Typed
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: raindrop-ai>=0.0.42
Requires-Dist: openai>=1.0.0
Provides-Extra: dev
Requires-Dist: pytest>=8.0; extra == "dev"
Requires-Dist: requests>=2.28; extra == "dev"
Dynamic: license-file

# raindrop-azure-openai

Raindrop observability integration for [Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/) (Python). Automatically captures `chat.completions.create()` calls by wrapping AzureOpenAI and AsyncAzureOpenAI clients.

## Installation

```bash
pip install raindrop-azure-openai openai
```

## Quick Start

```python
from raindrop_azure_openai import RaindropAzureOpenAI
from openai import AzureOpenAI

raindrop = RaindropAzureOpenAI(
    api_key="rk_...",
    user_id="user-123",
    debug=False,  # set True for verbose logging
)

client = AzureOpenAI(
    azure_endpoint="https://your-resource.openai.azure.com",
    api_key="...",
    api_version="2024-10-21",
)
wrapped = raindrop.wrap(client)

response = wrapped.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello!"}],
)

raindrop.shutdown()
```

## Debug Mode

```python
raindrop = RaindropAzureOpenAI(api_key="rk_...", debug=True)
```

When `debug=True`, verbose logs are emitted to the `raindrop_azure_openai` logger at `DEBUG` level.

## Async Support

```python
from raindrop_azure_openai import RaindropAzureOpenAI
from openai import AsyncAzureOpenAI

raindrop = RaindropAzureOpenAI(api_key="rk_...", user_id="user-123")

client = AsyncAzureOpenAI(
    azure_endpoint="https://your-resource.openai.azure.com",
    api_key="...",
    api_version="2024-10-21",
)
wrapped = raindrop.wrap(client)

response = await wrapped.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello!"}],
)

raindrop.shutdown()
```

## User Identification

```python
raindrop.identify("user-123", traits={"plan": "pro", "name": "Alice"})
```

## Tracking Signals

```python
raindrop.track_signal(
    event_id="evt-abc",
    name="thumbs_up",
    signal_type="feedback",
    sentiment="POSITIVE",
    comment="Great response!",
)
```

## Flushing and Shutdown

```python
raindrop.flush()     # flush pending data
raindrop.shutdown()  # flush + release resources
```

## Factory Function (Backwards-Compatible)

```python
from raindrop_azure_openai import create_raindrop_azure_openai

raindrop = create_raindrop_azure_openai(api_key="rk_...", user_id="user-123")
# raindrop is now a RaindropAzureOpenAI instance with .wrap(), .flush(), .shutdown()
```

## What Gets Captured

- **Chat completions** — input messages, output text, model, token usage
- **Finish reason** — `azure_openai.finish_reason` (`stop`, `length`, `content_filter`, `tool_calls`)
- **Extended tokens** — `ai.usage.cached_tokens` (prompt cache hits) and `ai.usage.thoughts_tokens` (reasoning tokens for o1/o3 models)
- **Errors** — error type and message captured as properties, then re-raised to the caller
- **Async support** — both sync (`AzureOpenAI`) and async (`AsyncAzureOpenAI`) clients are instrumented

## Configuration

| Option | Type | Default | Description |
|--------|------|---------|-------------|
| `api_key` | `str \| None` | `None` | Raindrop API key. `None` disables telemetry shipping |
| `user_id` | `str \| None` | `None` | Associate all events with a user (falls back to `"unknown"`) |
| `convo_id` | `str \| None` | `None` | Group events into a conversation |
| `tracing_enabled` | `bool` | `True` | Enable OTEL-based tracing |
| `bypass_otel_for_tools` | `bool` | `True` | Bypass OTEL for tool calls |
| `debug` | `bool` | `False` | Enable verbose debug logging |

## Double-Wrap Protection

Calling `wrap()` on an already-instrumented client is a safe no-op — the client is returned unchanged.

## Full Documentation

See the [Raindrop docs](https://docs.raindrop.ai/integrations/azure-openai) for the complete reference.

## Testing

```bash
cd packages/azure-openai-python
pip install -e ".[dev]"
python -m pytest tests/ -v
```

## License

MIT
