Metadata-Version: 2.4
Name: eager-tools-openai
Version: 0.3.0
Summary: OpenAI adapter for eager tool calling — wraps chat.completions streaming to emit SealEvents.
Project-URL: Homepage, https://github.com/cloudthinker-ai/eager-tools
Project-URL: Repository, https://github.com/cloudthinker-ai/eager-tools
Project-URL: Issues, https://github.com/cloudthinker-ai/eager-tools/discussions
Author-email: CloudThinker <hello@cloudthinker.io>
License: MIT
Keywords: agent,gpt,llm,openai,streaming,tool-calling
Classifier: Development Status :: 3 - Alpha
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Typing :: Typed
Requires-Python: >=3.11
Requires-Dist: eager-tools-core>=0.3.0
Provides-Extra: dev
Requires-Dist: openai>=1.50; extra == 'dev'
Requires-Dist: pyrefly>=0.1; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.23; extra == 'dev'
Requires-Dist: pytest>=8; extra == 'dev'
Requires-Dist: ruff>=0.5; extra == 'dev'
Description-Content-Type: text/markdown

# eager-tools-openai

OpenAI adapter for **eager tool calling** — wraps `chat.completions.create(stream=True)` and dispatches each tool the instant its block seals, overlapping tool execution with ongoing LLM generation.

> For the mechanism and runtime contract, see the [top-level `METHOD.md`](../../METHOD.md).

## Status

**Alpha / scaffold.** Public API shape is locked; implementation bodies arrive in Move 3 (CloudThinker port). See [`NEXT.md`](../../NEXT.md).

## Install (once published)

```bash
pip install eager-tools-openai
```

## Usage

```python
from openai import AsyncOpenAI
from eager_tools_openai import OpenAIEagerStream

client = AsyncOpenAI()

source = await client.chat.completions.create(
    model="gpt-4.1",
    tools=[...],
    messages=[...],
    stream=True,
)
runner = OpenAIEagerStream(source, tools=my_tools)

async for event in runner.events():
    print(event.kind, event.tool_call)

async for call, result in runner.results():
    print(call.name, "→", result)
```

## Supported SDK surfaces

- `chat.completions.create(stream=True)` with `tools=[...]` — **v0.1 target**.
- Responses API (`responses.stream(...)`) — **deferred**; contributions welcome.

## Supported SDK versions

- `openai>=1.50`.

## License

MIT. See top-level [`LICENSE`](../../LICENSE).
