Metadata-Version: 2.4
Name: llmkit-python
Version: 0.1.0
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Rust
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Typing :: Typed
Summary: Unified LLM API client for Python - multi-provider support with a single interface
Keywords: llm,ai,openai,anthropic,claude,gpt,llama
License: MIT OR Apache-2.0
Requires-Python: >=3.9
Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM
Project-URL: Documentation, https://github.com/yfedoseev/llmkit#readme
Project-URL: Homepage, https://github.com/yfedoseev/llmkit
Project-URL: Repository, https://github.com/yfedoseev/llmkit

# LLMKit Python

Python bindings for LLMKit - a unified LLM API client library.

## Installation

```bash
pip install llmkit
```

## Quick Start

```python
from llmkit import LLMKitClient, AsyncLLMKitClient, CompletionRequest, Message

# Sync client - use "provider/model" format
client = LLMKitClient.from_env()
response = client.complete(CompletionRequest(
    model="anthropic/claude-sonnet-4-20250514",
    messages=[Message.user("Hello!")],
))
print(response.text_content())

# Async client
async def main():
    client = AsyncLLMKitClient.from_env()
    response = await client.complete(CompletionRequest(
        model="anthropic/claude-sonnet-4-20250514",
        messages=[Message.user("Hello!")],
    ))
    print(response.text_content())

# Streaming
for chunk in client.complete_stream(request):
    if chunk.text:
        print(chunk.text, end="", flush=True)
```

## Features

- Unified API for 100+ LLM providers
- Sync and async clients
- Streaming support
- Tool/function calling
- Extended thinking (reasoning)
- Prompt caching
- Structured output (JSON schema)

## License

MIT OR Apache-2.0

