Metadata-Version: 2.4
Name: llm-api-engine
Version: 0.1.1
Summary: Unified LLM API provider and engine library
Author: LLM Engine Developer
License: MIT
Project-URL: Homepage, https://github.com/yourusername/llm-engine
Project-URL: Documentation, https://github.com/yourusername/llm-engine#readme
Project-URL: Repository, https://github.com/yourusername/llm-engine
Project-URL: Issues, https://github.com/yourusername/llm-engine/issues
Keywords: llm,api,openai,deepseek,ollama,ai,language-model,chatgpt
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.8
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: openai>=1.0.0
Requires-Dist: litellm>=1.0.0
Requires-Dist: pydantic>=2.0.0
Requires-Dist: pyyaml>=6.0.0
Requires-Dist: loguru>=0.7.0
Provides-Extra: dev
Requires-Dist: pytest>=7.0.0; extra == "dev"
Requires-Dist: pytest-cov>=4.0.0; extra == "dev"
Requires-Dist: pytest-asyncio>=0.21.0; extra == "dev"
Requires-Dist: mypy>=1.5.0; extra == "dev"
Requires-Dist: ruff>=0.1.0; extra == "dev"
Dynamic: license-file

# LLM Engine

Unified LLM API provider and engine library for Python.

## Features

- Support for multiple LLM providers (OpenAI, DeepSeek, Ollama, Custom)
- Both synchronous and asynchronous API support
- Unified configuration via `providers.yml`
- Environment variable resolution
- Automatic retry logic
- Streaming support
- Token estimation and resource management

## Installation

Install from PyPI:

```bash
pip install llm-engine
```

For local development:

```bash
cd llm-engine
pip install -e .
```

Or install with development dependencies:

```bash
pip install -e ".[dev]"
```

## Configuration

Create a `providers.yml` file:

```yaml
providers:
  deepseek:
    base_url: "https://api.deepseek.com/v1"
    api_key: ${DEEPSEEK_API_KEY}
    default_model: "deepseek-chat"
    models:
      - name: "deepseek-chat"
        context_length: 128000
        functions:
          json_output: true
```

## Usage

### Async Usage

```python
from llm_engine import LLMConfig, LLMProvider, LLMEngine

config = LLMConfig(
    provider=LLMProvider.DEEPSEEK,
    model_name="deepseek-chat",
    api_key="your-api-key",
)

engine = LLMEngine(config)
response = await engine.generate("Hello, world!")
```

### Sync Usage

```python
from llm_engine import LLMConfig, LLMProvider
from llm_engine.providers.openai_compatible import OpenAICompatibleProvider

config = LLMConfig(
    provider=LLMProvider.DEEPSEEK,
    model_name="deepseek-chat",
    api_key="your-api-key",
)

provider = OpenAICompatibleProvider(config)
response = provider.call("Hello, world!")
```

## License

MIT
