Metadata-Version: 2.4
Name: aevum-llm
Version: 0.3.0
Summary: Aevum — LiteLLM-backed LLM complication with episodic ledger record.
Project-URL: Homepage, https://aevum.build
Project-URL: Repository, https://github.com/aevum-labs/aevum
License: Apache-2.0
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3.11
Classifier: Typing :: Typed
Requires-Python: >=3.11
Requires-Dist: aevum-core
Requires-Dist: aevum-sdk
Requires-Dist: aiohttp>=3.13.4
Requires-Dist: litellm>=1.40
Requires-Dist: python-dotenv>=1.2.2
Provides-Extra: dev
Requires-Dist: mypy>=1.10; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.23; extra == 'dev'
Requires-Dist: pytest>=8.0; extra == 'dev'
Requires-Dist: ruff>=0.9; extra == 'dev'
Description-Content-Type: text/markdown

# aevum-llm

LiteLLM-backed LLM complication for Aevum. Every call is audited: model ID, prompt hash, and response hash are recorded in the episodic ledger. Raw prompts and responses are never stored.

```bash
pip install aevum-llm
```

```python
from aevum.llm import LlmComplication
from aevum.core import Engine

engine = Engine()
engine.install_complication(
    LlmComplication(model="claude-sonnet-4-6", fallback_models=["gpt-4.1"]),
    auto_approve=True,
)
```

See the [main repository README](https://github.com/aevum-labs/aevum) for the complication installation guide.
