Metadata-Version: 2.4
Name: yamllm-core
Version: 0.1.5
Summary: YAML-based LLM configuration and execution
Project-URL: Homepage, https://github.com/codehalwell/yamllm
Project-URL: Documentation, https://codehalwell.github.io/yamllm/
Project-URL: Repository, https://github.com/codehalwell/yamllm.git
Author-email: Daniel Halwell <danielhalwell@gmail.com>
License: MIT
License-File: LICENSE
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3.10
Requires-Python: >=3.10
Requires-Dist: faiss-cpu>=1.10.0
Requires-Dist: matplotlib>=3.10.0
Requires-Dist: numpy>=2.2.3
Requires-Dist: openai>=1.63.0
Requires-Dist: pandas>=2.2.3
Requires-Dist: pydantic>=2.10.6
Requires-Dist: python-dotenv>=1.0.1
Requires-Dist: pyyaml>=6.0.2
Requires-Dist: requests>=2.32.3
Requires-Dist: ruff>=0.9.6
Requires-Dist: scikit-learn>=1.6.1
Requires-Dist: seaborn>=0.13.2
Requires-Dist: tabulate>=0.9.0
Provides-Extra: dev
Requires-Dist: black>=24.1.1; extra == 'dev'
Requires-Dist: isort>=5.13.2; extra == 'dev'
Requires-Dist: mypy>=1.8.0; extra == 'dev'
Requires-Dist: pytest>=8.0.0; extra == 'dev'
Description-Content-Type: text/markdown

# YAMLLM

A Python library for YAML-based LLM configuration and execution.

## Installation

```bash
pip install yamllm-core
```

## Quick Start

```python
from yamllm import LLM
import os

# Initialize LLM with config
llm = LLM(config_path="path/to/config.yaml")
llm.api_key = os.environ.get("OPENAI_API_KEY")

# Make a query
response = llm.query("What is the meaning of life?")
print(response)
```

## Configuration
YAMLLM uses YAML files for configuration. Set up a `.config` file to define the parameters for your LLM instance. This file should include settings such as the model type, temperature, maximum tokens, and system prompt.

Example configuration:

```yaml
model: gpt-4-turbo-preview
temperature: 0.7
max_tokens: 500
system_prompt: "You are a helpful AI assistant."
```

Place the `.config` file in your project directory and reference it in your code to initialize the LLM instance.

Example configuration:

```yaml
model: gpt-4-turbo-preview
temperature: 0.7
max_tokens: 500
system_prompt: "You are a helpful AI assistant."
```

## Features

- YAML-based configuration
- Simple API interface
- Customizable prompt templates
- Error handling and retry logic

## License

MIT License