Metadata-Version: 2.4
Name: hiway2llm
Version: 0.1.1
Summary: HiWay2LLM SDK (Python) — OpenAI-compatible client for the HiWay router. BYOK, 40-60% savings, 0% markup on inference.
Project-URL: Homepage, https://www.hiway2llm.com
Project-URL: Documentation, https://www.hiway2llm.com/docs
Project-URL: Repository, https://github.com/Hiway2llm/client
Project-URL: Issues, https://github.com/Hiway2llm/client/issues
Project-URL: Changelog, https://github.com/Hiway2llm/client/releases
Author-email: Mytm-Group <admin@mytm-group.com>
License: MIT
License-File: LICENSE
Keywords: ai,anthropic,cost-optimization,llm,llm-router,openai,openai-compatible,sdk
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Typing :: Typed
Requires-Python: >=3.9
Requires-Dist: httpx>=0.27.0
Description-Content-Type: text/markdown

# hiway2llm

Official Python SDK for [HiWay2LLM](https://www.hiway2llm.com) — BYOK smart LLM routing with 40-60 % savings and zero markup on inference.

## Install

```bash
pip install hiway2llm
```

Or with [uv](https://github.com/astral-sh/uv):

```bash
uv add hiway2llm
```

Works on Python 3.9+. Pulls in `httpx` (the only runtime dep).

## Quickstart

```python
import os
from hiway2llm import Hiway

h = Hiway(api_key=os.environ["HIWAY_API_KEY"])

res = h.chat.completions.create(
    model="auto",                                     # smart routing
    messages=[{"role": "user", "content": "Say hi"}],
)

print(res["choices"][0]["message"]["content"])
print(res["_hiway"]["routed_model"])                  # which model was picked
```

Already using the OpenAI SDK? You can keep it — just change `base_url`:

```python
from openai import OpenAI
client = OpenAI(
    api_key=os.environ["HIWAY_API_KEY"],              # hw_live_...
    base_url="https://app.hiway2llm.com/v1",
)
```

Both approaches hit the same OpenAI-compatible endpoint.

## API surface

| Method | Description |
|---|---|
| `h.chat.completions.create(model=..., messages=..., **opts)` | OpenAI-compatible chat completion. |
| `h.models.list()` | Models your workspace can route to (BYOK-filtered). |
| `h.me()` | Current user + plan. |

## Errors

All 4xx / 5xx responses raise `HiwayError` with `.status` + `.body`:

```python
from hiway2llm import Hiway, HiwayError

h = Hiway()
try:
    h.chat.completions.create(model="auto", messages=[{"role": "user", "content": "hi"}])
except HiwayError as e:
    if e.status == 402:
        # quota exhausted — prompt to upgrade
        ...
```

## Links

- [Homepage](https://www.hiway2llm.com)
- [Docs](https://www.hiway2llm.com/docs)
- [CLI](https://www.npmjs.com/package/@hiway2llm/cli)
- [GitHub](https://github.com/hiway2llm/client)

## License

MIT — © Mytm-Group SAS
