Metadata-Version: 2.4
Name: ollama-models
Version: 0.1.0
Summary: Python client for the ollama-models API — search and list Ollama model weights
Project-URL: Repository, https://github.com/devcomfort/ollama-models
Author-email: devcomfort <im@devcomfort.me>
License: MIT
Keywords: llm,models,ollama,registry,search
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.8
Requires-Dist: httpx>=0.27.0
Description-Content-Type: text/markdown

# ollama-models (Python)

Python client for searching and listing models from the [Ollama](https://ollama.com) registry.

## Installation

```bash
pip install ollama-models
```

## Usage

```python
from ollama_models import OllamaModelsClient

# No base URL needed — defaults to the official hosted instance
client = OllamaModelsClient()

# Pass a base URL only if you self-host the API
# client = OllamaModelsClient("https://your-own-instance.workers.dev")

# Search models
result = client.search("qwen3", page=1)
for page in result.pages:
    print(page.http_url)

# Get all tags for a model
model = client.get_model("qwen3")
print(model.default_model_id)   # qwen3:latest
for w in model.model_list:
    print(w.id)                 # qwen3:latest, qwen3:4b, ...

# Async usage
import asyncio

async def main():
    result = await client.search_async("qwen3")
    model  = await client.get_model_async("qwen3")

asyncio.run(main())
```
