Metadata-Version: 2.4
Name: ollama-rich
Version: 0.1.0
Summary: A feature-rich Ollama client with enhanced terminal UI using the Rich library.
Home-page: https://github.com/jakbin/ollama-rich
Author: Jak BIn
License: MIT
Project-URL: Source, https://github.com/jakbin/ollama-rich
Project-URL: Tracker, https://github.com/jakbin/ollama-rich/issues
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Python: >=3.7
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: rich
Requires-Dist: ollama
Dynamic: author
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: license
Dynamic: license-file
Dynamic: project-url
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# ollama-rich

A feature-rich Ollama client with enhanced terminal UI using the Rich library.

## Features
- List available Ollama models in a beautiful table
- Chat with models directly from the terminal
- Stream responses live with markdown rendering
- Easy-to-use CLI interface
- More comming soon

## Requirements
- Python 3.7+
- [Ollama](https://github.com/jmorganca/ollama) server running
- [rich](https://github.com/Textualize/rich) Python library

## Clone the Repository

To get the source code, clone this repository:

```bash
git clone https://github.com/yourusername/ollama-rich.git
cd ollama-rich
```

## Installation
```bash
pip install .
```

## Usage
### List Models
```bash
ollama-rich models
```

### Chat with a Model
```bash
ollama-rich chat <model> "Your message here"
```

### Stream Chat Response
```bash
ollama-rich chat <model> "Your message here" --stream
```


