Metadata-Version: 2.3
Name: sai-chat
Version: 0.1.2
Summary: Simple AI interface to chat with your LLM models from the terminal
Requires-Dist: httpx>=0.28.1
Requires-Dist: inquirer>=3.4.1
Requires-Dist: peewee>=3.19.0
Requires-Dist: python-dotenv>=1.2.1
Requires-Dist: rich>=14.3.2
Requires-Python: >=3.12
Project-URL: Homepage, https://github.com/luisgdev/sai
Project-URL: Repository, https://github.com/luisgdev/sai
Project-URL: Issues, https://github.com/luisgdev/sai/issues
Project-URL: Documentation, https://github.com/luisgdev/sai#readme
Description-Content-Type: text/markdown

# sai
Simple AI interface to chat with your Ollama models from the terminal

# Features

- [x] Pretty print real time responses in Markdown, using `rich` library.
- [x] Keep conversation context.
- [x] Autodetect and option to select models.
- [ ] Add support for custom prompts.
- [ ] Add conversation persistency (sessions).

# Requirements
An Ollama instance is required to get access to local models. 
By default, the URL is set to `http://localhost:11434`.

# Install
The project is registered in PyPi: https://pypi.org/project/sai-chat/

So you can install it using any package manager of your preference like `pip`, 
but the recommended way is `uv tool`.

## Recommended
Using `uv tool`:

```shell
uv tool install sai-chat
```

# Usage

Start using it in your terminal just by running `sai` command:
```shell
luis@laptop:~ $ sai
╭───────────────────────────────────────────────────────╮
│ Welcome to Sai. Chat with your local LLM models.      │
│                                                       │
│ Available commands:                                   │
│                                                       │
│  • /help : Show this help message                     │
│  • /quit : Exit the application                       │
│  • /model : Select a model                            │
╰───────────────────────────────────────────────────────╯
> Hey                               
╭────────────────────────────────────── LLM Response ✔ ─╮
│ What's up? How can I help you today?                  │
╰───────────────────── llama3.2:1b ─────────────────────╯
> 

```

# Status

This project is under development. Feel free to contribute or provide feedback!