Metadata-Version: 2.4
Name: llm-to-cli
Version: 0.2.0
Summary: simple llm based tools to access from cli
License-File: LICENSE
Author: tikendraw
Author-email: tikendraksahu1029@gmail.com
Requires-Python: >=3.11,<4.0
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Provides-Extra: agents
Requires-Dist: click (>=8.1.8,<9.0.0)
Requires-Dist: litellm (>=1.59.9,<2.0.0)
Requires-Dist: rich (>=13.9.4,<14.0.0)
Requires-Dist: smolagents (>=0.1.0) ; extra == "agents"
Description-Content-Type: text/markdown

# LLM-cli
A lightweight Command Line Interface (CLI) for interacting with Large Language Models (LLMs) using LiteLLM.

See [IMPLEMENTED_FEATURES.md](./IMPLEMENTED_FEATURES.md) for the current feature guide based on the code that is actually implemented in this repository.


## 💡 Why This Project?
Sometimes network constraints or data limitations make it difficult to access large language models via web interfaces. This CLI provides a lightweight, flexible solution for LLM interactions directly from the terminal.


## 🚀 Features

- **Simple CLI Interface**: Easily chat with different LLMs from your terminal
- **Input**: Pipe inputs or redirect file text.
- **Multiple Chat Modes**:
  - Direct single-message chat
  - Interactive chat UI with markdown rendering
  - Image support for vision-capable models
- **Flexible Configuration**: Customize model, temperature, and system prompts
- **Easy Configuration Management**: Update settings with a simple command
- **Sessions** : Logs chat sessions, can be resumed saved chat later.

## 🔧 Prerequisites

- Api keys to the llms, set api keys as environment variables

## 💾 Installation

1. Via Pip
```bash
pip install llm-to-cli
```
Or 

2. From Repo
```bash
# Clone the repository
git clone https://github.com/tikendraw/llm-cli.git
cd llm-cli

# Install 
pip install .
```

## 🖥️ Usage

### Basic Chat

* Send a single message to an LLM:

  ```bash
  llm-cli chat "Hello, how are you?"
  ```
* Pipe input
  ```bash
  echo "what is 34th prime number" | llm-cli chat
  ```
* File redirection
  ```bash
  llm-chat chat < some_file_with_question.txt
  ```
* Include the last terminal command/output blocks from the current tmux pane
  ```bash
  llm-cli chat --pane-history 1 "Why did this command fail?"
  ```
* Target a different tmux pane explicitly
  ```bash
  llm-cli chat --pane-history 3 --pane-target %12 "Summarize what just happened"
  ```

### Interactive Chat UI

* Start an interactive chat session:

  ```bash
  llm-cli chatui
  ```
* Start with recent tmux pane history as context:
  ```bash
  llm-cli chatui --pane-history 2
  ```
* During chat, add pane history on demand:
  ```text
  /pane 3
  /pane 2 %12
  ```

### Image Support

* Add image
  ```bash
  llm-cli chat --image path/to/image/or/url
  ```

### Configuration

* View current configuration:
  ```bash
  llm-cli config
  ```

* Update configuration:
  ```bash
  llm-cli config model "anthropic/claude-3-haiku"
  llm-cli config temperature 0.7
  ```

## 🛠️ Commands

- `chat`: Send a single message
- `chatui`: Interactive chat 
- `config`: Manage CLI configuration
- `history`: See and manage history

## 🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

