Metadata-Version: 2.4
Name: local-brain
Version: 0.3.5
Summary: Chat with local Ollama models that can explore your codebase
Project-URL: Homepage, https://github.com/IsmaelMartinez/local-brain
Project-URL: Repository, https://github.com/IsmaelMartinez/local-brain
Project-URL: Documentation, https://github.com/IsmaelMartinez/local-brain#readme
Author-email: Ismael Martinez <ismaelmartinez@gmail.com>
License: MIT
License-File: LICENSE
Keywords: ai,cli,llm,ollama,tool-calling
Classifier: Development Status :: 4 - Beta
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Classifier: Topic :: Software Development :: Quality Assurance
Requires-Python: >=3.10
Requires-Dist: click>=8.0.0
Requires-Dist: ollama>=0.6.1
Provides-Extra: dev
Requires-Dist: mypy>=1.0.0; extra == 'dev'
Requires-Dist: pytest-cov>=4.0.0; extra == 'dev'
Requires-Dist: pytest>=7.0.0; extra == 'dev'
Requires-Dist: ruff>=0.1.0; extra == 'dev'
Description-Content-Type: text/markdown

# Local Brain

Chat with local Ollama models that can explore your codebase.

```bash
local-brain "What files changed recently?"
local-brain "Review the code in src/"
local-brain "Generate a commit message"
local-brain "Explain how auth works"
```

## Install

```bash
uv pip install local-brain
```

Or with pipx:
```bash
pipx install local-brain
```

**Requires:** [Ollama](https://ollama.ai) with a model:
```bash
ollama pull qwen3
```

## Usage

```bash
local-brain "prompt"                    # Ask anything (auto-selects best model)
local-brain -v "prompt"                 # Verbose (show tool calls)
local-brain -m qwen2.5-coder:7b "prompt"  # Specific model
local-brain --list-models               # Show available models
local-brain --root /path/to/project "prompt"  # Set project root
```

The model has tools to explore your codebase — it reads files, checks git, lists directories on its own.

## Model Discovery

Local Brain automatically detects installed Ollama models and picks the best one:

```bash
local-brain --list-models
```

**Recommended models:**
- `qwen3:latest` — General purpose (default)
- `qwen2.5-coder:7b` — Code-focused
- `llama3.2:3b` — Fast, lightweight
- `mistral:7b` — Balanced

## Examples

```bash
# Explore
local-brain "What's in this repo?"
local-brain "How does the auth system work?"

# Review
local-brain "Review the git changes"
local-brain "Review src/main.py for issues"

# Git
local-brain "Generate a commit message for staged changes"
local-brain "Summarize recent commits"

# Explain
local-brain "Explain how agent.py works"
```

## Security

All operations are **restricted to the project root** (path jailing):

- ✅ Read files within project directory
- ✅ Run safe, read-only shell commands
- ❌ Access files outside project root
- ❌ Read sensitive files (`.env`, keys)
- ❌ Network access

## Tools

The model can use (all read-only):

| Tool | What it does |
|------|--------------|
| `read_file` | Read file contents |
| `list_directory` | List files (glob patterns) |
| `file_info` | Get file metadata |
| `git_diff` | See code changes |
| `git_status` | Check repo status |
| `run_command` | Run safe shell commands |

## Development

```bash
git clone https://github.com/IsmaelMartinez/local-brain.git
cd local-brain
uv sync
uv run local-brain "Hello!"
```

## License

MIT
