Metadata-Version: 2.4
Name: context-pilot-mcp
Version: 0.1.0
Summary: Universal context manager for AI coding agents
Author-email: Samito <sanyam.ahuja.alwar@gmail.com>
License: MIT
Keywords: ai,claude,context,llm,mcp
Requires-Python: >=3.11
Requires-Dist: mcp>=1.0.0
Requires-Dist: rich>=13.0.0
Requires-Dist: typer>=0.9.0
Description-Content-Type: text/markdown

# context-pilot

Universal context manager for AI coding agents.

It tracks file relevance, compresses stale files, and evicts dead weight to keep your agent's context window optimized. Works with aider, Claude Code, Cursor, or anything that reads files.

## Features
- Per-file event tracking and state management
- Staleness scoring engine
- AST-based file compression
- Eviction decision engine

## Installation
```bash
pip install -e .
```

## Usage

### CLI Adapter
You can use `context-pilot` in any terminal:
```bash
# Track files and events
context-pilot track src/api.py --event added
context-pilot advance

# Evaluate when context gets full
context-pilot evaluate-context
```

### MCP Server (Claude Code / Cursor)
Run the MCP server to let AI agents automatically read/compress/evict files:
```json
{
  "mcpServers": {
    "context-pilot": {
      "command": "context-pilot-mcp"
    }
  }
}
```

## Configuration
You can configure behavior by placing a `.context-pilot.toml` file in your project root:

```toml
[context-pilot]
mode = "suggest"            # "suggest", "auto", or "off"
pressure_threshold = 0.8    # Trigger eviction when window is 80% full
stale_after_turns = 5       # Unreferenced turns before considering stale
edit_stickiness = 3         # Edited files stay in context for N extra turns
compress_before_drop = true # Try AST compression before full eviction
max_context_tokens = 128000 # Your model's context window size
min_staleness_to_flag = 0.6 # Only flag files that are at least 60% stale
```
