Metadata-Version: 2.4
Name: ferret-mcp
Version: 0.1.0
Summary: MCP server that extracts complete knowledge from any codebase — architecture, patterns, dependencies, API surface
Project-URL: Homepage, https://github.com/fabdendev/ferret-mcp
Project-URL: Repository, https://github.com/fabdendev/ferret-mcp
Project-URL: Issues, https://github.com/fabdendev/ferret-mcp/issues
License: MIT
License-File: LICENSE
Keywords: architecture,code-analysis,knowledge-extraction,mcp,reverse-engineering
Requires-Python: >=3.12
Requires-Dist: anthropic>=0.42
Requires-Dist: fastmcp>=2.0
Requires-Dist: httpx>=0.27
Requires-Dist: openai>=1.0
Requires-Dist: pathspec>=0.12.0
Provides-Extra: dev
Requires-Dist: pytest>=9.0; extra == 'dev'
Requires-Dist: ruff>=0.14; extra == 'dev'
Description-Content-Type: text/markdown

# Ferret MCP

[![PyPI version](https://img.shields.io/pypi/v/ferret-mcp)](https://pypi.org/project/ferret-mcp/)
[![Downloads](https://static.pepy.tech/badge/ferret-mcp/month)](https://pepy.tech/project/ferret-mcp)
[![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT)
[![Python 3.12+](https://img.shields.io/badge/python-3.12+-blue.svg)](https://www.python.org/downloads/)
[![Tests](https://github.com/fabdendev/ferret-mcp/actions/workflows/tests.yml/badge.svg)](https://github.com/fabdendev/ferret-mcp/actions/workflows/tests.yml)

An [MCP](https://modelcontextprotocol.io/) server that extracts complete knowledge from any codebase — architecture, patterns, dependencies, API surface. Combines static analysis with AI-powered deep interpretation.

Works with any MCP client: [Claude Code](https://docs.anthropic.com/en/docs/claude-code), [Claude Desktop](https://claude.ai), [Cursor](https://cursor.sh), and more.

**Give it a repo, get a senior engineer's analysis in 30 seconds for $0.02.**

## Quickstart

### Install & run with uvx (no clone needed)

```bash
uvx ferret-mcp
```

### Or install with pip

```bash
pip install ferret-mcp
```

## MCP Client Setup

### Claude Code

```bash
claude mcp add ferret -- uvx ferret-mcp
```

To enable AI-powered tools (`deep`, `ask`), set your API key:

```bash
claude mcp add ferret -e FERRET_LLM_API_KEY=sk-ant-... -- uvx ferret-mcp
```

### Claude Desktop

Add to `claude_desktop_config.json`:

```json
{
  "mcpServers": {
    "ferret": {
      "command": "uvx",
      "args": ["ferret-mcp"],
      "env": {
        "FERRET_LLM_API_KEY": "sk-ant-..."
      }
    }
  }
}
```

### Cursor / Windsurf / any MCP client

```json
{
  "mcpServers": {
    "ferret": {
      "command": "uvx",
      "args": ["ferret-mcp"],
      "env": {
        "FERRET_LLM_API_KEY": "sk-ant-..."
      }
    }
  }
}
```

### Local development

```bash
git clone https://github.com/fabdendev/ferret-mcp.git
cd ferret-mcp
cp .env.example .env   # Add your API key
uv sync
uv run ferret-mcp
```

## Tools

### Static Analysis (free, no LLM required)

| Tool | Description |
|------|-------------|
| `scan` | Repository overview — languages, structure, entry points, config files |
| `dependencies` | External packages + internal import graph with core modules |
| `architecture` | Layers, architectural patterns, module breakdown |
| `patterns` | Design patterns, naming conventions, testing, error handling |
| `api_surface` | REST endpoints, MCP tools, CLI commands, GraphQL, gRPC, exports |
| `full_extraction` | All of the above in one comprehensive report |

### AI-Powered (~$0.02/report with Haiku)

| Tool | Description |
|------|-------------|
| `deep` | Comprehensive Knowledge Extraction Report — 10-section expert analysis covering architecture, data flow, strengths, risks, and learning takeaways |
| `ask` | Ask any question about a repo, answered with full codebase context |

All tools take a `path` argument — the absolute path to the repository root directory.

## Configuration

AI-powered tools (`deep`, `ask`) require an LLM. Configure via environment variables:

| Env Var | Default | Description |
|---------|---------|-------------|
| `FERRET_LLM_PROVIDER` | `anthropic` | `anthropic` or `openai` (for Ollama, vLLM, LM Studio) |
| `FERRET_LLM_MODEL` | `claude-haiku-4-5-20251001` | Model name |
| `FERRET_LLM_API_KEY` | — | API key (required for Anthropic; `ollama` for local) |
| `FERRET_LLM_BASE_URL` | `http://localhost:11434/v1` | Base URL for OpenAI-compatible providers |

### Use with a local LLM (Ollama)

```bash
claude mcp add ferret \
  -e FERRET_LLM_PROVIDER=openai \
  -e FERRET_LLM_BASE_URL=http://localhost:11434/v1 \
  -e FERRET_LLM_MODEL=qwen3:8b \
  -- uvx ferret-mcp
```

## Example Output

The `deep` tool produces a ~1000-line Knowledge Extraction Report covering:

1. **Executive Summary** — what it is, what stage, honest assessment
2. **Architecture Deep Dive** — patterns, modules, dependency direction, God Objects
3. **Technology Stack & Rationale** — why each choice was made
4. **Data & Control Flow** — ASCII diagrams, execution model
5. **Design Patterns & Conventions** — with file references
6. **API & Interface Contracts** — REST, CLI, MCP, auth model
7. **Key Files Reading Guide** — ordered reading path for new contributors
8. **Strengths** — what's genuinely well-designed
9. **Risks & Technical Debt** — brutal, specific, with fixes
10. **Learning Takeaways** — what to steal, what to avoid

## Limitations

- `.gitignore` parsing only reads the root-level file (nested `.gitignore` files are not honored)
- Maximum 15,000 files scanned per repository
- File content analysis limited to files under 512 KB
- AI analysis quality depends on the LLM model used (Haiku is fast/cheap, Sonnet/Opus for deeper analysis)

## License

MIT
