Metadata-Version: 2.4
Name: contextual-engine
Version: 0.1.0
Summary: Temporal-first local code memory for AI tools via MCP
Project-URL: Homepage, https://github.com/contextual-ai/contextual
Project-URL: Documentation, https://github.com/contextual-ai/contextual/tree/main/docs
Project-URL: Repository, https://github.com/contextual-ai/contextual
Project-URL: Issues, https://github.com/contextual-ai/contextual/issues
Author: Contextual Contributors
License: BUSINESS SOURCE LICENSE 1.1 (MODIFIED — PRODUCTION-GRADE TEMPLATE)
        
        ---
        
        PARAMETERS
        
        Licensor:             [AARJUN MAHULE]
        
        Licensed Work:        [CONTEXTUAL V0.1.0]
        The Licensed Work is (c) [2026] [AARJUN MAHULE]
        
        Additional Use Grant:
        Production Use is permitted solely for:
        
        * Individuals acting for personal, non-commercial purposes
        * Non-commercial projects
        * Organizations with fewer than [5] employees AND annual revenue below [500,000INR]
        
        All other Production Use requires a separate commercial license from the Licensor.
        
        Change Date:          [2030-06-01] (typically four years from first public release)
        
        Change License:       Apache License, Version 2.0
        
        Governing Law:        This License shall be governed by and construed in accordance
        with the laws of Republic of India.
        
        Contact:              [aarjun.mahule23@gmail.com]
        
        ---
        
        TERMS
        
        1. Grant of Rights
        
        The Licensor hereby grants you the right to copy, modify, create derivative
        works, and redistribute the Licensed Work, in whole or in part, for non-production use.
        
        2. Definition of Production Use
        
        “Production Use” means any use of the Licensed Work in a live, operational,
        or business environment, including but not limited to:
        
        * Use in software-as-a-service (SaaS), platform-as-a-service (PaaS),
          or hosted environments
        * Internal business operations, tools, or workflows
        * Revenue-generating systems or services
        * Customer-facing applications or services
        * Any environment serving real users, customers, or organizational functions
        
        3. Restriction on Production Use
        
        You may not use the Licensed Work for Production Use except as expressly
        permitted under the Additional Use Grant or through a separate commercial
        license agreement with the Licensor.
        
        If your use exceeds the scope of the Additional Use Grant, you must obtain
        a commercial license from the Licensor or cease use immediately.
        
        4. Change License
        
        Effective on the Change Date, or the fourth anniversary of the first publicly
        available distribution of a specific version of the Licensed Work under this
        License, whichever comes first, the Licensor grants you rights under the
        terms of the Change License. On such date, the rights granted under this
        License terminate and are replaced by the Change License.
        
        5. Version Scope
        
        This License applies separately to each version of the Licensed Work.
        Each version may have its own Change Date as specified by the Licensor.
        
        6. Redistribution and Attribution
        
        You must conspicuously display this License and retain all copyright,
        license notices, and attribution in any original or modified copies
        of the Licensed Work.
        
        7. Termination
        
        Any use of the Licensed Work in violation of this License will automatically
        terminate your rights under this License for the current and all other
        versions of the Licensed Work.
        
        8. Trademarks
        
        This License does not grant you any rights to use the Licensor’s trademarks,
        service marks, or logos, except as required for reasonable and customary use
        in describing the Licensed Work.
        
        9. Disclaimer of Warranty
        
        TO THE EXTENT PERMITTED BY APPLICABLE LAW, THE LICENSED WORK IS PROVIDED
        ON AN "AS IS" BASIS. LICENSOR HEREBY DISCLAIMS ALL WARRANTIES AND CONDITIONS,
        EXPRESS OR IMPLIED, INCLUDING (WITHOUT LIMITATION) WARRANTIES OF
        MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, NON-INFRINGEMENT, AND TITLE.
        
        10. Limitation of Liability
        
        TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, IN NO EVENT SHALL THE
        LICENSOR BE LIABLE FOR ANY CLAIM, DAMAGES, OR OTHER LIABILITY, WHETHER IN
        AN ACTION OF CONTRACT, TORT, OR OTHERWISE, ARISING FROM, OUT OF, OR IN
        CONNECTION WITH THE LICENSED WORK OR THE USE OR OTHER DEALINGS IN THE
        LICENSED WORK.
        
        11. Governing Law
        
        This License shall be governed by and construed in accordance with the laws
        of Republic of India, without regard to conflict of law principles.
        
        ---
License-File: LICENSE
Keywords: ai,code-search,developer-tools,embeddings,local-first,mcp,temporal
Classifier: Development Status :: 3 - Alpha
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: Text Processing :: Indexing
Classifier: Typing :: Typed
Requires-Python: >=3.12
Requires-Dist: fastembed>=0.5.1
Requires-Dist: fastmcp<4,>=3.2
Requires-Dist: lancedb<0.31,>=0.26
Requires-Dist: numpy>=1.26
Requires-Dist: pathspec>=0.10
Requires-Dist: platformdirs>=3.3
Requires-Dist: pyarrow>=14.0
Requires-Dist: pydantic>=2.0
Requires-Dist: pygit2>=1.15
Requires-Dist: sqlite-vec>=0.1.1
Requires-Dist: structlog>=25.4
Requires-Dist: tantivy>=0.20.1
Requires-Dist: tiktoken>=0.5
Requires-Dist: tree-sitter-language-pack>=0.7.2
Requires-Dist: typer<1.0,>=0.9
Requires-Dist: watchdog>=4.0
Provides-Extra: benchmark
Requires-Dist: coir-eval; extra == 'benchmark'
Requires-Dist: ranx>=0.3; extra == 'benchmark'
Provides-Extra: dev
Requires-Dist: docutils>=0.21; extra == 'dev'
Requires-Dist: markdown-it-py>=4.0; extra == 'dev'
Requires-Dist: mdit-py-plugins; extra == 'dev'
Requires-Dist: mypy>=1.10; extra == 'dev'
Requires-Dist: pytest-asyncio>=1.0; extra == 'dev'
Requires-Dist: pytest-cov>=4.0; extra == 'dev'
Requires-Dist: pytest-xdist>=3.0; extra == 'dev'
Requires-Dist: pytest>=7.0; extra == 'dev'
Requires-Dist: python-frontmatter; extra == 'dev'
Requires-Dist: ranx>=0.3; extra == 'dev'
Requires-Dist: rapidfuzz>=3; extra == 'dev'
Requires-Dist: ruff>=0.4; extra == 'dev'
Description-Content-Type: text/markdown

# Contextual

**Temporal-first local code memory for AI tools via MCP.**

Contextual is a local-first, bi-temporal, code-aware semantic context engine that gives AI coding assistants persistent, accurate memory of your codebase — across tools, across time, without cloud dependencies.

## The Problem

Every AI coding tool today suffers from the same fundamental flaw: **context amnesia**. Claude Code forgets what you discussed yesterday. Cursor re-indexes from scratch. Copilot has no memory of architectural decisions. Your AI assistant hallucinates function signatures that changed three commits ago because it has no concept of *when* things were true.

The result: wasted tokens, hallucinated code, repeated explanations, and lost architectural intent.

## The Solution

Contextual maintains a **bi-temporal knowledge graph** of your codebase — tracking not just *what* is true, but *when* it became true and *when you learned* it was true. It exposes this knowledge through the **Model Context Protocol (MCP)**, making it available to every AI tool in your stack simultaneously.

```
Developer writes code → Contextual indexes changes (tree-sitter + git blame)
                       → Builds temporal knowledge graph (SQLite bi-temporal)
                       → Embeds semantically (jina-v2-base-code + BM25)
                       → Serves context via MCP to any AI tool
```

## Key Features

- **Temporal-First Memory** — Every fact carries four timestamps (valid_at, invalid_at, created_at, expired_at). Ask "what was the API signature last Tuesday?" and get the right answer.
- **Tool-Agnostic via MCP** — Works with Claude Desktop, Claude Code, Cursor, VS Code Copilot, Gemini CLI, Windsurf, Zed, and any MCP-compatible client. One index, every tool.
- **Local-First, Private by Design** — All data stays on your machine. No cloud, no telemetry, no API keys required. Runs on 8GB RAM laptops.
- **Hybrid Code Search** — Combines dense semantic embeddings (jina-v2-base-code) with code-aware BM25 (camelCase/snake_case splitting) via Reciprocal Rank Fusion, reranked by a cross-encoder.
- **Git-Native Indexing** — Incremental indexing via post-commit hooks. Blame-based temporal attribution. Force-push and rebase aware.
- **7 Languages at Launch** — Python, TypeScript, JavaScript, Go, Java, Rust, C# plus config formats (JSON, YAML, TOML, Dockerfile, Markdown).

## Quick Start

```bash
# Install
uvx contextual

# Index your codebase
contextual index .

# Search
contextual search "authentication middleware"

# Temporal recall
contextual recall "UserService" --as-of "2026-05-01"

# Start MCP server for your IDE
contextual serve --stdio
```

## IDE Integration

Run `contextual setup <ide>` to auto-configure any supported IDE, or configure manually:

### Claude Desktop / Cursor / Windsurf
```json
{
  "mcpServers": {
    "contextual": {
      "command": "uvx",
      "args": ["contextual", "serve", "--stdio"]
    }
  }
}
```

### VS Code Copilot
```json
{
  "mcp.servers": {
    "contextual": {
      "type": "stdio",
      "command": "uvx",
      "args": ["contextual", "serve", "--stdio"]
    }
  }
}
```

### Claude Code
```bash
claude mcp add contextual -- uvx contextual serve --stdio
```

### Gemini CLI
```json
{
  "mcpServers": {
    "contextual": {
      "command": "uvx",
      "args": ["contextual", "serve", "--stdio"]
    }
  }
}
```

## MCP Tools

| Tool | Description |
|------|-------------|
| `index` | Index a codebase at a given path |
| `search` | Hybrid semantic + keyword code search |
| `recall` | Temporal recall — what did we know about X at time T? |
| `capture_decision` | Record an architectural decision |
| `freshness` | Check how stale the current index is |
| `status` | System health and statistics |
| `forget` | Invalidate a fact (never deletes — marks as expired) |
| `timeline` | Full temporal history of any entity |

## Requirements

- Python 3.12+
- macOS, Linux, or Windows
- 8GB RAM minimum (16GB recommended)
- Git repository (full history recommended)

## Documentation

- [Architecture](docs/ARCHITECTURE.md) — System design and data flow
- [Technical Specification](docs/TECHNICAL_SPEC.md) — Complete implementation reference
- [Roadmap](docs/ROADMAP.md) — 30-day build plan
- [ADRs](docs/adr/) — Architectural Decision Records
- [Delegation Playbook](docs/DELEGATION_PLAYBOOK.md) — AI team task guide

## License

Business Source License 1.1 — Change License: Apache 2.0, Change Date: 4 years from each release.

## Acknowledgements

Built with FastMCP, LanceDB, tree-sitter, pygit2, fastembed, and tantivy-py. Temporal model inspired by Graphiti/Zep and Snodgrass's bi-temporal formalism.
