Metadata-Version: 2.4
Name: lattice-memory
Version: 0.1.5
Summary: Local-first, bicameral memory system for AI agents — remember, evolve, search
Project-URL: Homepage, https://github.com/tefx/lattice
Project-URL: Documentation, https://github.com/tefx/lattice#readme
Project-URL: Repository, https://github.com/tefx/lattice
Project-URL: Issues, https://github.com/tefx/lattice/issues
Author-email: Tefx <zhaomeng.zhu@gmail.com>
License: AGPL-3.0-only
License-File: LICENSE
Keywords: agent,ai,llm,mcp,memory,persistent,rag
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: GNU Affero General Public License v3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.11
Requires-Dist: deal>=4.24.0
Requires-Dist: invar-runtime>=0.1.0
Requires-Dist: litellm>=1.0.0
Requires-Dist: mcp>=1.0.0
Requires-Dist: opentelemetry-api>=1.20.0
Requires-Dist: opentelemetry-sdk>=1.20.0
Requires-Dist: returns>=0.22.0
Requires-Dist: sqlite-vec>=0.1.0
Requires-Dist: tiktoken>=0.5.0
Requires-Dist: tomli>=2.0.0; python_version < '3.11'
Requires-Dist: typer>=0.9.0
Provides-Extra: dev
Requires-Dist: pytest-dotenv>=0.5.2; extra == 'dev'
Requires-Dist: pytest>=8.0.0; extra == 'dev'
Description-Content-Type: text/markdown

# Lattice — Bicameral Memory for AI Agents

> **Solving the "Goldfish Memory" problem for AI Agents**

Lattice is a **bicameral memory system** that enables AI agents to:
- **Remember** your preferences, conventions, and decisions
- **Evolve** by automatically extracting rules from conversations
- **Search** historical dialogues to find relevant context

```
┌─────────────────────────────────────────────────────────┐
│  System 1 (Instinct)     │  System 2 (Memory)          │
│  ─────────────────────   │  ─────────────────────       │
│  Always-on Rules         │  Searchable Logs            │
│  0ms latency             │  ~100ms latency             │
│  Markdown files          │  SQLite + Vector Search     │
│  "Prefer TDD"            │  "Last week we fixed auth"  │
└─────────────────────────────────────────────────────────┘
                    ▲
                    │ Compiler (LLM)
                    │ Extracts patterns, generates rules
                    │
            ┌───────┴───────┐
            │   store.db    │
            │  (chat logs)  │
            └───────────────┘
```

## Installation

```bash
# Install from GitHub
pip install git+https://github.com/tefx/lattice.git

# Or clone and install locally
git clone https://github.com/tefx/lattice.git
cd lattice
pip install -e .
```

## Quick Start

### 1. Ingest Your First Data

```bash
lattice ingest
```

This auto-initializes the project:
- Creates `./.lattice/` — Project-level memory directory
- Creates `~/.config/lattice/` — Global configuration directory (if needed)

### 2. Configure LLM

Edit `~/.config/lattice/config.toml`:

```toml
[compiler]
# Recommended: GPT-5 Mini (best cost-performance ratio)
model = "openrouter/openai/gpt-5-mini"
api_key_env = "OPENROUTER_API_KEY"

# Or use opencode CLI (zero configuration)
# model = "cli:opencode:openrouter/openai/gpt-5-mini"

[thresholds]
warn_tokens = 3000
alert_tokens = 5000

[safety]
auto_apply = true
backup_keep = 10
```

Set environment variable:
```bash
export OPENROUTER_API_KEY="sk-or-..."
```

### 3. Data Capture

**Option A: OpenCode Plugin (Recommended)**

```bash
# Install plugin (TypeScript, single file)
mkdir -p ~/.config/opencode/plugins
curl -o ~/.config/opencode/plugins/lattice-capture.ts \
  https://raw.githubusercontent.com/tefx/lattice/main/plugins/opencode-lattice/index.ts

# Or clone and copy:
git clone https://github.com/tefx/lattice.git
mkdir -p ~/.config/opencode/plugins
cp lattice/plugins/opencode-lattice/index.ts ~/.config/opencode/plugins/lattice-capture.ts

# Restart opencode
opencode
```

OpenCode uses Bun runtime with native TypeScript support — no build step required.

**Alternative: Install from npm (once published)**

```json
// ~/.config/opencode/opencode.json
{
  "plugin": ["opencode-lattice"]
}
```

**Option B: MCP Server**

```bash
# Start MCP server
lattice serve

# Configure MCP in Claude Code / opencode
```

**Option C: Python SDK**

```python
import lattice

client = lattice.Client()

# Log conversation
client.log_turn(
    user="Fix the auth bug",
    assistant="I updated the middleware...",
)

# Search history
results = client.search("auth bug", limit=5)

# Get rules
instincts = client.get_instincts()
```
### 4. Evolve Rules

```bash
# Incremental evolution (only new sessions)
lattice evolve

# Check status
lattice status

# Apply/reject proposals
lattice apply <proposal>
lattice revert
```

## CLI Commands

```bash
# Data Ingestion (auto-initializes on first use)
lattice ingest                  # Ingest message from stdin

# Evolution
lattice evolve                  # Project-level evolution
lattice evolve --global         # Global evolution (cross-project patterns)
lattice status                  # Check status

# Search
lattice search "query"          # Search project memory
lattice search --global "query" # Search global memory

# Rule Management
lattice apply <proposal>        # Apply proposal
lattice revert                  # Rollback

# MCP
lattice serve                   # Start MCP server

# Configuration
lattice config init --global    # Create global config.toml
lattice config show --global    # Show current config
```

## Configuration

Lattice creates a comprehensive config file with all options documented.

### Create Config

```bash
# Create global config with all options documented
lattice config init --global

# Or force overwrite existing config
lattice config init --global --force
```

The global config is auto-created on first use with sensible defaults.

### Config File Location

```
~/.config/lattice/config.toml   # Global config
```

### Quick Setup

```bash
# 1. Ingest your first data (auto-creates config if missing)
lattice ingest

# 2. Save your API key
lattice auth login openai
# API Key for openai: ********
# ✓ Saved API key for openai
#   Key will be used automatically (no config.toml changes needed)

# 3. Edit model in config (optional)
vim ~/.config/lattice/config.toml
# Change model = "openai/gpt-5-mini" to your preferred model

# 4. Start using
lattice evolve
```

## Shell Completion

Lattice supports shell completion for **bash**, **zsh**, **fish**, and **PowerShell**.

### Install Completion

```bash
# Auto-detect shell and install
lattice completion --install

# Or specify shell explicitly
lattice completion --shell bash --install
lattice completion --shell zsh --install
lattice completion --shell fish --install
lattice completion --shell powershell --install
```

After installation, restart your terminal or source the completion file:

```bash
# Bash
source ~/.bash_completions/lattice.sh

# Zsh (add to ~/.zshrc)
fpath+=~/.zfunc
autoload -U compinit && compinit
```

### Show Completion Script

To view or manually install the completion script:

```bash
lattice completion --shell zsh
lattice completion --shell bash
lattice completion --shell fish
```

## API Key Configuration

Lattice supports flexible API key configuration with multiple sources and priority resolution.

### Quick Setup (Recommended)

Just run `lattice auth login` and you're done:

```bash
lattice auth login openai
# API Key for openai: ********
# ✓ Saved API key for openai
#   Key will be used automatically (no config.toml changes needed)
```

That's it! No need to edit config.toml. The key is stored securely and used automatically.

### Priority Order

Keys are resolved in this order (highest to lowest priority):

1. **Config `api_key`/`api_key_env`** - Explicit configuration in config.toml
2. **Auth storage** - `~/.config/lattice/auth.json` (automatic fallback)
3. **Environment variable** - Standard env vars (e.g., `OPENAI_API_KEY`)
4. **LiteLLM defaults** - LiteLLM's built-in key detection

**Key insight**: If you use `lattice auth login`, you don't need to configure anything in config.toml.

### When to Use config.toml

Only add `api_key` or `api_key_env` to config.toml if you want to:

1. **Override** the auth storage key for a specific model
2. **Use different keys** for different providers
3. **Use environment variables** in CI/CD

```toml
[compiler]
model = "openai/gpt-4"
# Optional: Override auth storage
# api_key = "{env:MY_CUSTOM_KEY}"
# api_key_env = "MY_CUSTOM_KEY"
```

### Variable Syntax

Use these variable formats in your config (only needed for advanced use cases):

| Syntax | Description | Example |
|--------|-------------|---------|
| `{env:VAR}` | Read from environment variable | `{env:OPENAI_API_KEY}` |
| `{file:/path}` | Read from file | `{file:~/.secrets/openai_key}` |
| `{auth:provider}` | Read from auth storage | `{auth:openai}` |
| Direct key | Plain text (not recommended) | `sk-proj-...` |

### Auth CLI Commands

Manage your API keys securely:

```bash
# Save an API key (prompts securely with masking)
lattice auth login openai

# Or provide via command line (shown in history, less secure)
lattice auth login openai --key sk-proj-...

# List saved providers (keys are redacted)
lattice auth list

# Test an API key
lattice auth test openai

# Remove an API key
lattice auth logout openai
```

### Security Best Practices

1. **Use auth storage** - Keys are stored with `chmod 0o600` permissions
2. **Avoid `--key` flag** - It shows in shell history; use interactive prompt instead
3. **Avoid direct keys in config** - Never hardcode keys in config files
4. **Use environment variables** - For CI/CD pipelines

### File Location

Auth keys are stored in:
```
~/.config/lattice/auth.json
```

The file is created with `0o600` permissions (owner read/write only).

## Complete Usage Workflow

This section walks through a typical usage cycle from initialization to observing memory effects.

### Week 1: Setup & Data Collection

```bash
# Day 1: Setup
cd your-project
lattice ingest                  # Auto-initializes on first use

# Configure LLM for Compiler
cat > ~/.config/lattice/config.toml << 'EOF'
[compiler]
model = "openrouter/openai/gpt-5-mini"
api_key_env = "OPENROUTER_API_KEY"

[thresholds]
warn_tokens = 3000
alert_tokens = 5000

[safety]
auto_apply = true
backup_keep = 10
EOF

export OPENROUTER_API_KEY="sk-or-..."

# Verify setup
lattice status
# Output:
# ── Sessions ─────────────────────
# Total: 0
# Pending evolution: 0
#
# ── Rules ───────────────────────
# rules/: 0 files, 0 tokens
```

```bash
# Day 1-7: Use your agent normally
# OpenCode Plugin captures conversations automatically
# Or use MCP: lattice serve

# Check data collection
lattice status
# Output:
# ── Sessions ─────────────────────
# Total: 23
# Pending evolution: 23
```

### Week 1-2: First Evolution

```bash
# Run Compiler to extract patterns
lattice evolve

# Output:
# Processing 23 sessions...
# LLM response received.
# Proposals written to:
#   - drift/proposals/20260219_143052_prefer_explicit_imports.md
#   - drift/proposals/20260219_143052_use_result_types.md
#
# Applied 2 proposals.
# Updated last_evolved_at.

# Check generated rules
lattice status
# Output:
# ── Sessions ─────────────────────
# Total: 23
# Pending evolution: 0
#
# ── Rules ───────────────────────
# rules/: 2 files, ~450 tokens
#
# ── Proposals ───────────────────
# Pending: 0

# View generated rules
cat .lattice/rules/*.md
```

### Week 2-4: Iteration & Review

```bash
# Sessions accumulate over time
lattice status
# ── Sessions ─────────────────────
# Total: 67
# Pending evolution: 15

# Run evolution again (incremental - only new sessions)
lattice evolve

# If auto_apply=false, review and apply manually
lattice status
# ── Proposals ───────────────────
# Pending: 3 proposals

# Review a proposal
cat .lattice/drift/proposals/20260225_091234_avoid_bare_except.md

# Apply it
lattice apply drift/proposals/20260225_091234_avoid_bare_except.md

# Or reject it (delete the file)
rm .lattice/drift/proposals/20260225_091234_avoid_bare_except.md

# Made a mistake? Revert
lattice revert
# Restored from backup: .lattice/backups/rules_20260219_143052.tar.gz
```

### Month 1+: Search & Cross-Project

```bash
# Search historical conversations
lattice search "authentication bug"
# Output:
# [1] ses_abc123 (user, 2026-02-15)
#     "I keep getting 401 errors on the API..."
# [2] ses_def456 (assistant, 2026-02-15)
#     "The JWT tokens were expiring. I added refresh logic..."

# After multiple projects, run global evolution
lattice evolve --global
# Scans all registered projects for cross-project patterns
# Promotes rules appearing in ≥3 projects to global rules

# Check global rules
lattice status --global
# ── Global Rules ─────────────────
# rules/: 3 files, ~600 tokens
```

### Measuring Effectiveness

| Signal | How to Verify |
| ------ | -------------- |
| **Data Collection** | `lattice status` shows increasing session count |
| **Pattern Extraction** | `lattice evolve` generates proposals after ~10+ sessions |
| **Rule Accumulation** | `rules/` directory grows with `.md` files |
| **Behavior Change** | Agent starts applying learned preferences automatically |
| **Search Utility** | `lattice search "keyword"` returns relevant history |

### Key Files to Monitor

```bash
# Session logs (System 2)
sqlite3 .lattice/store.db "SELECT COUNT(*) FROM logs;"

# Generated rules (System 1)
ls -la .lattice/rules/

# Pending proposals
ls -la .lattice/drift/proposals/

# Evolution traces (audit log)
ls -la .lattice/drift/traces/

# Backups
ls -la .lattice/backups/
```

## Directory Structure

```
~/.config/lattice/          # Global configuration (XDG compliant)
├── config.toml             # LLM settings
├── projects.toml           # Project registry
├── rules/                  # Global rules
├── store/
│   └── global.db           # Cross-project evidence
├── drift/
│   ├── proposals/          # Pending global proposals
│   └── traces/             # Global Compiler audit logs
└── backups/                # Global rule backups

./.lattice/                 # Project-level memory
├── rules/                  # Project rules (System 1)
├── store.db                # Chat logs (System 2)
├── drift/
│   ├── proposals/          # Pending proposals
│   └── traces/             # Compiler reasoning (audit)
└── backups/                # Rule backups for revert
```

## Core Concepts

| Concept      | Description                                              |
| ------------ | -------------------------------------------------------- |
| **Instinct** | Always-loaded rules (preferences, conventions, constraints) |
| **Compiler** | LLM process that extracts patterns from logs and generates rules |
| **Store**    | Searchable archive of conversation logs                  |
| **Promotion**| Project rules promoted to global rules (≥3 projects with same pattern) |

## Safety

- **Local-First**: All data stored locally
- **Secret Sanitization**: Automatically filters API keys, passwords, and sensitive information
- **Backup**: Automatic backup on every apply, with revert capability

## Documentation
## Documentation

- [RFC-002: Bicameral Memory Architecture](docs/RFC-002-Lattice-Bicameral-Memory.md)
- [Architecture Guide](docs/ARCHITECTURE.md) — Core/Shell layers, data flow, configuration
- [API Reference](docs/api-reference.md) — Python SDK usage
- [Session Compression RFC](docs/RFC-002-R1-Session-Compression.md) — Layer 0/1 compression for token efficiency
## License

AGPL-3.0-only