Metadata-Version: 2.4
Name: engram-ms
Version: 1.3.0
Summary: Engram — AI agent memory system with SQLite+FTS5, MCP integration, and quality gates. Canonical distribution (formerly published as 'memorytrace').
Project-URL: Homepage, https://github.com/aop60003/default
Project-URL: Issues, https://github.com/aop60003/default/issues
License-Expression: MIT
License-File: LICENSE
Keywords: agent,ai,fts5,llm,mcp,memory,sqlite
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.9
Requires-Dist: tomlkit>=0.13.0
Provides-Extra: dev
Requires-Dist: mypy>=1.10; extra == 'dev'
Requires-Dist: pytest-cov>=5.0; extra == 'dev'
Requires-Dist: pytest>=8.0; extra == 'dev'
Requires-Dist: ruff>=0.5.0; extra == 'dev'
Provides-Extra: full
Requires-Dist: anthropic>=0.40.0; extra == 'full'
Requires-Dist: mcp>=1.0.0; extra == 'full'
Requires-Dist: numpy>=1.24.0; extra == 'full'
Requires-Dist: openai>=1.50.0; extra == 'full'
Provides-Extra: llm
Requires-Dist: anthropic>=0.40.0; extra == 'llm'
Requires-Dist: openai>=1.50.0; extra == 'llm'
Provides-Extra: mcp
Requires-Dist: mcp>=1.0.0; extra == 'mcp'
Provides-Extra: semantic
Requires-Dist: numpy>=1.24.0; extra == 'semantic'
Requires-Dist: openai>=1.50.0; extra == 'semantic'
Description-Content-Type: text/markdown

# Engram

Persistent memory for AI agents, backed by SQLite + FTS5.

## Install

```bash
pip install engram-ms
```

Optional extras:

```bash
pip install "engram-ms[llm]"
pip install "engram-ms[semantic]"
pip install "engram-ms[mcp]"
pip install "engram-ms[full]"
```

> The `memorytrace` distribution name is the legacy alias still on PyPI:
> `pip install memorytrace` installs the same engine.

## Simple CLI

The default CLI is `engram`. `python -m engram` is equivalent.

```bash
engram init
engram save "Minseong Jeong is the CTO of Galaxy Corp"
engram find "CTO"
engram who "Minseong Jeong"
engram remember "Minseong Jeong" "Interested in AI and robotics"
engram status
```

## Advanced CLI

Use `engram-advanced` when you need structured options, JSON output, or a custom DB path.

```bash
engram-advanced --db ~/.engram/memory.db search "Galaxy Corp" --max-results 5
engram-advanced --json health
```

## Python SDK

```python
from engram.integrations.sdk import EngramSDK

with EngramSDK() as sdk:
    sdk.start_session("assistant")
    sdk.store("Minseong Jeong is the CTO of Galaxy Corp")
    sdk.add_fact("Minseong Jeong", "Interested in AI and robotics")
    result = sdk.search("Galaxy")
    print(result.to_agent_context())
```

## Source Checkout Compatibility

A repo-local `python mem ...` wrapper is kept for backward compatibility in source checkouts.
It forwards to the same simple CLI as `engram`, but it is not installed as a packaged console script.

## Related packages

The `engram` namespace on PyPI is reserved for small, focused tools that share
this memory model. Sibling packages currently published from this workspace:

| Package | Status | Purpose |
|---|---|---|
| [`engram-ms`](https://pypi.org/project/engram-ms/) | canonical (`0.0.x` placeholder, full engine in `0.1.x`) | The engram-named distribution — preferred install |
| [`memorytrace`](https://pypi.org/project/memorytrace/) | legacy alias | The original distribution name; same engine |

## Docs

- `docs/04-usage-guide/01-quickstart.md`
- `docs/01-project-analysis/05-cli-commands.md`
- `docs/01-project-analysis/09-setup-guide.md`

## Workspace Skill

A workspace-local skill package for generating a static project wiki tree lives at
`.agents/skills/engram-tree`.

Generate the site with:

```bash
python tools/run_engram_tree.py
python .agents/skills/engram-tree/scripts/build_engram_tree.py --project-root . --output-dir dist/engram-tree
```

Install the skill for Claude Code and Codex with:

```bash
python tools/install_engram_tree_skill.py --dry-run
python tools/install_engram_tree_skill.py --create-missing
```

## Bridge to Claude Code / Codex CLI (engram-ctx)

After `pip install engram-ms`, register engram-ctx as a memory bridge in your CLI of choice:

```bash
# Claude Code
engram-ctx install claude-code

# Codex CLI
engram-ctx install codex
```

Both installers are idempotent. They register:

- **MCP server** `engram` exposing `engram_ctx_index`, `engram_ctx_search`, `engram_ctx_stats`, `engram_ctx_doctor`, `engram_ctx_purge` (in addition to the existing `memory_*` tools).
- **Lifecycle hooks** `SessionStart`, `PreToolUse`, `PostToolUse`, `UserPromptSubmit`, `Stop`. Large tool outputs (>500 chars) are auto-indexed into `tool_observations` (PII-masked, head/tail truncated). Recall via `engram_ctx_search`.
- **Skill** `engram-context` with caveman-style output compression and routing rules.

For non-hook MCP clients (Cursor, Copilot CLI, OpenCode), use `engram.integrations.mcp_middleware.EngramContextMiddleware` directly.
