Metadata-Version: 2.4
Name: elume
Version: 0.4.0
Summary: Agentic memory with mental-model architecture. Deterministic, replay-safe substrate for cognitive-architecture agents: LinOSS temporal encoding, Hopfield attractor memory, mental-model and metacognitive primitives, strategy evolution, pluggable MemEvolve benchmark adapter. Contribution: byte-identical replay, immutable lineage, provider abstraction, composable operators.
Project-URL: Homepage, https://github.com/bionicbutterfly13/elume
Project-URL: Repository, https://github.com/bionicbutterfly13/elume
Project-URL: Issues, https://github.com/bionicbutterfly13/elume/issues
Project-URL: Changelog, https://github.com/bionicbutterfly13/elume/blob/main/CHANGELOG.md
Author-email: Mani Saint-Victor <drmani215@gmail.com>
License: MIT
License-File: LICENSE
Keywords: active-inference,attractor-basins,cognitive-architecture,deterministic,linoss,memory,mental-model,metacognition,self-evolving
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.11
Requires-Dist: linoss-dynamics>=0.1.0
Requires-Dist: numpy>=1.26
Provides-Extra: dev
Requires-Dist: pytest-asyncio>=0.23; extra == 'dev'
Requires-Dist: pytest>=8.0; extra == 'dev'
Requires-Dist: ruff>=0.5; extra == 'dev'
Description-Content-Type: text/markdown

# Elume

[![PyPI version](https://img.shields.io/pypi/v/elume.svg)](https://pypi.org/project/elume/)
[![Python versions](https://img.shields.io/pypi/pyversions/elume.svg)](https://pypi.org/project/elume/)
[![CI](https://github.com/bionicbutterfly13/elume/actions/workflows/ci.yml/badge.svg)](https://github.com/bionicbutterfly13/elume/actions/workflows/ci.yml)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

**Agentic memory with mental-model architecture — a deterministic, replay-safe substrate for cognitive-architecture agents.**

Elume is the cognitive substrate underneath an agent: long-horizon temporal encoding, attractor-based associative memory, mental-model primitives, metacognitive control, and deterministic strategy evolution — all behind clean provider boundaries, all replay-safe by construction.

Memory is the entry point. Mental modeling is the architecture. Replay-safety is the engineering contribution.

It integrates LinOSS-style temporal encoding (Rusch & Rus, ICLR 2025), Hopfield-style associative memory, mental-model and metacognitive record types, and a deterministic evolution substrate into one open-source stack. The contribution of Elume is not the invention of the underlying methods in isolation, but the engineering work required to combine them, adapt their codepaths, and make them operate coherently inside a single deterministic kernel.

## What Elume is

Elume is a runtime cognitive substrate for agents that need to:

- encode long trajectories with oscillatory state-space dynamics,
- recover useful prior state through attractor-based associative recall,
- maintain explicit mental models with predictions and revisions,
- exercise metacognitive control over inference and action selection,
- and evolve memory strategies over time, deterministically.

The full primitive set:

| Layer | Modules | Role |
|---|---|---|
| **Temporal encoding** | `elume.linoss` | LinOSS solver, encoder, timing |
| **Memory substrate** | `elume.basins`, `elume.network` | Attractor field, Hopfield, self-modeling network |
| **Mental modeling** | `elume.models.mental_model`, `elume.cognition.mental_model` | `MentalModel`, `BasinRelationship`, `PredictionTemplate`, `ModelPrediction`, `ModelRevision`, mental-model subnetworks |
| **Metacognition** | `elume.models.metacognitive` | `CognitiveCore`, `MetacognitiveParticle`, `MentalAction`, `ParticleType` |
| **Belief & cognition** | `elume.models.belief`, `elume.models.cognitive`, `elume.cognition` | Belief states, cognitive events, deterministic thought competition, prior-gated cognition, curiosity homing |
| **Evolution** | `elume.evolution` | Strategy lifecycle, GA over immutable `Strategy` records |
| **Determinism** | `elume.envelope` | Canonical pre-image hashing, byte-identical replay |
| **Integration** | `elume.providers`, `elume.adapters`, `elume.embedders` | Provider contracts, MemEvolve cartridge, embedding protocols |

In practice, Elume packages and stabilizes multiple upstream ideas plus original cognitive-architecture engineering so they can be used together as one substrate.

## What Elume is not

Elume does not claim authorship of the original LinOSS, MemEvolve, or Hopfield-style memory ideas.

Instead, it is an open-source composition of these components, with the modifications, interfaces, and system-level fixes needed to make them work together in one usable framework.

Elume's evolution module is a deterministic, replay-safe genetic algorithm operating on immutable Strategy records through a provider boundary. The framing — agent memory as an evolvable population rather than policy weights — is adopted from MemEvolve (Zhang et al. 2025, arXiv:2512.18746). The implementation is original Elume work using standard GA primitives. What Elume contributes is the engineering substrate: byte-identical replay, immutable lineage, provider abstraction, and composable operator protocols.

## What Elume created

Things that did not exist anywhere before this project:

- **The deterministic envelope** (`elume.envelope`, v0.1) — a canonical
  pre-image (BLAKE2b-256) over operation inputs, RNG state, result, and
  provider snapshot, giving every cognitive op a byte-identical replay
  contract. Six reference operations registered today (belief embed,
  basin recall, thought competition, evolution step, self-model step,
  curiosity score).
- **The platform-tagged float-hash policy** — `platform_fingerprint()`
  folded into the canonical pre-image so cross-platform replay drift
  surfaces as a hash mismatch by construction, not silent agreement on
  incidentally-matching bytes.
- **`elume.adapters.memevolve.ElumeMemoryProvider`** (v0.2) — the first
  deterministic baseline in MemEvolve's `--memory_provider` list. Same
  seed, same input → byte-identical `MemoryResponse` per step.
- **`cognition.curiosity_score`** as a replayable envelope op (v0.2) —
  Shannon-entropy + information-gain scoring wrapped with the same
  hash-equal replay contract as every other op. The math is ported
  from dionysus3 (credited below); the envelope wrap, the integration
  with `run_gated_thought_competition`, and the `CuriosityPrior`
  derivation are original Elume work.
- **Hyperevolution coupling** (v0.2) — the wiring inside
  `ElumeMemoryProvider` that lets curiosity continuously re-acquire
  the search heading: `provide_memory` re-ranks basins by current
  information gain; `take_in_memory` updates a per-session
  `BeliefBuffer` from trajectory outcomes; the whole pattern toggles
  via one config key.
- **The kernel discipline** — frozen records, successor semantics
  (`.evolved()`, `.revised()`, `.with_status()`), injected RNG,
  provider-boundary persistence, no framework dependencies. This
  discipline applied uniformly across LinOSS, basins, evolution,
  cognition, embedders, and providers is what makes the whole stack
  composable inside one Python package.

What Elume adopted from upstream is named in the [Attribution](#attribution)
section. What Elume created is everything in the bullets above.

## Core composition

Elume combines:

1. **LinOSS-based temporal encoding** for long-horizon trajectory representation.
2. **Attractor-based associative memory** for content-addressable recall.
3. **Deterministic adaptive memory logic** for improving memory behavior over time, with an optional curiosity homing signal.

These components are integrated into a shared memory pipeline for agentic learning.

## Why Elume

- **Determinism** — injected RNG, byte-identical replay within a platform fingerprint. Every retrieval decision can be audited.
- **Immutable records** — frozen trajectory snapshots, belief states, and basin activations. Strategies evolve via successors, not mutation.
- **Provider boundary** — storage is a protocol contract, not an implementation. Swap backends without touching cognition code.
- **No framework lock-in** — no FastAPI, Graphiti, or agent runtime in the core. Adapters live in consumers.
- **Cross-platform float-hash policy** — `platform_fingerprint()` is folded into the canonical hash pre-image. Cross-platform drift is a visible mismatch, not silent corruption.
- **Curiosity-driven hyperevolution** — the optional curiosity homing signal biases memory retrieval toward entropy-reducing directions, turning uniform-random search into goal-directed exploration.
- **Content-addressed replay artifacts** — provider snapshots can stay full dumps by default or emit a manifest/root-hash form for larger replay stores.

## MemEvolve cartridge

Elume v0.2.0 ships a `BaseMemoryProvider`-conformant adapter so MemEvolve
([bingreeky/MemEvolve](https://github.com/bingreeky/MemEvolve)) can benchmark
Elume against its 11 existing baselines. Two-line registration, then:

```bash
python run_flash_searcher_mm_gaia.py --memory_provider elume --sample_num 5
```

See [docs/adapters/memevolve.md](./docs/adapters/memevolve.md) for the full
install guide, determinism guarantee, and hyperevolution mode.

## Why Elume exists

Many memory systems are strong in isolation but difficult to combine in practice.

Elume exists to make these components interoperable: to unify their interfaces, reconcile assumptions, patch incompatibilities, and provide a coherent open-source implementation that others can inspect, use, and build on.

## Attribution

Elume builds directly on upstream work and code associated with LinOSS, MemEvolve, Hopfield-style associative memory, and attractor / neural-field context-engineering ideas.

Specific upstream sources:

- **LinOSS — Oscillatory State-Space Models** — T. Konstantin Rusch and Daniela Rus, *International Conference on Learning Representations (ICLR)*, 2025. Temporal encoding substrate and oscillator dynamics inside the basin field.
- **MemEvolve — Meta-Evolution of Agent Memory Systems** — Guibin Zhang, Haotian Ren, Chong Zhan, Zhenhong Zhou, Junhao Wang, He Zhu, Wangchunshu Zhou, and Shuicheng Yan, arXiv preprint [2512.18746](https://arxiv.org/abs/2512.18746), 2025. Source of the evolvable-memory-population framing. The `BaseMemoryProvider` cartridge interface and shaping helpers in `src/elume/adapters/memevolve/shaping.py` are adapted from the [bingreeky/MemEvolve](https://github.com/bingreeky/MemEvolve) codebase (Apache-2.0), with HTTP/HMAC stripped.
- **Context Engineering: Beyond Prompt Engineering** — Context Engineering Contributors (maintained by David Kimai), [github.com/davidkimai/context-engineering](https://github.com/davidkimai/context-engineering) (MIT), 2025. Source of the attractor-based neural-field model at the core of Elume's memory layer — specifically `00_foundations/08_neural_fields_foundations.md`, `00_foundations/11_emergence_and_attractor_dynamics.md`, `40_reference/attractor_dynamics.md`, and the memory-attractor protocol shells in `60_protocols/shells/`.
- **Hopfield-style associative memory** — Hopfield (PNAS 1982); textbook synthesis from Anderson (2014, Ch. 13); capacity bound from Amit, Gutfreund & Sompolinsky (1985). Classical mathematical substrate for discrete pattern storage inside the basin subsystem.
- **Source codebase** — **dionysus3**, a research cognitive architecture. Every module in `elume/` was originally developed there. Elume relocates the kernel math with verbatim semantics and strips project-specific glue so the result is a pure library. The Shannon-entropy + information-gain mechanism in `src/elume/cognition/curiosity.py` is ported from dionysus3's `CuriosityDriveService` (`api/services/mosaeic_self_discovery.py` and `arousal_system_service.py`).

BibTeX entries for all upstream academic citations are in [`CITATIONS.bib`](./CITATIONS.bib). Please cite the upstream sources in any published work that uses Elume.

## Status

Elume is an open-source integration project under active development.

Twenty-six tracks landed: kernel bootstrap, core data models, LinOSS solver + timing, Hopfield network, basin field engine, attractor basin core, embedder protocol, provider contracts, the evolution engine, the self-modeling network engine, immutable cognitive record types, immutable mental-model domain records, immutable metacognitive control records, prior hierarchy records, mental-model subnetworks, the cognitive event protocol, cognitive-event embedders, immutable thought-level records, immutable neuronal-packet records, deterministic thought competition, prior-gated cognition, the MemEvolve cartridge, curiosity homing device, hyperevolution wiring, and content-addressed provider snapshots. Track `007` was retired after source review showed it was framed against the wrong dionysus3 concept. **1194 tests passing, ruff clean.**

Phase 2 is complete through the prior gate: `Track 011` shipped `elume.network`, `Tracks 014`, `016`, `018`, `021`, and `022` landed the minimal cognition gate from `MentalModel` through `LinOSSEncoder`, `Tracks 012`, `013`, and `019` landed immutable thought and packet records plus deterministic EFE competition, and `Tracks 015`, `017`, and `020` landed metacognitive control, generic priors, and prior-gated cognition. See [`conductor/tracks.md`](./conductor/tracks.md).

Phase 3 is complete: the MemEvolve cartridge (`elume.adapters.memevolve`), curiosity homing (`elume.cognition.curiosity`), and hyperevolution wiring now connect Elume's deterministic substrate to MemEvolve's outer evolutionary loop. Phase 4 has started with content-addressed provider snapshots for larger replay artifacts.

Archon-style deterministic-harness adoption is complete for v0.1.0. The kernel has injected RNGs, frozen trajectory metadata, provider snapshots, content-addressed snapshot manifests, and an `elume.envelope` v0 operation registry covering belief embedding, evolution step, thought competition, self-model stepping, Hopfield recall, and (v0.2.0) curiosity scoring. Cross-platform float-hash policy is documented in `docs/archon-readiness/21-float-hash-policy.md`.

## Install

Requires Python `>=3.11`.

```bash
pip install elume
```

## Quickstart (development)

For local development, use [`uv`](https://github.com/astral-sh/uv) and an editable install:

```bash
# from the repo root
uv venv .venv
uv pip install -e ".[dev]"

# run the test suite
.venv/bin/pytest

# lint
.venv/bin/ruff check src tests reference_service/src

# optional: reference service demo
uv pip install -e ./reference_service
PYTHONPATH=src:reference_service/src python -m reference_service
```

## Layout

```
elume/
├── src/
│   └── elume/
│       ├── basins/      # Hopfield + basin field dynamics (neural fields model)
│       ├── cognition/   # mental-model subnetworks + typed cognitive events
│       ├── embedders/   # event -> trajectory projection protocols
│       ├── linoss/      # oscillatory state-space primitives (solver, timing, encoder)
│       ├── network/     # self-modeling network substrate for Phase 2 cognition
│       ├── evolution/   # successor-based strategy evolution
│       ├── providers/   # storage contracts + reference provider
│       ├── envelope/    # deterministic replay envelope + reference ops
│       ├── adapters/    # provider adapters (memevolve cartridge)
        └── models/      # beliefs, strategies, trajectories, cognitive + thought records
├── reference_service/   # runnable CLI/FastAPI demo (separate package, optional)
├── tests/
│   ├── unit/            # unit tests for kernel modules
│   ├── contract/        # contract tests consumers re-run against their impls
│   └── integration/     # end-to-end composition tests across subsystems
└── conductor/           # spec-driven development docs and tracks
```

## Consuming Elume

Downstream projects pin a versioned PyPI release:

```bash
pip install elume==0.4.0
```

For co-development against an unreleased branch, an editable install also works:

```bash
# from the consumer repo (e.g. dionysus3)
pip install -e /path/to/elume
```

## Principles

- **Integration, not invention.** The underlying techniques are open source or openly published; Elume's work is bringing them together.
- **Kernel, not application.** Reusable mechanism only. Adapters and policies live in consumers.
- **No framework lock-in.** No FastAPI, no Graphiti, no agent runtime in the core.
- **Pluggable storage.** Providers are contracts, not implementations.
- **Reproducible.** Deterministic where possible; evolution randomness goes through an injectable RNG.
- **Contract tests as the regression net.** Consumers re-run `tests/contract/` against their provider implementations.
- **The past is frozen.** Trajectory records, belief snapshots, and basin activations are immutable. Strategies evolve by producing successors, not by mutating in place.

## On the name

**Elume** is the brand form. **ELUME** works as an acronym mnemonic — Evolvable Long-horizon Unified Mental-model Engine.

For public descriptors:
- **Short:** *Cognitive Substrate for Agents*
- **Memory-first framing (for SaaS/category fit):** *Agentic Memory with Mental-Model Architecture*
- **Technical long form:** *Deterministic, Replay-Safe Substrate for Cognitive-Architecture Agents*
- **Tagline:** *Agentic memory with mental-model architecture — a deterministic, replay-safe substrate for cognitive-architecture agents.*

## License

MIT. Compatible with Context-Engineering's MIT license and all upstream components.

See [`ATTRIBUTION.md`](./ATTRIBUTION.md) and [`conductor/product.md`](./conductor/product.md) for the full attribution and product specifications.
