Metadata-Version: 2.4
Name: metatotai
Version: 0.1.0
Summary: Active-inference-driven meta tree-of-thought planning engine. Public protocol surface (LLMProvider, TraceSink, WorldModelProvider, MemoryProvider, ThoughtseedPromoter, DecisionLearner). Engine implementation arrives in 0.2.0. Distinct from kyegomez/Meta-Tree-Of-Thoughts.
Author-email: Mani Saint-Victor <drmani215@gmail.com>
Maintainer-email: Mani Saint-Victor <drmani215@gmail.com>
License-Expression: MIT
Project-URL: Homepage, https://github.com/bionicbutterfly13/metatotai
Keywords: active-inference,tree-of-thought,meta-tot,free-energy,planning,agent
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.11
Description-Content-Type: text/markdown
License-File: LICENSE
Dynamic: license-file

# metatotai

> **Status: name reserved. Real implementation arrives in 0.1.0.**

`metatotai` will be an active-inference-driven meta tree-of-thought planning engine — a deterministic, replay-safe decision layer that consumes [`elume`](https://pypi.org/project/elume/)'s cognitive substrate and [`linoss-dynamics`](https://pypi.org/project/linoss-dynamics/) physics primitives.

## What this will be

- **Active-inference engine** — variational free energy + expected free energy computation, belief updates, policy scoring.
- **Meta tree-of-thought planner** — search and expansion over reasoning trajectories, scored by free-energy minimization rather than LLM-graded floats.
- **Theory of mind** — `PartnerModel` for inferring other agents' beliefs from observed behavior, built on `elume.MentalModel`.
- **Provider-injected** — `LLMProvider`, `MemoryProvider`, `WorldModelProvider`, `TraceSink` protocols. Bring your own backend.

## What this is not

- **Not** a fork of [`kyegomez/Meta-Tree-Of-Thoughts`](https://github.com/kyegomez/Meta-Tree-Of-Thoughts). That is a LangChain-based prompt-rewriting meta-agent over LLM scoring; this is a different system. Zero shared code. Distinct name to avoid confusion.
- **Not** the original Tree of Thoughts (Yao et al. 2023). Different mechanism.

## Layering

```
linoss-dynamics  ← physics primitive (NumPy)
       ↑
   elume         ← cognitive substrate (mental models, basins, evolution)
       ↑
   metatotai     ← active-inference + meta-ToT planning
```

## Install (when 0.1.0 ships)

```bash
pip install metatotai
```

## Citations

`metatotai`'s active-inference implementation adopts the conceptual framing from Karl Friston's free-energy principle. The implementation is original Python code; conceptual framing only is shared with upstream sources.

**Foundational active inference:**
- Friston, K. J. (2010). The free-energy principle: a unified brain theory? *Nature Reviews Neuroscience*, 11(2), 127–138. https://doi.org/10.1038/nrn2787
- Friston, K., FitzGerald, T., Rigoli, F., Schwartenbeck, P., & Pezzulo, G. (2017). Active Inference: A Process Theory. *Neural Computation*, 29(1), 1–49. https://doi.org/10.1162/NECO_a_00912
- Parr, T., Pezzulo, G., & Friston, K. J. (2022). *Active Inference: The Free Energy Principle in Mind, Brain, and Behavior*. MIT Press. https://mitpress.mit.edu/9780262045353/

**Reference implementations consulted (no shared code):**
- **pymdp** — Heins, C., Millidge, B., Demekas, D., Klein, B., Friston, K., Couzin, I. D., & Tschantz, A. (2022). pymdp: A Python library for active inference in discrete state spaces. *Journal of Open Source Software*. https://github.com/infer-actively/pymdp (MIT)
- **ActiveInference.jl** — Nehrer, S. M. et al., Julia package for active inference and POMDP modeling. https://github.com/samuelnehrer02/ActiveInference.jl (MIT)

**Tree-of-thought lineage:**
- Yao, S., Yu, D., Zhao, J., Shafran, I., Griffiths, T. L., Cao, Y., & Narasimhan, K. (2023). Tree of Thoughts: Deliberate Problem Solving with Large Language Models. *NeurIPS 2023*. https://arxiv.org/abs/2305.10601

**Substrate dependencies (used at runtime):**
- [`elume`](https://pypi.org/project/elume/) — cognitive substrate (mental models, basins, evolution)
- [`linoss-dynamics`](https://pypi.org/project/linoss-dynamics/) — LinOSS physics primitive (Rusch & Rus, ICLR 2025)

**Distinct from (zero shared code):**
- [`kyegomez/Meta-Tree-Of-Thoughts`](https://github.com/kyegomez/Meta-Tree-Of-Thoughts) — LangChain-based prompt-rewriting meta-agent over LLM scoring. Different mechanism, different stack. Distinct name to avoid academic and licensing confusion.

## License

MIT.
