Metadata-Version: 2.3
Name: veritycog
Version: 0.1.1
Summary: A cognitive memory system for AI agents — neuroscience-inspired, zero dependencies by default
Project-URL: Homepage, https://github.com/bnyhil31-afk/verity
Project-URL: Repository, https://github.com/bnyhil31-afk/verity
Project-URL: Bug Tracker, https://github.com/bnyhil31-afk/verity/issues
Project-URL: Changelog, https://github.com/bnyhil31-afk/verity/blob/main/CHANGELOG.md
Author-email: Claude <noreply@anthropic.com>
License: Apache-2.0
Keywords: ai-agents,cognitive,embeddings,knowledge-graph,llm,memory,neuroscience,rag,sqlite
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.11
Requires-Dist: cryptography>=42.0.0
Requires-Dist: fastapi>=0.115.0
Requires-Dist: fastmcp>=2.0.0
Requires-Dist: httpx>=0.27.0
Requires-Dist: owlrl>=6.0.2
Requires-Dist: pydantic>=2.0.0
Requires-Dist: pyshacl>=0.26.0
Requires-Dist: rdflib>=7.0.0
Requires-Dist: uvicorn[standard]>=0.30.0
Requires-Dist: yake>=0.4.8
Provides-Extra: cognitive
Requires-Dist: hnswlib>=0.8.0; extra == 'cognitive'
Requires-Dist: model2vec>=0.3.0; extra == 'cognitive'
Provides-Extra: connectors
Requires-Dist: dlt>=1.4.0; extra == 'connectors'
Provides-Extra: dev
Requires-Dist: hatch>=1.12.0; extra == 'dev'
Requires-Dist: hypothesis>=6.100.0; extra == 'dev'
Requires-Dist: mypy>=1.10.0; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.23.0; extra == 'dev'
Requires-Dist: pytest-benchmark>=4.0; extra == 'dev'
Requires-Dist: pytest-cov>=5.0.0; extra == 'dev'
Requires-Dist: pytest>=8.0.0; extra == 'dev'
Requires-Dist: ruff>=0.4.0; extra == 'dev'
Provides-Extra: fast
Requires-Dist: oxrdflib>=0.5.0; extra == 'fast'
Requires-Dist: pyoxigraph>=0.3.22; extra == 'fast'
Provides-Extra: full
Requires-Dist: dlt>=1.4.0; extra == 'full'
Requires-Dist: hnswlib>=0.8.0; extra == 'full'
Requires-Dist: model2vec>=0.3.0; extra == 'full'
Requires-Dist: oxrdflib>=0.5.0; extra == 'full'
Requires-Dist: pyoxigraph>=0.3.22; extra == 'full'
Provides-Extra: team
Requires-Dist: asyncpg>=0.29.0; extra == 'team'
Requires-Dist: pgvector>=0.3.0; extra == 'team'
Requires-Dist: sqlalchemy[asyncio]>=2.0.0; extra == 'team'
Description-Content-Type: text/markdown

# Verity

A cognitive memory system for AI agents and applications — inspired by how the brain actually stores, recalls, and forgets.

## What it does

AI memory systems treat memory as a database: store a string, retrieve a string. Verity treats memory as cognition. It uses a dual-speed store (fast episodic buffer + slow semantic store), runs a sleep consolidation cycle between sessions to decay, prune, and abstract memories, and applies reconsolidation rules that let memories update without drifting. The result is a memory system that behaves more like a mind than a key-value store.

Three things Verity does that no other package does: reconsolidation stability (memories update on access but cannot drift — a four-tier Bayesian system gates every modification), sleep consolidation (an offline decay → prune → abstract cycle that runs between sessions, mirrors the SO-spindle-ripple cascade), and tiered temporal weighting (the system auto-graduates from exponential to Bayesian renewal to Hawkes processes as event history grows, per-memory, without any configuration).

The API is seven methods. Zero configuration. No GPU. No cloud. No API key required. Runs on a Raspberry Pi and in a Kubernetes pod identically. Works with nothing but Python's stdlib, and gets progressively smarter as you add optional packages.

## Install

```bash
pip install veritycog                    # stdlib only — text search
pip install "veritycog[cognitive]"       # + model2vec + hnswlib (recommended)
pip install "veritycog[connectors]"      # + dlt (60+ data sources)
pip install "veritycog[full]"            # everything
```

## Quickstart

```python
from verity import Memory

m = Memory()                                      # SQLite, zero config
m.add("I prefer dark mode")                       # store
m.add("Team standup every day at 9am")
m.add("The API uses JWT authentication")

results = m.search("daily schedule")              # retrieve
for r in results:
    print(r["content"], f"({r['confidence']:.0%} confidence)")

m.update(results[0]["id"], "standup moved to 10am")  # update
m.consolidate()                                   # sleep cycle
m.export()                                        # GDPR portability
m.delete(results[0]["id"])                        # GDPR erasure
```

## The cognitive layer

| Component | Neuroscience model | What it does | Maps to |
|---|---|---|---|
| DualSpeedStore | Complementary Learning Systems | Fast episodic buffer + slow semantic store | SQLite + numpy |
| ImportanceScorer | Predictive Processing | Prediction error as surprise signal | Embedding cosine distance |
| ReconsolidationEngine | Memory Reconsolidation | 4-tier stability prevents drift | Bayesian Beta-Bernoulli |
| ConsolidationCycle | Sleep Consolidation | Decay/prune/abstract between sessions | Scheduled background pass |
| TemporalWeighter | Temporal Point Processes | Auto-selects exponential/renewal/Hawkes | Tiered by event density |
| GlobalWorkspace | Global Workspace Theory | K=5 competitive selection + position-aware output | Mitigates lost-in-middle |

## Advanced: Engine API

`Memory` wraps the lower-level Engine API, which provides the full RELATE/NAVIGATE/GOVERN/REMEMBER loop with connectors, profiles, consent management, and a Merkle-chained audit trail. See [examples/01_personal_notes.py](examples/01_personal_notes.py) for a complete walkthrough.

## For AI agents

```python
from verity import Memory

memory = Memory()

def agent_response(user_input: str) -> str:
    # Recall relevant context
    context = memory.search(user_input, k=5)
    context_str = "\n".join(r["content"] for r in context)

    # ... call your LLM with context_str ...

    # Remember the interaction
    memory.add(f"User asked: {user_input}")
    return response
```

## License

Apache-2.0
