Metadata-Version: 2.4
Name: aurra-langchain
Version: 0.1.0
Summary: LangChain integration for Aurra — memory infrastructure with citations, audit trails, and bi-temporal versioning
Author-email: Aurra <support@aurra.us>
License: MIT
Project-URL: Homepage, https://aurra.us
Project-URL: Documentation, https://docs.aurra.us
Project-URL: Repository, https://github.com/trivediakshay2012/aurra
Keywords: langchain,aurra,memory,ai,agents,llm,retrieval
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.9
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: aurra>=0.5.0
Requires-Dist: langchain-core>=1.0.0
Provides-Extra: dev
Requires-Dist: pytest>=7.0; extra == "dev"
Requires-Dist: pytest-mock>=3.10; extra == "dev"
Dynamic: license-file

# aurra-langchain

LangChain integration for [Aurra](https://aurra.us) - memory infrastructure for AI agents.

Drop-in `BaseChatMessageHistory` that gives your LangChain agents:

- **Bi-temporal versioning** - memories know what was true, when, and who superseded them
- **Citation-grounded retrieval** - every retrieved fact carries its source
- **Multi-tenant isolation** - scope memories per-user without index sprawl
- **Auto-supersession** - Aurra\'s classifier detects when a new fact replaces an old one

Targets **LangChain 1.0+** (uses the `RunnableWithMessageHistory` pattern).

## Installation

```bash
pip install aurra-langchain
```

## Usage

```python
from aurra_langchain import AurraChatMessageHistory
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_anthropic import ChatAnthropic

# Wire up history backed by Aurra
def get_session_history(session_id: str):
    return AurraChatMessageHistory(
        api_key="aurra_...",
        session_id=session_id,
        tenant_id="acme-corp",  # optional
    )

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant with memory."),
    MessagesPlaceholder("history"),
    ("human", "{input}"),
])

chain = prompt | ChatAnthropic(model="claude-haiku-4-5-20251001")

with_history = RunnableWithMessageHistory(
    chain,
    get_session_history,
    input_messages_key="input",
    history_messages_key="history",
)

# Each turn writes to Aurra automatically
with_history.invoke(
    {"input": "My favorite coffee is Stumptown."},
    config={"configurable": {"session_id": "user-42-conv-7"}},
)
```

## Direct API

You can also use it standalone, without `RunnableWithMessageHistory`:

```python
history = AurraChatMessageHistory(
    api_key="aurra_...",
    session_id="user-42-conv-7",
)
history.add_user_message("My favorite coffee is Stumptown.")
history.add_ai_message("Got it.")
print(history.messages)  # list of messages, oldest first
```

## Configuration

| Param | Type | Default | Description |
|---|---|---|---|
| `api_key` | str | required | Your Aurra API key (from app.aurra.us) |
| `session_id` | str | required | Conversation identifier (groups extracted memories) |
| `tenant_id` | str | None | Multi-tenant scope (optional) |
| `base_url` | str | https://api.aurra.us | Override for self-hosted/staging |
| `auto_supersede` | bool | None | Per-key default if None |
| `max_messages_returned` | int | 50 | Cap on history.messages length |

## What gets stored

Each `add_user_message` / `add_ai_message` call buffers the turn. When both
sides of an exchange are present, the pair is sent to Aurra\'s
`/agent/memories` endpoint in messages mode. Aurra\'s extractor LLM atomizes
the exchange into individual factual memories.

The `messages` getter pulls memories back as `HumanMessage` / `AIMessage`
objects in chronological order, reconstructed from each memory\'s
`original_message` field.

## Lossiness note

Aurra is a fact store, not a verbatim message store. Round-tripping messages
is approximate: `original_message` is preserved up to 500 chars per turn, and
extracted memories may have refined wording. If you need 100% faithful
conversation replay, use a different LangChain backend.

## Limitations (v0.1.0)

- Streaming not yet supported (PRs welcome)
- Tool integration deferred to v0.2.0
- Retriever class (separate from history) deferred to v0.2.0
- The legacy `BaseMemory` pattern (LangChain 0.x) is not supported - use
  `RunnableWithMessageHistory` with `AurraChatMessageHistory` instead

## License

MIT.
