Metadata-Version: 2.4
Name: agentgc
Version: 0.0.4
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Rust
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Operating System :: OS Independent
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Summary: Stateless memory-compression layer for autonomous AI agents
Keywords: llm,agent,memory,rag,consolidate,compress
Author-email: Carrick Cheah <carrick.cheah@green-methods.com>
License: Apache-2.0
Requires-Python: >=3.10
Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM
Project-URL: Homepage, https://github.com/carrickcheah/agentgc
Project-URL: Issues, https://github.com/carrickcheah/agentgc/issues
Project-URL: Repository, https://github.com/carrickcheah/agentgc

# agentgc-bindings

FFI surface for AgentGC — Python (via uniffi) today, TypeScript via
uniffi-bindgen-react-native or napi-rs to follow.

## What's exposed

A concrete-type binding: `SqliteStorage` + `AzureOpenAIClient`. Other
storage/LLM combinations need their own constructor variants (or
future binding crates) because uniffi works best with concrete types
and AgentGC's native AFIT trait isn't directly object-safe.

The Python surface mirrors `crates::agentgc_core::AgentGC`'s methods:

```python
import asyncio
from agentgc import AgentGC

async def main():
    # Async alternate constructor — Python's __init__ can't be async,
    # so uniffi maps the async constructor to AgentGC.open(...).
    gc = await AgentGC.open(
        sqlite_url="sqlite:agent.db",
        azure_endpoint="https://your-resource.cognitiveservices.azure.com",
        azure_api_key="...",
        azure_deployment="gpt-5.5",
        azure_api_version="2025-04-01-preview",
        model="gpt-5.5",
    )

    memory = await gc.extract("user:alice", "I prefer dark mode")
    if memory is not None:
        print(memory.content)

    matches = await gc.retrieve("user:alice", "dark", 10)
    state = await gc.consolidate("main", "noisy log here...")
    print(state.learned_rules)
    if state.active_task is not None:
        print(state.active_task.intent, state.active_task.status)

asyncio.run(main())
```

## Build the Python wheel

We use [maturin](https://www.maturin.rs/) (recommended for Rust+Python
projects). From this directory:

```bash
# uv environment with maturin
uv venv
uv pip install maturin

# Build a wheel for the current platform
uv run maturin build --release

# Or develop-install for iteration
uv run maturin develop --release
```

The wheel lands in `target/wheels/`. Install with:

```bash
uv pip install target/wheels/agentgc-*.whl
```

## Generate UDL bindings manually

If you need raw uniffi bindings without maturin's wheel pipeline:

```bash
# Build the dylib first
cargo build --release

# Generate Python bindings from the UDL
cargo run --bin uniffi-bindgen -- \
    generate src/agentgc.udl \
    --language python \
    --out-dir bindings/python
```

This produces `agentgc.py` plus a `.dylib` / `.so` / `.dll` that
must be on `LD_LIBRARY_PATH` (or `DYLD_LIBRARY_PATH` on macOS).

## Status

- ✅ Crate scaffolding, UDL definition, Rust wrappers compile.
- ⏳ Python wheel build verification (requires `maturin` + Python env).
- ⏳ TypeScript bindings (defer until Python is in users' hands).
- ⏳ Integration tests against a real Azure OpenAI key (manual smoke test).

