Metadata-Version: 2.4
Name: zigrlm
Version: 0.1.0
Summary: Python adapter for zigrlm, a Zig runtime for Recursive Language Model workflows.
Author: Hunter Bown
License-Expression: MIT
Project-URL: Homepage, https://github.com/Hmbown/zigrlm
Project-URL: Repository, https://github.com/Hmbown/zigrlm
Project-URL: Upstream RLM, https://github.com/alexzhang13/rlm
Project-URL: Issues, https://github.com/Hmbown/zigrlm/issues
Keywords: agents,dspy,language-models,recursive-language-models,rlm,zig
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Dynamic: license-file

# zigrlm

`zigrlm` is an experimental Zig runtime for Recursive Language Model (RLM)
workflows. It is directly inspired by
[`alexzhang13/rlm`](https://github.com/alexzhang13/rlm), the Python reference
implementation for Recursive Language Models.

Use `zigrlm` when you want one small CLI process to own an RLM run: the root
prompt, recursive child calls, bounded local compute, parallel fanout, trace
events, and usage accounting.

## Install

The Python adapter is published as `zigrlm`:

```sh
pip install zigrlm
```

The Zig CLI is built from source:

```sh
git clone https://github.com/Hmbown/zigrlm.git
cd zigrlm
zig build
zig-out/bin/zigrlm --help
```

The Python adapter shells out to the Zig binary. After a source build it will
find `zig-out/bin/zigrlm` automatically from this checkout. In other layouts,
put the binary on `PATH` or set:

```sh
export ZIGRLM_BIN=/absolute/path/to/zigrlm
```

## Relationship To RLM

The upstream RLM project is the conceptual reference:

- Upstream: <https://github.com/alexzhang13/rlm>
- Paper, blog, and docs are linked from that repository.
- Upstream package: `rlms`, imported as `rlm`.

`zigrlm` is not a fork of that Python codebase. It reimplements the core
control loop in Zig and keeps the command shapes explicit:

- flat proxy commands for single model completions;
- recursive commands for RLM workflows;
- a `dszig` command for DSPy-style signature experiments;
- JSONL trace and metadata output for inspection.

If you want the full Python ecosystem, sandbox matrix, and upstream API, use
`alexzhang13/rlm`. If you want a compact Zig CLI for auditable local recursive
fanout, try `zigrlm`.

## Quick Start

Run no-network checks:

```sh
zig build test
zig build run -- demo
zig build run -- echo "hello"
zig build run -- echo --trace /tmp/zigrlm-echo-trace.jsonl "hello"
```

Run a plain OpenAI completion through the flat proxy:

```sh
printf 'Return exactly OK.' \
  | zig-out/bin/zigrlm openai-proxy --model gpt-5.4
```

Run a recursive Codex workflow:

````sh
zig-out/bin/zigrlm cli-codex --timeout-ms 600000 'Solve via a child:
```repl
rlm_query child = "Answer this briefly: " + context
FINAL_VAR(child)
```'
````

Run parallel child calls inside one RLM process:

````sh
zig-out/bin/zigrlm cli-codex --timeout-ms 600000 'Fan out:
```repl
rlm_query_batched answers = "Task A" | "Task B" | "Task C"
FINAL_VAR(answers)
```'
````

## Command Families

Keep flat completions and recursive workflows separate.

| Command | Shape | Use when |
| --- | --- | --- |
| `openai-proxy` | stdin to stdout | One plain OpenAI Responses completion. |
| `codex-proxy` | stdin to stdout | One plain Codex CLI completion. |
| `claude-proxy` | stdin to stdout | One plain Claude CLI completion. |
| `cli-openai` | recursive RLM | Root and child calls go through `openai-proxy`. |
| `cli-codex` | recursive RLM | Root and child calls go through `codex-proxy`. |
| `cli-claude` | recursive RLM | Root and child calls go through `claude-proxy`. |
| `zai` | recursive RLM | Root and child calls use Z.ai chat completions. |
| `cli` | recursive RLM | Root and child calls use custom stdin/stdout commands. |
| `dszig` | DSPy-style RLM | Signature inputs become Python variables and outputs use `SUBMIT(...)`. |
| `echo`, `demo` | diagnostic | No-network runtime checks. |

Rule of thumb:

```text
single completion?       -> *-proxy
workflow with subcalls?  -> cli-<backend>, zai, or cli
parallel children?       -> rlm_query_batched inside one recursive command
custom backend?          -> cli with ZIGRLM_MAIN_CMD / ZIGRLM_RLM_CMD
DSPy-style signature?    -> dszig
```

## The `repl` DSL

Recursive commands ask the root model to return fenced `repl` blocks:

````text
```repl
let name = "text" + context
set name = expression
print(expression)
js name = "const n = context.length; FINAL(String(n));"
llm_query name = expression
rlm_query name = expression
llm_query_batched name = expr | expr | ...
rlm_query_batched name = expr | expr | ...
FINAL(expression)
FINAL_VAR(name)
```
````

Important details:

- `context` is the original user prompt for the current RLM frame.
- Only fences tagged exactly as `repl` are executed.
- `llm_query` is a direct child model call.
- `rlm_query` starts a child RLM loop, unless the depth limit has been reached.
- Batched query forms run independent calls concurrently and join responses as
  indexed blocks.
- A block should end with `FINAL(...)` or `FINAL_VAR(...)`.

## DSzig Python Adapter

The PyPI package exposes a small Python adapter:

```python
from dszig import RLM

rlm = RLM(
    "question: str -> answer: str",
    main_cmd="python3 -c 'print(\"```python\"); print(\"SUBMIT(answer=question.upper())\"); print(\"```\")'",
    rlm_cmd="/bin/cat",
)

result = rlm(question="alpha")
assert result.answer == "ALPHA"
```

You can also import the same adapter through the package name:

```python
from zigrlm import RLM
```

If you already have real DSPy imported and want to patch it explicitly:

```python
import dspy
from dszig.dspy_compat import install

install(dspy)
```

The local `python/dspy` package in this repository is only a smoke-test shim.
It is not the full Stanford DSPy package and is not shipped as part of the PyPI
wheel.

## Runtime Flags

Recursive commands support:

```text
--file PATH
--stdin
--trace PATH
--metadata PATH
--timeout-ms N
--max-depth N
--max-iterations N
--max-concurrent-subcalls N
--max-runtime-ms N
--max-calls N
--max-input-bytes N
--max-output-bytes N
--max-consecutive-errors N
--environment script|python|docker-python
--python-bin PATH
--docker-bin PATH
--docker-image IMAGE
--docker-setup-timeout-ms N
--persistent
--compaction
--compaction-threshold-bytes N
```

Prefer trace paths under `/tmp`, for example:

```sh
--trace /tmp/zigrlm-my-run.jsonl
```

Traces can include prompt and model-output previews. Do not commit ad hoc
traces that contain private prompts, outputs, or keys.

## Configuration

Configuration is read from process environment first, then from a local `.env`.
Do not commit `.env`; it normally contains live keys.

Relevant variables:

```text
OPENAI_API_KEY
OPENAI_BASE_URL
OPENAI_MAIN_MODEL
OPENAI_RLM_MODEL

CODEX_BIN
CODEX_MAIN_MODEL
CODEX_RLM_MODEL

CLAUDE_BIN
CLAUDE_MAIN_MODEL
CLAUDE_RLM_MODEL
CLAUDE_EFFORT
CLAUDE_MAIN_EFFORT
CLAUDE_RLM_EFFORT

ZAI_CODING_API_KEY
ZAI_CODING_BASE_URL
ZAI_MAIN_MODEL
ZAI_RLM_MODEL

ZIGRLM_BIN
ZIGRLM_MAIN_CMD
ZIGRLM_RLM_CMD
```

Generic `cli` backends should read the full prompt from stdin and write only the
completion text to stdout.

## Project Layout

- `src/main.zig`: CLI commands, provider wrappers, runtime option parsing.
- `src/rlm.zig`: RLM loop, child calls, batching, usage, JSONL tracing.
- `src/dspy_rlm.zig`: DSPy-style signature runner with `SUBMIT(...)`.
- `src/env.zig`: `repl` DSL execution and Node `vm` bridge.
- `src/python_env.zig`: Python REPL subprocess with host callbacks.
- `src/docker_env.zig`: Docker-backed Python REPL with host callbacks.
- `python/dszig`: Python adapter for `zigrlm dszig`.
- `docs/RLM_PARITY.md`: current parity notes against upstream RLM/DSPy ideas.

## Development

Run the main checks:

```sh
zig build test
zig build
zig test src/dspy_rlm.zig
zig test src/python_env.zig
python3 -m py_compile scripts/compare_rlm_main.py scripts/run_oolong.py python/dszig/*.py python/dspy/*.py python/dspy/predict/*.py python/tests/*.py
PYTHONPATH=python python3 -m unittest discover -s python/tests -v
```

Build and validate the Python package:

```sh
python3 -m build
python3 -m twine check --strict dist/*
```

## Safety Notes

- The JavaScript sandbox uses Node `vm`. Treat it as trusted local compute, not
  a hard security boundary.
- `claude-proxy` disables Claude Code tools by default. Use `--allow-tools` only
  when tool access is intentional.
- `--environment docker-python` runs Python in a container with `--network none`
  by default, but the host process still controls model calls and mounted state.
- `--timeout-ms` bounds individual subprocess calls, not the whole recursive
  run.

## License

MIT. See [`LICENSE`](LICENSE).
