Metadata-Version: 2.4
Name: autoform
Version: 0.2.0
Summary: Composable function transformations for LLM programs
Requires-Python: >=3.12
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: litellm>=1.80.9
Requires-Dist: optree>=0.18.0
Dynamic: license-file

<div align="center">

# `autoform`

**Trace once. Transform freely.**

Composable function transformations for LM programs.

*Think [JAX](https://github.com/jax-ml/jax), but for LM programs.*

[![Python 3.12+](https://img.shields.io/badge/python-3.12+-blue.svg)](https://www.python.org/downloads/)
[![CI](https://github.com/ASEM000/autoform/actions/workflows/ci.yml/badge.svg)](https://github.com/ASEM000/autoform/actions/workflows/ci.yml)
[![codecov](https://codecov.io/gh/ASEM000/autoform/graph/badge.svg?token=Z0JBHSC3ZK)](https://codecov.io/gh/ASEM000/autoform)

[Quickstart](#quickstart) - [Transforms](#transforms) - [Concurrency](#concurrency) - [Debugging](#debugging) - [Docs](https://autoform.readthedocs.io)

</div>

```bash
pip install git+https://github.com/ASEM000/autoform.git
```

## Quickstart
```python
import autoform as af

def explain(topic: str) -> str:
    prompt = af.format("Explain {} in one paragraph.", topic)
    msg = dict(role="user", content=prompt)
    return af.lm_call([msg], model="gpt-5.2")

ir = af.trace(explain)("...")  # capture structure, no execution
```

Now transform it:
```python
# execute
output = ir.call("quantum entanglement")

# batch: n inputs
outputs = af.batch(ir).call(["DNA", "gravity", "recursion"])

# pushforward: propagate input perturbations forward
output, tangent = af.pushforward(ir).call(("quantum entanglement", "add more examples"))

# pullback: propagate output feedback backward
output, grad = af.pullback(ir).call(("quantum entanglement", "too technical"))

# compose: batched differentiation
topics = ["DNA", "gravity", "recursion"]
critiques = ["too technical", "too brief", "too abstract"]
outputs, hints = af.batch(af.pullback(ir)).call((topics, critiques))
```

The last line is the point: `batch(pullback(ir))`, transformations compose.

## Transforms

| Transform | What it does |
|-----------|--------------|
| `batch` | Vectorize over inputs |
| `pushforward` | Forward-mode AD |
| `pullback` | Reverse-mode AD |
| `sched` | Auto-concurrent execution |

## Concurrency

`sched` finds independent LM calls. `acall` runs them concurrently.
```python
scheduled = af.sched(ir)
result = await scheduled.acall("input") # acall for async
```

## Debugging

Checkpoint intermediate values. Substitute on re-execution.
```python
def pipeline(x: str) -> str:
    msg1 = dict(role="user", content=x)
    step1 = af.lm_call([msg1], model="gpt-5.2")
    step1 = af.checkpoint(step1, key="step1", collection="debug")
    
    msg2 = dict(role="user", content=step1)
    step2 = af.lm_call([msg2], model="gpt-5.2")
    return step2

ir = af.trace(pipeline)("...")

# capture
with af.collect(collection="debug") as captured:
    result = ir.call("input")

# substitute step1 value
with af.inject(collection="debug", values=dict(step1=["modified"])):
    result = ir.call("input")
```

---

> ⚠️ **Early development**: API may change.
