Metadata-Version: 2.4
Name: flowra
Version: 0.0.16
Summary: Flowra — flow infrastructure for building stateful LLM agents
Project-URL: Repository, https://github.com/anna-money/flowra
Project-URL: Changelog, https://github.com/anna-money/flowra/blob/master/CHANGELOG.md
Project-URL: Author, https://github.com/spaceorc
Author: Ivan Dashkevich
License-Expression: MIT
License-File: LICENSE
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Classifier: Topic :: Software Development :: Libraries
Requires-Python: >=3.12
Requires-Dist: httpx>=0.28
Requires-Dist: jsonschema>=4.26
Requires-Dist: marshmallow-recipe>=0.0.85
Provides-Extra: all
Requires-Dist: anthropic[vertex]; extra == 'all'
Requires-Dist: google-genai; extra == 'all'
Requires-Dist: openai; extra == 'all'
Provides-Extra: anthropic
Requires-Dist: anthropic[vertex]; extra == 'anthropic'
Provides-Extra: google
Requires-Dist: google-genai; extra == 'google'
Provides-Extra: mlflow
Requires-Dist: mlflow>=3.1; extra == 'mlflow'
Provides-Extra: openai
Requires-Dist: openai; extra == 'openai'
Description-Content-Type: text/markdown

# Flowra

[![PyPI](https://img.shields.io/pypi/v/flowra)](https://pypi.org/project/flowra/)
[![Python](https://img.shields.io/pypi/pyversions/flowra)](https://pypi.org/project/flowra/)
[![License](https://img.shields.io/pypi/l/flowra)](https://github.com/anna-money/flowra/blob/master/LICENSE)
[![CI](https://github.com/anna-money/flowra/actions/workflows/master.yml/badge.svg)](https://github.com/anna-money/flowra/actions/workflows/master.yml)

**Flow infra** for building stateful, persistent LLM agents with tool use,
parallel execution, and crash recovery. Requires Python 3.12+.

## Features

- **State machine agents** — define agents as `Agent[Spec, Result]` classes with
  `@step` methods, a single entry point, and typed spec/result contracts
- **Persistent state** — `Scalar[T]` and `AppendOnlyList[T]` with incremental
  dirty-tracking and pluggable storage (in-memory, file-based, or custom)
- **Tool integration** — `@tool` decorator for local functions, MCP server support,
  DI into tool handlers, agents as tools for LLM-driven delegation
- **LLM abstraction** — provider-agnostic `LLMProvider` interface with immutable
  message types and real-time streaming (ships `AnthropicVertexProvider`, `GoogleVertexProvider`, `OpenAIProvider`)
- **Agents as tools** — `@agent_tool` decorator exposes an agent as a tool the
  LLM can call autonomously; sub-agent runs its own system prompt and tool loop
- **Cooperative interrupts** — `InterruptToken` for graceful cancellation across
  the entire execution tree
- **Pre-built agents** — `ChatAgent` (multi-turn chat with session history) and
  `ToolLoopAgent` (single-turn LLM tool loop with hooks and caching)

## Installation

```bash
# Base package (no LLM providers)
pip install flowra

# With specific providers
pip install flowra[anthropic]
pip install flowra[openai]
pip install flowra[google]

# All providers
pip install flowra[all]
```

## Quick start

```python
import asyncio

from flowra.lib.chat import ChatAgent, ChatConfig, ChatResult, ChatSpec
from flowra.lib import LLMConfig
from flowra.llm import LLMProvider, SystemMessage, TextBlock
from flowra.llm.providers.anthropic_vertex import AnthropicVertexProvider
from flowra.agent import AgentRuntime
from flowra.tools import ToolRegistry


async def main() -> None:
    provider = AnthropicVertexProvider()

    async with await ToolRegistry.create([]) as registry:
        config = ChatConfig(
            llm_config=LLMConfig(model="claude-sonnet-4-5@20250929"),
            system_messages=[
                SystemMessage(blocks=[TextBlock(text="You are a helpful assistant.")])
            ],
        )

        runtime = AgentRuntime(
            agents={"chat": ChatAgent},
            services={
                LLMProvider: provider,
                ToolRegistry: registry,
                ChatConfig: config,
            },
        )

        while True:
            user_input = input("You: ")
            if not user_input:
                break

            result = await runtime.run(
                agent=ChatAgent,
                spec=ChatSpec(user_message=user_input),
            )

            if isinstance(result, ChatResult) and result.response:
                print(f"Assistant: {result.response}")


asyncio.run(main())
```

## Package structure

```
flowra/
├── llm/        # LLM abstraction (messages, blocks, provider interface)
├── tools/      # Tool definition, registration, execution
├── agent/      # Agent framework + execution engine + persistence
└── lib/        # Pre-built agents (ChatAgent, ToolLoopAgent, hooks, caching)
```

See [docs/architecture.md](docs/architecture.md) for the full dependency graph and
data flow. Each package has its own documentation in `docs/`.

## Development

```bash
make deps      # install dependencies (uv sync)
make check     # lint + test
make chat      # run interactive console chat example
```

## Documentation

- [Architecture](docs/architecture.md) — package structure and data flow
- [LLM](docs/llm.md) — message types, provider interface
- [Tools](docs/tools.md) — tool definition and execution
- [Agent](docs/agent.md) — agent framework, execution engine, persistence
- [Lib](docs/lib.md) — pre-built agents, hooks, caching
