Metadata-Version: 2.3
Name: agent-api-server
Version: 2.2.1a2
Summary: A Langgraph agent API server that implements Langgraph agent's web capabilities and can interact with chatbot
Keywords: fastapi,langgraph,agent,api-server
Requires-Python: >=3.11,<3.14
Classifier: Development Status :: 4 - Beta
Classifier: Framework :: FastAPI
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Requires-Dist: aiofiles (>=24.1.0,<25.0.0)
Requires-Dist: cryptography (>=45.0.4,<46.0.0)
Requires-Dist: fastapi (>=0.117.0,<0.121.2)
Requires-Dist: langchain (>=1.2.0,<2.0.0)
Requires-Dist: langchain-core (>=1.2.5,<2.0.0)
Requires-Dist: langgraph (>=1.0.6,<2.0.0)
Requires-Dist: langgraph-checkpoint (>=4.0.0,<5.0.0)
Requires-Dist: langgraph-checkpoint-postgres (>=3.0.3,<4.0.0)
Requires-Dist: llm-sdk (==1.0.1)
Requires-Dist: model-manage-client (>=0.0.1.8)
Requires-Dist: nats-py (>=2.11.0,<3.0.0)
Requires-Dist: openclaw-sdk (>=2.1.0,<3.0.0)
Requires-Dist: psycopg-binary (>=3.2.9,<4.0.0)
Requires-Dist: psycopg-pool (>=3.2.6,<4.0.0)
Requires-Dist: pydantic-settings (>=2.9.1,<3.0.0)
Requires-Dist: redis (>=6.2.0,<7.0.0)
Requires-Dist: starlette (>=0.49.3,<0.50.0)
Requires-Dist: tenacity (>=9.1.2,<10.0.0)
Description-Content-Type: text/markdown

# agent-api-server

`agent-api-server` is an agent runtime that now supports both:

- API Server mode for HTTP/SSE access
- SDK mode for direct in-process agent execution

It keeps compatibility with existing LangGraph-based agents and introduces a framework abstraction so additional runtimes such as OpenClaw can be added behind the same interface.

## Features

- FastAPI application factory for embedding or standalone deployment
- Direct SDK entry via root module `agent_api_server.py`
- Framework-neutral agent loading with LangGraph and OpenClaw adapters
- Backward-compatible LangGraph-oriented API routes under `/api/v1`
- Built-in static client assets bundled in both sdist and wheel artifacts
- Redis-backed thread storage and PostgreSQL checkpoint integration
- Optional integration with model management registration and listener services
- Layered package structure centered on `adapters/`, `core/`, `common/`, `api/`, `integration/`, and `sdk/`

## Requirements

- Python 3.11 to 3.13
- Redis
- PostgreSQL
- Access to all runtime dependencies declared in `pyproject.toml`

## Installation

Install from a package index that provides all required dependencies:

```bash
pip install agent-api-server
```

This project depends on `llm-sdk` and `model-manage-client`. If those packages are hosted on a private index in your environment, configure `pip` or Poetry to use that index before installation.

## Configuration

The runtime reads configuration from environment variables. Common settings include:

```env
REDIS_URL=redis://localhost:6379/0
POSTGRES_URL=postgresql://postgres:postgres@localhost:5432/postgres
MODEL_MANAGER_SERVICE_URL=http://127.0.0.1:10053
CLIENT_TOKEN=
SERVER_PORT=8080
SERVER_WORKER_AMOUNT=1
LOG_LEVEL=INFO
```

See `.env_example` for a more complete example.

## Agent Config

`agents.json` is now the primary agent registry. Existing deployments that still use `langgraph.json` remain compatible via fallback loading. LangGraph string entries still work:

```json
{
  "graphs": {
    "demo-agent": "./agents/demo.py:graph"
  }
}
```

You can also use the extended form to declare the framework explicitly and attach adapter-specific settings:

```json
{
  "graphs": {
    "demo-agent": {
      "framework": "langgraph",
      "entrypoint": "./agents/demo.py:graph"
    },
    "openclaw-agent": {
      "framework": "openclaw",
      "agent_id": "openclaw-agent",
      "create_agent": true,
      "agent_config": {
        "name": "OpenClaw Agent"
      },
      "client": {
        "gateway_ws_url": "ws://127.0.0.1:18789/gateway",
        "api_key": "your_gateway_token",
        "scopes": ["operator.read", "operator.write"],
        "device_identity_path": "~/.openclaw/identity/device.json",
        "timeout": 60
      },
      "input_schema": {
        "type": "object",
        "properties": {
          "query": {
            "type": "string"
          }
        },
        "required": ["query"]
      },
      "context_schema": {
        "type": "object",
        "properties": {
          "CHAT_PROVIDER": {
            "type": "string"
          }
        }
      }
    }
  }
}
```

For OpenClaw agents, API and SDK calls map the local `thread_id` to the OpenClaw SDK `session_name`, so each Chatbot conversation uses an isolated OpenClaw session by default. When `create_agent` is enabled, thread creation calls `list_agents()` and creates the remote OpenClaw agent if it is missing. If `workspace` is omitted, the adapter creates the agent under the OpenClaw client work directory using the OpenClaw workspace naming convention, for example `.openclaw/workspace-openclaw-agent`.

The Gateway protocol requires both `auth.token` and a signed `device` identity during `connect`. In practice that means `client.api_key` alone is not enough for a normal WS connection: you also need a paired device identity (default `~/.openclaw/identity/device.json`) or a gateway configured for insecure local auth.

When `client.api_key` is configured, this project now auto-generates a local Ed25519 device identity if `client.device_identity_path` does not exist yet. The first connection may still require device approval on the gateway before `hello-ok.auth.deviceToken` is issued.

## Request Payloads

Thread `run` and `stream` endpoints accept the current payload shape:

```json
{
  "query": "hello",
  "attachments": []
}
```

They also accept the legacy Chatbot payload shape:

```json
{
  "inputs": {
    "user_input": "hello"
  },
  "attachments": []
}
```

Internally both forms are normalized to `{"query": "hello"}` plus an attachment list before they reach the adapter. OpenClaw SSE events include `conversation_id` and use `model` / `tools` nodes for `node_message`, `tools_message`, and `token_stream` events.

## Running the server

Run the application with Uvicorn:

```bash
uvicorn service:create_fastapi_app --factory --host 0.0.0.0 --port 8080
```

After startup:

- API root: `http://127.0.0.1:8080/api/v1`
- OpenAPI docs: `http://127.0.0.1:8080/docs`
- Built-in client: `http://127.0.0.1:8080/site`

## SDK Usage

For OpenClaw agents, a package consumer can configure the gateway directly in the SDK constructor instead of shipping an `agents.json` file:

```python
import asyncio

from agent_api_server import AgentSDK


async def main():
    sdk = AgentSDK(
        agent_name="demo-openclaw-agent",
        agent_id="your-openclaw-agent-id",
        gateway_ws_url="ws://127.0.0.1:18789/gateway",
        api_key="your_gateway_token",
        scopes=["operator.read", "operator.write"],
        device_identity_path="~/.openclaw/identity/device.json",
        timeout=60,
    )
    result = await sdk.run(query="hello", thread_id="sdk-run-thread")
    print(result.content)


asyncio.run(main())
```

To print each streamed chunk to the console as it arrives:

```python
import asyncio

from agent_api_server import AgentSDK


async def main():
    sdk = AgentSDK(
        agent_name="demo-openclaw-agent",
        agent_id="your-openclaw-agent-id",
        gateway_ws_url="ws://127.0.0.1:18789/gateway",
        api_key="your_gateway_token",
        scopes=["operator.read", "operator.write"],
        device_identity_path="~/.openclaw/identity/device.json",
        timeout=60,
    )
    async for chunk in sdk.stream(query="hello", thread_id="sdk-stream-thread"):
        print(chunk, end="", flush=True)


asyncio.run(main())
```

The existing config-file style remains supported:

```python
import asyncio

from agent_api_server import AgentSDK


async def main():
    sdk = AgentSDK()
    result = await sdk.run(
        "demo-agent",
        {"query": "hello"},
        thread_id="sdk-thread",
    )
    print(result.content)


asyncio.run(main())
```

For synchronous usage with direct OpenClaw settings:

```python
from agent_api_server import AgentSDK

sdk = AgentSDK(
    agent_name="demo-openclaw-agent",
    agent_id="your-openclaw-agent-id",
    gateway_ws_url="ws://127.0.0.1:18789/gateway",
    api_key="your_gateway_token",
    scopes=["operator.read", "operator.write"],
    device_identity_path="~/.openclaw/identity/device.json",
    timeout=60,
)
result = sdk.run_sync(query="hello", thread_id="sdk-sync-thread")
print(result.content)
```

For synchronous usage with config-file-managed agents:

```python
from agent_api_server import AgentSDK

sdk = AgentSDK()
result = sdk.run_sync("demo-agent", {"query": "hello"}, thread_id="sync-thread")
print(result.content)
```

`attachments` is the public file/URL metadata channel. `ts_tenant`, `ei_token`, `runtime_config`, and `use_system_llm` are optional advanced parameters for deployments that need tenant-specific model credentials or LangGraph runtime configuration.

## Build

Build source and wheel distributions with Poetry:

```bash
poetry build
```

## Layout

The repository is now organized as layered packages:

- `adapters/` for framework-specific agent adapters
- `core/` for agent runtime, loading, and shared execution models
- `common/` for config, logging, crypto, Redis, Postgres, NATS, and formatting helpers
- `api/` for FastAPI route modules
- `integration/` for registration and model-update listener integrations
- `sdk/` for the direct SDK client

