Metadata-Version: 2.4
Name: langgraph-gaussdb-fastapi
Version: 0.3.0
Summary: A FastAPI LangGraph-compatible server backed by GaussDB.
Classifier: Development Status :: 3 - Alpha
Classifier: Framework :: FastAPI
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Requires-Python: >=3.10
Description-Content-Type: text/markdown
Requires-Dist: async-gaussdb>=0.1.0
Requires-Dist: croniter>=2.0.0
Requires-Dist: fastapi>=0.110.0
Requires-Dist: httpx>=0.27.0
Requires-Dist: langchain-core>=0.3.0
Requires-Dist: langchain-openai>=0.3.0
Requires-Dist: langgraph>=1.0.0
Requires-Dist: langgraph-checkpoint>=2.0.0
Requires-Dist: langgraph-sdk>=0.3.0
Requires-Dist: pydantic-settings>=2.0.0
Requires-Dist: uvicorn>=0.30.0
Provides-Extra: chainlit
Requires-Dist: chainlit>=2.0.0; extra == "chainlit"
Provides-Extra: test
Requires-Dist: pytest>=8.0.0; extra == "test"
Requires-Dist: pytest-asyncio>=0.23.0; extra == "test"

# LangGraph GaussDB FastAPI

A FastAPI server that implements key LangGraph Platform-compatible endpoints with async GaussDB persistence.

## Install

```bash
python -m pip install langgraph-gaussdb-fastapi
```

To run the packaged Chainlit prototype too:

```bash
python -m pip install "langgraph-gaussdb-fastapi[chainlit]"
```

## Run

Set the database connection in a mode settings file or with environment
variables, then start the server:

```bash
LG_MODE=dev \
LG_SETTINGS_FILE=settings/secrets.env \
langgraph-gaussdb-fastapi
```

Configuration is loaded from `settings/base.env`, then
`settings/{LG_MODE}.env`; `LG_MODE` defaults to `dev`. If `LG_SETTINGS_FILE`
is set, that file is loaded after the mode file. Real process environment
variables always win, so existing deployment env vars continue to work. When
no working-directory `settings/` directory exists, packaged defaults are used.

The default graph config now comes from the packaged `server_config.json`. You can still override it with `LG_GRAPH_CONFIG`.

```json
{
  "dependencies": [],
  "graphs": {
    "echo": "graph.example_graph:graph",
    "qwen_tools": "graph.qwen_tools_graph:graph",
    "orchestrator_worker": "graph.workflow_graphs:orchestrator_worker_graph",
    "router_specialist": "graph.workflow_graphs:router_specialist_graph",
    "evaluator_optimizer": "graph.workflow_graphs:evaluator_optimizer_graph"
  }
}
```

Additional packaged workflow graphs:

- `orchestrator_worker`: planner -> worker fan-out -> synthesis
- `router_specialist`: request classification into research, implementation, review, or general specialist
- `evaluator_optimizer`: draft -> critique -> revise loop with a bounded retry budget

Health checks are available at `/` and `/ok`.

## Chainlit

The distribution includes a packaged Chainlit UI that can be mounted onto
the same FastAPI app as a sub-application. This is the recommended way to
run the UI: a single process, no extra ports, no cross-origin hops.

Install the optional extra:

```bash
python -m pip install "langgraph-gaussdb-fastapi[chainlit]"
```

Start the server with `LG_ENABLE_CHAINLIT=1` in the selected settings file or
in the process environment:

```bash
LG_ENABLE_CHAINLIT=1 \
CHAINLIT_AUTH_SECRET=change-me \
langgraph-gaussdb-fastapi
```

The Chainlit UI is served at `/chainlit` (override with `LG_CHAINLIT_PATH`).
In mounted mode the Chainlit backend talks to the LangGraph API in-process
via `httpx.ASGITransport`, so `LANGGRAPH_SERVER_URL` does **not** need to
be set. Chainlit's FastAPI integration docs recommend header auth for
this setup: https://docs.chainlit.io/integrations/fastapi

If `LG_ENABLE_CHAINLIT` is unset or the `chainlit` extra is not installed,
the helper is a silent no-op and the API server continues normally.
If `LG_ENABLE_CHAINLIT=1` but `CHAINLIT_AUTH_SECRET` is missing, the API
server still starts but the UI is not mounted.

### Settings files and environment variables

The repository includes these mode files:

| File | Purpose |
|---|---|
| `settings/base.env` | Shared defaults |
| `settings/dev.env` | Development overrides |
| `settings/test.env` | Test overrides |
| `settings/prod.env` | Production overrides |

Use `LG_SETTINGS_DIR=/path/to/settings` to read mode files from another
directory. Use `LG_SETTINGS_FILE=/path/to/secrets.env` for a local override
file; `settings/*.local.env` and `settings/secrets.env` are ignored by git.

Supported settings-file syntax is `KEY=value`, optional `export KEY=value`,
shell-style quoting, blank lines, and comments.

| Variable | Default | Purpose |
|---|---|---|
| `LG_MODE` | `dev` | Selects `settings/{mode}.env` |
| `LG_SETTINGS_DIR` | `settings` | Directory containing `base.env` and mode files |
| `LG_SETTINGS_FILE` | _(unset)_ | Optional local override file loaded after the mode file |
| `LG_ENABLE_CHAINLIT` | `false` | Opt-in switch for the mounted UI |
| `LG_CHAINLIT_PATH` | `/chainlit` | Mount path under the FastAPI app |
| `CHAINLIT_AUTH_SECRET` | required when mounted | Signing key for Chainlit sessions |
| `LANGGRAPH_ASSISTANT_ID` | `echo` | Default assistant/graph selected in the UI |
| `LANGGRAPH_SERVER_URL` | _(unset)_ | Only used by the legacy standalone launcher or to point at a remote server |

### Legacy standalone launcher (deprecated)

The previous `langgraph-gaussdb-chainlit` launcher is still shipped for
backwards compatibility but is deprecated in favor of the mounted flow:

```bash
LANGGRAPH_SERVER_URL=http://127.0.0.1:2026 \
langgraph-gaussdb-chainlit --headless --host 0.0.0.0 --port 8010
```

### Integration tests

Run the opt-in integration suite against a live GaussDB-backed environment.
The fixtures now boot a single FastAPI process with Chainlit mounted:

```bash
RUN_LANGGRAPH_CHAINLIT_INTEGRATION=1 pytest -m integration
```
