Metadata-Version: 2.4
Name: rusticai-api
Version: 1.2.3
Summary: API Server for Rustic AI
License-Expression: Apache-2.0
Author: Dragonscale Industries Inc.
Author-email: dev@dragonscale.ai
Requires-Python: >=3.13,<3.14
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.13
Provides-Extra: test
Requires-Dist: fastapi (>=0.136.1,<0.137.0)
Requires-Dist: griffe (>=1.14.0,<2.0.0)
Requires-Dist: opentelemetry-distro (>=0.61b0,<0.62)
Requires-Dist: psycopg (>=3.2.6,<4.0.0)
Requires-Dist: ptvsd (>=4.3.2,<5.0.0)
Requires-Dist: python-json-logger (>=3.3.0,<4.0.0)
Requires-Dist: python-multipart (>=0.0.20,<0.0.21)
Requires-Dist: rusticai-core (>=1.2.3,<1.3.0)
Requires-Dist: rusticai-ray (>=1.2.2,<1.3.0)
Requires-Dist: starlette (>=0.52.1,<0.53.0)
Requires-Dist: uvicorn (>=0.42.0,<0.43.0)
Requires-Dist: websockets (>=16.0,<17.0)
Project-URL: Homepage, https://www.rustic.ai/
Project-URL: Repository, https://github.com/rustic-ai/rustic-ai
Project-URL: Rustic AI Core, https://pypi.org/project/rusticai-core/
Description-Content-Type: text/markdown

# Rustic AI API
This module provides the backend server for the Rustic AI framework. It provides the interface for creating and interacting with guilds.
The interaction with a guild is supported through a Websocket interface, allowing for real-time communication and updates.

## Installing

```shell
pip install rusticai-api
```
**Note:** It depends on [rusticai-core](https://pypi.org/project/rusticai-core/) and [rusticai-ray](https://pypi.org/project/rusticai-ray/).

## Running from source

1. Install required dependencies:
```shell
poetry install --with dev
poetry shell
```

2. Start the server:
```shell
# If using an external SQL database, expose RUSTIC_METASTORE to the corresponding url 
# For example, if using postgres, export RUSTIC_METASTORE=postgresql+psycopg://user:pwd@localhost:5432
./scripts/dev_server.sh
```

Server will be available at `http://localhost:8880` by default. The API documentation can be accessed at `http://localhost:8880/docs`.

## Running from source with Telemetry

1. Install required dependencies:
```shell
poetry install --with dev
poetry shell
```

2. Start Zipkin server - requires [Docker](https://www.docker.com/get-started/)
```shell
sudo chmod 777 scripts/zipkin/data-tmp/
./scripts/zipkin/zipkin_up.sh
```

3. Set the otel env variables -
```shell
export OTEL_SERVICE_NAME=GuildCommunicationService
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT="http://localhost:4318/v1/traces"
export OTEL_EXPORTER_OTLP_PROTOCOL="http/protobuf"
```
Refer [docs](https://opentelemetry.io/docs/languages/sdk-configuration/otlp-exporter/#endpoint-configuration)
for details.

3. Start the server -
```shell
./scripts/dev_server_with_otel.sh
```
Traces will be visible in Zipkin UI at http://localhost:9411/zipkin/

Note: To stop the Zipkin server, use `./scripts/zipkin/zipkin_down.sh`

**To run with all the available `rusticai` packages, use the poetry environment from the root directory, and prefix commands with `api/` — for example, use `./api/scripts/dev_server_with_otel.sh` instead of `./scripts/dev_server_with_otel.sh`.**

