Metadata-Version: 2.4
Name: figmin-bridge-gpu
Version: 0.2.1
Summary: Figmin XR bridge daemon with CUDA acceleration (Whisper + Kokoro on GPU).
Project-URL: Homepage, http://www.figminxr.com
Author: Figmin XR
License-Expression: LicenseRef-Proprietary
Keywords: ai,claude,codex,figmin,mcp,spatial-computing,xr
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: Other/Proprietary License
Classifier: Programming Language :: Python :: 3
Classifier: Topic :: Games/Entertainment
Requires-Python: >=3.11
Requires-Dist: espeakng-loader>=0.2.4
Requires-Dist: faster-whisper
Requires-Dist: mcp[cli]
Requires-Dist: numpy
Requires-Dist: nvidia-cublas-cu12==12.9.1.4
Requires-Dist: nvidia-cudnn-cu12==9.20.0.48
Requires-Dist: onnxruntime-gpu[cuda,cudnn]==1.24.3
Requires-Dist: onnxruntime==1.24.3
Requires-Dist: phonemizer-fork>=3.3.2
Requires-Dist: soundfile
Requires-Dist: websockets>=12.0
Description-Content-Type: text/markdown

# figmin-bridge

Figmin XR bridge daemon. Owns WebSocket ports for Figmin XR (39571/39572) and
for MCP adapters (39573 on loopback). Loads Whisper (STT) and Kokoro (TTS)
models once at startup; adapters come and go without losing Figmin state.

See the design doc and `C:\workspace\claude\mcp_refactor\old_mcp\` for
reference. This is a fresh package — the old combined `server.py` is being
split into this daemon plus the thin `figmin-mcp` adapter.

## Dev

```
uv run --project C:\workspace\claude\mcp_refactor\figmin-bridge figmin-bridge-gpu
```

## Users (GPU, recommended)

```
uvx figmin-bridge-gpu
```
