Metadata-Version: 2.4
Name: mhub_proxy
Version: 1.0.8
Summary: OpenAI-compatible local proxy for enterprise LLM endpoints
License-File: LICENSE
Requires-Python: >=3.10
Requires-Dist: fastapi>=0.133.1
Requires-Dist: httpx>=0.28.1
Requires-Dist: uvicorn>=0.41.0
Description-Content-Type: text/markdown

# mhub_proxy

`mhub_proxy` is a local OpenAI/Anthropic-compatible proxy that lets CLI tools (for example, Opencode) send OpenAI/Anthropic-style requests to SecureGPT.

## Requirements:

`uv` is required to install and run the proxy.

```bash
curl -LsSf https://astral.sh/uv/install.sh | sh
```

Or, using `pip`:

```bash
pip install uv
```
## Install `mhub_proxy`

Install as a `uv` tool:

```bash
uv tool install mhub_proxy
```

## Required environment variables

Before starting the proxy, define the variables used by `mhub_proxy/config.py`:

- `MHUB_API_URL` : URL of the Model Hub API
- `MHUB_CLIENT_ID` : Client ID for the Model Hub API
- `MHUB_CLIENT_SECRET` : Client Secret for the Model Hub API
- `ONEACCOUNT_LOGIN_URL` : URL of the OneAccount login API

Optional:

- `MHUB_PROXY_HOST` (default: `127.0.0.1`)
- `MHUB_PROXY_PORT` (default: `8123`)
- `MHUB_PROXY_LOG_LEVEL` (default: `info`)

## Run the proxy

Start the proxy with:

```bash
uvx mhub_proxy
```

By default, the proxy runs on `http://127.0.0.1:8123`.

## Use with Opencode

1. Optionally:

    - Install the bundled OpenCode configuration in your project by running:
    
    ```bash
    uvx mhub_proxy --install-config
    ```

    - Or, add an `opencode.json` file to the root of your project and configure it to point to the proxy URL (see the [Opencode Documentation](https://opencode.ai/docs/config/) or [this example](./opencode.json)).

**IMPORTANT: Notice how for OpenAI models the API version is included in the model ID, e.g. `gpt-5.1-2025-11-13@2025-01-01-preview`.**
This is required for the proxy to correctly route requests to the Model Hub API.

2. Start Opencode from a terminal **inside your project directory**.

Note: The included `opencode.json` points Opencode to the local proxy URL:

```json
"baseURL": "http://127.0.0.1:8123/openai"
```

```json
"baseURL": "http://127.0.0.1:8123/claude"
```


3. Use the `/connect` command in Opencode and choose `SecureGPT OpenAI` or `SecureGPT Anthropic` as your provider. You can use any values for the API key as the proxy does not validate it.

4. Choose a model from the Model Hub and start using Opencode as normal!

## Automatic startup (Windows)

To automatically start the proxy when your PC boots up, you can create a scheduled task:
1. Open Task Scheduler and create a new task.

![ts1](./assets/ts_1.png)

2. In the "General" tab, give the task a name (e.g., "Start mhub_proxy") and select "Run whether user is logged on or not".

![ts1](./assets/ts_2.png)

3. In the "Triggers" tab, create a new trigger that begins "At startup".

![ts1](./assets/ts_3.png)

4. In the "Actions" tab, create a new action that starts a program with the following settings:

    - Program/script: `powershell.exe`
    - Add arguments: 

```
-NoProfile -ExecutionPolicy Bypass -WindowStyle Hidden -Command "Start-Process -WindowStyle Hidden -FilePath 'powershell.exe' -ArgumentList '-NoProfile -Command uvx mhub_proxy'"
```

![ts1](./assets/ts_4.png)

5. Save the task and run it to verify that the proxy starts correctly.

## Notes

- The proxy expects OpenAI chat-completions style traffic from your CLI tool.
- If startup fails, first verify that all required environment variables are set.
- The installed OpenCode plugin checks proxy health at session start and will launch `uvx mhub_proxy` in the background when needed.