Metadata-Version: 2.4
Name: greybeam-mcp
Version: 0.1.0
Summary: MCP server that routes SQL queries through the Greybeam proxy while delegating Cortex Search to upstream Snowflake MCP.
Project-URL: Homepage, https://github.com/greybeam/mcp
Project-URL: Repository, https://github.com/greybeam/mcp
Project-URL: Issues, https://github.com/greybeam/mcp/issues
Author: Greybeam
Keywords: greybeam,mcp,model-context-protocol,snowflake
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Database
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.11
Requires-Dist: httpx>=0.27.0
Requires-Dist: mcp>=1.0.0
Requires-Dist: pydantic>=2.6.0
Requires-Dist: pyyaml>=6.0
Requires-Dist: snowflake-connector-python>=3.7.0
Requires-Dist: snowflake-labs-mcp==1.4.1
Provides-Extra: dev
Requires-Dist: mypy>=1.10; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.23; extra == 'dev'
Requires-Dist: pytest-cov>=4.1; extra == 'dev'
Requires-Dist: pytest>=8.0; extra == 'dev'
Requires-Dist: respx>=0.21.0; extra == 'dev'
Requires-Dist: ruff>=0.4; extra == 'dev'
Requires-Dist: types-pyyaml>=6.0; extra == 'dev'
Description-Content-Type: text/markdown

# Greybeam MCP

A single MCP server that lets agents query data through the Greybeam routing layer with Snowflake-compatible tooling.

- `run_snowflake_query` — executes SQL via the Greybeam proxy (Snowflake protocol).
- `cortex_analyst` — calls Snowflake Cortex Analyst; any returned SQL is executed via Greybeam.
- `cortex_search` — delegated to the pinned upstream Snowflake MCP (no SQL, REST only).

Out of scope for v1: Cortex Agent, semantic views, and the upstream `object_manager` / `query_manager` / `semantic_manager` tool families. These are locked off in the child config so a misconfigured deployment can't accidentally expose them.

## Quick start

Install and run the setup wizard with [`uv`](https://docs.astral.sh/uv/):

    uvx greybeam-mcp init

The `init` wizard prompts for account, user, proxy host, and auth method;
writes a config file at `~/.config/greybeam-mcp.yaml` (mode 0600); and prints
the exact registration command for Claude Code and Claude Desktop.

For a packaged install, the printed command uses this form:

    claude mcp add greybeam -- uvx greybeam-mcp --config ~/.config/greybeam-mcp.yaml

## Manual install and run

If you'd rather author the YAML by hand, copy `examples/greybeam.yaml` to a
location of your choice, edit it, `chmod 600` it, then:

    uvx greybeam-mcp --config /absolute/path/to/greybeam.yaml

From a source checkout, use `uv run` instead:

    uv run greybeam-mcp --config /absolute/path/to/greybeam.yaml

## Configuration

The YAML is the single source of truth — it holds account, proxy host, **and**
auth credentials. Pick one auth method (in order of recommendation):

- `private_key_file` (path to a PEM key, plus optional `private_key_passphrase`) — **recommended**, since Snowflake is deprecating password auth
- `private_key` (inline PEM contents) — for environments without a writable disk
- `authenticator: externalbrowser` for SSO (requires a SAML2 integration on the account)
- `password` — deprecated by Snowflake, avoid for new setups

`chmod 600` the file since it contains credentials.

### Environment variable fallback

Every field above can also come from an environment variable
(`SNOWFLAKE_USER`, `SNOWFLAKE_PRIVATE_KEY_FILE`, `SNOWFLAKE_PRIVATE_KEY_PASSPHRASE`,
`SNOWFLAKE_PRIVATE_KEY`, `SNOWFLAKE_AUTHENTICATOR`, `SNOWFLAKE_PASSWORD`). YAML
takes precedence; envs fill in unset fields. Useful in container/k8s deployments
where secrets are mounted as env vars.

### Cortex Analyst auth (v1 limitation)

The Cortex Analyst REST endpoint expects `Authorization: Bearer <oauth_or_jwt>`. The v1 client supports Bearer (`token`) directly; the password / Basic-auth branch is test scaffolding and will return 401 against real Snowflake. If you need Cortex Analyst in production today, configure an OAuth access token via the `token` field in code; broader keypair-JWT support is tracked for v1.1.

## Client integration

`uvx greybeam-mcp init` prints registration snippets pre-filled with your config
path. The general shapes:

**Claude Code (CLI):**

    claude mcp add greybeam -- uvx greybeam-mcp --config /absolute/path/to/greybeam.yaml

Start a new `claude` session to pick it up. Verify with `claude mcp list`.

**Claude Desktop:** paste the JSON printed by `greybeam-mcp init` into
`~/Library/Application Support/Claude/claude_desktop_config.json` (macOS) or the
equivalent on your platform, then restart the app. The installed-package shape
is:

```json
{
  "mcpServers": {
    "greybeam": {
      "command": "uvx",
      "args": ["greybeam-mcp", "--config", "/absolute/path/to/greybeam.yaml"]
    }
  }
}
```

After a successful install, ask the agent something like *"run select 1 on
snowflake"* — it should pick the `run_snowflake_query` tool automatically.

## Statement-type policy

Greybeam MCP does **not** enforce statement-type restrictions at the MCP layer. CREATE / DROP / ALTER and other potentially destructive statements are subject to:

1. Greybeam backend routing and policy.
2. Your Snowflake role's grants on the target objects.

If you need hard MCP-layer restrictions, scope the Snowflake role tightly or contact Greybeam about backend policy options. This is an intentional v1 divergence from upstream Snowflake MCP, which blocks DDL by default — see the design doc for rationale.

## Cancellation

v1 bounds in-flight calls via two driver-level mechanisms inside `run_snowflake_query`:

1. `cursor.execute(timeout=greybeam.query_timeout)` — Snowflake's own query timeout.
2. Explicit `cursor.close()` on row-cap / byte-cap exceedance (acquires the
   driver's `_lock_canceling` and aborts the in-flight query).

Client-driven cancellation (`notifications/cancelled`) is **not** wired in v1. The `CancelToken` primitive, dispatcher in-flight table, and delegated-cancel forwarding are scaffolding retained and unit-tested so v1.1 can light them up by adding a `notifications/cancelled` handler.

## Development

    uv sync --extra dev
    uv run pytest

To run the server from a local clone instead of the PyPI package:

    uv run greybeam-mcp init

The wizard detects the source checkout and prints registration commands using
`uv --directory /absolute/path/to/greybeam-mcp run greybeam-mcp ...`, so local
development can point clients at unpublished changes.

The default suite runs unit + always-on contract tests (no network, no DB). Two test layers are gated behind environment variables:

Contract tests against the real upstream child (requires real Snowflake credentials — placeholders are not enough because the upstream child may validate at startup):

    GREYBEAM_RUN_CHILD_CONTRACT=1 \
      SNOWFLAKE_ACCOUNT=... SNOWFLAKE_USER=... SNOWFLAKE_PASSWORD=... \
      uv run pytest tests/contract/

Integration tests against a real Greybeam dev endpoint:

    GREYBEAM_RUN_INTEGRATION=1 \
      SNOWFLAKE_ACCOUNT=... SNOWFLAKE_USER=... SNOWFLAKE_PASSWORD=... \
      GREYBEAM_PROXY_HOST=greybeam-dev.example.com \
      uv run pytest tests/integration/

The upstream Snowflake MCP package is pinned at `snowflake-labs-mcp==1.4.1` (import name `mcp_server_snowflake`). Bumping that pin should re-run the child contract snapshot test and re-approve `tests/contract/fixtures/child_tools_list.json` if the surface drifted.

Design doc: `docs/superpowers/specs/2026-04-24-greybeam-mcp-design.md`.

## Release

Build and verify locally before publishing:

    uv build --no-sources
    uv run pytest
    uv run ruff check src/ tests/

Preferred publish path:

1. Create a GitHub environment named `pypi`.
2. Configure a pending PyPI Trusted Publisher for project `greybeam-mcp`, this
   repository, the `pypi` environment, and `.github/workflows/publish.yml`.
3. Create a GitHub release for a tag whose version matches `pyproject.toml`.

Manual fallback, if Trusted Publishing is not configured yet:

    uv publish --token pypi-...
