Metadata-Version: 2.4
Name: iflow-mcp_kyryl-opens-ml_mcp-server-dagster
Version: 0.1.2
Summary: Add your description here
License-File: LICENSE
Requires-Python: >=3.12
Requires-Dist: dagster-webserver>=1.10.9
Requires-Dist: dagster>=1.10.9
Requires-Dist: mcp[cli]>=1.6.0
Requires-Dist: openai-agents>=0.0.9
Requires-Dist: pandas>=2.2.3
Requires-Dist: pytest-asyncio>=0.26.0
Requires-Dist: pytest>=8.3.5
Requires-Dist: ruff>=0.11.4
Requires-Dist: scikit-learn>=1.6.1
Description-Content-Type: text/markdown

# mcp-dagster: A Dagster MCP Server

> The [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is an open protocol that enables seamless integration between LLM applications and external data sources and tools. This repository provides an MCP server for interacting with [Dagster](https://dagster.io/), the data orchestration platform.

## Overview

A Model Context Protocol server that enables AI agents to interact with Dagster instances, explore data pipelines, monitor runs, and manage assets. It serves as a bridge between LLMs and your data engineering workflows.

Read our [launch post](https://kyrylai.com/2025/04/09/dagster-llm-orchestration-mcp-server/) to learn more.

[![PyPI version](https://badge.fury.io/py/mcp-server-dagster.svg)](https://pypi.org/project/mcp-server-dagster/)
[![Tests](https://github.com/kyryl-opens-ml/mcp-server-dagster/actions/workflows/pypi-publish.yaml/badge.svg)](https://github.com/kyryl-opens-ml/mcp-server-dagster/actions/workflows/pypi-publish.yaml)


## Components

### Tools

The server implements several tools for Dagster interaction:

- `list_repositories`: Lists all available Dagster repositories
- `list_jobs`: Lists all jobs in a specific repository
- `list_assets`: Lists all assets in a specific repository
- `recent_runs`: Gets recent Dagster runs (default limit: 10)
- `get_run_info`: Gets detailed information about a specific run
- `launch_run`: Launches a Dagster job run
- `materialize_asset`: Materializes a specific Dagster asset
- `terminate_run`: Terminates an in-progress Dagster run
- `get_asset_info`: Gets detailed information about a specific asset

## Configuration

The server connects to Dagster using these defaults:
- GraphQL endpoint: `http://localhost:3000/graphql`
- Transport: SSE (Server-Sent Events)

## Quickstart

### Running the Example

1. Start the Dagster instance with your pipeline:
```bash
uv run dagster dev -f ./examples/open-ai-agent/pipeline.py
```

2. Run the MCP server with SSE transport:
```bash
uv run examples/open-ai-agent/run_sse_mcp.py
```

3. Start the agent loop to interact with Dagster:
```bash
uv run ./examples/open-ai-agent/agent.py
```

### Example Interactions

Once the agent is running, you can ask questions like:

- "What assets are available in my Dagster instance and what do they do?"
- "Can you materialize the continent_stats asset and show me the result?"
- "Check the status of recent runs and provide a summary of any failures"
- "Create a new monthly aggregation asset that depends on continent_stats"

The agent will use the MCP server to interact with your Dagster instance and provide answers based on your data pipelines.
