Metadata-Version: 2.4
Name: pydantic-ai-resilient-mcp
Version: 0.1.0
Summary: Crash-resilient wrapper for Pydantic AI MCP servers
Author-email: Your Name <your.email@example.com>
License: MIT
Project-URL: Homepage, https://github.com/yourusername/pydantic-ai-resilient-mcp
Project-URL: Repository, https://github.com/yourusername/pydantic-ai-resilient-mcp
Project-URL: Issues, https://github.com/yourusername/pydantic-ai-resilient-mcp/issues
Keywords: mcp,pydantic-ai,crash-handling,sse
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: pydantic-ai-slim[mcp]>=0.1.0
Dynamic: license-file

# pydantic-ai-resilient-mcp

Crash-resilient wrapper for Pydantic AI MCP servers. Handles server crashes gracefully without breaking your agent.

## Installation

```bash
pip install pydantic-ai-resilient-mcp
```

## Quick Start

```python
from pydantic_ai import Agent
from pydantic_ai_resilient_mcp import ResilientMCPSSE

# Wrap your SSE MCP server
server = ResilientMCPSSE('http://localhost:8000/sse')

async with server:
    agent = Agent(model, toolsets=[server])

    # Use normally - crashes are handled automatically
    result = await agent.run('Your prompt here')
    print(result.output)
```

## What It Does

When your MCP server crashes:
- ✅ Catches the crash gracefully
- ✅ Returns clean error message: `{"error": "Host server crashed. Cannot use tools anymore"}`
- ✅ Prevents infinite retry loops
- ✅ Agent continues running instead of crashing

## Example

```python
from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIChatModel
from pydantic_ai.providers.openai import OpenAIProvider
from resilient_mcp import ResilientMCPSSE

model = OpenAIChatModel('gpt-4o', provider=OpenAIProvider(api_key='...'))
server = ResilientMCPSSE('http://your-server:8000/sse')

async with server:
    agent = Agent(model, toolsets=[server])

    # Works before crash
    result1 = await agent.run('Use a tool')
    print(result1.output)

    # Server crashes during this call - handled gracefully
    result2 = await agent.run('Use another tool')
    print(result2.output)  # {"error": "Host server crashed..."}

    # No infinite retries - agent continues cleanly
```

## License

MIT
