Metadata-Version: 2.4
Name: agentswarm-bedrock-agentcore
Version: 0.3.1
Summary: AWS Bedrock AgentCore implementation for ai-agentswarm.
Author-email: Luca Roverelli <luca.roverelli@gmail.com>
Project-URL: Homepage, https://github.com/ai-agentswarm/agentswarm-bedrock-agentcore
Project-URL: Bug Tracker, https://github.com/ai-agentswarm/agentswarm-bedrock-agentcore/issues
Keywords: ai,agents,aws,bedrock,agentcore
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.10
Description-Content-Type: text/markdown
Requires-Dist: ai-agentswarm>=0.5.1
Requires-Dist: bedrock-agentcore
Requires-Dist: pydantic>=2.0.0
Provides-Extra: dev
Requires-Dist: pytest; extra == "dev"
Requires-Dist: pytest-asyncio; extra == "dev"
Requires-Dist: twine; extra == "dev"
Requires-Dist: build; extra == "dev"
Requires-Dist: black; extra == "dev"
Requires-Dist: isort; extra == "dev"
Provides-Extra: examples
Requires-Dist: python-dotenv; extra == "examples"

# agentswarm-bedrock-agentcore

An implementation of the [AWS Bedrock AgentCore SDK](https://github.com/aws/bedrock-agentcore-sdk-python) for the `ai-agentswarm` framework.

This library eliminates boilerplate to create and execute agents in the Bedrock infrastructure.

## Installation

```bash
pip install agentswarm-bedrock-agentcore
```

## Usage

### 1. Defining and Hosting your Agent
Extend `BedrockAgent` to create your agent. You can then use the `.serve()` method to start a Bedrock AgentCore compatible WebSocket server.

```python
from agentswarm.bedrock_agentcore import BedrockAgent
from agentswarm.datamodels import Context
from agentswarm.llms import GoogleGenAI

# Initialize your LLM
llm = GoogleGenAI(model_name="gemini-1.5-pro")

class MyBedrockAgent(BedrockAgent):
    def id(self) -> str:
        return "my-awesome-agent"

    async def execute(self, user_id: str, context: Context, input: str = None):
        # Your custom agent logic here
        # The context will have the default_llm if passed to .serve()
        return f"Bedrock Agent says: I processed '{input}'"

if __name__ == "__main__":
    # Start the server and pass the default_llm
    MyBedrockAgent().serve(port=8000, default_llm=llm)
```

### 2. Invoking your Agent Remotely
Use `BedrockRemoteAgent` to call an agent that is already running. It supports both local WebSocket endpoints and AWS Bedrock ARNs.

```python
from agentswarm.bedrock_agentcore import BedrockRemoteAgent
from agentswarm.datamodels import Context

# Use an ARN for cloud invocation (after deployment)
# or "http://localhost:8000" for local testing
AGENT_ENDPOINT = "arn:aws:bedrock-agentcore:us-west-2:123456789012:runtime/my-agent-abc"

# Create a proxy for the remote agent
remote_agent = BedrockRemoteAgent(
    endpoint_url=AGENT_ENDPOINT,
    remote_agent_id="my-awesome-agent"
)

# Use it like any other AgentSwarm agent
# Note: Provide required context arguments for initialization
result = await remote_agent.execute(
    user_id="user-123",
    context=Context(trace_id="trace-1", messages=[], store=None, tracing=None),
    input="Hello Bedrock!"
)

print(result)
```

## Deployment

To deploy your agent natively to the **Amazon Bedrock AgentCore Runtime**, use the official **Bedrock AgentCore Starter Toolkit**.

### 1. Install the Toolkit
```bash
pip install bedrock-agentcore-starter-toolkit
```

### 2. Configure your Agent
Run the interactive configuration tool to set up your deployment (entrypoint, region, runtime, etc.).

```bash
agentcore configure
```

This will create or update a `.bedrock_agentcore.yaml` file with your settings.

### 3. Launch to AWS Bedrock
The `launch` command packages your code, installs dependencies from `requirements.txt`, and deploys it to the managed Bedrock runtime.

```bash
# Recommended: Cloud-based deployment (Direct Code Deploy)
agentcore launch
```

This command will return an **Agent ARN** which you can then use to invoke your agent via `BedrockRemoteAgent`.

## Configuration

### requirements.txt
Ensure your `requirements.txt` includes the necessary libraries for the remote runtime:
```text
ai-agentswarm>=0.5.1
agentswarm-bedrock-agentcore>=0.1.0
```

### default_llm Support
You can configure the model via environment variables in your `.bedrock_agentcore.yaml`:

```yaml
env:
  DEFAULT_LLM_MODEL: gemini-2.5-flash
```

## Quick Start: Testing and Deployment

### 1. Install Dependencies
```bash
# Core logic and Bedrock implementation
pip install ai-agentswarm agentswarm-bedrock-agentcore

# Deployment Toolkit
pip install bedrock-agentcore-starter-toolkit
```

### 2. Local Testing
Verify the bridge before deploying.

**Server:**
```bash
export RUN_SERVER=true
python examples/bedrock_demo.py
```

**Client (Proxy):**
```bash
# In another terminal
python examples/bedrock_demo.py
```

### 3. Native Bedrock Deployment
1. `export UV_CACHE_DIR=./.uv_cache` (optional, for permissions fix)
2. `agentcore configure`
3. `agentcore launch`


## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
