Metadata-Version: 2.4
Name: my-agents-adapter
Version: 0.0.8
Summary: Agents hosting adapter for Azure AI
Author: Microsoft Corporation
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Requires-Python: >=3.10
Description-Content-Type: text/markdown
Requires-Dist: azure-monitor-opentelemetry
Requires-Dist: azure-ai-projects
Requires-Dist: azure-ai-agents>=1.2.0b5
Requires-Dist: azure-identity
Requires-Dist: opentelemetry-api
Requires-Dist: starlette
Requires-Dist: uvicorn
Provides-Extra: agentframework
Requires-Dist: agent_framework_azure_ai==1.0.0b251007; extra == "agentframework"
Requires-Dist: opentelemetry-exporter-otlp-proto-grpc; extra == "agentframework"
Provides-Extra: langgraph
Requires-Dist: langchain; extra == "langgraph"
Requires-Dist: langchain-openai; extra == "langgraph"
Requires-Dist: langchain-azure-ai[opentelemetry]; extra == "langgraph"
Requires-Dist: langgraph; extra == "langgraph"
Requires-Dist: opentelemetry-exporter-otlp-proto-http; extra == "langgraph"

## Supported frameworks

- [WIP] langgraph
- [WIP] Microsoft Agent Framework

## Install

In current folder, run:
```bash
pip install -e .
```

## Usage

### langgraph

```python
# your existing agent
from my_langgraph_agent import my_awesome_agent

# langgraph utils
from azure.ai.agentshosting import from_langgraph

if __name__ == "__main__":
    # with this simple line, your agent will be hosted on http://localhost:8088
    from_langgraph(my_awesome_agent).run()

```

**Note**
If your langgraph agent was not using langgraph's builtin [MessageState](https://langchain-ai.github.io/langgraph/concepts/low_level/?h=messagesstate#messagesstate), you should implement your own `LanggraphStateConverter` and provide to `from_langgraph`.

Reference this [example](../../../samples/python/langgraph/custom_state/main.py) for more details.


### Microsoft Agent Framework

```python
# your existing agent
from my_framework_agent import my_awesome_agent

# agent framework utils
from azure.ai.agentshosting import from_agent_framework

if __name__ == "__main__":
    # with this simple line, your agent will be hosted on http://localhost:8088
    from_agent_framework(my_awesome_agent).run()

```

### Custom Code
If your agent is not built using a supported framework, you can still make it compatible with Microsoft AI Foundry by manually implementing the predefined interface.

```python
import datetime

from azure.ai.agentshosting import FoundryCBAgent
from azure.ai.agentshosting.models.azureaiagents.models import CreateResponse
from azure.ai.agentshosting.models.openai.models import (
    ItemContentOutputText,
    Response as OpenAIResponse,
    ResponsesAssistantMessageItemResource,
    ResponseTextDeltaEvent,
    ResponseTextDoneEvent,
)


def stream_events(text: str):
    assembled = ""
    for i, token in enumerate(text.split(" ")):
        piece = token if i == len(text.split(" ")) - 1 else token + " "
        assembled += piece
        yield ResponseTextDeltaEvent(delta=piece)
    # Done with text
    yield ResponseTextDoneEvent(text=assembled)


async def agent_run(request_body: CreateResponse):
    agent = request_body.agent
    print(f"agent:{agent}")

    if request_body.stream:
        return stream_events("I am mock agent with no intelligence in stream mode.")

    # Build assistant output content
    output_content = [
        ItemContentOutputText(
            text="I am mock agent with no intelligence.",
            annotations=[],
        )
    ]

    response = OpenAIResponse(
        metadata={},
        temperature=0.0,
        top_p=0.0,
        user="me",
        id="id",
        created_at=datetime.datetime.now(),
        output=[
            ResponsesAssistantMessageItemResource(
                status="completed",
                content=output_content,
            )
        ],
    )
    return response


my_agent = FoundryCBAgent()
my_agent.agent_run = agent_run

if __name__ == "__main__":
    my_agent.run()

```
