Metadata-Version: 2.4
Name: turborg
Version: 0.1.0
Summary: An open-source modular AI agent framework. Plug in IRC, Discord, Telegram, web — power it with Claude or any LLM.
Project-URL: Homepage, https://github.com/turborg/turborg
Project-URL: Repository, https://github.com/turborg/turborg
Project-URL: Issues, https://github.com/turborg/turborg/issues
Project-URL: Changelog, https://github.com/turborg/turborg/blob/main/CHANGELOG.md
Author-email: The turborg Authors <hello@xshellz.com>
Maintainer-email: xshellz <hello@xshellz.com>
License: Apache-2.0
License-File: LICENSE
License-File: NOTICE
Keywords: ai-agent,anthropic,chatbot,claude,discord,framework,irc,llm,telegram
Classifier: Development Status :: 3 - Alpha
Classifier: Framework :: AsyncIO
Classifier: Framework :: Pytest
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Topic :: Communications :: Chat :: Internet Relay Chat
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Typing :: Typed
Requires-Python: >=3.12
Requires-Dist: anthropic>=0.40
Requires-Dist: pydantic-settings>=2.2
Requires-Dist: pydantic>=2.6
Requires-Dist: python-dotenv>=1.0
Requires-Dist: typer>=0.12
Provides-Extra: all
Requires-Dist: fastapi>=0.115; extra == 'all'
Requires-Dist: mkdocs-material>=9.5; extra == 'all'
Requires-Dist: openai>=1.50; extra == 'all'
Requires-Dist: uvicorn[standard]>=0.30; extra == 'all'
Requires-Dist: websockets>=13; extra == 'all'
Provides-Extra: dev
Requires-Dist: httpx>=0.27; extra == 'dev'
Requires-Dist: mypy>=1.11; extra == 'dev'
Requires-Dist: pre-commit>=3.8; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.24; extra == 'dev'
Requires-Dist: pytest-cov>=5.0; extra == 'dev'
Requires-Dist: pytest-randomly>=3.15; extra == 'dev'
Requires-Dist: pytest>=8.3; extra == 'dev'
Requires-Dist: ruff>=0.6; extra == 'dev'
Provides-Extra: docs
Requires-Dist: mkdocs-material>=9.5; extra == 'docs'
Provides-Extra: openai
Requires-Dist: openai>=1.50; extra == 'openai'
Provides-Extra: web
Requires-Dist: fastapi>=0.115; extra == 'web'
Requires-Dist: uvicorn[standard]>=0.30; extra == 'web'
Requires-Dist: websockets>=13; extra == 'web'
Description-Content-Type: text/markdown

# turborg

**An open-source modular AI agent framework. Plug in IRC, Discord, Telegram, web — power it with Claude or any LLM.**

[![License: Apache 2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](LICENSE)
[![Python 3.12+](https://img.shields.io/badge/python-3.12%2B-blue.svg)](https://www.python.org/downloads/)
[![CI](https://github.com/turborg/turborg/actions/workflows/ci.yml/badge.svg)](https://github.com/turborg/turborg/actions/workflows/ci.yml)
[![Coverage](https://img.shields.io/badge/coverage-%E2%89%A590%25-brightgreen.svg)](#testing)

```python
from turborg.core import Agent, OutboundEnvelope
from turborg.connectors.irc import IRCConnector, IRCSettings
from turborg.llm.anthropic import AnthropicProvider

agent = Agent(llm=AnthropicProvider(api_key="sk-ant-..."))
agent.add_connector(IRCConnector(IRCSettings(hostname="irc.libera.chat", nick="myturborg")))

@agent.on_command("ask")
async def ask(envelope):
    answer = await agent.llm.ask(" ".join(envelope.args))
    return OutboundEnvelope.reply(envelope, answer)

agent.run()
```

That's a working IRC chatbot powered by Claude. Same code shape ports to any future connector — Discord, Telegram, web, custom.

---

## What is turborg?

turborg is a Python framework for writing AI agents that connect to chat networks. It separates **how the bot talks to a network** (the connector) from **how it thinks** (the LLM provider) from **what it does** (your handlers). Add a Discord connector and the same handlers work on Discord. Swap from Claude to OpenAI and the same connectors keep working.

The architecture lives at three layers:

```
  Your handlers (commands, event hooks)
              │
              ▼
        ┌──────────┐         ┌──────────────┐
        │  Agent   │  ←────  │ LLM provider │   Anthropic / OpenAI / custom
        └──────────┘         └──────────────┘
              │
              ▼
        ┌────────────────┐
        │  Connectors    │   IRC / Discord / web / Telegram / ...
        └────────────────┘
```

A normalized `Envelope` is the lingua franca: the agent never sees IRC vs Discord, only connectors translate.

For the long-term vision — **hive.xshellz.com**, a shared-intelligence cloud any turborg instance can attach to — see [docs/hive.md](docs/hive.md).

## Status

**v0.1.0** — alpha. The IRC connector and Anthropic provider are production-ready and well-tested (≥90% coverage gate enforced in CI). Other connectors are roadmap.

| Connector  | Status     | Install                                |
|------------|------------|----------------------------------------|
| IRC        | Stable     | `pip install turborg`                  |
| Discord    | Roadmap    | `pip install turborg[discord]` (TBD)   |
| Telegram   | Roadmap    | `pip install turborg[telegram]` (TBD)  |
| WhatsApp   | Roadmap    | `pip install turborg[whatsapp]` (TBD)  |
| Web        | v0.2 hook  | `pip install turborg[web]`             |

| LLM provider | Status     | Install                       |
|--------------|------------|-------------------------------|
| Anthropic    | Default    | `pip install turborg`         |
| OpenAI       | Available  | `pip install turborg[openai]` |

## Install

```bash
pip install turborg
```

Or with `uv`:

```bash
uv add turborg
```

For a working bot you also need an LLM API key:

```bash
export ANTHROPIC_API_KEY=sk-ant-...
```

See [docs/configuration.md](docs/configuration.md) for the full list of environment variables.

## Quickstart

The full 5-minute tutorial is at [docs/quickstart.md](docs/quickstart.md). The 30-second version:

```bash
pip install turborg
export ANTHROPIC_API_KEY=sk-ant-...
export TURBORG_IRC_HOSTNAME=irc.libera.chat
export TURBORG_IRC_NICK=myturborg
export TURBORG_IRC_CHANNELS='["#turborg-test"]'
python examples/claude_powered_irc.py
```

Then in IRC:

```
<you> !ask what is a quine?
<myturborg> A quine is a program that prints its own source code...
```

## Run with Docker

```bash
cp .env.example .env       # fill in ANTHROPIC_API_KEY + TURBORG_IRC_*
docker compose up
```

Or without compose:

```bash
docker run --env-file .env turborg/turborg:latest
```

To run a different example or your own bot, override `command:` in
`docker-compose.yml` or pass `python /app/examples/minimal_irc_bot.py` on
`docker run`. Mount your own script with `-v "$PWD/mybot.py:/app/mybot.py"`.
The image is multi-stage, ~150 MB on disk, and runs as a non-root user.

## Documentation

- [Quickstart](docs/quickstart.md) — get a bot online in 5 minutes
- [Architecture](docs/architecture.md) — how the agent, connectors, and LLM fit together
- [Configuration](docs/configuration.md) — every setting and environment variable
- [LLM providers](docs/llm-providers.md) — default Anthropic, swapping providers
- [Writing a connector](docs/writing-a-connector.md) — add Discord, Telegram, your own
- [Hive](docs/hive.md) — the future shared-intelligence cloud

## Examples

- [`examples/minimal_irc_bot.py`](examples/minimal_irc_bot.py) — the smallest possible bot (`!ping` → `pong`)
- [`examples/claude_powered_irc.py`](examples/claude_powered_irc.py) — `!ask <question>` proxies to Claude

## Project layout

```
turborg/
├── src/turborg/
│   ├── core/          Agent, envelope, event bus, command registry
│   ├── connectors/    Connector ABC + per-protocol implementations
│   │   └── irc/       IRC connector (handshake, parser, bouncer)
│   ├── llm/           LLM provider ABC + Anthropic implementation
│   ├── hive/          Hive client extension hook (noop default)
│   ├── api/           v0.2+ control-plane HTTP API (placeholder)
│   ├── config/        pydantic-settings config
│   └── cli.py         turborg CLI entry point
├── tests/             unit + integration suites
├── examples/          runnable example bots
└── docs/              full documentation
```

## Contributing

Contributions are welcome. See [CONTRIBUTING.md](CONTRIBUTING.md) for the dev setup, branching strategy, and PR rules. By submitting a contribution you agree to the [Contributor License Agreement](CLA.md) — the cla-assistant bot will guide you on first PR.

The maintainers run a strict CI gate: every PR must pass `ruff`, `mypy --strict`, and tests with ≥90% coverage. See the [Style](#style) section below.

## License

[Apache License 2.0](LICENSE) — see [TRADEMARKS.md](TRADEMARKS.md) for the trademark policy on the names "turborg" and "xshellz".

## Security

Found a vulnerability? Please **do not** open a public issue. See [SECURITY.md](SECURITY.md) for the responsible-disclosure process.

## Style

- Conventional Commits for PR titles (`feat:`, `fix:`, `docs:`, `refactor:`, `chore:`)
- Squash-merge to `main`; linear history
- 100-char line limit, ruff-formatted, mypy-strict
- pytest with branch coverage; fail under 90%
- No `Co-Authored-By: AI` trailers in commit messages

## Powered by

- [Anthropic Claude](https://www.anthropic.com/) — default LLM
- [pydantic](https://pydantic.dev/) — settings and envelope validation
- [typer](https://typer.tiangolo.com/) — CLI
- [hatchling](https://hatch.pypa.io/) — build backend
- [uv](https://docs.astral.sh/uv/) — env management

---

*Part of the [**xshellz**](https://www.xshellz.com) ecosystem. The future hosted hive lives at [hive.xshellz.com](https://hive.xshellz.com).*
