Metadata-Version: 2.4
Name: otto-agent
Version: 0.12.2
Summary: Otto — AI agent platform
Project-URL: Homepage, https://github.com/1broseidon/otto
Project-URL: Documentation, https://otto-agent.dev
Project-URL: Repository, https://github.com/1broseidon/otto
Project-URL: Issues, https://github.com/1broseidon/otto/issues
Author-email: George <george@example.com>
License-Expression: MIT
License-File: LICENSE
Keywords: agent,automation,cli,tools
Classifier: Development Status :: 3 - Alpha
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Utilities
Requires-Python: >=3.12
Requires-Dist: agent-client-protocol>=0.8.1
Requires-Dist: aiohttp>=3.9.0
Requires-Dist: anthropic>=0.52.0
Requires-Dist: beautifulsoup4>=4.12.0
Requires-Dist: brainfile>=0.4.0
Requires-Dist: clypi>=0.2.0
Requires-Dist: croniter>=2.0.0
Requires-Dist: discord-py>=2.0
Requires-Dist: filelock>=3.16.0
Requires-Dist: jinja2>=3.1.0
Requires-Dist: litellm<1.82.7,>=1.81.8
Requires-Dist: markdownify>=0.11.0
Requires-Dist: matplotlib>=3.10.8
Requires-Dist: mcp>=1.0.0
Requires-Dist: mistune>=3.2.0
Requires-Dist: pdfplumber>=0.10.0
Requires-Dist: pillow>=10.0.0
Requires-Dist: playwright>=1.40.0
Requires-Dist: prompt-toolkit>=3.0.52
Requires-Dist: python-telegram-bot[job-queue]>=20.0
Requires-Dist: pyyaml>=6.0
Requires-Dist: structlog>=25.0.0
Requires-Dist: tomlkit>=0.13.0
Requires-Dist: webauthn<3,>=2.0.0
Provides-Extra: all
Requires-Dist: faster-whisper>=1.0.0; extra == 'all'
Provides-Extra: voice
Requires-Dist: faster-whisper>=1.0.0; extra == 'voice'
Description-Content-Type: text/markdown

<p align="center">
  <img src="docs/otto-logo.png" alt="Otto" width="180" />
</p>

<h1 align="center">Otto</h1>

<p align="center">
  <strong>Self-hosted AI agent platform. Telegram bot, MCP tools, scheduled jobs, persistent memory.</strong>
</p>

<p align="center">
  <a href="https://pypi.org/project/otto-agent/"><img src="https://img.shields.io/pypi/v/otto-agent" alt="PyPI" /></a>
  <a href="LICENSE"><img src="https://img.shields.io/badge/license-MIT-green" alt="License" /></a>
  <a href="#"><img src="https://img.shields.io/badge/python-3.12+-yellow" alt="Python" /></a>
</p>

---

Otto is a personal AI agent that runs on your machine and talks to you through Telegram. It connects to any LLM via LiteLLM, exposes tools through MCP, runs scheduled jobs, and remembers context across conversations.

No cloud platform. No vendor lock-in. Just `pip install otto-agent` and go.

## Install

```bash
pip install otto-agent
```

Or with uv:

```bash
uv tool install otto-agent
```

## Setup

```bash
otto setup
```

The wizard walks you through:
- Choosing a model (`provider/model-name` format — Anthropic, OpenAI, Google, Ollama, OpenRouter, etc.)
- Connecting your Telegram bot token (from [@BotFather](https://t.me/botfather))
- Setting an owner ID so only you can use it

Config lives in `~/.otto/`.

## Usage

Start the Telegram bot:

```bash
otto start        # daemonized
otto run          # foreground (for debugging)
```

Manage the process:

```bash
otto status       # check if running
otto stop         # stop the daemon
otto logs         # tail recent logs
```

Configure the model:

```bash
otto config model get
otto config model set openai/gpt-4o
otto config model list
```

## Telegram Commands

| Command | What it does |
|---------|-------------|
| `/model` | Switch LLM model |
| `/tools` | List available tools |
| `/memory` | Search stored memories |
| `/stop` | Cancel a running response |
| `/session` | Start a fresh conversation |

### `/session pin` + Telegram Topics (important)

`/session pin <Topic Name>` binds context to a Telegram Topic thread.

For **group chats**: Topics must be enabled for the supergroup and the bot must have topic-management rights.

For **1:1 private chats**: Telegram requires enabling **Topics in private chats** for your bot in **@BotFather** (eligible bots only). If this is off, `createForumTopic` fails in DM.

BotFather path:
- `@BotFather` → `/mybots` → select bot → settings/features
- Enable **Topics in private chats**
- (Optional) Enable user topic create/delete controls

If you don't see this toggle, the bot may be ineligible or Telegram may not have rolled the feature to your account/client path yet (desktop/web BotFather usually exposes new settings first).

Fee note (Telegram Terms): enabling topics in private chats may apply a **15% non-refundable Telegram Stars fee** for purchases made in that bot while the feature is enabled.

## Discord Channel (Operator Rollout)

Discord setup and rollout runbook:
- `docs/specs/discord-operator-rollout.md`
- `docs/specs/discord-channel-architecture.md`

Channel config snippet:

```toml
[[bots.channels]]
type = "discord"
token = "${DISCORD_BOT_TOKEN}"
enabled = true
```

Rollout rule: keep Discord in dev/canary until the Discord test suite outcomes are green, then promote phase-by-phase with observability gates and a documented rollback to Telegram.

## Features

**Multi-backend LLM** — Any model supported by LiteLLM: Anthropic, OpenAI, Google, Ollama, OpenRouter, and more. Switch models mid-conversation with `/model`. Supports **OAuth-authenticated models** (Claude Code, Google Code, OpenAI Codex).

**MCP Tool Gateway** — Tools are MCP servers defined in `~/.otto/tools.yaml`. Otto connects to them at startup and exposes them to the agent. Includes **Workspace Policies** for sandboxed file operations (default vs. strict modes).

**Persistent Memory** — Stores and retrieves context across sessions. Memory is searchable and can persist identity/personality rules.

**Agent Orchestration** — Otto can delegate tasks to async sub-agents running in parallel. Delegation is fire-and-forget: Otto returns immediately and notifies you via Telegram when the job is done. Sub-agents run with the same tools and model access as the main session. Delegation is contract-based — specify deliverables, constraints, and optional validation commands so results are verified before delivery.

Built-in delegation tools:
- `delegate_task` — spawn a background sub-agent with a structured contract
- `list_jobs` — inspect status of all delegated jobs
- `cancel_job` — cancel a running sub-agent

**Scheduled Jobs** — Cron-style scheduling built in, with background prompt execution.

**Web UI** — Built-in dashboard for monitoring status, viewing logs, and managing configuration (default: http://localhost:7070).

**Telegram UX** — Interactive commands, inline controls, status cards, and chunked delivery for long responses.

**File Sending** — Send files (PDFs, images, documents) directly to Telegram.

## Architecture

```
You (Telegram / Web)
      │
      ▼
┌─────────────┐
│ Telegram Bot │──── Commands (/model, /tools, /stop, ...)
└──────┬──────┘
       │
┌──────▼──────┐
│   Web UI    │──── Dashboard, monitoring, config
└──────┬──────┘
       ▼
┌─────────────┐
│  Chat Layer  │──── Sessions, memory, system prompt
└──────┬──────┘
       ▼
┌─────────────┐
│    Agent     │──── Tool-calling loop (LiteLLM → any LLM)
└──────┬──────┘
       ▼
┌─────────────┐
│ MCP Gateway  │──── Connects to tool servers defined in tools.yaml
└─────────────┘
```

## Configuration

All config is in `~/.otto/`:

| File | Purpose |
|------|---------|
| `config.yaml` | Model, Telegram token, owner ID, web/workspace settings |
| `tools.yaml` | MCP tool server definitions |
| `skills/` | Custom skill modules |
| `memory.db` | Persistent memory store |
| `sessions/` | Conversation history |
| `logs/` | Structured logs |
| `credentials/` | OAuth provider tokens |

## Development

```bash
git clone https://github.com/1broseidon/otto.git
cd otto
uv sync --dev
make check         # lint + test
otto web           # start web UI for development
```

## License

MIT
