Metadata-Version: 2.4
Name: vela-cli
Version: 0.1.0
Summary: Runtime observability CLI for AI agents
Author-email: Vela <hello@vela.wtf>
License: MIT
Project-URL: Homepage, https://vela.wtf
Project-URL: Repository, https://github.com/vela-ai/vela
Project-URL: Bug Tracker, https://github.com/vela-ai/vela/issues
Keywords: ai,agents,observability,llm,tracing,cli
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Requires-Python: >=3.9
Description-Content-Type: text/markdown
Requires-Dist: click>=8.0
Requires-Dist: requests>=2.28
Requires-Dist: mitmproxy>=10.0
Requires-Dist: rich>=13.0

# Vela CLI

Runtime observability for AI agents — no code changes required.

## Install

```bash
# Homebrew (macOS / Linux)
brew tap vela-ai/vela
brew install vela

# pip
pip install vela-cli
```

## Quick start

```bash
vela login
vela wrap python my_agent.py
```

That's it. Every LLM call is captured and visible at [vela.wtf/dashboard](https://vela.wtf/dashboard).

---

## Commands

### `vela login`

Opens `vela.wtf/cli-auth` in your browser. After authenticating, your API key
is saved to `~/.vela/config.json`. Takes about 10 seconds.

```
$ vela login
Opening browser...
Waiting for authentication...
✓ Logged in as you@example.com. API key saved.
```

### `vela logout`

Removes `~/.vela/config.json`.

```
$ vela logout
✓ Logged out.
```

### `vela wrap <command>`

Runs any command with transparent LLM interception. Every request to OpenAI,
Anthropic, Gemini, Groq, or OpenRouter is captured — model, latency, tokens,
cost — and streamed to your Vela dashboard in real time.

```
$ vela wrap python my_agent.py
[vela] Session started · a3f2bc91
[vela] gpt-4o · 1.2s · $0.0043
[vela] claude-3-5-sonnet-20241022 · 0.8s · $0.0031
[vela] Session complete · $0.0074 · view: https://vela.wtf/session/a3f2bc91...
```

Works with any language or runtime:

```bash
vela wrap python my_agent.py
vela wrap node agent.js
vela wrap python -m my_module --flag value
```

**How it works:** `vela wrap` starts a local MITM proxy (port 9877) and sets
`HTTP_PROXY` / `HTTPS_PROXY` for the subprocess. The proxy intercepts LLM API
responses, extracts trace data, and POSTs it to the ingest API without
affecting the wrapped process at all.

**SSL:** The mitmproxy CA certificate (`~/.mitmproxy/mitmproxy-ca-cert.pem`)
is set via `SSL_CERT_FILE` / `REQUESTS_CA_BUNDLE` / `NODE_EXTRA_CA_CERTS` so
wrapped processes trust the proxy without system-wide cert changes.

### `vela status`

Shows your account and the last 5 sessions.

```
$ vela status
Logged in as you@example.com

  a3f2bc91  complete  1.2s  $0.017
  b7e1da02  complete  1.4s  $0.019
  c8f3ea14  error     0.8s  $0.012
  d4a9fb27  complete  1.1s  $0.016
  e2b5gc38  running   —     —
```

### `vela init`

Detects your project's entry file and inserts the Vela SDK initialisation
block at the top. Asks for confirmation before writing.

```
$ vela init
Detected: my_agent.py (OpenAI, LangChain)

Will add to top of my_agent.py:

  import vela
  vela.init(api_key="vela_xxx")

Add Vela to my_agent.py? [Y/n]: y
✓ Added Vela to my_agent.py
```

### `vela update`

Upgrades to the latest release.

```
$ vela update
✓ vela-cli updated.
```

---

## Supported LLM providers

| Provider | Host intercepted |
|---|---|
| OpenAI | `api.openai.com` |
| Anthropic | `api.anthropic.com` |
| Google Gemini | `generativelanguage.googleapis.com` |
| Groq | `api.groq.com` |
| OpenRouter | `openrouter.ai` |

---

## Offline mode

If `vela.wtf` is unreachable, `vela wrap` still runs the wrapped process
normally and prints:

```
[vela] Offline mode — traces will not be saved.
```

The process is never interrupted for network reasons.

---

## Homebrew tap

The tap lives at [github.com/vela-ai/vela](https://github.com/vela-ai/vela).
The formula at `Formula/vela.rb` in this directory is the canonical source.

```bash
brew tap vela-ai/vela
brew install vela
```

---

## Requirements

- Python 3.9+
- `mitmproxy` (installed automatically as a dependency)
