Metadata-Version: 2.4
Name: hey-nabu-climate-concierge
Version: 0.4.0
Summary: DIY Tesla in-car voice concierge: FastMCP server + HA configs + browser PWA. Climate, media, windows, seats.
Project-URL: Homepage, https://github.com/Raymondriter/hey-nabu-climate-concierge
Project-URL: Repository, https://github.com/Raymondriter/hey-nabu-climate-concierge
Project-URL: Issues, https://github.com/Raymondriter/hey-nabu-climate-concierge/issues
Project-URL: Changelog, https://github.com/Raymondriter/hey-nabu-climate-concierge/blob/main/CHANGELOG.md
Author-email: Raymondriter <tenzinshare@gmail.com>
License: MIT
License-File: LICENSE
Keywords: climate,concierge,home-assistant,home-automation,mcp,ollama,pwa,tesla,voice,whisper
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: End Users/Desktop
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Classifier: Topic :: Home Automation
Classifier: Topic :: Multimedia :: Sound/Audio :: Speech
Requires-Python: >=3.12
Requires-Dist: httpx>=0.27
Requires-Dist: mcp>=1.4
Requires-Dist: pydantic>=2.7
Requires-Dist: rich>=13.7
Requires-Dist: typer>=0.12
Requires-Dist: websockets>=14
Provides-Extra: dev
Requires-Dist: pytest-asyncio>=0.24; extra == 'dev'
Requires-Dist: pytest>=8; extra == 'dev'
Requires-Dist: ruff>=0.7; extra == 'dev'
Provides-Extra: live
Requires-Dist: tesla-fleet-api>=1.4; extra == 'live'
Description-Content-Type: text/markdown

# hey-nabu-climate-concierge

[![PyPI](https://img.shields.io/pypi/v/hey-nabu-climate-concierge.svg)](https://pypi.org/project/hey-nabu-climate-concierge/)
[![CI](https://github.com/Raymondriter/hey-nabu-climate-concierge/actions/workflows/ci.yml/badge.svg)](https://github.com/Raymondriter/hey-nabu-climate-concierge/actions/workflows/ci.yml)

A DIY in-car voice concierge that **closes Tesla Grok's biggest gap** — climate, media, window, and seat control. Single utterance, multi-target. Built on Home Assistant Voice + microWakeWord + an MCP server wrapping `tesla-fleet-api`.

## Install

```bash
pip install hey-nabu-climate-concierge              # MCP server only
pip install 'hey-nabu-climate-concierge[live]'      # +tesla-fleet-api for real commands
pip install 'hey-nabu-climate-concierge[dev]'       # +pytest, ruff for contributing
```

Or with `uv`:

```bash
uv add hey-nabu-climate-concierge
uv add 'hey-nabu-climate-concierge[live]'
```

The PyPI wheel ships **only the MCP server** (`hey_nabu_mcp`). The browser PWA in `pwa/` is a separate static-file artifact you host yourself and load in the Tesla browser — see [`pwa/README.md`](pwa/README.md). The Home Assistant YAML in `ha-config/` and the system prompt in `prompts/system.md` are copy/paste artifacts, not part of the wheel.

After install:

```bash
hey-nabu --help              # CLI surface
hey-nabu health              # smoke-test imports + print version
hey-nabu tools               # list registered MCP tools
```


> "Hey Nabu, make it warm and play the focus playlist."
>
> → microWakeWord on phone fires → HA Assist pipeline → Ollama (Qwen3-32B) sees the intent → calls 3 MCP tools concurrently (`set_climate`, `play_spotify`, optional `vent_windows`) → MCP server dispatches signed Tesla commands via the HTTP Proxy → **waits for Fleet API confirmation events before claiming success**.

The full architecture spec is in [`../ideas/v2/wave3/hey_nabu_climate_concierge.md`](../ideas/v2/wave3/hey_nabu_climate_concierge.md). This repo is the buildable scaffold of that spec.

## Why this exists (vs Grok)

Tesla Grok in firmware 2026.14 cannot:

- Adjust climate, media, windows, frunk, or seats
- Operate offline (requires Premium Connectivity)
- Control your smart home
- See your calendar, email, or RAG over your documents
- Survive a cellular drop without restarting the session
- Multi-step / agentic loops

Hey Nabu does all of that. The wave-3 spec writes up the competitive matrix in detail.

## Status — v0.4 alpha

What works today (testable without a Tesla):

- **Pure-Python `logic` module** with one function per voice intent (`set_climate`, `play_spotify`, `unlock_doors`, `vent_windows`, `get_climate_status`). Each function takes a `TeslaClient` and a `ConfirmationWaiter` as arguments — fully DI'd, mock-friendly.
- **`MockTeslaClient`** records all calls and returns canned responses.
- **`FakeConfirmationWaiter`** simulates the HA event-bus confirmation pattern with controllable timeout/success behavior.
- **FastMCP `server.py`** wraps the logic functions as MCP tools (`@mcp.tool()`) so Ollama/Claude/any MCP client can call them. Exposes 5 tools.
- **`hey-nabu serve`** Typer CLI starts the MCP server (defaults to SSE transport on `0.0.0.0:8765`).
- **`HAWebSocketConfirmationWaiter`** real implementation of the wait-for-`tesla_fleet_command_confirmed` event listener via the HA WebSocket API.
- **`LiveTeslaClient`** wraps `tesla-fleet-api` (only loaded when you `pip install '...[live]'`).
- **Full HA YAML configs** in `ha-config/` ready to paste into your HA install — `configuration.yaml`, `custom_sentences/en/climate.yaml`, `intent_script.yaml`, `scripts.yaml`, `automations.yaml`.
- **System prompt** at `prompts/system.md` includes the **hallucination-defense clause** ("If a tool returns `pending=true`, say 'queued' not 'done'") that turns the confirmation pattern into a structural property.
- **Demo script** at `demo/script.md` with three timed utterances for filming.
- **Sharp-edges doc** at `docs/sharp-edges.md` with the 10 known footguns from the wave-3 spec.

**Browser side** (`pwa/`, 44/44 Node tests pass — added in v0.2, expanded in v0.4):

- `AgentRouter` state machine (cloud → local → error) with injectable transports — pure logic, tested in Node with fakes + a manual clock
- `HAWebSocketTransport` wraps the HA `conversation/process` command via long-lived access token
- `WebLlmTransport` lazy-loads `@mlc-ai/web-llm` (Phi-3.5-mini-instruct, ~2.2 GB to IndexedDB) on first cellular-drop fallback
- PWA shell: `index.html` + `styles.css` + `manifest.webmanifest` + `service-worker.js` + `app.js`. Tesla-touchscreen-friendly dark UI with mode pill + status cards + conversation log + settings dialog
- Service worker caches the shell offline-first; never caches cross-origin requests
- Reconnect loop: when in local mode, pings HA every 10 s and swaps back to cloud automatically
- **v0.4: Conversation history persistence.** Every user + assistant turn (including
  scripted demo-mode turns, flagged with `"demo": true`) is appended to a local
  IndexedDB store (`hey-nabu-history`, schema v1, single object store `turns` keyed
  by autoincrement id, indexed on `timestamp`). The header gains a **History** button
  that opens a dialog with two tabs:
  - **View** — the 200 most-recent turns, newest-first.
  - **Export** — *Download as JSONL* (mime `application/x-ndjson`, filename
    `hey-nabu-history-YYYY-MM-DD.jsonl`) and *Clear history* (with confirm).

  One JSONL line looks like:
  ```json
  {"role":"user","text":"set the cabin to 72","timestamp":1763510400000,"id":1,"demo":true}
  ```

  All store functions in `pwa/history-store.js` accept an injectable IDB factory so
  the unit tests run against a tiny in-memory shim — no `fake-indexeddb` dependency.

- See [`pwa/README.md`](pwa/README.md) for dev workflow and Tesla-deployment instructions

What's not yet in v0.4 (deferred):

- The Hetzner $5/mo Docker Compose HA mirror (operational pattern, documented in `docs/sharp-edges.md` not yet shipped as code)
- The cabin-radar passenger inference (HA already exposes; a separate intent)
- Real HA event-bus state subscription for the PWA status cards (currently best-effort polled)

## Quick start (without a Tesla, just the dev loop)

```bash
git clone https://github.com/Raymondriter/hey-nabu-climate-concierge.git
cd hey-nabu-climate-concierge
uv sync --extra dev
uv run pytest                           # ~15 tests, no creds, no Tesla
uv run hey-nabu --help                  # CLI surface
```

## Quick start (full deployment)

1. **Install HA Voice + Ollama** per [`docs/architecture.md`](docs/architecture.md). HA OS or HA Container; `ollama pull qwen3:32b-q5_k_m` on a home box.
2. **Wire the YAML** — paste the snippets from `ha-config/` into your HA `/config/`. Restart HA.
3. **Set up Android Companion app** with microWakeWord and "Hey Nabu" wake phrase per `ha-config/companion_app_setup.md`.
4. **Set the env vars** in `.env` (Tesla developer client_id/secret, your VIN, Tesla refresh token, Spotify client_id/secret, HA long-lived token).
5. **Start the MCP server**:
   ```bash
   uv sync --extra live --extra dev
   uv run hey-nabu serve --host 0.0.0.0 --port 8765
   ```
6. **Wire HA's MCP client** to `http://localhost:8765/sse`. Settings → Devices & services → MCP.
7. **Test**: in your car (or anywhere with the Companion app paired to its Bluetooth speaker), say "Hey Nabu, make it warm and play the focus playlist."

## Architecture

```
┌─────────────────────────┐         ┌──────────────────────────┐
│  Tesla browser PWA      │         │  Home Assistant          │
│  (visual surface only)  │         │   ├ Wyoming + Moonshine  │
└──────────┬──────────────┘         │   ├ Wyoming + Kokoro     │
           │                        │   ├ Ollama (Qwen3-32B)   │
           │ status / cards         │   ├ MCP client → us       │
           ▼                        │   ├ tesla_fleet integ.   │
       Touchscreen                  │   └ event bus            │
                                    └────────┬─────────────────┘
                                             │
   ┌────────────────────────┐                │
   │ Phone (Android         │                │
   │ Companion app)         │                │
   │  ├ microWakeWord       │  Wyoming TCP   │
   │  ├ Bluetooth A2DP →    │ ──────────────▶│
   │  │   car speakers      │                │
   │  └ mic capture         │                │
   └────────────────────────┘                │
                                             │
                                  MCP/SSE    ▼
                              ┌─────────────────────────┐
                              │  hey-nabu-mcp           │
                              │  (this repo)            │
                              │   ├ FastMCP @mcp.tool() │
                              │   ├ logic.* funcs (DI)  │
                              │   ├ ConfirmationWaiter  │ ← waits for
                              │   └ TeslaClient        │   tesla_fleet_
                              └─────────┬───────────────┘   command_
                                        │                   confirmed
                                        │ signed cmd via HTTP Proxy
                                        ▼
                              ┌─────────────────────────┐
                              │  Tesla HTTP Proxy v0.4.1│
                              └─────────┬───────────────┘
                                        │ BLE / Fleet API
                                        ▼
                                       Tesla
```

## Screenshots

The PWA rendered in a 1280x900 viewport (captured headlessly against the bundled `python -m http.server` dev server). The Conversation history dialog is visible in all three because headless Chrome promotes a closed `<dialog>` styled with `display: flex` to the top layer; in a real Tesla browser session it stays hidden until you tap the **History** button in the header.

![PWA idle state — header, mode pill, vehicle status cards, empty conversation log, settings prompt](docs/screenshots/01-pwa-idle.png)

![Demo mode at ~4 s — the scripted exchange has fired the climate set_temperature turn](docs/screenshots/02-pwa-demo.png)

![Demo mode at ~12 s — climate, vent_windows, and the play-Spotify turn have all played through](docs/screenshots/03-pwa-history.png)

## Changelog

See [CHANGELOG.md](CHANGELOG.md). Versions follow [Keep a Changelog](https://keepachangelog.com/) and the project uses [SemVer](https://semver.org/). For the release runbook, see [RELEASE.md](RELEASE.md).

## License

MIT. See [LICENSE](LICENSE).

## CI

[![ci](https://github.com/Raymondriter/hey-nabu-climate-concierge/actions/workflows/ci.yml/badge.svg)](https://github.com/Raymondriter/hey-nabu-climate-concierge/actions/workflows/ci.yml)

GitHub Actions runs two parallel jobs on every push and PR:

- `python` &mdash; ruff + pytest on Python 3.12 and 3.13 for the MCP server.
- `pwa` &mdash; `node --check` + the node 24 built-in test runner against `pwa/tests/**/*.test.js`.

See [`.github/workflows/ci.yml`](.github/workflows/ci.yml).
