Metadata-Version: 2.4
Name: finchvox
Version: 0.0.2
Summary: Voice AI observability dev tool for Pipecat
License: Finchvox Source Available License
        
        Copyright (c) 2026 Finchvox
        
        Permission is granted, free of charge, to any person obtaining a copy of this
        software and associated documentation files (the "Software"), to use, copy,
        modify, and distribute the Software for internal or private use.
        
        You may:
        - Run the Software locally or within your organization
        - Modify the Software for your own use
        - Deploy the Software within your own infrastructure
        - Deploy the Software on behalf of a customer, provided the customer operates
          the Software themselves
        
        You may NOT:
        - Offer the Software as a hosted service, managed service, or SaaS
        - Provide access to the Software to third parties as a service
        - Resell, rebrand, or commercially exploit the Software as a service
        
        This license does not grant you the right to use the Finchvox name, logo, or
        trademarks.
        
        THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND.
License-File: LICENSE
Requires-Python: >=3.10
Requires-Dist: aiofiles>=24.1.0
Requires-Dist: aiohttp>=3.9.0
Requires-Dist: aiortc>=1.14.0
Requires-Dist: fastapi>=0.115.0
Requires-Dist: grpcio>=1.60.0
Requires-Dist: loguru>=0.7.0
Requires-Dist: opentelemetry-exporter-otlp-proto-grpc>=1.27.0
Requires-Dist: opentelemetry-proto>=1.27.0
Requires-Dist: pipecat-ai[cartesia,daily,deepgram,silero]>=0.0.98
Requires-Dist: protobuf<6.0.0,>=4.25.0
Requires-Dist: python-multipart>=0.0.9
Requires-Dist: uvicorn[standard]>=0.32.0
Provides-Extra: dev
Requires-Dist: pytest-cov>=4.0.0; extra == 'dev'
Requires-Dist: pytest>=7.0.0; extra == 'dev'
Requires-Dist: ruff>=0.14.10; extra == 'dev'
Requires-Dist: twine>=6.2.0; extra == 'dev'
Description-Content-Type: text/markdown

# <img src="ui/images/finchvox-logo.png" height=24 /> Finchvox - elevated debuggability for Voice AI apps

Do your eyes bleed like a Vecna victim watching Pipecat logs fly by? Do OpenTelemetry traces look impressive … yet explain nothing? If so, meet Finchvox, a local debuggability tool purpose-built for Voice AI apps. 

Finchvox unifies conversation audio and traces in a single UI, highlighting voice-specific problems like interruptions and high user <-> bot latency. Good luck convincing DataDog to add that!

_👇 Click the image for a short video:_
<a href="https://raw.githubusercontent.com/itsderek23/finchvox/refs/heads/main/docs/demo.gif" target="_blank"><img src="./docs/screenshot.png" /></a>

## Table of Contents

- [Prerequisites](#prerequisites)
- [Installation](#installation)
- [Setup](#setup)
- [Usage](#usage---finchvox-server)
- [Troubleshooting](#troubleshooting)

## Prerequisites

- Python 3.10 or higher
- A Pipecat Voice AI application

## Installation

```bash
# uv
uv add finchvox "pipecat-ai[tracing]"

# Or with pip
pip install finchvox "pipecat-ai[tracing]"
```

## Setup

1. Add the following to the top of your bot (e.g., `bot.py`):

```python
import finchvox
from finchvox import FinchvoxProcessor

finchvox.init(service_name="my-voice-app")
```

2. Add `FinchvoxProcessor` to your pipeline, ensuring it comes after `transport.output()`:

```python
pipeline = Pipeline([
    # SST, LLM, TTS, etc. processors
    transport.output(),
    FinchvoxProcessor(), # Must come after transport.output()
    context_aggregator.assistant(),
])
```

3. Initialize your `PipelineTask` with metrics, tracing and turn tracking enabled:

```python
task = PipelineTask(
    pipeline,
    params=PipelineParams(enable_metrics=True),
    enable_tracing=True,
    enable_turn_tracking=True,
)
```

## Usage - Finchvox server

```bash
uv run finchvox start
```

For the list of available options, run:

```bash
uv run finchvox --help
```

## Troubleshooting

### Port already in use

If port 4317 is already occupied:

```bash
# Find process using port
lsof -i :4317

# Kill the process
kill -9 <PID>
```

### No spans being written

1. Check collector is running: Look for "OTLP collector listening on port 4317" log message
2. Verify client endpoint: Ensure Pipecat is configured to send to `http://localhost:4317`
