Metadata-Version: 2.4
Name: ai-api-failover
Version: 1.0.0
Summary: Automatic failover from FAL.ai / Replicate to NexaAPI on 429/500 errors. Same models, 1/5 the price.
Home-page: https://nexa-api.com
Author: NexaAPI Team
Author-email: frequency404@villaastro.com
License: MIT
Project-URL: Homepage, https://nexa-api.com
Project-URL: API Keys, https://rapidapi.com/nexaquency
Project-URL: Bug Reports, https://nexa-api.com/contact
Project-URL: Source, https://github.com/nexaapi/ai-api-failover
Keywords: ai,api,failover,fal,replicate,nexaapi,flux,stable-diffusion,image-generation,video-generation,rate-limit,429,retry
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Internet :: WWW/HTTP
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Operating System :: OS Independent
Requires-Python: >=3.8
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: requests>=2.28.0
Provides-Extra: httpx
Requires-Dist: httpx>=0.24.0; extra == "httpx"
Provides-Extra: dev
Requires-Dist: pytest>=7.0; extra == "dev"
Requires-Dist: pytest-asyncio>=0.21; extra == "dev"
Requires-Dist: httpx>=0.24.0; extra == "dev"
Requires-Dist: responses>=0.23; extra == "dev"
Dynamic: author
Dynamic: author-email
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: keywords
Dynamic: license
Dynamic: license-file
Dynamic: project-url
Dynamic: provides-extra
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# ai-api-failover

> **Automatic failover from FAL.ai / Replicate to NexaAPI on 429/500 errors.**
> Same AI models. 1/5 the price. Zero code changes.

[![PyPI version](https://img.shields.io/pypi/v/ai-api-failover.svg)](https://pypi.org/project/ai-api-failover/)
[![Python 3.8+](https://img.shields.io/badge/python-3.8+-blue.svg)](https://www.python.org/downloads/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

---

## Why?

FAL.ai and Replicate are great — until you hit a **429 rate limit** or a **500 server error** at 3am. Your pipeline stalls, your users wait, and you lose money.

**ai-api-failover** intercepts those errors and automatically retries your request against **[NexaAPI](https://nexa-api.com)** — the same models (Flux Pro, SDXL, Kling, Whisper, and more) at **1/5 the official price**.

```
FAL returns 429 → ai-api-failover → NexaAPI ✅
Replicate 500   → ai-api-failover → NexaAPI ✅
```

No code changes to your existing FAL/Replicate calls. Just install and configure.

---

## Installation

```bash
pip install ai-api-failover
```

For httpx support:

```bash
pip install "ai-api-failover[httpx]"
```

---

## Quick Start

### With `requests` (FAL.ai)

```python
import requests
from ai_api_failover import install_requests_failover

# One-time setup — patch all requests globally
install_requests_failover(nexa_api_key="your-nexa-api-key")

# Your existing FAL code — unchanged!
response = requests.post(
    "https://fal.run/fal-ai/flux-pro",
    headers={"Authorization": "Key your-fal-key"},
    json={
        "prompt": "A futuristic cityscape at sunset",
        "image_size": "landscape_16_9",
        "num_inference_steps": 28,
    },
)
# If FAL returns 429 or 500, automatically retried with NexaAPI
print(response.json())
```

### With `requests` (Replicate)

```python
import requests
from ai_api_failover import install_requests_failover

install_requests_failover(nexa_api_key="your-nexa-api-key")

response = requests.post(
    "https://api.replicate.com/v1/models/black-forest-labs/flux-pro/predictions",
    headers={
        "Authorization": "Bearer your-replicate-token",
        "Prefer": "wait",
    },
    json={
        "input": {
            "prompt": "A dragon over a medieval castle",
            "width": 1024,
            "height": 1024,
        }
    },
)
# Failover to NexaAPI on 429/500
print(response.json())
```

### With `httpx`

```python
import httpx
from ai_api_failover import wrap_httpx_client

client = httpx.Client()
wrap_httpx_client(client, nexa_api_key="your-nexa-api-key")

response = client.post(
    "https://fal.run/fal-ai/flux/schnell",
    headers={"Authorization": "Key your-fal-key"},
    json={
        "prompt": "A cute robot playing chess",
        "image_size": "square_hd",
    },
)
```

### Async httpx

```python
import asyncio
import httpx
from ai_api_failover import wrap_httpx_client

async def generate():
    async with httpx.AsyncClient() as client:
        wrap_httpx_client(client, nexa_api_key="your-nexa-api-key")
        response = await client.post(
            "https://fal.run/fal-ai/flux-pro",
            json={"prompt": "A neon city at night"},
        )
        return response.json()

asyncio.run(generate())
```

---

## Configuration

### Global configure (recommended)

```python
from ai_api_failover import configure, install_requests_failover

configure(
    nexa_api_key="your-nexa-api-key",
    max_retries=3,           # retry attempts on NexaAPI (default: 2)
    retry_delay=1.5,         # seconds between retries (default: 1.0)
    on_failover=lambda url, code, nexa_url: print(f"Failover: {url} → {code}"),
)

# Now install without repeating the key
install_requests_failover()
```

### Per-session patch

```python
import requests
from ai_api_failover import install_requests_failover

session = requests.Session()
install_requests_failover(nexa_api_key="your-key", session=session)
# Only this session gets failover
```

---

## Supported Models

### FAL.ai → NexaAPI

| FAL Model | NexaAPI Model |
|-----------|---------------|
| `fal-ai/flux-pro` | `flux-2-pro` |
| `fal-ai/flux-pro/v1.1` | `flux-pro-1.1` |
| `fal-ai/flux-pro/v1.1-ultra` | `flux-pro-1.1-ultra` |
| `fal-ai/flux/schnell` | `flux-schnell` |
| `fal-ai/flux/dev` | `flux-dev` |
| `fal-ai/stable-diffusion-v3-medium` | `stable-diffusion-3` |
| `fal-ai/stable-diffusion-3.5-large` | `stable-diffusion-3.5-large` |
| `fal-ai/stable-diffusion-xl` | `sdxl-turbo` |
| `fal-ai/playground-v25` | `playground-v2.5` |
| `fal-ai/dreamshaper-xl-turbo` | `dreamshaper-xl` |
| `fal-ai/ideogram/v2` | `ideogram-v2` |
| `fal-ai/recraft-v3` | `recraft-v3` |
| `fal-ai/kling-video/v2/master/text-to-video` | `kling-v2-master` |
| `fal-ai/minimax/video-01` | `minimax-video-01` |
| `fal-ai/whisper` | `whisper-large-v3` |

### Replicate → NexaAPI

| Replicate Model | NexaAPI Model |
|-----------------|---------------|
| `black-forest-labs/flux-pro` | `flux-2-pro` |
| `black-forest-labs/flux-1.1-pro` | `flux-pro-1.1` |
| `black-forest-labs/flux-schnell` | `flux-schnell` |
| `stability-ai/stable-diffusion-3` | `stable-diffusion-3` |
| `stability-ai/sdxl` | `sdxl-turbo` |
| `ideogram-ai/ideogram-v2` | `ideogram-v2` |
| `recraft-ai/recraft-v3` | `recraft-v3` |
| `openai/whisper` | `whisper-large-v3` |

> Full model list: [nexa-api.com](https://nexa-api.com)

---

## Failover Triggers

The following HTTP status codes trigger a failover to NexaAPI:

| Status | Meaning |
|--------|---------|
| `429` | Rate limit exceeded |
| `500` | Internal server error |
| `502` | Bad gateway |
| `503` | Service unavailable |
| `504` | Gateway timeout |

---

## Parameter Translation

Parameters are automatically translated between FAL/Replicate and NexaAPI formats:

| FAL Parameter | NexaAPI Parameter |
|---------------|-------------------|
| `image_size: "square_hd"` | `width: 1024, height: 1024` |
| `image_size: "landscape_16_9"` | `width: 1024, height: 576` |
| `image_size: {"width": 1280, "height": 720}` | `width: 1280, height: 720` |
| `num_inference_steps` | `steps` |
| `guidance_scale` | `guidance_scale` |
| `num_images` | `n` |

Replicate's `input` wrapper is automatically unwrapped.

---

## Advanced Usage

### Custom failover callback

```python
from ai_api_failover import configure

def my_callback(original_url: str, status_code: int, nexa_url: str):
    # Log to your monitoring system
    metrics.increment("api.failover", tags={"source": "fal", "code": status_code})
    print(f"Failover: {original_url} → NexaAPI (saved ~80% on this call)")

configure(nexa_api_key="your-key", on_failover=my_callback)
```

### Check model support

```python
from ai_api_failover.mapping import resolve_fal_model

mapping = resolve_fal_model("https://fal.run/fal-ai/flux-pro")
if mapping:
    nexa_model, endpoint = mapping
    print(f"Supported! NexaAPI model: {nexa_model}")
else:
    print("No NexaAPI mapping — failover will not trigger for this model")
```

### Direct engine usage

```python
from ai_api_failover import FailoverEngine
import requests

engine = FailoverEngine(nexa_api_key="your-key", max_retries=3)

# Build a NexaAPI request manually
result = engine.build_nexa_request(
    "https://fal.run/fal-ai/flux-pro",
    {"prompt": "A sunset", "image_size": "square_hd"},
)
if result:
    nexa_url, nexa_headers, nexa_body = result
    response = requests.post(nexa_url, json=nexa_body, headers=nexa_headers)
```

---

## Get Your NexaAPI Key

1. Visit [https://rapidapi.com/nexaquency](https://rapidapi.com/nexaquency)
2. Subscribe to any NexaAPI plan
3. Copy your API key

**Pricing:** Official model prices / 5 — prepaid, no subscription required.

**Models available:** Gemini, Claude, Flux Pro, Stable Diffusion 3.5, Kling Video, Veo 3.1, Whisper, and 30+ more.

📧 **Contact:** frequency404@villaastro.com
🌐 **Website:** [https://nexa-api.com](https://nexa-api.com)

---

## License

MIT License — see [LICENSE](LICENSE) for details.

---

*NexaAPI is the cheapest AI API provider. Same models, 1/5 the price, prepaid, no subscription.*
*Contact: frequency404@villaastro.com | https://nexa-api.com*
