Metadata-Version: 2.2
Name: indoxrouter
Version: 0.1.0
Summary: Client library for IndoxRouter - A unified API for multiple LLM providers
Home-page: https://github.com/yourusername/indoxRouter
Author: Ashkan Eskandari
Author-email: ashkan.eskandari.dev@gmail.com
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Python: >=3.8
Description-Content-Type: text/markdown
Requires-Dist: requests>=2.25.0
Requires-Dist: pyjwt>=2.0.0
Dynamic: author
Dynamic: author-email
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# IndoxRouter

A unified API for multiple LLM providers, allowing you to switch between different models seamlessly.

## Features

- Support for multiple providers (OpenAI, Anthropic, Mistral, Cohere, Google, Meta, AI21, Llama, NVIDIA, Deepseek, Databricks)
- Unified API for all providers
- User authentication and API key management
- Request logging and monitoring
- Docker support for easy deployment
- Ready for Railway deployment

## Quick Start

### Local Installation

1. Clone the repository:

```bash
git clone https://github.com/yourusername/indoxRouter.git
cd indoxRouter
```

2. Create a virtual environment and install dependencies:

```bash
python -m venv venv
# On Windows
venv\Scripts\activate
# On Unix or MacOS
source venv/bin/activate
pip install -r requirements.txt
```

3. Initialize the database:

```bash
python -m indoxRouter.init_db
```

4. Run the application:

```bash
python run.py
```

5. Access the application:
   - API: http://localhost:8000
   - API Documentation: http://localhost:8000/docs

### Docker Installation

1. Clone the repository and configure:

```bash
git clone https://github.com/yourusername/indoxRouter.git
cd indoxRouter
cp .env.example .env
# Edit .env with your configuration
```

2. Start with Docker Compose:

```bash
docker-compose up -d
```

3. Access the application:
   - API: http://localhost:8000
   - API Documentation: http://localhost:8000/docs
   - pgAdmin: http://localhost:5050

## Railway Deployment

IndoxRouter is ready to be deployed on Railway. Follow these steps:

1. Fork the repository on GitHub.

2. Create a new project on Railway and connect it to your GitHub repository.

3. Add the following environment variables in Railway:

   - `DATABASE_URL`: Your PostgreSQL connection string
   - `JWT_SECRET`: A secure secret key for JWT tokens
   - Provider API keys (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.)

4. Railway will automatically detect the Dockerfile and deploy your application.

5. Once deployed, you can access your API at the URL provided by Railway.

## Detailed Documentation

For detailed deployment instructions, configuration options, and troubleshooting, please refer to the [Deployment Guide](DEPLOYMENT_GUIDE.md).

## API Usage

### Authentication

1. Register a new user through the API.
2. Generate an API key for the user.
3. Use the API key in your requests.

### Making Requests

```python
import requests

api_key = "your_api_key"
url = "http://localhost:8000/v1/completions"

payload = {
    "provider": "openai",
    "model": "gpt-3.5-turbo",
    "prompt": "Hello, world!",
    "temperature": 0.7,
    "max_tokens": 100
}

headers = {
    "Authorization": f"Bearer {api_key}",
    "Content-Type": "application/json"
}

response = requests.post(url, json=payload, headers=headers)
print(response.json())
```

## Client Library

IndoxRouter includes a Python client library for easy integration:

```python
from indoxRouter.client import Client

client = Client(api_key="your_api_key")

response = client.completions(
    provider="openai",
    model="gpt-3.5-turbo",
    prompt="Hello, world!",
    temperature=0.7,
    max_tokens=100
)

print(response)
```

## Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

## License

This project is licensed under the MIT License - see the LICENSE file for details.
