Metadata-Version: 2.4
Name: llm-api-bridge
Version: 1.0.0
Summary: A unified interface for querying multiple LLM providers via REST APIs
Home-page: https://github.com/Hunzala-Rasheed1/llmconnect
Author: Hunzala Rasheed
Author-email: hunzalarasheed14@gmail.com
Project-URL: Bug Reports, https://github.com/Hunzala-Rasheed1/llmconnect/issues
Project-URL: Source, https://github.com/yourusername/llmconnect
Project-URL: Documentation, https://github.com/Hunzala-Rasheed1/llmconnect#readme
Keywords: llm,ai,openai,claude,gemini,api,chatbot,nlp
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.7
Description-Content-Type: text/markdown
Requires-Dist: requests>=2.25.0
Provides-Extra: dev
Requires-Dist: pytest>=6.0; extra == "dev"
Requires-Dist: black>=21.0; extra == "dev"
Requires-Dist: flake8>=3.8; extra == "dev"
Requires-Dist: python-dotenv>=0.19.0; extra == "dev"
Dynamic: author
Dynamic: author-email
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: keywords
Dynamic: project-url
Dynamic: provides-extra
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# LLMConnect

A unified Python library for querying multiple Large Language Models (LLMs) using direct REST API calls.

## Features

- **Multi-Provider Support**: Connect to OpenAI, Claude, Gemini, Perplexity, Mistral, DeepSeek, and LLaMA
- **Auto-Detection**: Automatically detects the provider based on model name prefix
- **Simple Interface**: Just provide model name and API key
- **Customizable**: Optional temperature and max_tokens parameters
- **No SDKs Required**: Uses direct REST API calls via requests library

## Installation

```bash
pip install llmconnect
```

## Quick Start

```python
from llmconnect import LLMConnect

# Initialize with your preferred model
client = LLMConnect("gpt-4", "your-openai-api-key")

# Send a message
response = client.chat("Hello, how are you?")
print(response)
```

## Supported Models

- **OpenAI**: gpt-4, gpt-3.5-turbo, etc.
- **Claude**: claude-3-sonnet, claude-3-opus, etc.
- **Gemini**: gemini-pro, gemini-ultra, etc.
- **Perplexity**: sonar-pro, sonar-medium, etc.
- **Mistral**: mistral-large-latest, mistral-medium, etc.
- **DeepSeek**: deepseek-chat, deepseek-coder, etc.
- **LLaMA**: llama3-70b, llama2-13b, etc.

## Advanced Usage

```python
# Custom parameters
client = LLMConnect(
    model_name="claude-3-sonnet",
    api_key="your-claude-api-key",
    temperature=0.9,
    max_tokens=2048
)

response = client.chat("Write a creative story")
```

## API Keys

You'll need API keys from the respective providers:
- OpenAI: https://platform.openai.com/
- Anthropic (Claude): https://console.anthropic.com/
- Google (Gemini): https://makersuite.google.com/
- Perplexity: https://www.perplexity.ai/
- Mistral: https://console.mistral.ai/
- DeepSeek: https://platform.deepseek.com/
- LLaMA: https://www.llama-api.com/

## License

MIT License
