Metadata-Version: 2.4
Name: cycloneai
Version: 0.2.2
Summary: Gemini AI automation client with persistent memory, async support, and provider routing.
Author: User
License: MIT
Keywords: llm,ai,gemini,selenium,automation,client
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Intended Audience :: Developers
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.10
Description-Content-Type: text/markdown
Requires-Dist: selenium>=4.15
Requires-Dist: webdriver-manager>=4.0
Provides-Extra: dev
Requires-Dist: pytest>=7; extra == "dev"
Requires-Dist: pytest-asyncio>=0.23; extra == "dev"

# CycloneAI

CycloneAI is a small Python package that keeps the public API simple:

```python
from cycloneai import Client

ai = Client(provider="auto")
print(ai.ask("Hello"))
```

## What is included

- `Client` and `AsyncClient`
- `ask()`, `stream()`, `compare()`, `save()`, `load()`
- conversation memory
- provider routing with `provider="auto"` and fallback
- provider registration for custom backends

## What is not included

This package **does not automate guest access to third-party hosted LLM websites**. The built-in `gemini`, `gpt`, and `claude` providers are explicit placeholders so the public API and provider architecture are ready without shipping code that attempts to bypass official access controls or service terms.

If you want real responses, register your own compliant provider implementation.

## Usage

### Basic

```python
from cycloneai import Client

ai = Client(provider="auto")
reply = ai.ask("Python study order")
print(reply)
```

### Memory

```python
chat = Client(memory=True)
chat.register_provider(MyProvider())

chat.ask("My name is Hyunho")
chat.ask("What is my name?")
```

### System prompt

```python
ai = Client(system="You are the best coding mentor.")
ai.register_provider(MyProvider())
```

### Streaming

```python
for chunk in ai.stream("Write a short story"):
    print(chunk, end="")
```

### Async

```python
from cycloneai import AsyncClient

ai = AsyncClient()
ai.register_provider(MyProvider())
print(await ai.ask("What is AI?"))
```

### Top-level helper

```python
import cycloneai as cy

provider = MyProvider()
print(cy.ask("Hi", provider=provider))
```

## Custom provider example

```python
from cycloneai import BaseProvider, Client, Response


class MyProvider(BaseProvider):
    name = "demo"

    def ask(self, prompt, *, system=None, history=None, timeout=30):
        return Response(
            text=f"demo reply: {prompt}",
            provider=self.name,
            time=0.01,
            tokens=None,
        )


ai = Client(provider=MyProvider())
print(ai.ask("Hello"))
```

