Metadata-Version: 2.4
Name: thinkai-framework
Version: 0.1.0
Summary: Enterprise-grade AI framework for seamless LLM integration
Author-email: ThinkAi Team <thinkai@example.com>
License-Expression: MIT
Project-URL: Homepage, https://github.com/thinkai/thinkai
Project-URL: Documentation, https://thinkai.readthedocs.io
Project-URL: Repository, https://github.com/thinkai/thinkai
Project-URL: Issues, https://github.com/thinkai/thinkai/issues
Keywords: ai,llm,fastapi,openai,ollama,rag,agent
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Requires-Python: >=3.9
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: fastapi>=0.100.0
Requires-Dist: pydantic>=2.0
Requires-Dist: pydantic-settings>=2.0
Requires-Dist: httpx>=0.24.0
Requires-Dist: pyyaml>=6.0
Requires-Dist: python-dotenv>=1.0.0
Requires-Dist: tenacity>=8.0.0
Provides-Extra: ollama
Requires-Dist: ollama>=0.1.0; extra == "ollama"
Provides-Extra: openai
Requires-Dist: openai>=1.0.0; extra == "openai"
Provides-Extra: qwen
Requires-Dist: dashscope>=1.0.0; extra == "qwen"
Provides-Extra: deepseek
Requires-Dist: openai>=1.0.0; extra == "deepseek"
Provides-Extra: claude
Requires-Dist: anthropic>=0.7.0; extra == "claude"
Provides-Extra: gemini
Requires-Dist: google-generativeai>=0.3.0; extra == "gemini"
Provides-Extra: rag
Requires-Dist: chromadb>=0.4.0; extra == "rag"
Requires-Dist: tiktoken>=0.5.0; extra == "rag"
Requires-Dist: pypdf>=3.0; extra == "rag"
Requires-Dist: python-docx>=0.8.0; extra == "rag"
Provides-Extra: agent
Requires-Dist: openai>=1.0.0; extra == "agent"
Provides-Extra: dev
Requires-Dist: pytest>=7.0; extra == "dev"
Requires-Dist: pytest-asyncio>=0.21.0; extra == "dev"
Requires-Dist: black>=23.0; extra == "dev"
Requires-Dist: ruff>=0.1.0; extra == "dev"
Requires-Dist: mypy>=1.0; extra == "dev"
Provides-Extra: all
Requires-Dist: thinkai[agent,claude,deepseek,gemini,ollama,openai,qwen,rag]; extra == "all"
Dynamic: license-file

# ThinkAi - Enterprise AI Framework

基于FastAPI的企业级AI大模型集成框架 - **开箱即用,简单易用,功能全面**

## 特性

- **多模型支持** - 支持Ollama、OpenAI、通义千问、DeepSeek、Claude、Gemini等主流大模型
- **统一接口** - 一次配置,多模型自由切换
- **开箱即用** - 简单配置即可使用,无需复杂配置
- **符合OpenAI标准** - 采用OpenAI兼容的API格式
- **流式响应** - 支持SSE流式输出
- **会话管理** - 内置多轮对话上下文管理
- **RAG支持** - 3行代码实现检索增强生成
- **Agent系统** - 内置ReAct Agent,支持工具调用
- **中间件管道** - 日志、重试、缓存、限流
- **企业级性能** - 异步架构,连接池,自动重试

## 安装

```bash
# 基础安装
pip install thinkai

# 安装特定Provider
pip install thinkai[ollama]
pip install thinkai[openai]
pip install thinkai[qwen]

# 安装全部依赖
pip install thinkai[all]
```

## 快速开始

### 1. 最简使用(3行代码)

```python
from thinkai import ThinkAI

ai = ThinkAI(provider="ollama", model="llama3")
response = await ai.chat("你好")
print(response.content)
```

### 2. FastAPI集成

```python
from fastapi import FastAPI
from thinkai import ThinkAI

app = FastAPI()
ai = ThinkAI(provider="ollama", model="llama3")

@app.post("/chat")
async def chat(message: str):
    response = await ai.chat(message)
    return {"content": response.content}

# 启动: uvicorn main:app --reload
```

### 3. 多模型配置与切换

```python
from thinkai import ThinkAI

ai = ThinkAI(provider="ollama", model="llama3")

# 注册多个模型
ai.register_model("qwen", provider="qwen", model="qwen-turbo")
ai.register_model("deepseek", provider="deepseek", model="deepseek-chat")
ai.register_model("gpt4", provider="openai", model="gpt-4")

# 自由切换
response1 = await ai.chat("你好", model="llama3")
response2 = await ai.chat("你好", model="qwen")
response3 = await ai.chat("你好", model="deepseek")
response4 = await ai.chat("你好", model="gpt4")
```

### 4. 多轮对话(会话管理)

```python
ai = ThinkAI()

async with ai.session() as session:
    response1 = await session.chat("你好,我想学习Python")
    response2 = await session.chat("有什么好的学习路径?")
    response3 = await session.chat("推荐一些资源吧")
```

### 5. 流式响应

```python
ai = ThinkAI()

async for chunk in ai.chat_stream("讲一个故事"):
    if chunk.choices and chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="", flush=True)
```

### 6. RAG(检索增强生成)

```python
from thinkai import ThinkAI
from thinkai.rag import RAGPipeline

ai = ThinkAI()

# 3行代码实现RAG
rag = RAGPipeline(
    documents=["./docs", "./knowledge"],
    ai_client=ai,
    chunk_size=500,
)

# 查询
answer = await rag.query("ThinkAi框架支持哪些AI模型?")
print(answer)
```

### 7. Agent(智能体)

```python
from thinkai import ThinkAI
from thinkai.agent import ReActAgent, Tool

# 定义工具
@Tool(name="calculator", description="计算数学表达式")
def calculator(expression: str) -> str:
    return str(eval(expression))

@Tool(name="search", description="搜索信息")
async def search(query: str) -> str:
    # 实现搜索逻辑
    return "搜索结果"

ai = ThinkAI()

# 创建Agent
agent = ReActAgent(
    tools=[calculator, search],
    ai_client=ai,
    verbose=True,
)

# 运行任务
result = await agent.run("计算25*48,然后搜索Python的相关信息")
print(result)
```

## 支持的AI模型

| Provider | 模型 | 类型 | 配置方式 |
|----------|------|------|----------|
| **Ollama** | llama3, mistral, qwen等 | 本地 | `provider="ollama"` |
| **OpenAI** | gpt-4, gpt-3.5-turbo, gpt-4o | 云端 | `provider="openai"` |
| **通义千问** | qwen-turbo, qwen-plus, qwen-max | 云端 | `provider="qwen"` |
| **DeepSeek** | deepseek-chat, deepseek-coder | 云端 | `provider="deepseek"` |
| **Anthropic** | claude-3-opus/sonnet/haiku | 云端 | `provider="claude"` |
| **Google** | gemini-pro, gemini-ultra | 云端 | `provider="gemini"` |

## 项目结构

```
thinkai/
├── thinkai/
│   ├── __init__.py
│   ├── core/           # 核心模块
│   │   ├── client.py   # 统一客户端
│   │   ├── config.py   # 配置管理
│   │   └── models.py   # 数据模型
│   ├── providers/      # Provider实现
│   │   ├── base.py     # Provider基类
│   │   ├── registry.py # 注册表
│   │   ├── ollama.py   # Ollama
│   │   ├── openai.py   # OpenAI
│   │   ├── qwen.py     # 通义千问
│   │   └── deepseek.py # DeepSeek
│   ├── session/        # 会话管理
│   ├── prompt/         # Prompt模板
│   ├── middleware/     # 中间件
│   ├── rag/            # RAG模块
│   ├── agent/          # Agent模块
│   ├── streaming.py    # 流式处理
│   └── exceptions.py   # 异常定义
├── examples/           # 示例代码
├── config.example.yaml # 配置示例
├── .env.example        # 环境变量示例
└── pyproject.toml      # 项目配置
```

## 配置方式

### 方式1: 代码配置

```python
ai = ThinkAI(
    provider="ollama",
    model="llama3",
    temperature=0.7,
    max_tokens=2048,
    timeout=60,
)
```

### 方式2: 环境变量

```bash
export THINKAI_DEFAULT_PROVIDER=ollama
export THINKAI_DEFAULT_MODEL=llama3
export OPENAI_API_KEY=your_key_here
```

### 方式3: YAML配置文件

```yaml
default_provider: "ollama"
default_model: "llama3"

providers:
  openai:
    api_key: "${OPENAI_API_KEY}"
    api_base: "https://api.openai.com/v1"

models:
  llama3:
    provider: "ollama"
    model: "llama3"
    temperature: 0.7
```

```python
from thinkai.core.config import Settings

config = Settings.from_file("config.yaml")
ai = ThinkAI(config=config)
```

## 高级功能

### 中间件系统

```python
from thinkai.middleware import LoggingMiddleware, RetryMiddleware

ai = ThinkAI()
ai.add_middleware(LoggingMiddleware())
ai.add_middleware(RetryMiddleware(max_retries=3))
```

### Prompt模板

```python
from thinkai.prompt.template import PromptTemplate, prompt_manager

# 使用内置模板
template = prompt_manager.get("system_code")
prompt = template.format()

# 自定义模板
custom = PromptTemplate("将以下代码转换为$type: $code")
result = custom.format(type="Python", code="...")
```

### 自定义Provider

```python
from thinkai.providers.base import BaseProvider
from thinkai.providers.registry import register_provider

@register_provider("custom")
class CustomProvider(BaseProvider):
    name = "custom"
    default_model = "custom-model"
    
    async def chat(self, request):
        # 实现聊天逻辑
        pass
    
    async def chat_stream(self, request):
        # 实现流式聊天逻辑
        pass
```

## 企业级特性

- **异步架构** - 全面使用async/await,高性能
- **连接池** - HTTP连接复用
- **自动重试** - 失败自动重试,指数退避
- **错误处理** - 完善的异常体系
- **类型安全** - 完整的Type Hints
- **日志记录** - 结构化日志支持
- **监控指标** - 集成Prometheus(规划中)
- **负载均衡** - 多模型路由(规划中)

## 文档

完整文档请访问: [https://thinkai.readthedocs.io](https://thinkai.readthedocs.io)

## 示例

运行示例代码:

```bash
# 基础使用
python examples/basic_usage.py

# FastAPI集成
python examples/fastapi_demo.py

# RAG示例
python examples/rag_example.py

# Agent示例
python examples/agent_example.py
```

## 贡献

欢迎提交Issue和Pull Request!

## 许可证

MIT License
