Metadata-Version: 2.4
Name: docklee
Version: 1.0.0
Summary: Docklee AI context infrastructure SDK
Project-URL: Homepage, https://docklee.com
Project-URL: Documentation, https://docs.docklee.com
Project-URL: Repository, https://github.com/docklee-ai/docklee-python
License: MIT
Keywords: ai,context,knowledge,llm,memory,rag
Requires-Python: >=3.9
Requires-Dist: httpx>=0.27.0
Provides-Extra: langchain
Requires-Dist: langchain-core>=0.2.0; extra == 'langchain'
Requires-Dist: langchain>=0.2.0; extra == 'langchain'
Provides-Extra: langgraph
Requires-Dist: langgraph>=0.1.0; extra == 'langgraph'
Provides-Extra: voice
Requires-Dist: pipecat-ai>=0.0.30; extra == 'voice'
Description-Content-Type: text/markdown

# docklee

AI context infrastructure SDK for Python. Company knowledge + persistent memory for any AI agent.

## Install

```bash
pip install docklee
```

## Quick Start

```python
import asyncio
from docklee import Docklee

async def main():
    async with Docklee(api_key="dk_live_xxxx") as client:

        # Query a knowledge engine — grounded answers with citations
        answer = await client.knowledge.query("eng_xxxx", "What is our refund policy?")
        print(answer.answer)
        print(answer.confidence)

        # Retrieve chunks for your own LLM
        chunks = await client.knowledge.retrieve("eng_xxxx", "pricing tiers")
        for chunk in chunks:
            print(chunk.content)

        # Write to memory
        await client.memory.write("space_xxxx", "User prefers dark mode")

        # Search memory
        results = await client.memory.search("space_xxxx", "user preferences")
        for r in results:
            print(r.content)

        # Unified context — KE + DUM in one call
        context = await client.context.assemble(
            "eng_xxxx",
            "What is the pricing for 50 seats?",
            memory_space_id="space_xxxx",
        )
        print(context.answer)
        print(f"Memory used: {len(context.memory_context)} records")

asyncio.run(main())
```

## OpenAI Wrapper — 2 lines to add Docklee to any existing app

```python
from openai import AsyncOpenAI
from docklee.providers import withDocklee

client = withDocklee(
    AsyncOpenAI(api_key="sk-xxxx"),
    docklee_key="dk_live_xxxx",
    engine_id="eng_xxxx",         # company knowledge
    memory_space_id="space_xxxx", # user memory
)

# All existing OpenAI code works unchanged
response = await client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "What is our refund policy?"}],
)
# Answer is grounded in your company knowledge + user memory injected automatically
```

## Universal Tool Definition

```python
from docklee.providers import DockleeTools

tools = DockleeTools(
    docklee_key="dk_live_xxxx",
    engine_id="eng_xxxx",
    memory_space_id="space_xxxx",
)

# Works with any LLM
tools.for_openai()     # OpenAI function calling format
tools.for_anthropic()  # Anthropic tool use format
tools.for_gemini()     # Gemini function calling format
tools.for_any()        # Generic format

# Handle tool calls
result = await tools.handle_tool_call("docklee_search_knowledge", {"query": "refund policy"})
```

## LangChain Integration

```bash
pip install docklee[langchain]
```

```python
from docklee.integrations.langchain import DockleeRetriever, DockleeMemory

retriever = DockleeRetriever(api_key="dk_live_xxxx", engine_id="eng_xxxx")
memory = DockleeMemory(api_key="dk_live_xxxx", space_id="space_xxxx")

docs = await retriever.ainvoke("What is our pricing?")
history = await memory.aload_memory_variables({"input": "pricing"})
```

## LangGraph Integration

```bash
pip install docklee[langgraph]
```

```python
from docklee.integrations.langgraph import docklee_knowledge_node, docklee_memory_node

graph.add_node("knowledge", docklee_knowledge_node(
    api_key="dk_live_xxxx",
    engine_id="eng_xxxx",
))
graph.add_node("memory", docklee_memory_node(
    api_key="dk_live_xxxx",
    space_id="space_xxxx",
))
```

## Voice Agent (Pipecat)

```bash
pip install docklee[voice]
```

```python
from docklee.integrations.pipecat import DockleeContextProcessor

processor = DockleeContextProcessor(
    api_key="dk_live_xxxx",
    engine_id="eng_xxxx",
    memory_space_id="space_xxxx",
)

context = await processor.get_context(transcript)
```

## Links

- Website: https://docklee.com
- Docs: https://docs.docklee.com
- API Reference: https://docs.docklee.com/api
