Metadata-Version: 2.4
Name: langchain_deepseek_v4
Version: 0.1.0
Summary: LangChain integration package for DeepSeek chat models
Project-URL: Homepage, https://api-docs.deepseek.com
Project-URL: Repository, https://github.com/hlcm/langchain_deepseek_v4
License: MIT
License-File: LICENSE
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: <4.0,>=3.10
Requires-Dist: langchain-core<2.0.0,>=1.0.0
Requires-Dist: openai<3.0.0,>=1.99.0
Requires-Dist: pydantic<3.0.0,>=2.7.4
Provides-Extra: build
Requires-Dist: build<2.0.0,>=1.2.0; extra == 'build'
Provides-Extra: test
Requires-Dist: pytest-asyncio<2.0.0,>=0.23.0; extra == 'test'
Requires-Dist: pytest<10.0.0,>=8.0.0; extra == 'test'
Description-Content-Type: text/markdown

# langchain_deepseek_v4

LangChain chat model integration for the current DeepSeek Chat Completions API.

Install the distribution package and import the integration module:

```bash
pip install langchain_deepseek_v4
```

This implementation targets the DeepSeek API documented at
<https://api-docs.deepseek.com>, including:

- Tool calling
- JSON structured output through DeepSeek JSON Output
- Thinking mode and `reasoning_content`
- Streaming chunks that preserve reasoning deltas
- Multi-turn conversations that pass tool-call reasoning back to DeepSeek

```python
from langchain_deepseek import ChatDeepSeek

llm = ChatDeepSeek(
    model="deepseek-v4-pro",
    api_key="...",
    thinking={"type": "enabled"},
    reasoning_effort="high",
)

msg = llm.invoke("Say hello in JSON: {\"message\": \"...\"}")
```
