Metadata-Version: 2.4
Name: cajal-p2pclaw
Version: 1.0.0
Summary: CAJAL-4B — Native integration for the P2PCLAW scientific intelligence model. Easy inference, chat, and server.
Project-URL: Homepage, https://github.com/Agnuxo1/CAJAL
Project-URL: Documentation, https://huggingface.co/Agnuxo/CAJAL-4B-P2PCLAW
Project-URL: Repository, https://github.com/Agnuxo1/CAJAL
Project-URL: Issues, https://github.com/Agnuxo1/CAJAL/issues
Author-email: Francisco Angulo de Lafuente <lareliquia.angulo@gmail.com>
Keywords: agents,ai,cajal,llm,ollama,p2pclaw,qwen,research,scientific
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Scientific/Engineering :: Information Analysis
Requires-Python: >=3.9
Requires-Dist: accelerate>=0.29.0
Requires-Dist: fastapi>=0.110.0
Requires-Dist: huggingface-hub>=0.23.0
Requires-Dist: peft>=0.10.0
Requires-Dist: pydantic>=2.0.0
Requires-Dist: torch>=2.2.0
Requires-Dist: transformers>=4.40.0
Requires-Dist: uvicorn>=0.29.0
Provides-Extra: all
Requires-Dist: bitsandbytes>=0.43.0; extra == 'all'
Requires-Dist: fastapi>=0.110.0; extra == 'all'
Requires-Dist: optimum>=1.18.0; extra == 'all'
Requires-Dist: uvicorn>=0.29.0; extra == 'all'
Provides-Extra: quantized
Requires-Dist: bitsandbytes>=0.43.0; extra == 'quantized'
Requires-Dist: optimum>=1.18.0; extra == 'quantized'
Provides-Extra: server
Requires-Dist: fastapi>=0.110.0; extra == 'server'
Requires-Dist: uvicorn>=0.29.0; extra == 'server'
Description-Content-Type: text/markdown

# CAJAL-4B-P2PCLAW — Native Python Integration

🧠 **One-line install**: `pip install cajal`

CAJAL-4B is a specialized scientific intelligence model fine-tuned for decentralized research networks, peer-to-peer architectures, cryptographic protocols, and formal verification. This package provides native Python integration, a FastAPI server, CLI tool, and configs for 10+ platforms.

---

## Quick Start

### 1. Install

```bash
pip install cajal
```

### 2. Chat in Python

```python
from cajal import CAJALChat

chat = CAJALChat()
response = chat.send("Explain Byzantine consensus in P2P networks.")
print(response)
```

### 3. Run Server (OpenAI-compatible)

```bash
cajal-server --port 8000
# Endpoint: POST /v1/chat/completions
```

### 4. CLI Chat

```bash
cajal "Explain zero-knowledge proofs"
# Or interactive mode:
cajal -i
```

---

## Ollama Integration (Recommended)

```bash
# Pull the model
ollama pull Agnuxo/CAJAL-4B-P2PCLAW

# Create custom model with system prompt
ollama create cajal -f Modelfile

# Run
ollama run cajal
```

---

## Platform Integrations

| Platform | Config | Status |
|----------|--------|--------|
| **Ollama** | `integrations/ollama/Modelfile` | ✅ Ready |
| **VS Code** | `integrations/vscode/cajal.json` | ✅ Ready |
| **Cursor** | `integrations/cursor/cajal.json` | ✅ Ready |
| **Continue.dev** | `integrations/continue_dev/config.yaml` | ✅ Ready |
| **Open WebUI** | `integrations/openwebui/README.md` | ✅ Ready |
| **Jan** | `integrations/jan/model.json` | ✅ Ready |
| **LM Studio** | `integrations/lmstudio/README.md` | ✅ Ready |
| **Pinokio** | `integrations/pinokio/install.json` | ✅ Ready |

---

## Model Info

- **Base**: Qwen3.5-4B (Apache 2.0)
- **Fine-tuned**: 4.21B parameters
- **Context**: 262K tokens
- **Format**: Safetensors BF16
- **License**: MIT
- **HF**: [Agnuxo/CAJAL-4B-P2PCLAW](https://huggingface.co/Agnuxo/CAJAL-4B-P2PCLAW)

---

## Requirements

- Python 3.9+
- PyTorch 2.2+
- Transformers 4.40+
- GPU recommended (6.5GB VRAM minimum)

---

## License

MIT — Francisco Angulo de Lafuente (Agnuxo1)

Copyright 2026 P2PCLAW Laboratory
