Metadata-Version: 2.4
Name: torchblocks-vp
Version: 1.1.0
Summary: Pluggable Transformer building blocks: GQA, RoPE, SwiGLU, RMSNorm, Conformer conv, adapters with registry system
Author: F000NK, Voluntas Progressus
License-Expression: MIT
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.14
Classifier: Operating System :: OS Independent
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.14
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: torch>=2.0.0
Dynamic: license-file

# torchblocks-vp

Pluggable Transformer building blocks with a plugin registry system.

Part of the [MorphFormer](https://pypi.org/project/morphoformer/) project by Voluntas Progressus.

## Installation

```bash
pip install torchblocks-vp
```

Requires Python >= 3.14 and PyTorch >= 2.0.

## Features

- **Registry system** — `@register(category, name)` decorator + `get(category, name)` factory lookup
- **Attention** — Grouped Query Attention (GQA), Multi-Head Attention (MHA), Cross-Attention with KV cache support
- **Feed-forward** — SwiGLU, GeLU variants
- **Normalization** — RMSNorm, LayerNorm
- **Positional encoding** — Rotary Position Embeddings (RoPE)
- **Convolution** — Conformer-style depthwise separable conv1d
- **Adapters** — Language-conditioned, bottleneck, and no-op adapters

## Quick Start

```python
import torchblocks

# List all registered modules
print(torchblocks.list_modules())
# {'attention': ['gqa', 'mha', 'cross'], 'feedforward': ['swiglu', 'gelu'], ...}

# Get a specific module class by category and name
GQA = torchblocks.get("attention", "gqa")
attention = GQA(d_model=512, num_heads=8, num_kv_heads=2)

# Register your own module into the registry
@torchblocks.register("feedforward", "my_custom_ff")
class MyFeedForward(torch.nn.Module):
    ...
```

## Registry Categories

| Category | Registered modules |
|---|---|
| `attention` | `gqa`, `mha`, `cross` |
| `feedforward` | `swiglu`, `gelu` |
| `norm` | `rmsnorm`, `layernorm` |
| `conv` | `local`, `none` |
| `adapter` | `language_conditioned`, `bottleneck`, `none` |

## License

MIT
