Metadata-Version: 2.4
Name: omgformer
Version: 2.0.2
Summary: Parallel Diffusion Language Model — GQA + AdaLN-Zero + Self-Conditioning + LoRA
Author: omgformer contributors
License:                                  Apache License
                                   Version 2.0, January 2004
                                http://www.apache.org/licenses/
        
           TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
        
           1. Definitions.
        
              "License" shall mean the terms and conditions for use, reproduction,
              and distribution as defined by Sections 1 through 9 of this document.
        
              "Licensor" shall mean the copyright owner or entity authorized by
              the copyright owner that is granting the License.
        
              "Legal Entity" shall mean the union of the acting entity and all
              other entities that control, are controlled by, or are under common
              control with that entity. For the purposes of this definition,
              "control" means (i) the power, direct or indirect, to cause the
              direction or management of such entity, whether by contract or
              otherwise, or (ii) ownership of fifty percent (50%) or more of the
              outstanding shares, or (iii) beneficial ownership of such entity.
        
              "You" (or "Your") shall mean an individual or Legal Entity
              exercising permissions granted by this License.
        
              "Source" form shall mean the preferred form for making modifications,
              including but not limited to software source code, documentation
              source, and configuration files.
        
              "Object" form shall mean any form resulting from mechanical
              transformation or translation of a Source form, including but
              not limited to compiled object code, generated documentation,
              and conversions to other media types.
        
              "Work" shall mean the work of authorship made available under
              the License, as indicated by a copyright notice that is included in
              or attached to the work (an example is provided in the Appendix below).
        
              "Derivative Works" shall mean any work, whether in Source or Object
              form, that is based on (or derived from) the Work and for which the
              editorial revisions, annotations, elaborations, or other modifications
              represent, as a whole, an original work of authorship. For the purposes
              of this License, Derivative Works shall not include works that remain
              separable from, or merely link (or bind by name) to the interfaces of,
              the Work and Derivative Works thereof.
        
              "Contribution" shall mean, as submitted to the Licensor for inclusion
              in the Work by the copyright owner or by an individual or Legal Entity
              authorized to submit on behalf of the copyright owner. For the purposes
              of this definition, "submitted" means any form of electronic, verbal,
              or written communication sent to the Licensor.
        
              "Contributor" shall mean Licensor and any Legal Entity on behalf of
              whom a Contribution has been received by the Licensor and included
              within the Work.
        
           2. Grant of Copyright License. Subject to the terms and conditions of
              this License, each Contributor hereby grants to You a perpetual,
              worldwide, non-exclusive, no-charge, royalty-free, irrevocable
              copyright license to reproduce, prepare Derivative Works of,
              publicly display, publicly perform, sublicense, and distribute the
              Work and such Derivative Works in Source or Object form.
        
           3. Grant of Patent License. Subject to the terms and conditions of
              this License, each Contributor hereby grants to You a perpetual,
              worldwide, non-exclusive, no-charge, royalty-free, irrevocable
              patent license to make, use, sell, offer for sale, import, and
              otherwise transfer the Work.
        
           4. Redistribution. You may reproduce and distribute copies of the
              Work or Derivative Works thereof in any medium, with or without
              modifications, and in Source or Object form, provided that You
              meet the following conditions:
        
              (a) You must give any other recipients of the Work or Derivative
                  Works a copy of this License; and
        
              (b) You must cause any modified files to carry prominent notices
                  stating that You changed the files; and
        
              (c) You must retain, in the Source form of any Derivative Works
                  that You distribute, all copyright, patent, trademark, and
                  attribution notices from the Source form of the Work; and
        
              (d) If the Work includes a "NOTICE" text file, You must include a
                  readable copy of the attribution notices contained within such
                  NOTICE file.
        
           5. Submission of Contributions. Unless You explicitly state otherwise,
              any Contribution submitted for inclusion in the Work shall be under
              the terms and conditions of this License, without any additional terms.
        
           6. Trademarks. This License does not grant permission to use the trade
              names, trademarks, service marks, or product names of the Licensor.
        
           7. Disclaimer of Warranty. Unless required by applicable law or agreed
              to in writing, Licensor provides the Work on an "AS IS" BASIS,
              WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND.
        
           8. Limitation of Liability. In no event and under no legal theory,
              whether in tort (including negligence), contract, or otherwise,
              shall any Contributor be liable to You for damages.
        
           9. Accepting Warranty or Additional Liability. While redistributing
              the Work or Derivative Works thereof, You may offer, and charge a
              fee for, acceptance of support, warranty, indemnity, or other
              liability obligations consistent with this License.
        
           END OF TERMS AND CONDITIONS
        
           Copyright 2024-2026 omgformer contributors
        
           Licensed under the Apache License, Version 2.0 (the "License");
           you may not use this file except in compliance with the License.
           You may obtain a copy of the License at
        
               http://www.apache.org/licenses/LICENSE-2.0
        
           Unless required by applicable law or agreed to in writing, software
           distributed under the License is distributed on an "AS IS" BASIS,
           WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
           See the License for the specific language governing permissions and
           limitations under the License.
        
Project-URL: Homepage, https://pypi.org/project/omgformer/
Project-URL: Documentation, https://pypi.org/project/omgformer/
Project-URL: Bug Tracker, https://pypi.org/project/omgformer/
Keywords: diffusion,language-model,transformer,nlp,masked-diffusion,parallel-decoding,deep-learning,lora
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Science/Research
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Typing :: Typed
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: torch>=2.2.0
Requires-Dist: transformers>=4.38.0
Requires-Dist: safetensors>=0.4.0
Provides-Extra: train
Requires-Dist: wandb>=0.16.0; extra == "train"
Requires-Dist: datasets>=2.18.0; extra == "train"
Requires-Dist: accelerate>=0.28.0; extra == "train"
Requires-Dist: tqdm>=4.66.0; extra == "train"
Provides-Extra: hub
Requires-Dist: huggingface-hub>=0.22.0; extra == "hub"
Provides-Extra: dev
Requires-Dist: pytest>=8.0.0; extra == "dev"
Requires-Dist: pytest-cov>=5.0.0; extra == "dev"
Requires-Dist: mypy>=1.9.0; extra == "dev"
Requires-Dist: ruff>=0.4.0; extra == "dev"
Dynamic: license-file

# omgformer v2.0.1

**Paralel Diffusion Dil Modeli** — aynı anda tüm tokenleri üretir.

Klasik GPT 256 token için 256 forward pass yapar.  
omgformer aynı çıktıyı **8–10 forward pass** ile üretir.

```bash
pip install omgformer
```

---

## Temel Fikir

```
Adım 0: "Merhaba [MASK] [MASK] [MASK] [MASK] [MASK]"
Adım 1: "Merhaba dünya  [MASK] [MASK] [MASK] [MASK]"
Adım 2: "Merhaba dünya  nasıl  gidiyor [MASK] [MASK]"
Adım 3: "Merhaba dünya  nasıl  gidiyor  bugün   ?"
```

Her adımda tüm sequence bir forward pass'ten geçer.  
Model hem sola hem sağa bakar.  
En yüksek güvenli tokenlar açılır.  
Python for loop yok — tamamen vektörize.

---

## Hızlı Başlangıç

```python
from omgformer import OMGConfig, OMGModel, MaskScheduler, ParallelDecoder

cfg   = OMGConfig.from_preset("omgformer-base")
model = OMGModel(cfg).eval()

# Doğrudan model üzerinden üretim (v2.0.1 yeni)
prompt_ids = tokenizer.encode("İstanbul'un tarihi")
out = model.generate(prompt_ids, new_tokens=128, steps=10)
print(tokenizer.decode(out[0]))
```

### Manuel decoder

```python
sched   = MaskScheduler(steps=64, mask_token_id=cfg.mask_token_id, vocab_size=cfg.vocab_size)
decoder = ParallelDecoder(model, sched)

out = decoder.generate(
    prompt_ids,
    new_tokens=128,
    steps=10,
    temperature=0.9,
    top_p=0.95,
    remask_prob=0.05,
)
```

### Pipeline

```python
from omgformer import pipeline

gen    = pipeline("text-generation", model="omgformer-base")
result = gen("Yapay zeka", new_tokens=256, steps=10)
print(result["generated_text"])
```

### Eğitim

```python
from omgformer.training import Trainer, TrainingConfig

train_cfg = TrainingConfig(
    steps=100_000,
    batch_size=32,
    lr=3e-4,
    use_amp=True,
    use_compile=True,
    grad_accum_steps=4,
    self_cond_prob=0.5,
    use_wandb=True,
)

trainer = Trainer(model, sched, train_cfg, get_batch=my_data_loader)
trainer.fit()
```

### Çok GPU (FSDP)

```python
from omgformer.training import wrap_fsdp

model = wrap_fsdp(model, device_id=local_rank)
# torchrun --nproc_per_node=8 train.py
```

---

## v2.0.1 — Hata Düzeltmeleri

| Dosya | Hata | Düzeltme |
|-------|------|----------|
| `modeling.py` | `_forward_blocks` t_emb tipi `Tensor` (Optional değil) | `Optional[torch.Tensor]` yapıldı |
| `training.py` | `torch.cuda.amp.GradScaler` deprecated (PyTorch ≥ 2.3) | `torch.amp.GradScaler("cuda", …)` kullanıldı |
| `training.py` | Gradient checkpointing `t_emb=None` → dummy tensor → yanlış dal | `None` değerleri doğru iletiliyor |
| `tokenization.py` | `__getattr__` sonsuz özyineleme | `object.__getattribute__` ile korundu |
| `diffusion.py` | `generate_stream` `temperature/top_k/top_p/remask_prob` yok sayılıyordu | Tam parametre desteği eklendi |
| `diffusion.py` | `generate_stream` son adımda MASK token kalabiliyordu | Son adım forced-unmask eklendi |
| `configuration.py` | `assert` → `-O` ile devre dışı bırakılabilir | `ValueError` ile değiştirildi |
| `pipeline.py` | `assert task == …` | `ValueError` ile değiştirildi |

### Yeni Eklentiler

- **`utils.py`** — `set_seed`, `count_parameters`, `get_model_size_mb`, `get_device`, `tokens_per_second`, `format_number`
- **`OMGModel.generate()`** — MaskScheduler + ParallelDecoder oluşturmadan doğrudan üretim
- **`OMGConfig.from_json_string()`** — JSON dizisinden konfigürasyon
- **`OMGConfig.validate()`** — erken değer doğrulaması
- **`TrainingConfig.validate()`** — eğitim başlamadan tutarsızlık tespiti
- **`AutoModel.from_pretrained(device=…)`** — cihaz parametresi eklendi
- **`py.typed`** — PEP 561 tip marker
- **PyPI metadata** — `classifiers`, `keywords`, `authors`, `[project.urls]`

---

## v2.0.0 Mimarisi

### GQA (Grouped Query Attention)

```python
cfg = OMGConfig(num_heads=16, num_kv_heads=4)  # 4× bellek tasarrufu
```

### AdaLN-Zero

```
x_attn = norm(x) * (1 + scale) + shift
x = x + gate.tanh() * attention(x_attn)
```

Gate'ler sıfır başlatılır → eğitim başında blok = tam geçirgen.

### Self-Conditioning

```
Adım N:   logits_n  = model(noisy, self_cond=None)
Adım N+1: logits_n1 = model(noisy, self_cond=soft_embed(logits_n))
```

### Absorbing Diffusion

```
%80 → MASK token
%10 → rastgele başka token
%10 → orijinal token kalır
```

---

## Model Boyutları

| Preset         | Katman | Hidden | Heads (Q/KV) | ~Parametre |
|----------------|--------|--------|--------------|------------|
| omgformer-tiny | 4      | 256    | 4/4          | ~14M       |
| omgformer-small| 8      | 512    | 8/4          | ~87M       |
| omgformer-base | 12     | 768    | 12/4         | ~180M      |
| omgformer-large| 24     | 1024   | 16/4         | ~600M      |
| omgformer-xl   | 24     | 2048   | 16/2         | ~2.1B      |
| omgformer-3b   | 32     | 2560   | 32/8         | ~3.2B      |

---

## Hız Karşılaştırması

| Model                      | 256 token üretim  | Yöntem             |
|----------------------------|-------------------|--------------------|
| GPT-2                      | 256 forward pass  | Otoregresif        |
| omgformer (steps=10)       | 10 forward pass   | Paralel diffusion  |
| omgformer (steps=6, SC)    | 6 forward pass    | + Self-conditioning|

---

## Kurulum

```bash
# Minimum
pip install omgformer

# Eğitim araçlarıyla
pip install "omgformer[train]"

# Geliştirici araçlarıyla
pip install "omgformer[dev]"
```

---

## Testler

```bash
pytest tests/ -v
```

---

## Referanslar

- [MDLM: Masked Diffusion Language Models](https://arxiv.org/abs/2406.07524)
- [DiT: Scalable Diffusion Transformers](https://arxiv.org/abs/2212.09748)
- [Self-Conditioning in Diffusion Models](https://arxiv.org/abs/2208.04202)
- [GQA: Training Generalized Multi-Query Attention](https://arxiv.org/abs/2305.13245)
- [Mercury: Ultra-Fast Language Models (Inception Labs)](https://www.inceptionlabs.ai/mercury)

---

## Lisans

Apache-2.0
