Metadata-Version: 2.4
Name: mldd-core
Version: 1.0.1
Summary: Visual Drag-and-drop Machine Learning Trainer
Author: Shivam Sharma
License: MIT License
        
        Copyright (c) 2026 zaina-ml
        
        Permission is hereby granted, free of charge, to any person obtaining a copy
        of this software and associated documentation files (the "Software"), to deal
        in the Software without restriction, including without limitation the rights
        to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
        copies of the Software, and to permit persons to whom the Software is
        furnished to do so, subject to the following conditions:
        
        The above copyright notice and this permission notice shall be included in all
        copies or substantial portions of the Software.
        
        THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
        IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
        FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
        AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
        LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
        OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
        SOFTWARE.
        
Keywords: machine learning,pytorch,visual,no-code
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.11
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: dearpygui>=1.11.0
Requires-Dist: Pillow>=9.0.0
Provides-Extra: training
Requires-Dist: torch>=2.0.0; extra == "training"
Requires-Dist: torchvision>=0.15.0; extra == "training"
Dynamic: license-file

# ML Forge

A visual PyTorch pipeline editor. Build, train and run image classification models without writing code.
![ML D&D screenshot](ml_D_D/assets/showcase.gif)
---

## What it does

- **Visual pipeline** - drag nodes onto a canvas, connect them with wires, and ML Forge generates and runs the training code for you
- **Three-tab workflow** - Data Prep -> Model -> Training
- **Live training** - watch loss curves update in real time, save checkpoints, run inference on your trained model
- **Export** - export projects into clean PyTorch


## Requirements
**IMPORTANT**: PyTorch must be preinstalled for training, it is not installed as a dependency.

- Python 3.10 or newer
- PyTorch 2.0 or newer
- torchvision

```
pip install torch torchvision
```

GPU training is automatic if CUDA is available. CPU and Apple MPS are also supported.


## Building your first model

### 1. Data Prep tab

- Add a **Dataset** node (MNIST, CIFAR10, CIFAR100, FashionMNIST, or ImageFolder)
- Chain **transforms**: ToTensor is required, add Normalize for best results
- End with a **DataLoader (train)** node
- For proper validation, add a second chain (same dataset with `train=False`) ending with **DataLoader (val)**

### 2. Model tab

- Start with an **Input** node - shape is auto-filled from your dataset
- Add layers: Linear, Conv2D, ReLU, BatchNorm2D, Flatten, Dropout, etc.
- End with an **Output** node - num classes is auto-filled from your dataset
- Connect nodes by dragging from an output pin to an input pin
- `in_features` and `in_channels` auto-fill when you connect layers
- After a Flatten node, the next Linear's `in_features` is calculated automatically

### 3. Training tab

Add these four nodes from the palette and wire them up:

```
DataLoaderBlock.images  ->  ModelBlock.images
ModelBlock.predictions  ->  Loss.pred
DataLoaderBlock.labels  ->  Loss.target
Loss.loss               ->  Optimizer.params
```

Configure epochs, device, checkpointing and early stopping in the right panel, then press **RUN**.

---

## Keyboard shortcuts

| Key | Action |
|-----|--------|
| `Del` | Delete selected nodes |
| `Ctrl+S` | Save project |
| `Ctrl+Z` | Undo |
| `Ctrl+Y` | Redo |
| Middle-drag | Pan the canvas |

---

## Supported datasets

| Dataset | Classes | Input shape |
|---------|---------|-------------|
| MNIST | 10 | 1 × 28 × 28 |
| FashionMNIST | 10 | 1 × 28 × 28 |
| CIFAR-10 | 10 | 3 × 32 × 32 |
| CIFAR-100 | 100 | 3 × 32 × 32 |
| ImageFolder | custom | 3 × 224 × 224 |

---

## Inference

After training, open **Run -> Inference**, browse to your checkpoint (`.pth`), and click **Run Inference** to sample from the test set and see top-k predictions.

---

## Metrics

Click the **METRICS** button to see a summary of your training run: final loss, best validation accuracy, fit diagnosis, and loss/accuracy curves, you may also see the curves on the right training panel.

---

## Saving and loading

Projects are saved as `.mlf` files (JSON). Use **File -> Save / Save As** or `Ctrl+S`.

---

## Exporting code

**File -> Export -> Python -> PyTorch** generates a standalone `train.py` that reproduces your pipeline. No ML Forge required to run it.

---

## RUN
```
python -m ml_D_D
```
