Metadata-Version: 2.4
Name: mloopforml
Version: 1.0.0
Summary: Neural network surrogate hyperparameter optimization via a simple decorator
Project-URL: Homepage, https://github.com/emv-dev/mloopforml
Project-URL: Repository, https://github.com/emv-dev/mloopforml
Project-URL: Issues, https://github.com/emv-dev/mloopforml/issues
License-Expression: MIT
License-File: LICENSE
Keywords: bayesian,hyperparameter,machine learning,neural network,optimization
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.11
Requires-Dist: m-loop>=3.0
Requires-Dist: numpy>=2.0
Requires-Dist: rich>=13.0
Requires-Dist: scipy>=1.15
Provides-Extra: dev
Requires-Dist: build>=1.2; extra == 'dev'
Requires-Dist: pytest-cov>=7.0; extra == 'dev'
Requires-Dist: pytest>=8.0; extra == 'dev'
Requires-Dist: ruff>=0.15; extra == 'dev'
Description-Content-Type: text/markdown

# mloopforml

Neural network surrogate hyperparameter optimization via a simple decorator.

[![PyPI version](https://img.shields.io/pypi/v/mloopforml)](https://pypi.org/project/mloopforml/)
[![Python](https://img.shields.io/pypi/pyversions/mloopforml)](https://pypi.org/project/mloopforml/)

## Install

```bash
pip install mloopforml
```

## Quickstart

```python
import mloopforml as mloop

@mloop.optimize(
    params={"lr": (1e-4, 1e-1), "dropout": (0.0, 0.5)},
    max_iterations=50,
    direction="minimize",   # or "maximize"
    seed=42,
)
def train(lr, dropout):
    # Replace with your actual training loop
    # Return a single float (lower is better when direction="minimize")
    loss = some_model.train(lr=lr, dropout=dropout)
    return loss

result = train()

print(result.best_params)   # {'lr': 0.0042, 'dropout': 0.12}
print(result.best_score)    # 0.237
```

mloopforml uses M-LOOP's neural network surrogate to propose new parameter
combinations. The decorator handles the optimization loop — your function just
returns a number.

### Parameters

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `params` | `dict[str, (float, float)]` | required | Hyperparameter name → `(min, max)` bounds |
| `max_iterations` | `int` | `50` | Total number of trials |
| `direction` | `str` | `"minimize"` | `"minimize"` or `"maximize"` |
| `num_training_runs` | `int` | `1` | Bootstrap trials before the NN surrogate activates |
| `seed` | `int \| None` | `None` | Random seed for reproducible runs |

### Result

Calling the decorated function returns an `OptimizeResult`:

```python
result.best_params   # dict[str, float] — best parameter combination found
result.best_score    # float — best objective value
result.all_params    # list[dict] — all trial params, in order
result.all_costs     # list[float] — all trial scores (nan for failed trials)
```

## Requirements

- Python >=3.11
- numpy >=2.0
- scipy >=1.15

## License

MIT
