Metadata-Version: 2.4
Name: mloopforml
Version: 0.1.0
Summary: Neural network surrogate hyperparameter optimization via a simple decorator
Project-URL: Homepage, https://github.com/emv-dev/mloopforml
Project-URL: Repository, https://github.com/emv-dev/mloopforml
Project-URL: Issues, https://github.com/emv-dev/mloopforml/issues
License-Expression: MIT
License-File: LICENSE
Keywords: bayesian,hyperparameter,machine learning,neural network,optimization
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.11
Requires-Dist: numpy>=2.0
Requires-Dist: scipy>=1.15
Provides-Extra: dev
Requires-Dist: build>=1.2; extra == 'dev'
Requires-Dist: pytest-cov>=7.0; extra == 'dev'
Requires-Dist: pytest>=8.0; extra == 'dev'
Requires-Dist: ruff>=0.15; extra == 'dev'
Description-Content-Type: text/markdown

# mloopforml

Neural network surrogate hyperparameter optimization via a simple decorator.

```python
import mloopforml as mloop

@mloop.optimize(params={"lr": (0.001, 0.1)}, max_iterations=30, direction="maximize")
def train(lr):
    # your model training here
    return accuracy

result = train()
print(result.best_params, result.best_score)
```

## Install

```bash
pip install mloopforml
```

## Requirements

- Python >=3.11
- numpy >=2.0
- scipy >=1.15
