Metadata-Version: 2.4
Name: anyschedule
Version: 0.1.0
Summary: Hyperparameter scheduler composer for PyTorch Optimizers
Author-email: "Shih-Ying Yeh(KohakuBlueLeaf)" <apolloyeh0123@gmail.com>
License: Apache License 2.0
Project-URL: Homepage, https://github.com/KohakuBlueleaf/AnySchedule
Requires-Python: >=3.10
Description-Content-Type: text/markdown
Requires-Dist: numpy
Requires-Dist: torch
Requires-Dist: toml

# [WIP] AnySchedule

***Still works in progress, no document, usage method are provided until first fully workalbe API get finished***

A composer tool for custom complex scheduler. (Such as WSD lr scheduler or more)

It is designed for scheduler for all hyperparameter in optimizer. (So schedule for weight decay or more hyper parameters is possible)

## Example

A simple example of "Multi-WSD" lr scheduler

| ![1737822139812](image/README/1737822139812.png) | ![1737822145021](image/README/1737822145021.png) |
| ---------------------------------------------- | ---------------------------------------------- |
