Metadata-Version: 2.1
Name: pyperch
Version: 0.1.4
Summary: Randomized opt networks with PyTorch
Home-page: https://github.com/jlm429/pyperch
Author: John Mansfield
Author-email: jlm429@gmail.com
License: New BSD
Platform: Any
Description-Content-Type: text/markdown


# pyperch

## Getting Started

### About
Pyperch is a neural network weight optimization package developed to support students taking Georgia Tech’s graduate machine learning course CS7641. 
Three random optimization algorithms - randomized hill climbing, simulated annealing, and a genetic algorithm - can be used as drop-in replacements for traditional gradient-based optimizers using PyTorch.
### Install

```
pip install pyperch
```
### Examples

For starter code and examples, see

[Backprop Neural Net](notebooks/backprop_network.ipynb) 

[RHC Optimized Neural Net](notebooks/rhc_opt_network.ipynb) 

[SA Optimized Neural Net](notebooks/sa_opt_network.ipynb)

[GA Optimized Neural Net](notebooks/ga_opt_network.ipynb)

[Regression Examples](notebooks/regression_examples.ipynb)


## Contributing

Pull requests are welcome.  

* Fork pyperch.
* Create a branch (`git checkout -b branch_name`)
* Commit changes (`git commit -m "Comments"`)
* Push to branch (`git push origin branch_name`)
* Open a pull request
