Metadata-Version: 2.4
Name: neural-monkey
Version: 0.2.0
Summary: A neural network library for beginners.
Home-page: https://github.com/19919rohit/Neural-Monkey
Author: Neunix Studios
Author-email: neunixstudios@gmail.com
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Python: >=3.7
Description-Content-Type: text/markdown

# Monkey – Simple Neural Networks for Beginners

Monkey is a lightweight Python library for building and training simple neural networks.  
It is designed for beginners to learn concepts like layers, activations, forward propagation, backpropagation, and simple attention mechanisms in a fun and interactive way.

---

## Features

- Create fully connected neural networks (Dense layers)  
- Choose activation functions: **ReLU**, **Sigmoid**, **Tanh**  
- Train networks using gradient descent with adjustable learning rate and epochs  
- Make predictions on new inputs  
- Optional lightweight **AttentionBlock** for sequence inputs  
- Beginner-friendly API with minimal setup  
- Fully Python-based, no external dependencies  

---

## Installation

```bash
pip install monkey
```

---

## Quick Start Examples

### 1. Predict the sum of two numbers

```python
from monkey.nn import NeuralNet

# Training data
x_train = [[2, 8], [9, 3], [7, 4], [1, 1]]
y_train = [[sum(pair)] for pair in x_train]

# Create network
nn = NeuralNet(input_size=2)
nn.add_layer(neurons=5, activation='relu')
nn.add_layer(neurons=1, activation='relu', layer='output')

# Train the network
nn.train(x_train, y_train, epochs=500, lr=0.1)

# Predict
print("Prediction for [3,5]:", nn.predict([3,5])[0])
```

### 2. Predict using Sigmoid activation

```python
nn = NeuralNet(input_size=2)
nn.add_layer(neurons=4, activation='sigmoid')
nn.add_layer(neurons=1, activation='sigmoid', layer='output')

nn.train(x_train, y_train, epochs=1000, lr=0.05)
print("Prediction for [2,2]:", nn.predict([2,2])[0])
```

### 3. Using AttentionBlock for sequences

```python
from monkey.attention import AttentionBlock

seq_input = [[0.8, 0.2, 0.1], [0.5, 0.1, 0.3], [0.2, 0.7, 0.6]]
attn = AttentionBlock(input_size=3, output_size=3)
seq_output = attn.forward(seq_input)

print("Attention output:", seq_output)
```

---

## API Reference

| Class / Function      | Description                                                                 |
|-----------------------|-----------------------------------------------------------------------------|
| `NeuralNet`           | Core class for creating and training fully connected networks              |
| `NeuralNet.add_layer` | Add a layer to the network; choose neurons, activation, layer type          |
| `NeuralNet.predict`   | Make predictions for a given input                                         |
| `NeuralNet.train`     | Train the network on input/output data                                      |
| `AttentionBlock`      | Simple attention mechanism for sequences                                    |
| `relu(x)`             | Rectified Linear Unit activation function                                    |
| `sigmoid(x)`          | Sigmoid activation function                                                 |
| `tanh(x)`             | Tanh activation function                                                    |

**Layer Parameters in `add_layer`:**

- `neurons`: Number of neurons in the layer  
- `activation`: One of `'relu'`, `'sigmoid'`, `'tanh'`  
- `layer`: `'hidden'` (default) or `'output'`  
- `input_size`: Optional, only needed for first layer  

---

## Testing Monkey

You can run the provided test script:

```bash
python tests/test.py
```

Sample test scenarios include:

- Predicting sums of two numbers  
- Predicting multiplication or XOR patterns  
- Testing different activation functions  
- Testing AttentionBlock on sequence data  

---

## Learning Tips for Beginners

- Start with a single hidden layer and few neurons  
- Use small datasets (like sum of two numbers) for testing  
- Adjust `learning_rate` and `epochs` to see effect on convergence  
- Observe how activation functions change the output and training speed  
- Experiment with the AttentionBlock for sequence-based learning  

---

## Contributing

Monkey is beginner-friendly and open to contributions!  

- Experiment with new activation functions  
- Add utilities or visualization helpers  
- Improve training efficiency  

Github Repo available at : https://github.com/19919rohit/Neural-Monkey

---

## License

MIT License
