Metadata-Version: 2.4
Name: denograd
Version: 1.0.1
Summary: Instance noise reduction framework based on Deep Learning gradients agnostic to         the network architecture.
Author: JJavier98
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: GNU Affero General Public License v3
Classifier: Operating System :: OS Independent
Requires-Python: >=3.6
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: numpy
Requires-Dist: matplotlib
Requires-Dist: torch
Requires-Dist: ipython
Requires-Dist: tqdm
Dynamic: author
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: license-file
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# DenoGrad: Deep Gradient Denoising Framework

[![Python](https://img.shields.io/badge/Python-3.8%2B-blue)](https://www.python.org/)
[![License](https://img.shields.io/badge/License-MIT-green)](LICENSE)

## 📄 Description

**DenoGrad** is a novel gradient-based denoising framework designed to enhance the robustness and performance of Artificial Intelligence models, with a specific focus on interpretable (white-box) models.

Unlike conventional techniques that simply remove noisy instances or significantly alter the data distribution, DenoGrad leverages the gradients of a reference Deep Learning (DL) model—trained on the target data—to dynamically detect and correct noisy samples.

### 🚀 Key Features

* **Gradient-Based Correction:** Utilizes gradient information from deep models to guide the noise reduction process effectively.
* **Distribution Preservation:** Corrects instances while maintaining the original data distribution, avoiding oversimplification of the problem space.
* **Task Agnostic:** Validated effectively on both tabular data and time-series datasets.
* **Interpretable AI Enhancement:** Specifically engineered to boost the performance of interpretable models in noisy environments without sacrificing transparency.

## 🛠️ Installation

```bash
pip install denograd
```

## 📖 Basic Usage

```python
import torch
import torch.nn as nn
from denograd import DenoGrad

# Initialize the denoiser with your reference model
denoiser = DenoGrad(
    model=my_deep_model, # DL model fitted to noisy data
    criterion=nn.MSELoss(),
)

# 1. Fit the noisy data
# For Tabular Data
denoiser.fit(
    X=x_noisy, 
    y=y_noisy, 
    is_ts=False
)

# For Time Series (requires window_size)
# denoiser.fit(
#    X=x_noisy,
#    y=['y'], # y_noisy is also accepted
#    is_ts=True,
#    window_size=24,
#    future=1,
#    stride=1
# )

# 2. Denoise the dataset
x_clean, y_clean, x_gradients, y_gradients = denoiser.transform(
    nrr = 0.05, # Noise Reduction Rate. Same functionality as learning rate but for denoising porposes.
    nr_threshold = 0.01, # "Level" of noise allowed.
    max_epochs = 200, # Max number of epochs to perform the denoising process.
    denoise_y = True, # Enable denoising for the target variable (recommended for Tabular).
    batch_size = 1024,
    save_gradients = True # Save all the gradients calculated through the denoising process.
)
```

## 📝 Citation

If you use DenoGrad in your research, please cite our paper:

ON REVISION
