Metadata-Version: 2.4
Name: aceflow
Version: 1.6.2
Summary: A Python library for building and training Seq2Seq models
Author-email: Maaz waheed <wwork4287@gmail.com>
License: MIT License
Project-URL: Bug Tracker, https://github.com/42Wor/aceflow/issues
Project-URL: Source Code, https://github.com/42Wor/aceflow
Keywords: seq2seq,machine-learning,rust,numpy
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.7
Description-Content-Type: text/markdown
Requires-Dist: numpy>=1.19.0
Requires-Dist: torch>=1.9.0
Requires-Dist: tqdm>=4.60.0
Requires-Dist: h5py>=3.0.0
Requires-Dist: pyyaml>=5.4.0
Requires-Dist: contractions>=0.1.73
Requires-Dist: termcolor>=2.0.0
Requires-Dist: matplotlib>=3.3.0

# AceFlow - Seq2Seq Model Library

<div align="center">

![AceFlow Logo](https://img.shields.io/badge/AceFlow-Seq2Seq-blue)
![Python](https://img.shields.io/badge/Python-3.7%2B-green)
![PyTorch](https://img.shields.io/badge/PyTorch-1.9%2B-red)
![License](https://img.shields.io/badge/License-MIT-yellow)

A powerful Python library for building and training Sequence-to-Sequence models with attention mechanisms.

</div>

## 🚀 Features

- **Multiple RNN Types**: LSTM, GRU, RNN, and bidirectional variants
- **Attention Mechanisms**: Bahdanau and Luong-style attention
- **Custom Model Format**: Save/load models in `.ace` format
- **Advanced Tokenization**: Flexible preprocessing and vocabulary management
- **Production Ready**: Comprehensive training utilities and inference tools

## 📖 Documentation

- [Installation Guide](docs/installation.md)
- [Quick Start](docs/quickstart.md)
- [API Reference](docs/api/)
- [User Guides](docs/guides/)
- [Examples](docs/examples/)

## 🎯 Quick Example

```python
from aceflow import Seq2SeqModel
from aceflow.utils import Tokenizer

# Initialize model
model = Seq2SeqModel(
    src_vocab_size=1000,
    tgt_vocab_size=1000,
    hidden_size=256,
    rnn_type='lstm',
    use_attention=True
)

# Train and save
model.save("model.ace")

# Load model
loaded_model = Seq2SeqModel.load("model.ace")
```

## 📦 Installation



For detailed installation instructions, see [Installation Guide](docs/installation.md).


## 📄 License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.


---

<div align="center">
Made with ❤️ by Maaz Waheed
</div>
```
