Metadata-Version: 2.4
Name: open-titans
Version: 0.0.5
Summary: Memory-Augmented Sequence Models in Pytorch
Author-email: Neeze <nampvh4436@gmail.com>
License: MIT License
        
        Copyright (c) 2026 Neeze
        
        Permission is hereby granted, free of charge, to any person obtaining a copy
        of this software and associated documentation files (the "Software"), to deal
        in the Software without restriction, including without limitation the rights
        to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
        copies of the Software, and to permit persons to whom the Software is
        furnished to do so, subject to the following conditions:
        
        The above copyright notice and this permission notice shall be included in all
        copies or substantial portions of the Software.
        
        THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
        IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
        FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
        AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
        LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
        OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
        SOFTWARE.
Project-URL: Homepage, https://github.com/Neeze/OpenTitans
Project-URL: Bug Tracker, https://github.com/Neeze/OpenTitans/issues
Project-URL: Source Code, https://github.com/Neeze/OpenTitans
Keywords: artificial intelligence,deep learning,test time training,linear attention,memory-augmented models,neural memory,titans,transformers
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Science/Research
Classifier: Intended Audience :: Developers
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Operating System :: OS Independent
Requires-Python: >=3.12
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: assoc-scan>=0.0.4
Requires-Dist: axial_positional_embedding>=0.3.10
Requires-Dist: einops>=0.8.0
Requires-Dist: einx>=0.3.0
Requires-Dist: hyper-connections>=0.3.11
Requires-Dist: Ninja
Requires-Dist: rotary-embedding-torch
Requires-Dist: tensordict
Requires-Dist: torch>=2.8
Requires-Dist: tqdm
Requires-Dist: transformers>=4.30.0
Requires-Dist: x-transformers
Dynamic: license-file

<div align="center">

# 🌌 OpenTitans
**The Open-Source Framework for Memory-Augmented Sequence Models**

[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Python 3.10+](https://img.shields.io/badge/python-3.10+-blue.svg)](https://www.python.org/downloads/)
[![PyTorch 2.0+](https://img.shields.io/badge/pytorch-2.0+-ee4c2c.svg)](https://pytorch.org/)
[![Maintenance](https://img.shields.io/badge/Maintained%3F-yes-green.svg)](https://github.com/Neeze/OpenTitans/graphs/commit-activity)
[![PyPI version](https://img.shields.io/pypi/v/open-titans.svg)](https://pypi.org/project/open-titans/)

---

**Democratizing Test-Time Memorization and Neural Memory Architectures.**

[Introduction](#-introduction) • [Documentation](./docs/README.md) • [Features](#-key-features) • [Quick Start](#-quick-start) • [Citations](#-citations--acknowledgements)

</div>

## 🌟 Introduction

**OpenTitans** is a modular, high-performance framework designed to implement and explore the next generation of sequence models. While Transformers revolutionized AI, their quadratic context limitations have met their match. 

Inspired by groundbreaking research from Google and other top labs, OpenTitans focuses on **Memory-Augmented Models** that learn to memorize, optimize, and cache their internal states at test time. Our goal is to provide a "HuggingFace-like" experience for researchers and engineers building the future of infinite-context modeling.

---


## 📖 Documentation

For detailed information on how to use OpenTitans, please refer to our **[Documentation Index](./docs/README.md)**.

*   **[Installation Guide](./docs/Installation.md)**: Setup and dependency management.
*   **[Quickstart](./docs/Quickstart.md)**: Your first model in 5 minutes.
*   **[Titans Variants](./docs/Titans_Variants.md)**: Understanding MAC, MAG, and MAL.
*   **[ATLAS Framework](./docs/Atlas.md)**: Learning to Optimally Memorize Context at Test Time.
*   **[Neural Memory](./docs/Neural_Memory.md)**: Customizing memory models and fast-weight updates.

---

## 📦 Quick Start

### Installation

> [!TIP]
> We recommend using a virtual environment (venv or conda) for the best experience.

```bash
pip install open-titans
```

#### Development Installation

```bash
# Clone the repository
git clone https://github.com/Neeze/OpenTitans.git
cd OpenTitans

# Install in editable mode with dependencies
pip install -e .
```

---

## 🤝 Contributing

We are looking for "Titans" to help us build! 🚀

Whether you want to implement a new paper, optimize a CUDA kernel, or just fix a typo, your contributions are welcome. Check out our [CONTRIBUTING.md](CONTRIBUTING.md) to get started.

---

## 📚 Citations & Acknowledgements

OpenTitans stands on the shoulders of giants. We acknowledge the authors of the following papers for their foundational work:

```bibtex
@misc{behrouz2024titanslearningmemorizetest,
      title={Titans: Learning to Memorize at Test Time}, 
      author={Ali Behrouz and Peilin Zhong and Vahab Mirrokni},
      year={2024},
      url={https://arxiv.org/abs/2501.00663}
}

@misc{behrouz2025itsconnectedjourneytesttime,
      title={It's All Connected: A Journey Through Test-Time Memorization, Attentional Bias, Retention, and Online Optimization}, 
      author={Ali Behrouz and Meisam Razaviyayn and Peilin Zhong and Vahab Mirrokni},
      year={2025},
      url={https://arxiv.org/abs/2504.13173}
}

@misc{behrouz2025atlaslearningoptimallymemorize,
      title={ATLAS: Learning to Optimally Memorize the Context at Test Time}, 
      author={Ali Behrouz and Zeman Li and Praneeth Kacham and Majid Daliri and Yuan Deng and Peilin Zhong and Meisam Razaviyayn and Vahab Mirrokni},
      year={2025},
      url={https://arxiv.org/abs/2505.23735}
}

@misc{behrouz2025nestedlearningillusiondeep,
      title={Nested Learning: The Illusion of Deep Learning Architectures}, 
      author={Ali Behrouz and Meisam Razaviyayn and Peilin Zhong and Vahab Mirrokni},
      year={2025},
      eprint={2512.24695},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2512.24695}, 
}

@misc{behrouz2026memorycachingrnnsgrowing,
      title={Memory Caching: RNNs with Growing Memory}, 
      author={Ali Behrouz and Zeman Li and Yuan Deng and Peilin Zhong and Meisam Razaviyayn and Vahab Mirrokni},
      year={2026},
      eprint={2602.24281},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2602.24281}, 
}
```

---

## 📄 License

OpenTitans is released under the **MIT License**. See [LICENSE](LICENSE) for more details.
