Metadata-Version: 2.4
Name: torchref
Version: 0.4.3
Summary: Pytorch based crystallographic refinement
Author: HansPeterSeidel
License-Expression: MIT
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: numpy<2.4.0,>=2.0.0
Requires-Dist: pandas<2.4.0,>=2.0.0
Requires-Dist: torch<2.10.0,>=2.4.0
Requires-Dist: tqdm<4.68.0,>=4.61.0
Requires-Dist: numba<0.64.0,>=0.59.0
Requires-Dist: gemmi<0.8.0,>=0.5.0
Requires-Dist: scipy<1.18.0,>=1.10.0
Requires-Dist: matplotlib<3.11.0,>=3.7.0
Requires-Dist: reciprocalspaceship<1.1.0,>=0.9.18
Requires-Dist: pyarrow<23.0.0,>=12.0.0
Provides-Extra: dev
Requires-Dist: pytest>=6.0.0; extra == "dev"
Requires-Dist: pytest-cov>=2.12.0; extra == "dev"
Requires-Dist: black>=21.5b2; extra == "dev"
Requires-Dist: isort>=5.9.0; extra == "dev"
Requires-Dist: flake8>=3.9.0; extra == "dev"
Provides-Extra: alignment
Requires-Dist: jax>=0.4.0; extra == "alignment"
Requires-Dist: s2fft>=1.0.0; extra == "alignment"
Requires-Dist: s2ball>=0.0.2; extra == "alignment"
Requires-Dist: spherical>=1.0.0; extra == "alignment"
Requires-Dist: quaternionic>=1.0.0; extra == "alignment"
Provides-Extra: forcefield
Requires-Dist: torchmd-net>=2.0.0; extra == "forcefield"
Provides-Extra: amber
Requires-Dist: openmm>=8.0.0; extra == "amber"
Requires-Dist: pdbfixer; extra == "amber"
Provides-Extra: docs
Requires-Dist: sphinx>=4.0.0; extra == "docs"
Requires-Dist: sphinx-rtd-theme>=1.0.0; extra == "docs"
Requires-Dist: numpydoc>=1.1.0; extra == "docs"
Dynamic: license-file

# TorchRef

**A PyTorch-based crystallographic refinement library**

[![Python 3.8+](https://img.shields.io/badge/python-3.8+-blue.svg)](https://www.python.org/downloads/)
[![PyTorch](https://img.shields.io/badge/PyTorch-1.9+-ee4c2c.svg)](https://pytorch.org/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Documentation](https://readthedocs.org/projects/torchref/badge/?version=latest)](https://torchref.readthedocs.io/)

TorchRef is a crystallographic refinement package built entirely on PyTorch. By leveraging PyTorch's automatic differentiation and GPU acceleration, TorchRef enables seamless integration with machine learning workflows and provides a flexible, extensible framework for crystallographic structure refinement.

# Key Features

- **Native PyTorch Integration**: Built on PyTorch's `nn.Module` architecture, TorchRef integrates naturally with the PyTorch ecosystem, including machine learning models, optimizers, and GPU acceleration.

- **Automatic Differentiation**: Dynamic computational graphs eliminate the need for manually implemented gradient calculations. Define new refinement targets directly—PyTorch handles the derivatives automatically.

- **Modular Architecture**: Following PyTorch's module pattern, components are easily composable and extensible. Add custom targets, restraints, or optimizers without modifying core code.

- **GPU Acceleration**: Leverage CUDA for structure factor calculations, scaling, and optimization—achieving significant speedups for large structures.

- **FFT-based Structure Factors**: Efficient structure factor calculation using Fast Fourier Transform (FFT) methods, enabling rapid F_calc computation even for large unit cells.

- **State Management**: Full `state_dict` support enables saving and loading complete refinement states, including model parameters, scaler settings, and restraints.

# Getting Started

| Notebook | Description |
|----------|-------------|
| [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/HatPdotS/TorchRef/blob/main/example_notebooks/basic_usage.ipynb) | Basic Usage - Getting started tutorial |
| [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/HatPdotS/TorchRef/blob/main/example_notebooks/code_examples.ipynb) | Code Examples - Common patterns and recipes |
| [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/HatPdotS/TorchRef/blob/main/example_notebooks/target_exploration.ipynb) | Target Exploration - Exploring refinement targets |
| [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/HatPdotS/TorchRef/blob/main/example_notebooks/structure_factor_calculation.ipynb) | Structure Factor Calculation - FFT-based F_calc |

## Installation

```bash

pip install torchref

```

### Local installation for development

### clone the repository
git clone https://github.com/HatPdotS/TorchRef.git
cd torchref

### Install with pip
pip install -e .

### Or install with development dependencies
pip install -e ".[dev]"

## Dependencies

- Python ≥ 3.10
- PyTorch ≥ 2.40
- NumPy ≥ 2.0
- Gemmi ≥ 0.5
- reciprocalspaceship ≥ 0.9
- SciPy ≥ 1.7

## Testing

```bash
# Run all tests
pytest tests/

# Run with coverage
pytest tests/ --cov=torchref

# Run specific test categories
pytest tests/unit/           # Fast unit tests
pytest tests/integration/    # Integration tests
pytest tests/functional/     # Full workflow tests
```

## Contributing

Contributions are welcome! Please follow these guidelines:

1. Follow the [NumPy docstring style](https://numpydoc.readthedocs.io/en/latest/format.html)
2. Add tests for new functionality
3. Ensure all tests pass before submitting

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.






