Metadata-Version: 2.4
Name: torchref
Version: 0.3.1
Summary: Tools for multicopy refinement of crystallographic models
Author: HansPeterSeidel
License-Expression: MIT
Requires-Python: >=3.9
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: numpy<2.4.0,>=1.24.0
Requires-Dist: pandas<2.4.0,>=2.0.0
Requires-Dist: torch<2.10.0,>=2.0.0
Requires-Dist: tqdm<4.68.0,>=4.61.0
Requires-Dist: numba<0.64.0,>=0.59.0
Requires-Dist: gemmi<0.8.0,>=0.5.0
Requires-Dist: scipy<1.18.0,>=1.10.0
Requires-Dist: matplotlib<3.11.0,>=3.7.0
Requires-Dist: reciprocalspaceship<1.1.0,>=0.9.18
Requires-Dist: pyarrow<23.0.0,>=12.0.0
Provides-Extra: dev
Requires-Dist: pytest>=6.0.0; extra == "dev"
Requires-Dist: pytest-cov>=2.12.0; extra == "dev"
Requires-Dist: black>=21.5b2; extra == "dev"
Requires-Dist: isort>=5.9.0; extra == "dev"
Requires-Dist: flake8>=3.9.0; extra == "dev"
Provides-Extra: docs
Requires-Dist: sphinx>=4.0.0; extra == "docs"
Requires-Dist: sphinx-rtd-theme>=1.0.0; extra == "docs"
Requires-Dist: numpydoc>=1.1.0; extra == "docs"
Dynamic: license-file

# TorchRef

**A PyTorch-based crystallographic refinement library**

[![Python 3.8+](https://img.shields.io/badge/python-3.8+-blue.svg)](https://www.python.org/downloads/)
[![PyTorch](https://img.shields.io/badge/PyTorch-1.9+-ee4c2c.svg)](https://pytorch.org/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

TorchRef is a crystallographic refinement package built entirely on PyTorch. By leveraging PyTorch's automatic differentiation and GPU acceleration, TorchRef enables seamless integration with machine learning workflows and provides a flexible, extensible framework for crystallographic structure refinement.

## Key Features

- **Native PyTorch Integration**: Built on PyTorch's `nn.Module` architecture, TorchRef integrates naturally with the PyTorch ecosystem, including machine learning models, optimizers, and GPU acceleration.

- **Automatic Differentiation**: Dynamic computational graphs eliminate the need for manually implemented gradient calculations. Define new refinement targets directly—PyTorch handles the derivatives automatically.

- **Modular Architecture**: Following PyTorch's module pattern, components are easily composable and extensible. Add custom targets, restraints, or optimizers without modifying core code.

- **GPU Acceleration**: Leverage CUDA for structure factor calculations, scaling, and optimization—achieving significant speedups for large structures.

- **FFT-based Structure Factors**: Efficient structure factor calculation using Fast Fourier Transform (FFT) methods, enabling rapid F_calc computation even for large unit cells.

- **State Management**: Full `state_dict` support enables saving and loading complete refinement states, including model parameters, scaler settings, and restraints.

## Installation

```bash
# Clone the repository
git clone https://github.com/HatPdotS/TorchRef.git
cd torchref

# Install with pip
pip install -e .

# Or install with development dependencies
pip install -e ".[dev]"
```

### Dependencies

- Python ≥ 3.8
- PyTorch ≥ 1.9
- NumPy ≥ 1.20
- Gemmi ≥ 0.5
- reciprocalspaceship ≥ 0.9
- SciPy ≥ 1.7

## Getting Started

For demonstrations and usage examples, see the example notebooks in [`example_notebooks/`](example_notebooks/):

- [`basic_usage.ipynb`](example_notebooks/basic_usage.ipynb) - Getting started tutorial
- [`code_examples.ipynb`](example_notebooks/code_examples.ipynb) - Code examples and patterns
- [`target_exploration.ipynb`](example_notebooks/target_exploration.ipynb) - Exploring refinement targets

## Testing

```bash
# Run all tests
pytest tests/

# Run with coverage
pytest tests/ --cov=torchref

# Run specific test categories
pytest tests/unit/           # Fast unit tests
pytest tests/integration/    # Integration tests
pytest tests/functional/     # Full workflow tests
```

## Contributing

Contributions are welcome! Please follow these guidelines:

1. Follow the [NumPy docstring style](https://numpydoc.readthedocs.io/en/latest/format.html)
2. Add tests for new functionality
3. Ensure all tests pass before submitting

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.




