Metadata-Version: 2.4
Name: reynoldsflow
Version: 0.0.2
Summary: Efficient finite-difference solver for the Reynolds equation in thin fluid films
Author-email: "Vladislav A. Yastrebov" <vladislav.yastrebov@minesparis.psl.eu>
License-Expression: BSD-3-Clause
Project-URL: Homepage, https://github.com/vyastreb/reynoldsflow
Project-URL: Repository, https://github.com/vyastreb/reynoldsflow
Project-URL: Issues, https://github.com/vyastreb/reynoldsflow/issues
Keywords: Reynolds equation,thin film,fluid mechanics,lubrication,finite difference,percolation,sparse solver,scientific computing
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Science/Research
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering
Classifier: Topic :: Scientific/Engineering :: Mathematics
Classifier: Topic :: Scientific/Engineering :: Physics
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: numpy>=1.20.0
Requires-Dist: scipy>=1.7.0
Requires-Dist: numba>=0.56.0
Requires-Dist: scikit-image>=0.19.0
Requires-Dist: pyamg>=4.0.0
Requires-Dist: matplotlib>=3.5.0
Requires-Dist: rfgen>=0.1.0
Provides-Extra: solvers
Requires-Dist: pypardiso; extra == "solvers"
Requires-Dist: petsc4py; extra == "solvers"
Requires-Dist: scikit-sparse; extra == "solvers"
Provides-Extra: dev
Requires-Dist: pytest>=7.0.0; extra == "dev"
Provides-Extra: all
Requires-Dist: reynoldsflow[dev,solvers]; extra == "all"
Dynamic: license-file

![Reynolds Fluid Solver](./docs/img/header.jpg)

# Finite-Difference Reynolds Fluid Solver

## Description

An efficient Python code to solve the diffusion equation on Cartesian and polar grids:

+ Reynolds equation:  
  ```math
  \nabla\cdot\left(g^3\nabla p\right)=0
  ```
+ Inlet/outlet pressure:  
  ```math
  p(x=0)=1,\quad p(x=1)=0
  ```
+ Periodic boundary conditions:  
  ```math
  p(y=0)=p(y=1),\quad \frac{\partial p}{\partial y}(x,y=0)=\frac{\partial p}{\partial y}(x,y=1)
  ```


<!-- ![equation to be solved](./docs/img/eq.png) -->

## What can it do?

+ It takes as input a gap field $g$.
+ It analyzes its connectivity and removes isolated islands and checks for percolation (whether a flow problem can be solved).
+ It dilates non-zero gap field to properly handle impenetrability of channels, it allows not to erode the domain for flux calculation.
+ It applies an inlet pressure $p_i=1$  on one side $x=0$ and an outlet pressure $p_0=0$ on the opposite side $x=1$ and uses periodic boundary conditions on the lateral sides $y=\{0,1\}$.
+ It constructs a sparse matrix with conductivity proportional to $g^3$.
+ Different solvers (direct and iterative with appropriate preconditioners) are selected and tuned to solve efficiently the resulting linear system of equations.
+ Total flux is properly computed.

## Usage

1. Install the package
```bash
pip install reynoldsflow
```
The default solver is `scipy.amg-rs` (SciPy CG + Ruge-Stüben AMG via `pyamg`), included in the base install.
For optional high-performance solvers (PARDISO, PETSc, CHOLMOD):
```bash
pip install reynoldsflow[solvers]
```
With `[solvers]`, `solver="auto"` selects `petsc-cg.hypre` as the fastest option.

2. Run a minimal example (flow around a circular inclusion)
```python
import numpy as np
import matplotlib.pyplot as plt
from reynoldsflow import transport as FS

n = 100
X, Y = np.meshgrid(np.linspace(0, 1, n), np.linspace(0, 1, n))
gaps = (np.sqrt((X - 0.5)**2 + (Y - 0.5)**2) > 0.2).astype(float)

_, _, flux = FS.solve_fluid_problem(gaps, solver="scipy.amg-rs")
if flux is not None:
    plt.imshow(np.sqrt(flux[:, :, 0]**2 + flux[:, :, 1]**2),
               origin='lower', cmap='jet')
    plt.show()
```

3. Run the test suite
```bash
python -m pytest -q
```

4. Or run these tests manually:
+ Solves the flux evolution problem: `/tests/test_evolution.py`
+ Solves flux problem on a Cartesian grid: `/tests/test_solve.py`
+ Solves flux problem on a polar grid: `/tests/polar_flow.py`
+ Tests all solvers: `/tests/test_solvers.py`.

## Available Solvers and Preconditioners

The fluid flow solver supports several linear system solvers and preconditioners for efficient and robust solution of large sparse systems:

| Solver String | Solver Type | Preconditioner | Backend | Description |
|---------------|-------------|----------------|---------|-------------|
| `pardiso` | Direct | - | Intel MKL | 🥇PARDISO direct solver. The fastest for bigger problems, but consumes a lot of memory. |
| `petsc-cg.hypre` | Iterative (CG) | HYPRE | PETSc | 🥇 CG with HYPRE BoomerAMG. The fastest for moderate problems. |
| `scipy.amg-rs` | Iterative (CG) | AMG (Ruge-Stuben) | SciPy/PyAMG | CG with Ruge-Stuben AMG. Only two times slower than the fastest.  |
| `scipy.amg-smooth_aggregation` | Iterative (CG) | AMG (Smoothed Aggregation) | SciPy/PyAMG | CG with Smoothed Aggregation AMG. Memory efficient, but relatively slow.|
| `cholesky` | Direct | - | scikit-sparse | CHOLMOD Cholesky decomposition. Slightly lower memory consumption for huge problems, but it is slow. |
| `petsc-cg.gamg` | Iterative (CG) | GAMG | PETSc | CG with Geometric Algebraic Multigrid. Not very reliable in performance, 2-3 times slower than the fastest solver. |
| `petsc-mumps` | Direct | - | PETSc/MUMPS | MUMPS direct solver via PETSc. For moderate problems, five times slower than the fastest solver. |
| `petsc-cg.ilu` | Iterative (CG) | ILU | PETSc | CG with Incomplete LU factorization. The slowest. |

Relevant CPU times for a relatively small problem with $N\times N = 2000\times 2000$ grid points (relative tolerance for iterative solvers was set to 1e-8).

| **Solver**                    | **CPU time (s)** |
|-------------------------------|-----------------:|
| petsc-cg.hypre                | 4.46 |
| pardiso                       | 8.53 |
| scipy.amg-rs                  | 8.96 |
| petsc-cg.gamg                 | 11.96 |
| scipy.amg-smooth_aggregation  | 15.48 |
| cholesky                      | 20.61 |
| petsc-mumps                   | 26.14 |
| petsc-cg.ilu                  | 134.98 |

**Rules of thumb:** 
- For fastest computation: use `pardiso` (consumes a lot of memory) or `petsc-cg.hypre` (the only difficulty is to install PETSc);
- For best memory efficiency: use `scipy.amg-rs`;
- For small-scale problems (for $N<2000$): use `pardiso`;
- For large-scale problems (for $N>2000$): use `petsc-cg.hypre`;
- Avoid `petsc-cg.ilu`.

The most reliable solvers for big problems are `petsc-cg.hypre` and `pardiso`. Here are the test data obtained on rough "contact" problems on Intel(R) Xeon(R) Platinum 8488C. Only solver's time is shown (relative tolerance for PETSc-CG.Hypre was set to 1e-8).

<table>
  <thead>
    <tr><th rowspan="2">N</th><th colspan="2">CPU time (s)</th></tr>
    <tr><th>PETSc-CG.Hypre</th><th>Intel MKL Pardiso</th></tr>
  </thead>
  <tbody>
    <tr><td>20 000</td><td>1059.22</td><td>∅</td></tr>
    <tr><td>10 000</td><td>278.18</td><td>112.38</td></tr>
    <tr><td>5 000</td><td>70.62</td><td>28.42</td></tr>
    <tr><td>2 500</td><td>17.72</td><td>6.34</td></tr>
    <tr><td>1 250</td><td>4.47</td><td>1.93</td></tr>
  </tbody>
</table>

∅ $-$ `pardiso` could not run as it required more than 256 GB or memory.

**CPU/RAM Performance**

Performance of the code on a truncated rough surface is shown below. The peak memory consumption and the CPU time required to perform connectivity analysis, constructing the matrix and solving the linear system are provided. The real number of DOFs is reported which corresponds to approximately 84% of the square grid $N\times N$ for $N\in\{500, 1\,000, 2\,000, 4\,000, 6\,000, 8\,000, 16\,000\}$ (relative tolerance for iterative solvers was set to 1e-8).

![CPU and RAM performance of the solver](./docs/img/CPU_RAM_real_dof_performance.png)


## Illustration

An example of a fluid flow simulation, solved on the grid $N\times N = 8\,000 \times 8\,000$ which features a truncated self-affine rough surface with a rich spectrum. Solution time on my laptop with `petsc` is only 97 seconds and the peak memory consumption is 25.8 GB.

![Solution for 64 million grid points](./docs/img/illustration.jpg)

Another example for a grid $N\times N = 20\,000 \times 20\,000$. Simulation time (sequential) $\approx 17$ minutes on Intel(R) Xeon(R) Platinum 8488C with the peak memory below 230 GB with `petsc-cg.gamg` solver.

![Solution for 400 million grid points](./docs/img/illustration_2.jpg)

## Info

+ Author: Vladislav A. Yastrebov (CNRS, Mines Paris - PSL)
+ AI usage: Cursor & Copilot (different models), ChatGPT 4o, 5, Claude Sonnet 3.7, 4, 4.5
+ License: BSD 3-clause
+ Date: Sept-Nov 2025



