Metadata-Version: 2.4
Name: p2o
Version: 0.1.1
Classifier: Programming Language :: Rust
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: Environment :: Console
Summary: A PaddlePaddle New IR (PIR) to ONNX model converter.
Keywords: paddlepaddle,onnx,converter,deep-learning
Home-Page: https://github.com/greatv/p2o
Author-email: Wang Xin <xinwang614@gmail.com>
License-Expression: Apache-2.0
Requires-Python: >=3.10
Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM
Project-URL: Documentation, https://github.com/greatv/p2o#readme
Project-URL: Issues, https://github.com/greatv/p2o/issues
Project-URL: Source, https://github.com/greatv/p2o

# p2o

A PaddlePaddle New IR (PIR) to ONNX model converter, written in Rust.

Converts PaddlePaddle inference models (`inference.json` + `inference.pdiparams`) to the ONNX format.

## Installation

### Pre-built binaries

Download from [GitHub Releases](https://github.com/greatv/p2o/releases).

### From PyPI

```bash
pip install p2o
```

### From crates.io

```bash
cargo install p2o
```

### Build from source

```bash
git clone https://github.com/greatv/p2o.git
cd p2o
cargo build --release
```

## Usage

```bash
p2o <model.json> <model.pdiparams> <output.onnx> [--opset 17] [--strict]
```

### Arguments

| Argument | Description |
|---|---|
| `model.json` | Path to the PaddlePaddle `.json` model file |
| `model.pdiparams` | Path to the PaddlePaddle `.pdiparams` weight file |
| `output.onnx` | Path to the output `.onnx` model file |
| `--opset <N>` | Target ONNX opset version (≥ 10, default: 17) |
| `--strict` | Reject lossy conversions (e.g. multinomial → ArgMax) |

### Example

```bash
p2o inference_models/PP-OCRv5_server_det_infer/inference.json \
    inference_models/PP-OCRv5_server_det_infer/inference.pdiparams \
    output.onnx --opset 17
```

