Metadata-Version: 2.3
Name: encoder-converter
Version: 0.2.1
Summary: Convert huggingface encoder to onnx format.
Author: aveitsme
Author-email: aveitsme@gmail.com
Requires-Python: >=3.10,<3.13
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Requires-Dist: onnx (>=1.0.0,<2.0.0)
Requires-Dist: onnxruntime (>=1.0.0,<2.0.0)
Requires-Dist: openvino (>=2025.1.0,<2026.0.0)
Requires-Dist: tokenizers (>=0.2.0,<1.0.0)
Requires-Dist: torch (>=2.0.0,<3.0.0)
Requires-Dist: transformers (>=4.0.0,<5.0.0)
Project-URL: Homepage, https://github.com/AVEitsme/encoder-converter
Project-URL: Repository, https://github.com/AVEitsme/encoder-converter
Description-Content-Type: text/markdown

# Encoder converter
:rocket: Easy way to convert huggingface encoder model to other formats.
## Description
Encoder converter is a package that allows you to convert the huggingface encoder model to other formats (e.g. onnx).
## Features
Unfinished features will be implemented in future versions.
- [x] Convert encoder model to onnx.
- [x] Convert encoder model to openvino.
- [ ] Convert model with custom wrapper.
## Installation
```bash
pip install encoder-converter
```
## Usage
### Run
```bash
convertencoder --model-name project/huggingface_repo --format onnx --output-dir /my/output/dir --cache-dir /cache/dir --output-model-name t5_encoder
```
### Parameters
| Parameter             | Description                                               | Default   |
|-----------------------|-----------------------------------------------------------|-----------|
| `--model-name`        | Huggingface model name                                    |           |
| `--format`            | Compiled model format. Available: `onnx`, `openvino`      |           |
| `--output-dir`        | Path to save compiled model and tokenizer artifacts.      |           |
| `--cache-dir`         | Path to a directory in which a downloaded pretrained model configuration should be cached while compiling.                                                             |  `/tmp`   |
| `--output-model-name` | If not specified, the default output model name will be parsed depends on the `--model_name` parameter.                                                             |           |
If `--output-model-name` is not specified, you can find the complied model at `output_dir`/`huggingface_repo`.`extension`:
1. output_dir - `--ouput-dir` parameter.
2. huggingface_repo - extracts from the `--model-name` parameter.
3. extension - depends on selected `format` parameter:
    * `onnx` - `onnx`
    * `openvino` - `xml`

