Metadata-Version: 2.4
Name: embedl-hub
Version: 2026.4.3
Summary: The official Embedl Hub Python client library.
Author-email: Embedl AB <support@embedl.com>
Project-URL: Homepage, https://hub.embedl.com
Project-URL: Documentation, https://hub.embedl.com/docs
Classifier: Development Status :: 1 - Planning
Classifier: Intended Audience :: Developers
Classifier: License :: Other/Proprietary License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Embedded Systems
Classifier: Topic :: Software Development :: Libraries
Requires-Python: >=3.11
Description-Content-Type: text/markdown
License-File: LICENSE
License-File: NOTICE
Requires-Dist: ai-edge-quantizer==0.4.0
Requires-Dist: ai-edge-litert>=2.0.3
Requires-Dist: asyncssh>=2.22.0
Requires-Dist: onnx~=1.19.0
Requires-Dist: onnx2tf==1.28.3
Requires-Dist: onnx_graphsurgeon==0.5.8
Requires-Dist: onnxsim>=0.4.33
Requires-Dist: platformdirs>=4.5
Requires-Dist: pydantic>=2.8.2
Requires-Dist: PyYAML~=6.0.2
Requires-Dist: qai-hub>=0.40.0
Requires-Dist: requests>=2.32.3
Requires-Dist: rich>=13.7
Requires-Dist: sng4onnx==1.0.4
Requires-Dist: tabulate>=0.9.0
Requires-Dist: typer>=0.12.0
Provides-Extra: dev
Requires-Dist: pytest; extra == "dev"
Requires-Dist: ruff; extra == "dev"
Provides-Extra: docs
Requires-Dist: myst-parser; extra == "docs"
Requires-Dist: pydata-sphinx-theme; extra == "docs"
Requires-Dist: sphinx; extra == "docs"
Requires-Dist: sphinx-click; extra == "docs"
Dynamic: license-file

# Embedl Hub Python library

Optimize and deploy your model on any edge device with the Embedl Hub Python library:

- **Compile** your model for execution on CPU, GPU, NPU or other AI accelerators
  using ONNX Runtime, TensorRT, or TFLite backends.
- **Profile** your model's latency and memory usage on real edge devices in the cloud.
- **Invoke** your compiled model to run inference with real input data.

The library logs your metrics, parameters, and results on the
[Embedl Hub](https://hub.embedl.com) website, allowing you to inspect, compare,
and reproduce your results.

For comprehensive getting started guides and API reference, visit the
[Embedl Hub documentation](https://hub.embedl.com/docs).

[Create a free Embedl Hub account](https://hub.embedl.com/docs/setup)
to get started.

## Installation

Install `embedl-hub` with `pip`:

```shell
pip install embedl-hub
```

## Usage

The `embedl-hub` library can be used in two ways:

### CLI

The `embedl-hub` (or `ehub`) command provides an end-to-end workflow for
compiling, profiling, and invoking models from the terminal:

```
Usage: embedl-hub [OPTIONS] COMMAND [ARGS]...

 embedl-hub end-to-end Edge-AI workflow CLI

╭─ Options ────────────────────────────────────────────────────────────────╮
│ --version      -V               Print embedl-hub version and exit.       │
│ --verbose      -v      INTEGER  Increase verbosity (-v, -vv, -vvv).      │
│ --help                          Show this message and exit.              │
╰──────────────────────────────────────────────────────────────────────────╯
╭─ Commands ───────────────────────────────────────────────────────────────╮
│ auth           Store the API key for embedl-hub CLI.                     │
│ init           Configure persistent CLI context.                         │
│ show           Print the active project name and artifact directory.     │
│ compile        Compile a model for on-device deployment.                 │
│ profile        Profile a compiled model on a target device.              │
│ invoke         Run inference on a compiled model.                        │
│ log            Show past runs from the artifact directory.               │
│ list-devices   List available devices.                                   │
╰──────────────────────────────────────────────────────────────────────────╯
```

### Python API

For programmatic use, import from the `embedl_hub` package. The API provides
compiler, profiler, and invoker components for each supported backend
(ONNX Runtime, TensorRT, TFLite):

```python
from embedl_hub.compile import OnnxRuntimeCompiler
from embedl_hub.profile import OnnxRuntimeProfiler
from embedl_hub.invoke import OnnxRuntimeInvoker
```

See the [Embedl Hub documentation](https://hub.embedl.com/docs) for detailed
guides and examples.

## License

Copyright (C) 2025, 2026 Embedl AB

This software is subject to the [Embedl Hub Software License Agreement](https://hub.embedl.com/embedl-hub-sla.txt).

<!-- Copyright (C) 2025, 2026 Embedl AB -->
