Metadata-Version: 2.4
Name: aidge_model_explorer
Version: 0.1.0
Summary: Aidge module for model explorer: https://github.com/google-ai-edge/model-explorer.
Project-URL: Homepage, https://www.deepgreen.ai/en/platform
Project-URL: Documentation, https://eclipse.dev/aidge/
Project-URL: Repository, https://gitlab.eclipse.org/eclipse/aidge/aidge_model_explorer
Project-URL: Issues, https://gitlab.eclipse.org/eclipse/aidge/aidge_model_explorer/-/issues/
Project-URL: Changelog, https://gitlab.eclipse.org/eclipse/aidge/aidge_model_explorer/-/releases
Classifier: Development Status :: 2 - Pre-Alpha
Classifier: Programming Language :: Python :: 3
Requires-Python: >=3.10
Description-Content-Type: text/markdown
Requires-Dist: ai-edge-model-explorer
Provides-Extra: test
Requires-Dist: pytest; extra == "test"

# Aidge Model Adapter

![Pipeline status](https://gitlab.eclipse.org/eclipse/aidge/aidge_model_explorer/badges/main/pipeline.svg?ignore_skipped=true) ![Python coverage](https://gitlab.eclipse.org/eclipse/aidge/aidge_model_explorer/badges/main/coverage.svg?job=test:ubuntu_python&key_text=Python+coverage&key_width=100)
[![Latest Release](https://gitlab.eclipse.org/eclipse/aidge/aidge_model_explorer/-/badges/release.svg)](https://gitlab.eclipse.org/eclipse/aidge/aidge_model_explorer/-/releases)


Plugin of the framework Google Model Explorer to visualize Aidge graphs.
For more information on how to use Model Explorer, you can refer yourself to the [official documentation](https://github.com/google-ai-edge/model-explorer/wiki/2.-User-Guide) of the framework.

##  Installation

To install this plugin, you can install it using the pipy repository:

> pip install aidge_core aidge_backend_cpu aidge_onnx aidge_model_explorer

Of course, this is only a plugin

### Installation for devs

If you want to install this module locally, you can use this command at the root of the git project:

> pip install -e .

**Note:** `-e` allow editable install, meaning that you would not have to reinstall each time you change the python code.

## 🚀 Features

### Visualize ONNX file

You can use directly the ``model-explorer`` CLI with the extension aidge_model_explorer to visualize an ONNX graph like so:

```bash
model-explorer aidge_mobilenetV2.onnx --extensions=aidge_model_explorer
```

This will allow you to see how Aidge will import your ONNX graph, looking at potential MetaOperator created. Also GenericOperator have a specific background allowing to easily see which operator is not supported natively by Aidge.

### Visualize wrapper method

This function wrap every call to the model explorer API, allowing to only call one line to visualize your graph.

**Note:** This function will automatically embed the visualization in a Jupyter Notebook cell if ran in a notebook, otherwise it will simply open a page in your browser.

```python
import aidge_core
import aidge_model_explorer

lstm = aidge_core.LSTM(in_channels=4, hidden_channels=8, seq_length=5)
model = aidge_core.get_connected_graph_view(lstm)

aidge_model_explorer.visualize(model, "LSTM")
```

:warning: When using model-explorer from a distant server, port forwarding is not always automatic. Make sure to open the right port. (You can force the port to use by providing the argument ``port``)

### Visualize from config

You can also use the model_explorer API by using a ``aidge_model_explorer.Config()`` which is an overridden class enabling full compatibility with the model_explorer API.

```python
import aidge_core
import aidge_model_explorer
import model_explorer


lstm = aidge_core.LSTM(in_channels=4, hidden_channels=8, seq_length=5)
lstm_model = aidge_core.get_connected_graph_view(lstm)

config = aidge_model_explorer.config()
config.add_graphview(lstm_model, "lstm")
model_explorer.visualize_from_config(config)
```

### Synchronize graphs with node ids

Model explorer support an option to synchronize two graphs using node id.
You can use this functionality to visualize a graph before and after graph modifications:

```python
import aidge_core
import aidge_model_explorer
import model_explorer


lstm = aidge_core.LSTM(in_channels=4, hidden_channels=8, seq_length=5)
lstm_model = aidge_core.get_connected_graph_view(lstm)


config = aidge_model_explorer.config()
# Note: need to add graphview before doing any manipulation to keep the ids
config.add_graphview(lstm_model, "lstm")

aidge_core.expand_metaops(lstm_model)
config.add_graphview(lstm_model, "lstm_expanded")

model_explorer.visualize_from_config(config)
```

Once the GUI is opened, you can then explorer the two models like this:

![Node styling](./static/sync.gif)

**Note:** Screen recording taken using ``v0.1.5`` of model explorer.

### Add attributes & metadata


```python
import aidge_core
import aidge_onnx
import aidge_model_explorer
import model_explorer


model = aidge_onnx.load_onnx("aidge_mobilenetV2.onnx")

conv_config = aidge_model_explorer.ConverterConfig()
conv_config.add_attribute("isConv",
                          lambda node: "" if (node.type() == "Conv2D" or node.type() == "ConvDepthWise2D") else None)

conv_config.add_output_metadata("mean",
                          lambda _, tensor: str(tensor.mean()))

config = aidge_model_explorer.config()
config.add_graphview(model, "model1", conv_config)

model_explorer.visualize_from_config(config)
```

Once the GUI is opened, you can then play with the node styler to easily see the nodes that are convolutions.

![Node styling](./static/coloring.gif)

**Note:** Screen recording taken using ``v0.1.5`` of model explorer.

A more practical example would be to add an attribute to every node for which we cannot find an implementation to use (for example because input/output datatype does not match):

```python
def test_best_match(node):
    res = None
    # Skip node with a default impl
    if not node.get_operator().backend(): return None
    try:
        op_impl = node.get_operator().get_impl()
        required_specs = op_impl.get_required_spec()
        get_best_match = op_impl.get_best_match(required_specs)
        if get_best_match == required_specs:
            res = f"Required specs:\n{required_specs}\nAvailable:\n{op_impl.get_available_impl_specs()}"
    except Exception as e:
        res = "Error:\n{e}"
    return res

converter_config.add_attribute("fail_best_match",
                          test_best_match)
```
