Metadata-Version: 2.1
Name: jumper-kernel
Version: 1.1.1
Summary: This is a Jupyter Python Kernel for performance engineering. Beside performance data as CPU, GPU, memory utilization it supports performance data recording with Score-P.
Author-email: Elias Werner <elias.werner@tu-dresden.de>
License: Copyright (c) 2022, Technische Universitaet Dresden, Germany, all rights reserved.
        Author: Elias Werner
        
        Copyright 2017-2020, Technische Universitaet Dresden, Germany, all rights reserved.
        Author: Andreas Gocht
        
        portions copyright 2001, Autonomous Zones Industries, Inc., all rights...
        err...  reserved and offered to the public under the terms of the
        Python 2.2 license.
        Author: Zooko O'Whielacronx
        http://zooko.com/
        mailto:zooko@zooko.com
        
        Copyright 2000, Mojam Media, Inc., all rights reserved.
        Author: Skip Montanaro
        
        Copyright 1999, Bioreason, Inc., all rights reserved.
        Author: Andrew Dalke
        
        Copyright 1995-1997, Automatrix, Inc., all rights reserved.
        Author: Skip Montanaro
        
        Copyright 1991-1995, Stichting Mathematisch Centrum, all rights reserved.
        
        Permission to use, copy, modify, and distribute this Python software and
        its associated documentation for any purpose without fee is hereby
        granted, provided that the above copyright notice appears in all copies,
        and that both that copyright notice and this permission notice appear in
        supporting documentation, and that the name of neither Automatrix,
        Bioreason, Mojam Media or TU Dresden be used in advertising or publicity
        pertaining to distribution of the software without specific, written
        prior permission.
        
        Copyright (c) 2014, Simon Percivall
        All rights reserved.
        
        Copyright (c) 2008-Present, IPython Development Team
        Copyright (c) 2001-2007, Fernando Perez <fernando.perez@colorado.edu>
        Copyright (c) 2001, Janko Hauser <jhauser@zscout.de>
        Copyright (c) 2001, Nathaniel Gray <n8gray@caltech.edu>
        
        Copyright (c) 2001-2015, IPython Development Team
        Copyright (c) 2015-, Jupyter Development Team 
        All rights reserved.
        
        Copyright (c) 2004-2016 California Institute of Technology.
        Copyright (c) 2016-2022 The Uncertainty Quantification Foundation.
        All rights reserved.
        
        This software is available subject to the conditions and terms laid
        out below. By downloading and using this software you are agreeing
        to the following conditions.
        
        Redistribution and use in source and binary forms, with or without
        modification, are permitted provided that the following conditions
        are met::
        
            - Redistribution of source code must retain the above copyright
              notice, this list of conditions and the following disclaimer.
        
            - Redistribution in binary form must reproduce the above copyright
              notice, this list of conditions and the following disclaimer in the
              documentations and/or other materials provided with the distribution.
        
            - Neither the names of the copyright holders nor the names of any of
              the contributors may be used to endorse or promote products derived
              from this software without specific prior written permission.
        
        THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
        "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED
        TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
        PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
        CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
        EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
        PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS;
        OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
        WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR
        OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF
        ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
        
Project-URL: homepage, https://github.com/score-p/scorep_jupyter_kernel_python
Project-URL: repository, https://github.com/score-p/scorep_jupyter_kernel_python
Classifier: Programming Language :: Python :: 3
Requires-Python: >=3.8
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: ipykernel
Requires-Dist: ipywidgets
Requires-Dist: ipympl
Requires-Dist: jupyter-client
Requires-Dist: astunparse
Requires-Dist: dill
Requires-Dist: itables
Requires-Dist: matplotlib
Requires-Dist: pandas
Requires-Dist: pynvml

[![Unit Tests](https://github.com/score-p/scorep_jupyter_kernel_python/actions/workflows/unit_test.yml/badge.svg)](https://github.com/score-p/scorep_jupyter_kernel_python/actions/workflows/unit_test.yml)
[![Formatting](https://github.com/score-p/scorep_jupyter_kernel_python/actions/workflows/formatter.yml/badge.svg)](https://github.com/score-p/scorep_jupyter_kernel_python/actions/workflows/formatter.yml)
[![Static Analysis](https://github.com/score-p/scorep_jupyter_kernel_python/actions/workflows/linter.yml/badge.svg)](https://github.com/score-p/scorep_jupyter_kernel_python/actions/workflows/linter.yml)

<p align="center">
<img width="450" src="doc/JUmPER01.png"/>
</p>

# A Jupyter Kernel for Performance Engineering

This is the JUmPER Kernel that enables you to:

1. Monitor Jupyter cells and measure system metrics like cpu, gpu, I/O or memory utilization.

2. Instrument and trace or profile Jupyter cells with [Score-P](https://score-p.org/).

For binding to Score-P, the kernel uses the [Score-P Python bindings](https://github.com/score-p/scorep_binding_python). Monitoring does not rely on Score-P and you can use it without a Score-P installation.



# Table of Content

- [A Jupyter Kernel for Performance Engineering](#a-jupyter-kernel-for-performance-engineering)
- [Table of Content](#table-of-content)
- [Installation](#installation)
- [Usage](#usage)
  - [Monitoring](#monitoring)
  - [Score-P Instrumentation](#score-p-instrumentation)
    - [Configuring Score-P in Jupyter](#configuring-score-p-in-jupyter)
  - [Multi-Cell Mode](#multi-cell-mode)
  - [Write Mode](#write-mode)
- [Presentation of Performance Data](#presentation-of-performance-data)
- [Limitations](#limitations)
  - [Serialization Type Support](#serialization-type-support)
  - [Overhead](#overhead)
- [Future Work](#future-work)
- [Citing](#citing)
- [Contact](#contact)
- [Acknowledgments](#acknowledgments)

# Installation

To install the kernel and required dependencies for supporting the monitoring features:

```
pip install jumper-kernel
python -m jumper.install
```

You can also build the kernel from source via:

```
pip install .
```

The kernel will then be installed in your active python environment.
You can select the kernel in Jupyter as `jumper`.

**For using the Score-P features of the kernel you need a proper Score-P installation.**
Note: this is not required for the monitoring feature of system metrics.

```
pip install scorep
```

From the Score-P Python bindings:

> You need at least Score-P 5.0, build with `--enable-shared` and the gcc compiler plugin.
> Please make sure that `scorep-config` is in your `PATH` variable.
> For Ubuntu LTS systems there is a non-official ppa of Score-P available: https://launchpad.net/~andreasgocht/+archive/ubuntu/scorep .


# Usage

## Monitoring

Every cell that is executed will be monitored by a parallel running process that collects system metrics for CPU, Memory, IO and if available GPU. Besides that, Jumper forwards the execution of that code to the default Python kernel.

The frequency for performance monitoring can be set via the `JUMPER_REPORT_FREQUENCY`environment variable.

```
%env JUMPER_REPORT_FREQUENCY=2
```

Additionally, the number of reports required to store performance data can be defined by the `JUMPER_REPORTS_MIN` environment variable.

```
%env JUMPER_REPORTS_MIN=2
```

The performance data is recorded in-memory and the kernel provides several magic commands to display and interact with the data:


![](doc/code_history.png)

`%%display_code_history`

Shows the history of the code of monitored cells with index and timestamp.

`%%display_code_for_index`

Shows the code for the cell of the selected index.


![](doc/monitoring.gif)

`%%display_graph_for_last`

Shows the performance display for the last monitored cell.

`%%display_graph_for_index [index]`

Shows the performance display for the cell of the selected index.

`%%display_graph_for_all`

Shows the accumulated performance display for all monitored cells.

`%%perfdata_to_variable [varname]`

Exports the performance data to a variable

`%%perfdata_to_json [filename]`

Exports the performance data and the code to json files.

## Score-P Instrumentation

### Configuring Score-P in Jupyter

Set up your Score-P environment with `%env` line magic, e.g.:
```
%env SCOREP_ENABLE_TRACING=1
%env SCOREP_TOTAL_MEMORY=3g
```
For a documentation of Score-P environment variables, see: [Score-P Measurement Configuration](https://perftools.pages.jsc.fz-juelich.de/cicd/scorep/tags/latest/html/scorepmeasurementconfig.html).


`%%scorep_python_binding_arguments`

Set the Score-P Python bindings arguments. For a documentation of arguments, see [Score-P Python bindings](https://github.com/score-p/scorep_binding_python).

![](doc/pythonBindings_setup.png)

`%%marshalling_settings`

Set marshaller/serializer used for persistence and mode of communicating persistence between notebook and subprocess. Currently tested marshallers: `dill`, `cloudpickle`, `parallel_marshall`; modes of communication: `disk`, `memory`. If no arguments were provided, will print current configuration. Use:
```
%%marshalling_settings
MARSHALLER=[dill,cloudpickle]
MODE=[disk,memory]
```

When using persistence in `disk` mode, user can also define directory to which serializer output will be saved with `SCOREP_KERNEL_PERSISTENCE_DIR` environment variable.

`%%execute_with_scorep`

Executes a cell with Score-P, i.e. it calls `python -m scorep <cell code>`

![](doc/instrumentation.gif)



## Multi-Cell Mode
You can also treat multiple cells as one single cell by using the multi cell mode. Therefore you can mark the cells in the order you wish to execute them.


`%%enable_multicellmode`


Enables the multi-cell mode and starts the marking process. Subsequently, "running" cells will not execute them but mark them for execution after `%%finalize_multicellmode`.

`%%finalize_multicellmode`

Stop the marking process and executes all the marked cells.
All the marked cells will be executed with Score-P.

`%%abort_multicellmode`

Stops the marking process, without executing the cells.

**Hints**:
- The `%%execute_with_scorep` command has no effect in the multi cell mode.

- There is no "unmark" command available but you can abort the multicellmode by the `%%abort_multicellmode` command. Start your marking process again if you have marked your cells in the wrong order.

- The `%%enable_multicellmode`, `%%finalize_multicellmode` and `%%abort_multicellmode` commands should be run in an exclusive cell. Additional code in the cell will be ignored.

![](doc/mcm.gif)



## Write Mode

Analogous to [%%writefile](https://ipython.readthedocs.io/en/stable/interactive/magics.html#cellmagic-writefile) command in IPykernel, you can convert a set of cells to the Python script which is to be executed with Score-P Python bindings (with settings and environment described in auxillary bash script).

`%%start_writefile [scriptname]`

Enables the write mode and starts the marking process. Subsequently, "running" cells will not execute them but mark them for writing into a python file after `%%end_writefile`.
`scriptname` is `jupyter_to_script.py` by default.

`%%end_writefile`

Stops the marking process and writes the marked cells in a Python script. Additionally, a bash script will be created for setting the Score-P environment variables, Pyhton bindings arguments and executing the Python script.

**Hints**:
- Recording a cell containing `%%scorep_python_binding_arguments` will add the Score-P Python bindings to the bash script.

- Code of a cell which is not to be executed with Score-P (not inside the multicell mode and without `%%execute_with_scorep`) will be framed with `with scorep.instrumenter.disable()` in the Python script to prevent instrumentation.

- Other cells will be recorded without any changes, except for dropping all magic commands.

- `%%abort_multicellmode` will be ignored in the write mode and will not unmark previous cells from instrumentation.

![](doc/writemode.gif)


# Presentation of Performance Data

For the monitoring data, use the build-in magic commands or build your own visualizations after exporting the data to a variable or json via the introduced magic commands.

To inspect the Score-P collected performance data, use tools as Vampir (Trace) or Cube (Profile).

# Limitations 

## Serialization Type Support
For the execution of a cell, the kernel uses the default IPython kernel. For a cell with Score-P it starts a new Python process. Before starting this process, the state of the previous executed cells is persisted using `dill` (https://github.com/uqfoundation/dill) or `cloudpickle` (https://github.com/cloudpipe/cloudpickle/releases). However:

> `dill` cannot yet pickle these standard types:
> frame, generator, traceback

Similar yields for cloudpickle. Use the `%%marshalling_settings` magic command to switch between both depending on your needs.

## Overhead

When dealing with big data structures, there might be a big runtime overhead at the beginning and the end of a Score-P cell. This is due to additional data saving and loading processes for persistency in the background. However this does not affect the actual user code and the Score-P measurements.

# Future Work

The kernel is still under development. The following is on the agenda:
 
 - Provide perfmonitors for multi node setups
 - Config for default perfmonitor to define collected metrics
 
PRs are welcome.

# Citing

If you publish some work using the kernel, we would appreciate if you cite the following paper:

```
Werner, E., Manjunath, L., Frenzel, J., & Torge, S. (2021, October).
Bridging between Data Science and Performance Analysis: Tracing of Jupyter Notebooks.
In The First International Conference on AI-ML-Systems (pp. 1-7).
https://dl.acm.org/doi/abs/10.1145/3486001.3486249
```

Additionally, please refer to the Score-P Python bindings, published here:

```
Gocht A., Schöne R., Frenzel J. (2021)
Advanced Python Performance Monitoring with Score-P.
In: Mix H., Niethammer C., Zhou H., Nagel W.E., Resch M.M. (eds) Tools for High Performance Computing 2018 / 2019. Springer, Cham.
https://doi.org/10.1007/978-3-030-66057-4_14 
```

or

```
Gocht-Zech A., Grund A. and Schöne R. (2021)
Controlling the Runtime Overhead of Python Monitoring with Selective Instrumentation
In: 2021 IEEE/ACM International Workshop on Programming and Performance Visualization Tools (ProTools)
https://doi.org/10.1109/ProTools54808.2021.00008
``` 

# Contact

elias.werner@tu-dresden.de

# Acknowledgments

This work was supported by the German Federal Ministry of Education and Research (BMBF, SCADS22B) and the Saxon State Ministry for Science, Culture and Tourism (SMWK) by funding the competence center for Big Data and AI "ScaDS.AI Dresden/Leipzig


