Metadata-Version: 2.1
Name: blender-tissue-cartography
Version: 0.0.2
Summary: Pipeline for tissue extraction and analysis of surfaces from volumetric mircroscopy data using blender
Home-page: https://github.com/nikolas-claussen/blender-tissue-cartography
Author: Nikolas
Author-email: nclaussen@ucsb.edu
License: Apache Software License 2.0
Keywords: nbdev jupyter notebook python
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: Natural Language :: English
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: License :: OSI Approved :: Apache Software License
Requires-Python: >=3.7
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: matplotlib
Requires-Dist: numpy
Requires-Dist: scipy
Requires-Dist: scikit-image
Requires-Dist: tifffile
Requires-Dist: h5py
Requires-Dist: jupyter
Requires-Dist: tqdm
Requires-Dist: libigl
Provides-Extra: dev
Requires-Dist: nbdev; extra == "dev"
Requires-Dist: pymeshlab; extra == "dev"
Requires-Dist: trimesh; extra == "dev"
Requires-Dist: quaternionic; extra == "dev"
Requires-Dist: spherical; extra == "dev"

# blender-tissue-cartography


<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->

## What this tool does

Tissue cartography extracts and cartographically projects surfaces from
volumetric image data. This turns your 3d data into 2d data which is
much easier to visualize, analyze, and computationally process. Tissue
cartography is particularly useful in developmental biology, analyzing
3d microscopy data by taking advantage of the laminar, sheet-like
organization of many biological tissues. For more detail, see [Heemskerk
& Streichan 2015](https://doi.org/10.1038/nmeth.3648) and [Mitchell &
Cislo 2023](https://doi.org/10.1038/s41592-023-02081-w).

`blender_tissue_cartography` is a set of Python tools, template analysis
pipelines, and tutorials to do tissue cartography using the popular 3d
creation software [blender](https://www.blender.org/). The goal is to
make tissue cartography as user-friendly as possible using simple,
modular Python code and blender’s graphical user interface.

### Work in progress!

This project is a work in progress and will change rapidly. If you want
to use it, I recommend updating regularly via `git pull`.

- Tools for individual recordings are in a reasonably complete state
- Tools for dynamic recordings/movies are complete, but not fully tested
- Tutorials to be written: analysis in 3d

## Installation

1.  Install required non-python programs: [Fiji](https://fiji.sc/)
    (optional), [Ilastik](https://www.ilastik.org/),
    [Meshlab](https://www.meshlab.net/) (optional), and
    [Blender](https://www.blender.org/).

2.  Install Python via
    [anaconda/miniconda](https://docs.anaconda.com/miniconda/miniconda-install/),
    if you haven’t already.

    - If `conda` is unbearably slow for you, install
      [mamba](https://mamba.readthedocs.io/en/latest/index.html), a
      `conda` replacement which is much faster.

3.  Install `blender_tissue_cartography`:

    - run `pip install blender-tissue-cartography` in a command window.

4.  (Optional) Install extra Python library for `pymeshlab`, required
    for some advanced functionality (remeshing and surface
    reconstruction from within Python). Note that this package is not
    available on new ARM Apple computers.

    - run `pip install pymeshlab` in a command window

5.  (Optional) Install the Blender plugin
    [MicroscopyNodes](https://github.com/oanegros/MicroscopyNodes) for
    rendering volumetric `.tif` files in blender

This project is hosted on pip here:
https://pypi.org/project/blender-tissue-cartography/

### Developer installation

1.  [Clone this github
    repository](https://docs.github.com/en/repositories/creating-and-managing-repositories/cloning-a-repository),
    or simply download the code as a .zip file and unpack it (green
    button “Code”).

2.  Create a `conda` environment with all Python dependencies and
    install the `blender_tissue_cartography` module. Open a command
    window in the `blender-tissue-cartography` directory and type:

    - `conda env create -n blender_tissue_cartography -f environment.yml`
    - `conda activate blender_tissue_cartography`
    - `pip install -e .`

3.  (Optional) Install extra Python library for `pymeshlab`, required
    for some advanced functionality (remeshing and surface
    reconstruction from within Python).

    - `pip install pymeshlab` - Note that this package is not available
      on new ARM Apple computers.

4.  Install [nbdev](https://nbdev.fast.ai/)

## Documentation

Full documentation (including jupyter tutorials is available here:
https://nikolas-claussen.github.io/blender-tissue-cartography/

## Usage

Some fully worked out examples are provided in the `nbs/Tutorials/`
folder. You can look at the jupyter notebooks on
https://nikolas-claussen.github.io/blender-tissue-cartography/ without
downloading anything.

To run a tutorial on your computer, follow the installation instructions
and then [launch
jupyter](https://docs.jupyter.org/en/latest/running.html) and work
through the notebooks in the `tutorial` directory in order. If you are
impatient, jump directly to `nbs/Tutorials/03_basics_example.ipynb`. I
recommended being comfortable with running simple Python code (you don’t
have to do any coding yourself). The basic user interface of blender is
explained in `nbs/Tutorials/02_blender_tutorial.ipynb`.

In general, for each tissue cartography project, first, create a folder
to hold your data and results. You run the `blender_tissue_cartography`
pipeline from a jupyter computational notebook, which can also serve as
your lab notebook (notes, comments on the data). Use one of the tutorial
jupyter notebooks as a template with instructions. As you work through
the notebook, you will:

1.  create a segmentation of your 3d data

2.  convert the segmentation into a mesh of your surface of interest

3.  load the mesh into blender to map to unwrap it into the plane

4.  make a cartographic projection of your 3d data using the unwrapped
    mesh

5.  visualize the results in 3d using blender.

Below is a screenshot to give you an idea of the workflow for the
example *Drosophila* dataset: Volumetric data in ImageJ (center),
jupyter computational notebook to run the `blender_tissue_cartography`
module (left), and blender project with extracted mesh and texture
(right).

![image.png](index_files/figure-commonmark/cell-7-1-image.png)

In this pipeline, you can edit meshes and cartographic projections
interactively - you can create a preliminary projection of your data
automatically, and use it as guidance when editing your cartographic map
in blender. Here, we edit the “seam” of our cartographic map based on
the region occupied by cells during zebrafish epiboly (tutorial 6).

![image-2.png](index_files/figure-commonmark/cell-8-1-image-2.png)

#### Notes for Python beginners

- You will need a working Python installation (see here: [installing
  anaconda/miniconda](https://docs.anaconda.com/miniconda/miniconda-install/),
  and know how to [launch jupyter
  notebooks](https://docs.jupyter.org/en/latest/running.html). You will
  run the computational notebooks in your browser. Here is a [video
  tutorial](https://www.youtube.com/watch?v=HW29067qVWk)

- Create a new folder for each tissue cartography project. Do not place
  them into the folder into which you unpacked
  `blender_tissue_cartography` - otherwise, your files will be
  overwritten if you want to update the software

- The repository contains two sets of notebooks: in the `nbs` folder and
  in the `nbs/Tutorials` folder. The `nbs`-notebooks are for developing
  the code. If you don’t want to develop/adapt the code to your needs,
  you don’t need to look at them. Copy a notebook from the
  `nbs/Tutorials` folder - e.g. `03_basics_example.ipynb` - into your
  project folder to use it as a template.

- You do not need to copy functions into your notebooks manually. If you
  follow the installation instructions, the code will be installed as a
  Python package and can be “imported” by Python. See tutorials!

## Software stack

Note: the Python libraries will be installed automatically if you follow
the installation instructions above.

### Required

- Python, with the following libraries
  - [jupyter](https://jupyter.org/)
  - [numpy](https://numpy.org/) / [Matplotlib](https://matplotlib.org/)
    / [Scipy](https://scipy.org/)
  - [skimage](https://scikit-image.org) various image processing tools.
  - [h5py](https://www.h5py.org/) for reading/writing of `.h5` files.
  - [tifffile](https://github.com/cgohlke/tifffile/) for reading/writing
    of `.tif` files, including metadata.
  - [libigl](https://libigl.github.io/libigl-python-bindings) Geometry
    processing.
- [Ilastik](https://www.ilastik.org/) Image classification and
  segmentation,
- [Blender](https://www.blender.org/) Mesh editing and UV mapping.

### Optional

- [Meshlab](https://www.meshlab.net/) GUI and Python library with
  advanced surface reconstruction tools (required for some workflows).

- Python libraries:

  - [PyMeshLab](https://pymeshlab.readthedocs.io/en/latest/index.html)
    Python interface to MeshLab.
  - [nbdev](https://nbdev.fast.ai/tutorials/tutorial.html) for
    notebook-based development, if you want to add your own code

### Other useful software

- [MicroscopyNodes](https://github.com/oanegros/MicroscopyNodes) plug-in
  for rendering volumetric `.tif` files in blender
- [Boundary First
  Flattening](https://github.com/GeometryCollective/boundary-first-flattening)
  advanced tool for creating UV maps with graphical and command line
  interface
- [pyFM](https://github.com/RobinMagnet/pyFM) python library for
  mesh-to-mesh registration (for dynamic data) which may complement the
  algorithms that ship with `blender_tissue-cartography`

## Acknowledgements

This software is being developed by Nikolas Claussen in the [Streichan
lab at UCSB](https://streichanlab.physics.ucsb.edu/). Noah Mitchell,
Susan Wopat, and Matthew Lefebvre contributed example data. Sean Komura
and Gary Han tested the software. Dillon Cislo provided advice on
surface-surface registration.
