DIY Interactive Segmentation with napari#
napari is a very flexible and “hackable” tool. In this tutorial we will make a custom interactive segmentation tool from scratch.
In this tutorial we will write an interactive segmentation tool and use it on data hosted on Zebrahub.
Setup#
# this cell is required to run these notebooks on Binder. Make sure that you also have a desktop tab open.
import os
if 'BINDER_SERVICE_HOST' in os.environ:
os.environ['DISPLAY'] = ':1.0'
!pip install scikit-learn scikit-image ome-zarr
Collecting scikit-learn
Downloading scikit_learn-1.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (11 kB)
Requirement already satisfied: scikit-image in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (0.22.0)
Collecting ome-zarr
Downloading ome_zarr-0.8.2-py3-none-any.whl.metadata (3.4 kB)
Requirement already satisfied: numpy<2.0,>=1.17.3 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from scikit-learn) (1.26.1)
Requirement already satisfied: scipy>=1.5.0 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from scikit-learn) (1.11.3)
Collecting joblib>=1.1.1 (from scikit-learn)
Downloading joblib-1.3.2-py3-none-any.whl.metadata (5.4 kB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn)
Downloading threadpoolctl-3.2.0-py3-none-any.whl.metadata (10.0 kB)
Requirement already satisfied: networkx>=2.8 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from scikit-image) (3.2)
Requirement already satisfied: pillow>=9.0.1 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from scikit-image) (10.1.0)
Requirement already satisfied: imageio>=2.27 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from scikit-image) (2.31.5)
Requirement already satisfied: tifffile>=2022.8.12 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from scikit-image) (2023.9.26)
Requirement already satisfied: packaging>=21 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from scikit-image) (23.2)
Requirement already satisfied: lazy_loader>=0.3 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from scikit-image) (0.3)
Collecting aiohttp<4 (from ome-zarr)
Downloading aiohttp-3.8.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.7 kB)
Requirement already satisfied: dask in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from ome-zarr) (2023.10.0)
Collecting distributed (from ome-zarr)
Downloading distributed-2023.10.0-py3-none-any.whl.metadata (3.4 kB)
Requirement already satisfied: fsspec!=2021.07.0,!=2023.09.0,>=0.8 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from fsspec[s3]!=2021.07.0,!=2023.09.0,>=0.8->ome-zarr) (2023.9.2)
Requirement already satisfied: requests in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from ome-zarr) (2.31.0)
Requirement already satisfied: toolz in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from ome-zarr) (0.12.0)
Collecting zarr>=2.8.1 (from ome-zarr)
Downloading zarr-2.16.1-py3-none-any.whl.metadata (5.8 kB)
Requirement already satisfied: attrs>=17.3.0 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from aiohttp<4->ome-zarr) (23.1.0)
Requirement already satisfied: charset-normalizer<4.0,>=2.0 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from aiohttp<4->ome-zarr) (3.3.0)
Collecting multidict<7.0,>=4.5 (from aiohttp<4->ome-zarr)
Downloading multidict-6.0.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (114 kB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/114.5 kB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 114.5/114.5 kB 9.7 MB/s eta 0:00:00
?25h
Collecting async-timeout<5.0,>=4.0.0a3 (from aiohttp<4->ome-zarr)
Downloading async_timeout-4.0.3-py3-none-any.whl.metadata (4.2 kB)
Collecting yarl<2.0,>=1.0 (from aiohttp<4->ome-zarr)
Downloading yarl-1.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (268 kB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/268.8 kB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 268.8/268.8 kB 14.6 MB/s eta 0:00:00
?25h
Collecting frozenlist>=1.1.1 (from aiohttp<4->ome-zarr)
Downloading frozenlist-1.4.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.2 kB)
Collecting aiosignal>=1.1.2 (from aiohttp<4->ome-zarr)
Downloading aiosignal-1.3.1-py3-none-any.whl (7.6 kB)
Collecting s3fs (from fsspec[s3]!=2021.07.0,!=2023.09.0,>=0.8->ome-zarr)
Downloading s3fs-2023.9.2-py3-none-any.whl.metadata (1.6 kB)
Collecting asciitree (from zarr>=2.8.1->ome-zarr)
Downloading asciitree-0.3.3.tar.gz (4.0 kB)
Preparing metadata (setup.py) ... ?25l-
done
?25hCollecting fasteners (from zarr>=2.8.1->ome-zarr)
Downloading fasteners-0.19-py3-none-any.whl.metadata (4.9 kB)
Collecting numcodecs>=0.10.0 (from zarr>=2.8.1->ome-zarr)
Downloading numcodecs-0.12.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.8 kB)
Requirement already satisfied: click>=8.0 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from dask->ome-zarr) (8.1.7)
Requirement already satisfied: cloudpickle>=1.5.0 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from dask->ome-zarr) (3.0.0)
Requirement already satisfied: partd>=1.2.0 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from dask->ome-zarr) (1.4.1)
Requirement already satisfied: pyyaml>=5.3.1 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from dask->ome-zarr) (6.0.1)
Requirement already satisfied: importlib-metadata>=4.13.0 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from dask->ome-zarr) (6.8.0)
Requirement already satisfied: jinja2>=2.10.3 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from distributed->ome-zarr) (3.1.2)
Requirement already satisfied: locket>=1.0.0 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from distributed->ome-zarr) (1.0.0)
Collecting msgpack>=1.0.0 (from distributed->ome-zarr)
Downloading msgpack-1.0.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (9.1 kB)
Requirement already satisfied: psutil>=5.7.2 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from distributed->ome-zarr) (5.9.6)
Collecting sortedcontainers>=2.0.5 (from distributed->ome-zarr)
Downloading sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting tblib>=1.6.0 (from distributed->ome-zarr)
Downloading tblib-2.0.0-py3-none-any.whl.metadata (25 kB)
Requirement already satisfied: tornado>=6.0.4 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from distributed->ome-zarr) (6.3.3)
Requirement already satisfied: urllib3>=1.24.3 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from distributed->ome-zarr) (2.0.7)
Collecting zict>=3.0.0 (from distributed->ome-zarr)
Downloading zict-3.0.0-py2.py3-none-any.whl (43 kB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/43.3 kB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 43.3/43.3 kB 10.2 MB/s eta 0:00:00
?25hRequirement already satisfied: idna<4,>=2.5 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from requests->ome-zarr) (3.4)
Requirement already satisfied: certifi>=2017.4.17 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from requests->ome-zarr) (2023.7.22)
Requirement already satisfied: zipp>=0.5 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from importlib-metadata>=4.13.0->dask->ome-zarr) (3.17.0)
Requirement already satisfied: MarkupSafe>=2.0 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from jinja2>=2.10.3->distributed->ome-zarr) (2.1.3)
Collecting aiobotocore~=2.5.4 (from s3fs->fsspec[s3]!=2021.07.0,!=2023.09.0,>=0.8->ome-zarr)
Downloading aiobotocore-2.5.4-py3-none-any.whl.metadata (19 kB)
Collecting botocore<1.31.18,>=1.31.17 (from aiobotocore~=2.5.4->s3fs->fsspec[s3]!=2021.07.0,!=2023.09.0,>=0.8->ome-zarr)
Downloading botocore-1.31.17-py3-none-any.whl.metadata (5.9 kB)
Requirement already satisfied: wrapt<2.0.0,>=1.10.10 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from aiobotocore~=2.5.4->s3fs->fsspec[s3]!=2021.07.0,!=2023.09.0,>=0.8->ome-zarr) (1.15.0)
Collecting aioitertools<1.0.0,>=0.5.1 (from aiobotocore~=2.5.4->s3fs->fsspec[s3]!=2021.07.0,!=2023.09.0,>=0.8->ome-zarr)
Downloading aioitertools-0.11.0-py3-none-any.whl (23 kB)
Collecting jmespath<2.0.0,>=0.7.1 (from botocore<1.31.18,>=1.31.17->aiobotocore~=2.5.4->s3fs->fsspec[s3]!=2021.07.0,!=2023.09.0,>=0.8->ome-zarr)
Downloading jmespath-1.0.1-py3-none-any.whl (20 kB)
Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from botocore<1.31.18,>=1.31.17->aiobotocore~=2.5.4->s3fs->fsspec[s3]!=2021.07.0,!=2023.09.0,>=0.8->ome-zarr) (2.8.2)
Collecting urllib3>=1.24.3 (from distributed->ome-zarr)
Downloading urllib3-1.26.18-py2.py3-none-any.whl.metadata (48 kB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/48.9 kB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.9/48.9 kB 11.0 MB/s eta 0:00:00
?25h
Requirement already satisfied: six>=1.5 in /opt/hostedtoolcache/Python/3.10.13/x64/lib/python3.10/site-packages (from python-dateutil<3.0.0,>=2.1->botocore<1.31.18,>=1.31.17->aiobotocore~=2.5.4->s3fs->fsspec[s3]!=2021.07.0,!=2023.09.0,>=0.8->ome-zarr) (1.16.0)
Downloading scikit_learn-1.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (10.8 MB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/10.8 MB ? eta -:--:--
━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.7/10.8 MB 20.1 MB/s eta 0:00:01
━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/10.8 MB 22.2 MB/s eta 0:00:01
━━━━━━━━━╸━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.7/10.8 MB 25.6 MB/s eta 0:00:01
━━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━ 4.1/10.8 MB 29.4 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━╸━━━━━━━━━━━━━━━━━ 6.1/10.8 MB 34.4 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━ 8.8/10.8 MB 41.2 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸ 10.8/10.8 MB 50.3 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.8/10.8 MB 45.9 MB/s eta 0:00:00
?25hDownloading ome_zarr-0.8.2-py3-none-any.whl (36 kB)
Downloading aiohttp-3.8.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/1.0 MB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.0/1.0 MB 80.4 MB/s eta 0:00:00
?25hDownloading joblib-1.3.2-py3-none-any.whl (302 kB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/302.2 kB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 302.2/302.2 kB 41.2 MB/s eta 0:00:00
?25hDownloading threadpoolctl-3.2.0-py3-none-any.whl (15 kB)
Downloading zarr-2.16.1-py3-none-any.whl (206 kB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/206.9 kB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 206.9/206.9 kB 45.7 MB/s eta 0:00:00
?25hDownloading distributed-2023.10.0-py3-none-any.whl (1.0 MB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/1.0 MB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.0/1.0 MB 80.2 MB/s eta 0:00:00
?25h
Downloading async_timeout-4.0.3-py3-none-any.whl (5.7 kB)
Downloading frozenlist-1.4.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (225 kB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/225.7 kB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 225.7/225.7 kB 44.2 MB/s eta 0:00:00
?25hDownloading msgpack-1.0.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (530 kB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/530.8 kB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 530.8/530.8 kB 79.2 MB/s eta 0:00:00
?25h
Downloading numcodecs-0.12.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.7 MB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/7.7 MB ? eta -:--:--
━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.0/7.7 MB 125.8 MB/s eta 0:00:01
━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.0/7.7 MB 125.8 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━╸━━━━━━━━━━━━━━━━━━━━━ 3.7/7.7 MB 35.0 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━╸━━━━━━━━━━━━━━━━━━ 4.2/7.7 MB 38.2 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸ 7.7/7.7 MB 46.2 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.7/7.7 MB 42.5 MB/s eta 0:00:00
?25hDownloading tblib-2.0.0-py3-none-any.whl (11 kB)
Downloading fasteners-0.19-py3-none-any.whl (18 kB)
Downloading s3fs-2023.9.2-py3-none-any.whl (28 kB)
Downloading aiobotocore-2.5.4-py3-none-any.whl (73 kB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/73.4 kB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 73.4/73.4 kB 20.5 MB/s eta 0:00:00
?25hDownloading botocore-1.31.17-py3-none-any.whl (11.1 MB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/11.1 MB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━━━━━━ 5.6/11.1 MB 167.2 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸ 11.1/11.1 MB 158.8 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.1/11.1 MB 115.9 MB/s eta 0:00:00
?25hDownloading urllib3-1.26.18-py2.py3-none-any.whl (143 kB)
?25l ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/143.8 kB ? eta -:--:--
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 143.8/143.8 kB 31.7 MB/s eta 0:00:00
?25h
Building wheels for collected packages: asciitree
Building wheel for asciitree (setup.py) ... ?25l-
\
|
done
?25h Created wheel for asciitree: filename=asciitree-0.3.3-py3-none-any.whl size=5034 sha256=c6f08c6aa499ccbb5b154ec2ed5a48af58aeda7e8d841b020cae0a892a48b222
Stored in directory: /home/runner/.cache/pip/wheels/7f/4e/be/1171b40f43b918087657ec57cf3b81fa1a2e027d8755baa184
Successfully built asciitree
Installing collected packages: sortedcontainers, asciitree, zict, urllib3, threadpoolctl, tblib, numcodecs, multidict, msgpack, joblib, jmespath, frozenlist, fasteners, async-timeout, aioitertools, zarr, yarl, scikit-learn, botocore, aiosignal, distributed, aiohttp, aiobotocore, s3fs, ome-zarr
Attempting uninstall: urllib3
Found existing installation: urllib3 2.0.7
Uninstalling urllib3-2.0.7:
Successfully uninstalled urllib3-2.0.7
Successfully installed aiobotocore-2.5.4 aiohttp-3.8.6 aioitertools-0.11.0 aiosignal-1.3.1 asciitree-0.3.3 async-timeout-4.0.3 botocore-1.31.17 distributed-2023.10.0 fasteners-0.19 frozenlist-1.4.0 jmespath-1.0.1 joblib-1.3.2 msgpack-1.0.7 multidict-6.0.4 numcodecs-0.12.1 ome-zarr-0.8.2 s3fs-2023.9.2 scikit-learn-1.3.1 sortedcontainers-2.4.0 tblib-2.0.0 threadpoolctl-3.2.0 urllib3-1.26.18 yarl-1.9.2 zarr-2.16.1 zict-3.0.0
[notice] A new release of pip is available: 23.0.1 -> 23.3
[notice] To update, run: pip install --upgrade pip
from appdirs import user_data_dir
import os
import zarr
import dask.array as da
import toolz as tz
from sklearn.ensemble import RandomForestClassifier
from skimage import data, segmentation, feature, future
from skimage.feature import multiscale_basic_features
from skimage.io import imread, imshow
import numpy as np
from functools import partial
import napari
import threading
from ome_zarr.io import parse_url
from ome_zarr.reader import Reader
from functools import partial
from psygnal import debounced
from superqt import ensure_main_thread
import logging
import sys
LOGGER = logging.getLogger("halfway_to_i2k_2023_america")
LOGGER.setLevel(logging.DEBUG)
streamHandler = logging.StreamHandler(sys.stdout)
formatter = logging.Formatter(
"%(asctime)s - %(name)s - %(levelname)s - %(message)s"
)
streamHandler.setFormatter(formatter)
LOGGER.addHandler(streamHandler)
Reading in data#
Get data from OpenOrganelle.
def open_zebrahub():
url = "https://public.czbiohub.org/royerlab/zebrahub/imaging/single-objective/ZSNS002.ome.zarr/"
# read the image data
parse_url(url, mode="r").store
reader = Reader(parse_url(url))
# nodes may include images, labels etc
nodes = list(reader())
# first node will be the image pixel data
image_node = nodes[0]
dask_data = image_node.data
return dask_data
zebrahub_data = open_zebrahub()
zebrahub_data[3].shape
(1100, 1, 42, 304, 657)
# Let's choose a crop to work on
crop_3D = zebrahub_data[3][800, 0, :, :, :]
crop_3D.shape
(42, 304, 657)
Visualize in napari#
viewer = napari.Viewer()
scale = (9.92, 3.512, 3.512)
contrast_limits = (0, 372)
data_layer = viewer.add_image(crop_3D, scale=scale, contrast_limits=contrast_limits)
data_layer.bounding_box.visible = True
Extracting features#
def extract_features(image, feature_params):
features_func = partial(
multiscale_basic_features,
intensity=feature_params["intensity"],
edges=feature_params["edges"],
texture=feature_params["texture"],
sigma_min=feature_params["sigma_min"],
sigma_max=feature_params["sigma_max"],
channel_axis=None,
)
# print(f"image shape {image.shape} feature params {feature_params}")
features = features_func(np.squeeze(image))
return features
example_feature_params = {
"sigma_min": 1,
"sigma_max": 5,
"intensity": True,
"edges": True,
"texture": True,
}
features = extract_features(crop_3D, example_feature_params)
features.shape
(42, 304, 657, 15)
Visualize Features#
What do these features we are extracting look like?
def show_features():
for feature_idx in range(features.shape[-1]):
viewer.add_image(features[:, :, :, feature_idx])
# show_features()
Making the Interactive Segmentation Tool!#
Ok, now we’ve seen:
our data
some features we can compute for our data
Our goal is to create an image where we have labels that correspond to the zebrafish sample.
The approach is that when we annotate/draw in our painting layer, then we want our segmentations to be updated automatically.
We will do this using 3 different image layers:
Our input image
A layer for painting
A layer for storing the machine learning generated predictions
Due to popular demand we will be using Zarr to store these layers, because that will help this approach scale to very large datasets. However, we could have used numpy arrays as well.
Create our painting and prediction layers#
zarr_path = os.path.join(user_data_dir("halfway_to_i2k_2023_america", "napari"), "diy_segmentation.zarr")
print(f"Saving outputs to zarr path: {zarr_path}")
# Create a prediction layer
prediction_data = zarr.open(
f"{zarr_path}/prediction",
mode='a',
shape=crop_3D.shape,
dtype='i4',
dimension_separator="/",
)
prediction_layer = viewer.add_labels(prediction_data, name="Prediction", scale=data_layer.scale)
# Create a painting layer
painting_data = zarr.open(
f"{zarr_path}/painting",
mode='a',
shape=crop_3D.shape,
dtype='i4',
dimension_separator="/",
)
painting_layer = viewer.add_labels(painting_data, name="Painting", scale=data_layer.scale)
Saving outputs to zarr path: /home/runner/.local/share/halfway_to_i2k_2023_america/diy_segmentation.zarr
painting_data.shape
(42, 304, 657)
Let’s make a UI as well#
from qtpy.QtWidgets import (
QVBoxLayout,
QHBoxLayout,
QComboBox,
QLabel,
QCheckBox,
QDoubleSpinBox,
QGroupBox,
QWidget,
)
class NapariMLWidget(QWidget):
def __init__(self, parent=None):
super(NapariMLWidget, self).__init__(parent)
self.initUI()
def initUI(self):
layout = QVBoxLayout()
# Dropdown for selecting the model
model_label = QLabel("Select Model")
self.model_dropdown = QComboBox()
self.model_dropdown.addItems(["Random Forest"])
model_layout = QHBoxLayout()
model_layout.addWidget(model_label)
model_layout.addWidget(self.model_dropdown)
layout.addLayout(model_layout)
# Select the range of sigma sizes
self.sigma_start_spinbox = QDoubleSpinBox()
self.sigma_start_spinbox.setRange(0, 256)
self.sigma_start_spinbox.setValue(1)
self.sigma_end_spinbox = QDoubleSpinBox()
self.sigma_end_spinbox.setRange(0, 256)
self.sigma_end_spinbox.setValue(5)
sigma_layout = QHBoxLayout()
sigma_layout.addWidget(QLabel("Sigma Range: From"))
sigma_layout.addWidget(self.sigma_start_spinbox)
sigma_layout.addWidget(QLabel("To"))
sigma_layout.addWidget(self.sigma_end_spinbox)
layout.addLayout(sigma_layout)
# Boolean options for features
self.intensity_checkbox = QCheckBox("Intensity")
self.intensity_checkbox.setChecked(True)
self.edges_checkbox = QCheckBox("Edges")
self.texture_checkbox = QCheckBox("Texture")
self.texture_checkbox.setChecked(True)
features_group = QGroupBox("Features")
features_layout = QVBoxLayout()
features_layout.addWidget(self.intensity_checkbox)
features_layout.addWidget(self.edges_checkbox)
features_layout.addWidget(self.texture_checkbox)
features_group.setLayout(features_layout)
layout.addWidget(features_group)
# Dropdown for data selection
data_label = QLabel("Select Data for Model Fitting")
self.data_dropdown = QComboBox()
self.data_dropdown.addItems(
["Current Displayed Region", "Whole Image"]
)
self.data_dropdown.setCurrentText("Current Displayed Region")
data_layout = QHBoxLayout()
data_layout.addWidget(data_label)
data_layout.addWidget(self.data_dropdown)
layout.addLayout(data_layout)
# Checkbox for live model fitting
self.live_fit_checkbox = QCheckBox("Live Model Fitting")
self.live_fit_checkbox.setChecked(True)
layout.addWidget(self.live_fit_checkbox)
# Checkbox for live prediction
self.live_pred_checkbox = QCheckBox("Live Prediction")
self.live_pred_checkbox.setChecked(True)
layout.addWidget(self.live_pred_checkbox)
self.setLayout(layout)
# Let's add this widget to napari
widget = NapariMLWidget()
viewer.window.add_dock_widget(widget, name="halfway to I2K 2023 America")
We have a widget, we have our painting and prediction layers, now what?#
We need to start connecting things together. How should we do that? napari has things called “events” that happen when things happen within napari. We want to respond to a few different event types:
changes in camera (e.g. camera position and rotation)
changes in “dims” (e.g. moving a dimension slider)
painting events (e.g. a user clicked, painted, and release their mouse)
When one of these events happens, we want to:
update our machine learning model with the new painted data
update our prediction with the updated ML model
# Let's start with our event listener
# We use "curry" because this allows us to "store" our viewer and widget for later use
@tz.curry
def on_data_change(event, viewer=None, widget=None):
corner_pixels = data_layer.corner_pixels
# Ensure the painting layer visual is updated
painting_layer.refresh()
# Training the ML model and generating predictions can take time
# we will use a "thread" to perform these calculations
# otherwise napari will freeze until these
calculations are done
thread = threading.Thread(
target=threaded_on_data_change,
args=(
event,
corner_pixels,
viewer.dims,
widget.model_dropdown.currentText(),
{
"sigma_min": widget.sigma_start_spinbox.value(),
"sigma_max": widget.sigma_end_spinbox.value(),
"intensity": widget.intensity_checkbox.isChecked(),
"edges": widget.edges_checkbox.isChecked(),
"texture": widget.texture_checkbox.isChecked(),
},
widget.live_fit_checkbox.isChecked(),
widget.live_pred_checkbox.isChecked(),
widget.data_dropdown.currentText(),
),
)
thread.start()
thread.join()
# Ensure the prediction layer visual is updated
prediction_layer.refresh()
# Now we have to make the hard part of the listener
model = None
def threaded_on_data_change(
event,
corner_pixels,
dims,
model_type,
feature_params,
live_fit,
live_prediction,
data_choice,
):
global model
LOGGER.info(f"Labels data has changed! {event}")
current_step = dims.current_step
# Find a mask of indices we will use for fetching our data
mask_idx = (slice(viewer.dims.current_step[0], viewer.dims.current_step[0]+1), slice(corner_pixels[0, 1], corner_pixels[1, 1]), slice(corner_pixels[0, 2], corner_pixels[1, 2]))
if data_choice == "Whole Image":
mask_idx = tuple([slice(0, sz) for sz in data_layer.data.shape])
LOGGER.info(f"mask idx {mask_idx}, image {data_layer.data.shape}")
active_image = data_layer.data[mask_idx]
LOGGER.info(
f"active image shape {active_image.shape} data choice {data_choice} painting_data {painting_data.shape} mask_idx {mask_idx}"
)
active_labels = painting_data[mask_idx]
def compute_features(image, feature_params):
"""Compute features for each channel and concatenate them."""
features = extract_features(
image, feature_params
)
return features
training_labels = None
if data_choice == "Current Displayed Region":
# Use only the currently displayed region.
training_features = compute_features(
active_image, feature_params
)
training_labels = np.squeeze(active_labels)
else:
raise ValueError(f"Invalid data choice: {data_choice}")
if (training_labels is None) or np.any(training_labels.shape == 0):
LOGGER.info("No training data yet. Skipping model update")
elif live_fit:
# Retrain model
LOGGER.info(
f"training model with labels {training_labels.shape} features {training_features.shape} unique labels {np.unique(training_labels[:])}"
)
model = update_model(training_labels, training_features, model_type)
# Don't do live prediction on whole image, that happens earlier slicewise
if live_prediction:
# Update prediction_data
prediction_features = compute_features(
active_image, feature_params
)
# Add 1 becasue of the background label adjustment for the model
prediction = predict(model, prediction_features, model_type)
LOGGER.info(
f"prediction {prediction.shape} prediction layer {prediction_layer.data.shape} prediction {np.transpose(prediction).shape} features {prediction_features.shape}"
)
if data_choice == "Whole Image":
prediction_layer.data[mask_idx] = np.transpose(prediction)
else:
prediction_layer.data[mask_idx] = np.transpose(prediction)[
np.newaxis, :
]
# Model training function that respects widget's model choice
def update_model(labels, features, model_type):
features = features[labels > 0, :]
# We shift labels - 1 because background is 0 and has special meaning, but models need to start at 0
labels = labels[labels > 0] - 1
if model_type == "Random Forest":
clf = RandomForestClassifier(
n_estimators=50, n_jobs=-1, max_depth=10, max_samples=0.05
)
print(
f"updating model with label shape {labels.shape} feature shape {features.shape} unique labels {np.unique(labels)}"
)
clf.fit(features, labels)
return clf
def predict(model, features, model_type):
# We shift labels + 1 because background is 0 and has special meaning
prediction = future.predict_segmenter(features.reshape(-1, features.shape[-1]), model).reshape(features.shape[:-1]) + 1
return np.transpose(prediction)
# Now connect everything together
for listener in [
viewer.camera.events,
viewer.dims.events,
painting_layer.events.paint,
]:
listener.connect(
debounced(
ensure_main_thread(
on_data_change(
viewer=viewer,
widget=widget, # pass the widget instance for easy access to settings
)
),
timeout=1000,
)
)