Skip to content

Commit

Permalink
feat: ✨ Add COSEM models and update README.
Browse files Browse the repository at this point in the history
  • Loading branch information
rhoadesScholar committed Mar 7, 2024
1 parent 6311999 commit d5a4975
Show file tree
Hide file tree
Showing 29 changed files with 1,061 additions and 1 deletion.
27 changes: 27 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,30 @@
[![black](https://github.com/janelia-cellmap/cellmap.models/actions/workflows/black.yaml/badge.svg)](https://github.com/janelia-cellmap/cellmap.models/actions/workflows/black.yaml)
[![mypy](https://github.com/janelia-cellmap/cellmap.models/actions/workflows/mypy.yaml/badge.svg)](https://github.com/janelia-cellmap/cellmap.models/actions/workflows/mypy.yaml)
[![codecov](https://codecov.io/gh/janelia-cellmap/cellmap.models/branch/main/graph/badge.svg)](https://codecov.io/gh/janelia-cellmap/cellmap.models)

This package contains the models used for segmention by the CellMap project team at HHMI Janelia.

## Installation

```bash
git clone https://github.com/janelia-cellmap/cellmap.models
cd cellmap.models
conda env create -n cellmap python=3.10
conda activate cellmap
pip install .
```

## Usage

```python
import cellmap.models
```

Different models are available in the `cellmap.models` module. For example, to use the models produced by the `COSEM` pilot project team, and published as part of [Whole-cell organelle segmentation in volume electron microscopy](https://doi.org/10.1038/s41586-021-03977-3):

```python
import cellmap.models.cosem as cosem_models
model = cosem_models.load_model('setup04/1820500')
```

More information on each set of models and how to use them is available in the `README.md` file in the corresponding subdirectory.
6 changes: 5 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,11 @@ authors = [
{ email = "[email protected]", name = "CellMap" },
]
dynamic = ["version"]
dependencies = []
dependencies = [
'torch',
'torchvision',
'numpy'
]

[project.optional-dependencies]
dev = [
Expand Down
55 changes: 55 additions & 0 deletions src/cellmap.models.egg-info/PKG-INFO
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
Metadata-Version: 2.1
Name: cellmap.models
Version: 0.0.0
Summary: Repository of model architectures and network weights used for CellMap segmentations.
Author-email: CellMap <[email protected]>
License: BSD 3-Clause License
Project-URL: homepage, https://github.com/janelia-cellmap/cellmap.models
Project-URL: repository, https://github.com/janelia-cellmap/cellmap.models
Classifier: Programming Language :: Python :: 3
Requires-Python: >=3.7
Description-Content-Type: text/markdown
Requires-Dist: torch
Requires-Dist: torchvision
Requires-Dist: numpy
Provides-Extra: dev
Requires-Dist: pytest; extra == "dev"
Requires-Dist: pytest-cov; extra == "dev"
Requires-Dist: black; extra == "dev"
Requires-Dist: mypy; extra == "dev"
Requires-Dist: pdoc; extra == "dev"
Requires-Dist: pre-commit; extra == "dev"

# cellmap.models

[![tests](https://github.com/janelia-cellmap/cellmap.models/actions/workflows/tests.yaml/badge.svg)](https://github.com/janelia-cellmap/cellmap.models/actions/workflows/tests.yaml)
[![black](https://github.com/janelia-cellmap/cellmap.models/actions/workflows/black.yaml/badge.svg)](https://github.com/janelia-cellmap/cellmap.models/actions/workflows/black.yaml)
[![mypy](https://github.com/janelia-cellmap/cellmap.models/actions/workflows/mypy.yaml/badge.svg)](https://github.com/janelia-cellmap/cellmap.models/actions/workflows/mypy.yaml)
[![codecov](https://codecov.io/gh/janelia-cellmap/cellmap.models/branch/main/graph/badge.svg)](https://codecov.io/gh/janelia-cellmap/cellmap.models)

This package contains the models used for segmention by the CellMap project team at HHMI Janelia.

## Installation

```bash
git clone https://github.com/janelia-cellmap/cellmap.models
cd cellmap.models
conda env create -n cellmap python=3.10
conda activate cellmap
pip install .
```

## Usage

```python
import cellmap.models
```

Different models are available in the `cellmap.models` module. For example, to use the models produced by the `COSEM` pilot project team, and published as part of [Whole-cell organelle segmentation in volume electron microscopy](https://doi.org/10.1038/s41586-021-03977-3):

```python
import cellmap.models.cosem as cosem_models
model = cosem_models.load_model('setup04/1820500')
```

More information on each set of models and how to use them is available in the `README.md` file in the corresponding subdirectory.
8 changes: 8 additions & 0 deletions src/cellmap.models.egg-info/SOURCES.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
README.md
pyproject.toml
src/cellmap.models.egg-info/PKG-INFO
src/cellmap.models.egg-info/SOURCES.txt
src/cellmap.models.egg-info/dependency_links.txt
src/cellmap.models.egg-info/requires.txt
src/cellmap.models.egg-info/top_level.txt
tests/test_assert.py
1 change: 1 addition & 0 deletions src/cellmap.models.egg-info/dependency_links.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

11 changes: 11 additions & 0 deletions src/cellmap.models.egg-info/requires.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
torch
torchvision
numpy

[dev]
pytest
pytest-cov
black
mypy
pdoc
pre-commit
1 change: 1 addition & 0 deletions src/cellmap.models.egg-info/top_level.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

2 changes: 2 additions & 0 deletions src/cellmap.models/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
"""
.. include:: ../../README.md
"""

from .utils import download_url_to_file
Empty file.
Binary file added src/cellmap.models/pytorch/cosem/.DS_Store
Binary file not shown.
44 changes: 44 additions & 0 deletions src/cellmap.models/pytorch/cosem/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# COSEM Trained PyTorch Networks

This repository contains the COSEM trained networks, converted to PyTorch. The original COSEM repository can be found [here](https://open.quiltdata.com/b/janelia-cosem-networks/tree/v0003.2/) and the original COSEM paper can be found [here](https://doi.org/10.1038/s41586-021-03977-3).

The networks have been converted to PyTorch from their original Tensorflow versions using the scripts available [here](https://github.com/pattonw/cnnectome.conversion). All models are trained on 3D data and expect input of shape `(batch_size, 1, z, y, x)`.

This repo is pip installable, simply follow the steps below in an appropriate python environment (python >= 3.7), replacing `/path/to/cosem_models` with the path to this repository on your local machine:

```bash
cd /path/to/cosem_models
pip install .
```

Then you can load a model using the following code:

```python
import cosem_models
model = cosem_models.load_model('setup04/1820500')

# The model is now ready to use
```

Each model has a separate unet backbone and single layer prediction head. The `unet` and `head` objects are both PyTorch modules and can be used as such. You can access the separate components of the model using the following code:

```python
import cosem_models
model = cosem_models.load_model('setup04/1820500')
unet = model.unet
head = model.prediction_head
```

The models' prediction heads have the following numbers of output channels:
- setup04 - 14
- setup26.1 - 3
- setup28 - 2
- setup36 - 2
- setup46 - 2

This information is also available once the model is loaded using the `model.classes_out` attribute.
Additionally, the minimum input size for each model is available using the `model.min_input_size` attribute.
The step size for increasing the input size is available using the `model.input_size_step` attribute.
And the minimum output size for each model is available using the `model.min_output_size` attribute.

The model weights we most frequently use are `setup04/1820500` and `setup04/975000`.
Empty file.
Loading

0 comments on commit d5a4975

Please sign in to comment.