-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
c85929f
commit b547b5b
Showing
1 changed file
with
347 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,347 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"id": "4c0f45d7", | ||
"metadata": {}, | ||
"source": [ | ||
"# The siibra tool suite – interfaces to a multilevel brain atlas spanning scales and modalities\n", | ||
"\n", | ||
"Studying the human brain requires to capture its structural and functional organization in a common spatial framework. siibra is a tool suite that implements a multilevel atlas of the human brain by providing access to reference templates at different spatial scales, complementary parcellation maps, and multimodal data features. \n", | ||
"\n", | ||
"For interactive and explorative use, the tool suite includes a web-based 3D viewer hosted [on EBRAINS](https://atlases.ebrains.eu/viewer). [siibra-python](https://siibra-python.readthedocs.io) is a software library for using the framework in computational workflows, and provides good compatibility with established (neuro)data science tools such as [nibabel](https://nipy.org/nibabel/) and [pandas](https://pandas.pydata.org).\n", | ||
"\n", | ||
"this notebook walks you through some examples that are intended for presenting siibra at the OHBM 2023 conference. It uses a recent develoment snapshot of siibra." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "78209d92", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"!pip install -U --user siibra" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "64562619", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"import siibra as sb\n", | ||
"from packaging.version import Version\n", | ||
"assert Version(sb.__version__) >= Version('1.0a14')\n", | ||
"from nilearn import plotting\n", | ||
"import matplotlib.pyplot as plt\n", | ||
"%matplotlib notebook" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "d38c80d2", | ||
"metadata": {}, | ||
"source": [ | ||
"# Load example image with regions of interest" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "ef10c44c", | ||
"metadata": {}, | ||
"source": [ | ||
"The example starts with a NIfTI image containing some regions of interesta in MNI space, which we are going to assign to cytoarchitectonic brain regions, and use for multimodel data queries. Here we just load a predefined NIfTI file, but in general you might use a thresholded functional activation map or similar signal here." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "2101caa9", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"img = sb.volumes.from_file(\n", | ||
" \"https://github.com/FZJ-INM1-BDA/siibra-tutorials/raw/main/ohbm-2023-example-input.nii.gz\",\n", | ||
" name=\"Example input\", \n", | ||
" space=\"mni152\",\n", | ||
")\n", | ||
"\n", | ||
"# display the image\n", | ||
"plotting.plot_stat_map(img.fetch(), cmap='viridis', colorbar=False, draw_cross=False)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "89a7ccca", | ||
"metadata": {}, | ||
"source": [ | ||
"# Assign activations to cytoarchitectonic brain regions\n", | ||
"\n", | ||
"We request the Julich-Brain probabilistic cytoarchitectonic maps, and perform an assignment of the image signal to the cytoarchitectonic structures. Siibra will automatically identify separated structures in the image. The result is pandas dataframe with regions assigned to each structure identified in the input image, and several scores for the assignment. " | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "979701eb", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# retrieve Julich-Brain probabilistic cytoarchitectonic maps\n", | ||
"julich_pmaps = sb.get_map(\n", | ||
" sb.parcellations.get('julich 2.9'),\n", | ||
" sb.spaces.get('mni 152'),\n", | ||
" sb.MapType.STATISTICAL\n", | ||
")\n", | ||
"\n", | ||
"# assign the input image from above to the cytoarchitectonic maps.\n", | ||
"# This results in a pandas DataFrame, which we sort by correlation.\n", | ||
"matches = julich_pmaps.assign(img).sort_values('correlation', ascending=False)\n", | ||
"\n", | ||
"# For this example, we work only with a few of the assigned structures.\n", | ||
"# We filter by strong correlation and containedness scores.\n", | ||
"matches = matches[(matches.correlation > 0.3) & (matches['map containedness'] < 0.25)]\n", | ||
"\n", | ||
"# display some columns of the filtered table\n", | ||
"matches[['input structure', 'region', 'correlation']].round(2)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "ee50f9da", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"matches" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "f5bc1b46", | ||
"metadata": {}, | ||
"source": [ | ||
"Show these filtered matched brain regions in 3D." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "c25bf33e", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"fig, axs = plt.subplots(1, len(matches), figsize=(8, 4))\n", | ||
"for i, match in enumerate(matches.itertuples()):\n", | ||
" plotting.plot_stat_map(\n", | ||
" julich_pmaps.fetch(region=match.region),\n", | ||
" axes=axs[i], colorbar=False, display_mode='z',\n", | ||
" cut_coords=[int(match.region.compute_centroids('mni152')[0][2])],\n", | ||
" title=match.region.name\n", | ||
" )" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "2250cbd7", | ||
"metadata": {}, | ||
"source": [ | ||
"# Query multimodal regional features\n", | ||
"\n", | ||
"Retrieving regional data features is as simple as specifying an atlas concept - such as a region, parcellation or space - and a feature modality. So in order to find receptor density fingerprints in the left primary visual cortex, we could do:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "5231d9e8", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# retrieve the features for v1 left\n", | ||
"features = sb.features.get(\n", | ||
" sb.get_region('julich 2.9', 'v1 left'),\n", | ||
" sb.features.molecular.ReceptorDensityFingerprint,\n", | ||
")\n", | ||
"\n", | ||
"# display the data array of the first of the returned features\n", | ||
"features[0].data.T" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "25a7d64c", | ||
"metadata": {}, | ||
"source": [ | ||
"We use the same mechanism now to build a table of three different feature modalities measured in the two selected brain regions. We choose receptor densities and cell densities, and complement them with connectivity profiles below. The connectivity uses a slightly different query, since the connectivity matrix is linked to the parcellation object, not a single region." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "34261ea7", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# retrieve the features for v1 left\n", | ||
"features = sb.features.get(\n", | ||
" sb.get_region('julich 2.9', 'v1 left'),\n", | ||
" sb.features.molecular.GeneExpressions,\n", | ||
" gene=['GABBR1', 'GABBR2']\n", | ||
")\n", | ||
"\n", | ||
"# display the data array of the first of the returned features\n", | ||
"features[0].data" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "3ef089e9", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# show their locations\n", | ||
"plotting.plot_markers(node_coords = list(features[0].data.mni_xyz), node_values=features[0].data['sample'])" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "0c39911b", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# create the plot\n", | ||
"fig, axs = plt.subplots(\n", | ||
" 3,\n", | ||
" len(matches),\n", | ||
" sharey='row', figsize=(9,9)\n", | ||
")\n", | ||
"\n", | ||
"# row 1 - receptor densities for different receptor types\n", | ||
"for i, m in enumerate(matches.itertuples()):\n", | ||
" features = sb.features.get(\n", | ||
" m.region, \n", | ||
" sb.features.molecular.ReceptorDensityFingerprint\n", | ||
" )\n", | ||
" features[0].plot(ax=axs[0, i])\n", | ||
"\n", | ||
"# row 2 - cell densities per cortical layer\n", | ||
"for i, m in enumerate(matches.itertuples()):\n", | ||
" features = sb.features.get(\n", | ||
" m.region, \n", | ||
" sb.features.cellular.LayerwiseCellDensity\n", | ||
" )\n", | ||
" features[0].plot(ax=axs[1, i])\n", | ||
" \n", | ||
"# row 3 - gene expression\n", | ||
"for i, m in enumerate(matches.itertuples()):\n", | ||
" features = sb.features.get(\n", | ||
" m.region, \n", | ||
" sb.features.molecular.GeneExpressions,\n", | ||
" gene=['MAOA', 'TAC1', 'GABBR1', 'GABBR2']\n", | ||
" )\n", | ||
" features[0].data.boxplot(column='zscore', by='gene', ax=axs[2, i])\n", | ||
"\n", | ||
"# optimize plot layout\n", | ||
"plt.tight_layout()" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "90cd24de", | ||
"metadata": {}, | ||
"source": [ | ||
"# Extract BigBrain image data\n", | ||
"\n", | ||
"Finally, we sample 3D chunks of the BigBrain volume located in the identified brain regions." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "ffa4d0e6", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# access the BigBrain reference template\n", | ||
"bigbrain = sb.get_template('bigbrain')\n", | ||
"\n", | ||
"# fetch the whole-brain volume at reduced resolution\n", | ||
"bigbrain_volume = bigbrain.fetch(resolution_mm=.64)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "21292efa", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# prepare the plot\n", | ||
"f, axs = plt.subplots(2, len(matches), figsize=(9, 7))\n", | ||
"plot_kwargs = {\n", | ||
" \"bg_img\": None,\n", | ||
" \"cmap\": 'gray',\n", | ||
" \"colorbar\": False, \n", | ||
" \"draw_cross\": False,\n", | ||
" \"annotate\": False, \n", | ||
" 'vmin': 0,\n", | ||
" 'vmax': 255\n", | ||
"}\n", | ||
"\n", | ||
"# for each matched brain region, sample a random 3D location in MNI space,\n", | ||
"# warp it to bigbrain space, and fetch a 3D chunk of 3mm sidelength\n", | ||
"# from the full resolution Big Brain (20 micron) at this position.\n", | ||
"for i, match in enumerate(matches.itertuples()):\n", | ||
"\n", | ||
" point = julich_pmaps.sample_locations(match.region, 1)[0].warp('bigbrain')\n", | ||
" view = plotting.plot_img(bigbrain_volume, axes=axs[0, i], cut_coords=tuple(point), **plot_kwargs)\n", | ||
" view.add_markers([tuple(point)])\n", | ||
"\n", | ||
" voi = point.get_enclosing_cube(width_mm=3)\n", | ||
" chunk = bigbrain.fetch(voi=voi, resolution_mm=0.02)\n", | ||
" plotting.plot_img(chunk, axes=axs[1, i], **plot_kwargs)\n", | ||
"\n", | ||
"plt.tight_layout()" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "55878936", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "Python 3 (ipykernel)", | ||
"language": "python", | ||
"name": "python3" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.8.13" | ||
}, | ||
"vscode": { | ||
"interpreter": { | ||
"hash": "2cbf8a3ba20297bfc165c143f092fb45c4fe7d3297069e929ec9cf329479c54c" | ||
} | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 5 | ||
} |