We'll be using conda
or mamba
(faster) as a package manager here, depending on what is installed on the teaching . This allows to set up the entire environment with a single command.
-
create the conda environment from the
environment.yaml
file# it's recommended to use mamba for faster installation, or set libmamba as the default solver # conda config --set solver libmamba conda env create -f environment.yaml -y # alternatively, if you already have a conda environment you'd like to use, you can update it like this conda env update --name myenv --file environment.yaml --prune
-
activate the environment
conda activate spatialdata-workshop
-
register the conda environment in Jupyter
python -m ipykernel install --user --name spatialdata-workshop --display-name "Python (SpatialData Workshop)"
-
Optionally: Set up auto-completion inside of Jupter Notebooks
pip install jupyter_tabnine jupyter contrib nbextension install --user jupyter nbextension install --py jupyter_tabnine --user jupyter nbextension enable --py jupyter_tabnine --user jupyter serverextension enable --py jupyter_tabnine --user
-
If at any point you modify the
environment.yaml
and want to update the environment, you can do this withconda env update --name spatialdata-workshop --file environment.yaml --prune
- Activate the environment as explained above
conda activate spatialdata-workshop
- Run the following command to download the data
# download the raw data python download.py --data_dir data raw visium python download.py --data_dir data raw visium_hd python download.py --data_dir data raw xenium # download some already processed data python download.py --data_dir data zarr merfish
Notes on the data:
- The datasets have been manually processed to reduce the file size, by removing certain data, metadata or reducing some of the image sizes. Due to the reduced size, the datasets are not representative of the original data, nor of the technology they have been using to profile them. Therefore, they should not be used for any scientific example or comparison across technologies.
- The MERFISH dataset is from the Allen Institute prototype MERFISH pipeline, and it is not representative of the new commercial MERFISH technology. We choose the former because it's lightweight and already analyzed.
- Activate the environment as explained above
conda activate spatialdata-workshop
- Start JupyterLab
jupyter-lab
Here you can find a list of our past workshops, including the respective notebooks and slides.
- 2024/09/09: (Tim Treis) BioTrac Spatial Biology Symposium notebooks slides
- 2024/09/11: (Luca Marconato) scverse conference workshop notebooks slides
- 2024/09/25: (Wouter-Michiel Vierdag, Luca Marconato) EBI 'Advances in spatial omics' webinar series; (Luca Marconato) FoG Live notebooks slides
- 2024/10/22: (Luca Marconato) BIOINFO 2024, Gyeongju (South Korea) notebooks slides
In addition to the material we provide, you may consider the following additional workshops:
- 2024/12/09: (VIB Spatial Catalyst team) Targeted spatial transcriptomics data analysis notebooks. A hands-on introduction into the analysis of targeted spatial transcriptomics data using the SPArrOW pipeline developed by the Yvan Saeys group (VIB).