Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature documentation #80

Merged
merged 70 commits into from
Nov 7, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
70 commits
Select commit Hold shift + click to select a range
41aecbf
Added Sphinx documentation
jeipollack Oct 13, 2023
8d457ce
Added workflow_dispatch for testing
jeipollack Oct 13, 2023
266284e
Added CD workflow
jeipollack Oct 13, 2023
ef6f305
Added workflow_dispatch key for manual run
jeipollack Oct 13, 2023
3536c01
Added feature_documentation branch
jeipollack Oct 13, 2023
c68308c
Corrected CD error wrong project name
jeipollack Oct 13, 2023
9815241
Provided module path for documentation
jeipollack Oct 13, 2023
8d2d56b
Files to generate API doc
jeipollack Oct 16, 2023
3471111
Added _static to source/
jeipollack Oct 16, 2023
8e2ef0d
Removed unused code
jeipollack Oct 16, 2023
9de0b86
Added some documentation
jeipollack Oct 18, 2023
c026cf2
Applied some style changes
jeipollack Oct 18, 2023
7ad9137
Added to test build
jeipollack Oct 18, 2023
6b40b3d
Added missing dependency
jeipollack Oct 18, 2023
b68835b
Updated descriptions and param ordering
jeipollack Oct 22, 2023
6d2858a
Updated extensions and options
jeipollack Oct 22, 2023
f1c2c2a
Changed formatting of refs
jeipollack Oct 22, 2023
8235ebd
Modified text
jeipollack Oct 22, 2023
a7c43ab
Changes to text
jeipollack Oct 22, 2023
f21a345
Added new section Running WaveDiff
jeipollack Oct 22, 2023
5c2ebfa
Added space
jeipollack Oct 22, 2023
7bbea14
Modified docstring
jeipollack Oct 22, 2023
0bf9daf
Added new sections
jeipollack Oct 22, 2023
760450b
Added dependency
jeipollack Oct 22, 2023
947f8ca
Changed naming of image to solve GH-pages prob
jeipollack Oct 22, 2023
5d5708e
Update to gitignore
jeipollack Oct 22, 2023
8cdb54c
Added images for documentation
jeipollack Oct 22, 2023
e06861e
Normalised text
jeipollack Oct 23, 2023
369687f
Added a cross-reference
jeipollack Oct 23, 2023
d09211e
Normalised text
jeipollack Oct 23, 2023
77fe79e
Removed requirements.txt that was re-added by accident
jeipollack Oct 23, 2023
5ef9c72
Clarified text and added ref
jeipollack Oct 23, 2023
7b6c8d9
Normalised text
jeipollack Oct 23, 2023
7e747d7
Added pydocstyle
jeipollack Oct 23, 2023
8125ea2
Removed unused dir
jeipollack Oct 23, 2023
6135763
Removed pydocstyle tbd next release
jeipollack Oct 23, 2023
2722900
Added cross-ref for master_config file
jeipollack Oct 23, 2023
d6da784
Update README.md
jeipollack Oct 23, 2023
65b98fe
Update README.md correcting hyperlinks
jeipollack Oct 23, 2023
3263f8d
Added missing zernike code ref
jeipollack Oct 23, 2023
dd75e22
Edits to text
jeipollack Oct 23, 2023
b34b9c2
Updated .gitignore
jeipollack Oct 24, 2023
83b8e7f
Disable pydocstyle check
jeipollack Oct 31, 2023
5d34527
Merge branch 'feature_documentation' of https://github.com/CosmoStat/…
jeipollack Oct 31, 2023
36547c2
Updated gitignore
jeipollack Oct 31, 2023
f94afc0
Removed duplicated n_epoch parameters
jeipollack Oct 31, 2023
11c690a
Removed dependencies from yml
jeipollack Oct 16, 2023
672889d
Added pytest to dependency list
jeipollack Oct 16, 2023
308df36
Corrected error in dependencies installation cmd
jeipollack Oct 16, 2023
f02bc10
Removed pytest from dependency list for testing
jeipollack Oct 16, 2023
1756445
Removed files not used in build arch PR comment
jeipollack Oct 16, 2023
87e78c3
Merge branch 'dummy_main' into feature_documentation
jeipollack Oct 31, 2023
a33d9a7
Update docs/source/installation.md first paragraph
jeipollack Oct 31, 2023
3690526
Update docs/source/installation.md note section
jeipollack Oct 31, 2023
78cd2ae
Update docs/source/configuration.md fix bad word choice
jeipollack Oct 31, 2023
f184ccd
Update docs/source/configuration.md with better word choice
jeipollack Oct 31, 2023
0656488
Update docs/source/configuration.md
jeipollack Oct 31, 2023
9e2ac01
Update docs/source/configuration.md
jeipollack Oct 31, 2023
e376e78
Update docs/source/configuration.md
jeipollack Oct 31, 2023
012eb52
Update docs/source/configuration.md with minor corrections
jeipollack Oct 31, 2023
6d521ad
Removed sentence
jeipollack Oct 31, 2023
a73fbd0
Improvements to text docs/source/configuration.md
jeipollack Oct 31, 2023
39445dd
Modified text in training configuration section
jeipollack Oct 31, 2023
0216831
New text edits basic_exec & config
jeipollack Oct 31, 2023
3ae382d
Removed info.py
jeipollack Nov 6, 2023
feb172d
Removed a metrics config param from training config
jeipollack Nov 6, 2023
46af5b3
Edits to the docs
jeipollack Nov 6, 2023
e9b3bdd
Corrections to configuration.md
jeipollack Nov 6, 2023
920177f
Removed workflow dispatch trigger
jeipollack Nov 6, 2023
a58fd20
Removed old unused plotting module
jeipollack Nov 7, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 44 additions & 0 deletions .github/workflows/cd.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# This workflow will install Python dependencies, run tests and lint with a single version of Python
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: CD

on:
workflow_dispatch:
push:
branches:
- feature_documentation
- dummy_main


jobs:
docs:
name: Deploy API documentation
runs-on: [ubuntu-latest]

steps:
- name: Checkout
uses: actions/checkout@v3

- name: Set up Python 3.10.5
uses: actions/setup-python@v3
with:
python-version: "3.10.5"

- name: Check Python Version
run: python --version

- name: Install dependencies
run: |
python -m pip install ".[docs]"

- name: Build API documentation
run: |
sphinx-apidoc -Mfeo docs/source src/wf_psf
sphinx-build docs/source docs/build

- name: Deploy API documentation
uses: peaceiris/[email protected]
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: docs/build
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
sfarrens marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# This workflow will install Python dependencies, run tests and lint with a single version of Python
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: CI-test
name: CI

on:
pull_request:
Expand Down
10 changes: 8 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ __pycache__/

# Remove method comparison data
method-comparison/compatible-datasets/*
tf_notebooks

# Log files from slurm
*.err
Expand Down Expand Up @@ -66,6 +67,7 @@ coverage.xml
*.py,cover
.hypothesis/
.pytest_cache/
src/wf_psf/pytest.xml

# Translations
*.mo
Expand All @@ -86,6 +88,11 @@ instance/

# Sphinx documentation
docs/_build/
docs/source/wf_psf*.rst
docs/source/_static/file.png
docs/source/_static/images/logo_colab.png
docs/source/_static/minus.png
docs/source/_static/plus.png

# PyBuilder
target/
Expand Down Expand Up @@ -144,5 +151,4 @@ dmypy.json
# Pyre type checker
.pyre/

# UML plots
*.png

108 changes: 5 additions & 103 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,109 +3,11 @@
<h1 align='center'>WaveDiff</h1>
<h2 align='center'>A differentiable data-driven wavefront-based PSF modelling framework.</h2>

WaveDiff is a differentiable PSF modelling pipeline constructed with [Tensorflow](https://github.com/tensorflow/tensorflow). It was developed at the [CosmoStat lab](https://www.cosmostat.org) at CEA Paris-Saclay.

See the [documentation](https://cosmostat.github.io/wf-psf/) for details on how to install and run WaveDiff.

This repository includes:
- A differentiable PSF model entirely built in [Tensorflow](https://github.com/tensorflow/tensorflow).
- A numpy-based PSF simulator [here](https://github.com/tobias-liaudat/wf-psf/blob/main/wf_psf/SimPSFToolkit.py).
- A [numpy-based PSF simulator](https://github.com/CosmoStat/wf-psf/tree/dummy_main/src/wf_psf/sims).
- All the scripts, jobs and notebooks required to reproduce the results in [arXiv:2203.04908](http://arxiv.org/abs/2203.04908) and [arXiv:2111.12541](https://arxiv.org/abs/2111.12541).

~~For more information on how to use the WaveDiff model through configurable scripts see the `long-runs` directory's [README](https://github.com/tobias-liaudat/wf-psf/blob/main/long-runs/README.md).~~ (Scripts will become obsolute with next release.)

## Proposed framework

A schematic of the proposed framework can be seen below. The PSF model is estimated (trained) using star observations in the field-of-view.

<img height=300 src="assets/PSF_model_diagram_v6.png" >

<!-- Visual reconstruction example of the WaveDiff-original PSF model trained on a simplified Euclid-like setting.

<img height=800 src="assets/PSF_reconstruction_example.png" > -->


## Requirements
- [numpy](https://github.com/numpy/numpy) [>=1.19.2]
- [scipy](https://github.com/scipy/scipy) [>=1.5.2]
- [TensorFlow](https://www.tensorflow.org/) [==2.4.1]
- [TensorFlow Addons](https://github.com/tensorflow/addons) [==0.12.1]
- [Astropy](https://github.com/astropy/astropy) [==4.2]
- [zernike](https://github.com/jacopoantonello/zernike) [==0.0.31]
- [opencv-python](https://github.com/opencv/opencv-python) [>=4.5.1.48]
- [pillow](https://github.com/python-pillow/Pillow) [>=8.1.0]
- [galsim](https://github.com/GalSim-developers/GalSim) [>=2.3.1]

Optional packages:
- [matplotlib](https://github.com/matplotlib/matplotlib) [=3.3.2]
- [seaborn](https://github.com/mwaskom/seaborn) [>=0.11]

## Install

`wf-psf` is pure python and can be easily installed with `pip`. After cloning the repository, run the following commands:

```bash
$ cd wf-psf
$ git checkout dummy_main
$ pip install .
```

The package can then be imported in Python as `import wf_psf as wf`. ~~We recommend using the release `1.2.0` for stability as the current main branch is under development.~~

## Running `WaveDiff`

To run `WaveDiff`, we prepared a step-by-step [instruction guide](https://github.com/CosmoStat/wf-psf/wiki/Getting-started-tutorial).

[Read the tutorial to get started!](https://github.com/CosmoStat/wf-psf/wiki/Getting-started-tutorial)

## Reproducible research

#### [arXiv:2203.04908](http://arxiv.org/abs/2203.04908) Rethinking data-driven point spread function modeling with a differentiable optical model (2022)
_Submitted._

- Use the release 1.2.0.
- All the scripts, jobs and notebooks to reproduce the figures from the article can be found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/papers/article_IOP).
- The trained PSF models are found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/papers/article_IOP/data/models).
- The input PSF field can be found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/data).
- The script used to generate the input PSF field is [this one](https://github.com/tobias-liaudat/wf-psf/blob/main/long-runs/LR-PSF-field-gen-coherentFields.py).
- The code required to run the comparison against pixel-based PSF models is in [this directory](https://github.com/tobias-liaudat/wf-psf/tree/main/method-comparison).
- The training of the models was done using [this script](https://github.com/tobias-liaudat/wf-psf/blob/main/long-runs/train_eval_plot_script_click.py). In order to match the script's option for the different models with the article you should follow:
- `poly->WaveDiff-original`
- `graph->WaveDiff-graph`
- `mccd->WaveDiff-Polygraph`

_Note: To run the comparison to other PSF models you need to install them first. See [RCA](https://github.com/CosmoStat/rca), [PSFEx](https://github.com/astromatic/psfex) and [MCCD](https://github.com/CosmoStat/mccd)._


#### [arXiv:2111.12541](https://arxiv.org/abs/2111.12541) Rethinking the modeling of the instrumental response of telescopes with a differentiable optical model (2021)
_NeurIPS 2021 Workshop on Machine Learning and the Physical Sciences._

- Use the release 1.2.0.
- All the scripts, jobs and notebooks to reproduce the figures from the article can be found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/papers/Neurips2021_ML4Physics_workshop).



## Citation

If you use `wf-psf` in a scientific publication, we would appreciate citations to the following paper:

*Rethinking data-driven point spread function modeling with a differentiable optical model*, T. Liaudat, J.-L. Starck, M. Kilbinger, P.-A. Frugier, [arXiv:2203.04908](http://arxiv.org/abs/2203.04908), 2022.


The BibTeX citation is the following:
```
@misc{https://doi.org/10.48550/arxiv.2203.04908,
doi = {10.48550/ARXIV.2203.04908},

url = {https://arxiv.org/abs/2203.04908},

author = {Liaudat, Tobias and Starck, Jean-Luc and Kilbinger, Martin and Frugier, Pierre-Antoine},

keywords = {Instrumentation and Methods for Astrophysics (astro-ph.IM), Computer Vision and Pattern Recognition (cs.CV), FOS: Physical sciences, FOS: Physical sciences, FOS: Computer and information sciences, FOS: Computer and information sciences},

title = {Rethinking data-driven point spread function modeling with a differentiable optical model},

publisher = {arXiv},

year = {2022},

copyright = {arXiv.org perpetual, non-exclusive license}
}
```

6 changes: 3 additions & 3 deletions config/data_config.yaml
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# Training and test datasets for training and/or metrics evaluation
# Training and test data sets for training and/or metrics evaluation
data:
training:
# Specify directory path to data; Default setting is /path/to/repo/data
data_dir: data/coherent_euclid_dataset/
file: train_Euclid_res_200_TrainStars_id_001.npy
# if training dataset file does not exist, generate a new one by setting values below
# if training data set file does not exist, generate a new one by setting values below
stars: null
positions: null
SEDS: null
Expand All @@ -28,7 +28,7 @@ data:
test:
data_dir: data/coherent_euclid_dataset/
file: test_Euclid_res_id_001.npy
# If test dataset file not provided produce a new one
# If test data set file not provided produce a new one
stars: null
noisy_stars: null
positions: null
Expand Down
18 changes: 9 additions & 9 deletions config/metrics_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,21 +4,21 @@ metrics:
# Choose the training cycle for which to evaluate the psf_model. Can be: 1, 2, ...
saved_training_cycle: 2
# Metrics-only run: Specify model_params for a pre-trained model else leave blank if running training + metrics
# Specify path to Trained Model
trained_model_path: </path/to/trained/model>
# Specify path to Parent Directory of Trained Model
trained_model_path: </path/to/parent/directory/of/trained/model>
# Path to Trained Model Config file inside /trained_model_path/ parent directory
trained_model_config: </path/to/trained/model>
# Name of Plotting Config file - Enter name of yaml file to run plot metrics else if empty run metrics evaluation only
plotting_config: <enter name of plotting_config .yaml file or leave empty>
trained_model_config: </path/to/trained/model/config/file>
#Evaluate the monchromatic RMSE metric.
eval_mono_metric_rmse: True
#Evaluate the OPD RMSE metric.
eval_opd_metric_rmse: True
#Evaluate the super-resolution and the shape RMSE metrics for the train dataset.
eval_train_shape_sr_metric_rmse: True
# Name of Plotting Config file - Enter name of yaml file to run plot metrics else if empty run metrics evaluation only
plotting_config: <enter name of plotting_config .yaml file or leave empty>
ground_truth_model:
model_params:
#Model used as ground truth for the evaluation. Options are: 'poly', 'physical'.
#Model used as ground truth for the evaluation. Options are: 'poly' for polychromatic and 'physical' [not available].
model_name: poly

# Evaluation parameters
Expand Down Expand Up @@ -72,15 +72,15 @@ metrics:
#Zernike polynomial modes to use on the parametric part.
n_zernikes: 45

#Max polynomial degree of the parametric part. chg to max_deg_param
#Max polynomial degree of the parametric part.
d_max: 2

#Flag to save optimisation history for parametric model
save_optim_history_param: true

# Hyperparameters for non-parametric model
nonparam_hparams:
#Max polynomial degree of the non-parametric part. chg to max_deg_nonparam
#Max polynomial degree of the non-parametric part.
d_max_nonparam: 5

# Number of graph features
Expand All @@ -96,7 +96,7 @@ metrics:
reset_dd_features: False

#Flag to save optimisation history for non-parametric model
save_optim_history_nonparam: true
save_optim_history_nonparam: True

metrics_hparams:
# Batch size to use for the evaluation.
Expand Down
5 changes: 3 additions & 2 deletions config/plotting_config.yaml
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
plotting_params:
# Specify path to parent folder containing wf-psf metrics outputs for all runs, ex: $WORK/wf-outputs/
# Specify path to parent folder containing wf-outputs-xxxxxxxxxxx for all runs, ex: $WORK/wf-outputs/
metrics_output_path: <PATH>
# List directory(s) for metrics of trained PSF models to include in plot,
# List all of the parent output directories (i.e. wf-outputs-xxxxxxxxxxx) that contain metrics results to be included in the plot
metrics_dir:
# - wf-outputs-xxxxxxxxxxx1
# - wf-outputs-xxxxxxxxxxx2
# List of name of metric config file to add to plot (would like to change such that code goes and finds them in the metrics_dir)
metrics_config:
# - metrics_config_1.yaml
# - metrics_config_2.yaml
# Show Plots Flag
plot_show: False
29 changes: 13 additions & 16 deletions config/training_config.yaml
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
training:
# ID name
# Run ID name
id_name: -coherent_euclid_200stars
# Name of Data Config file
data_config: data_config.yaml
# Metrics Config file - Enter file to run metrics evaluation else if empty run train only
metrics_config: metrics_config.yaml
# PSF model parameters
model_params:
# Model type. Options are: 'mccd', 'graph', 'poly, 'param', 'poly_physical'."
model_name: poly
Expand Down Expand Up @@ -50,16 +51,16 @@ training:

# Hyperparameters for Parametric model
param_hparams:
# Random seed for Tensor Flow Initialization
# Set the random seed for Tensor Flow Initialization
random_seed: 3877572

# Parameter for the l2 loss function for the Optical path differences (OPD)/WFE
# Set the parameter for the l2 loss function for the Optical path differences (OPD)/WFE
l2_param: 0.

#Zernike polynomial modes to use on the parametric part.
n_zernikes: 15

#Max polynomial degree of the parametric part. chg to max_deg_param
#Max polynomial degree of the parametric part. m
d_max: 2

#Flag to save optimisation history for parametric model
Expand Down Expand Up @@ -87,35 +88,31 @@ training:

# Training hyperparameters
training_hparams:
n_epochs_params: [2, 2]

n_epochs_non_params: [2, 2]

# Batch Size
batch_size: 32

# Multi-cyclic Parameters
multi_cycle_params:

# Total amount of cycles to perform.
# Total number of cycles to perform for training.
total_cycles: 1

# Train cycle definition. It can be: 'parametric', 'non-parametric', 'complete', 'only-non-parametric' and 'only-parametric'."
cycle_def: complete

# Make checkpoint at every cycle or just save the checkpoint at the end of the training."
# Flag to save all cycles. If "True", create a checkpoint at every cycle, else if "False" only save the checkpoint at the end of the training."
save_all_cycles: False

#"Saved cycle to use for the evaluation. Can be 'cycle1', 'cycle2', ..."
saved_cycle: cycle1

# Learning rates for the parametric parts. It should be a str where numeric values are separated by spaces.
# Learning rates for training the parametric model parameters per cycle.
learning_rate_params: [1.0e-2, 1.0e-2]

# Learning rates for the non-parametric parts. It should be a str where numeric values are separated by spaces."
# Learning rates for training the non-parametric model parameters per cycle.
learning_rate_non_params: [1.0e-1, 1.0e-1]

# Number of training epochs of the parametric parts. It should be a strign where numeric values are separated by spaces."
# Number of training epochs for training the parametric model parameters per cycle.
n_epochs_params: [20, 20]

# Number of training epochs of the non-parametric parts. It should be a str where numeric values are separated by spaces."
# Number of training epochs for training the non-parametric model parameters per cycle.
n_epochs_non_params: [100, 120]

20 changes: 20 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
Loading
Loading