Skip to content

Commit

Permalink
documentation review
Browse files Browse the repository at this point in the history
  • Loading branch information
IvoVellekoop committed Oct 11, 2024
1 parent e6f19ff commit b7d8c70
Show file tree
Hide file tree
Showing 5 changed files with 30 additions and 30 deletions.
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@
in a minimal amount of code, with OpenWFS taking care of low-level hardware control, synchronization,
and troubleshooting. Algorithms can be used on different hardware or in a completely
simulated environment without changing the code. Moreover, we provide full integration with
the \textmu Manager microscope control software, enabling wavefront shaping experiments to be
the Micro-Manager microscope control software, enabling wavefront shaping experiments to be
executed from a user-friendly graphical user interface.
}
}
Expand Down
14 changes: 7 additions & 7 deletions docs/source/readme.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ What is wavefront shaping?

Wavefront shaping (WFS) is a technique for controlling the propagation of light in arbitrarily complex structures, including strongly scattering materials :cite:`kubby2019`. In WFS, a spatial light modulator (SLM) is used to shape the phase and/or amplitude of the incident light. With a properly constructed wavefront, light can be made to focus through :cite:`Vellekoop2007`, or inside :cite:`vellekoop2008demixing` scattering materials; or light can be shaped to have other desired properties, such as optimal sensitivity for specific measurements :cite:`bouchet2021maximum`, specialized point-spread functions :cite:`boniface2017transmission`, spectral filtering :cite:`Park2012`, or for functions like optical trapping :cite:`vcivzmar2010situ`.

It stands out that an important driving force in WFS is the development of new algorithms, for example, to account for sample movement :cite:`valzania2023online`, experimental conditions :cite:`Anderson2016`, to be optimally resilient to noise :cite:`mastiani2021noise`, or to use digital twin models to compute the required correction patterns :cite:`salter2014exploring,ploschner2015seeing,Thendiyammal2020,cox2023model`. Much progress has been made towards developing fast and noise-resilient algorithms, or algorithms designed specifically for the methodology of wavefront shaping, such as using algorithms based on Hadamard patterns or Fourier-based approaches :cite:`Mastiani2022`. Fast techniques that enable wavefront shaping in dynamic samples :cite:`Liu2017,Tzang2019` have also been developed, and many potential applications have been prototyped, including endoscopy :cite:`ploschner2015seeing`, optical trapping :cite:`Cizmar2010`, Raman scattering :cite:`Thompson2016`, and deep-tissue imaging :cite:`Streich2021`. Applications extend beyond that of microscope imaging, such as in optimizing photoelectrochemical absorption :cite:`Liew2016` and tuning random lasers :cite:`Bachelard2014`.
It stands out that an important driving force in WFS is the development of new algorithms, for example, to account for sample movement :cite:`valzania2023online`, experimental conditions :cite:`Anderson2016`, to be optimally resilient to noise :cite:`mastiani2021noise`, or to use digital twin models to compute the required correction patterns :cite:`salter2014exploring,ploschner2015seeing,Thendiyammal2020,cox2023model`. Much progress has been made towards developing fast and noise-resilient algorithms, or algorithms designed specifically for the methodology of wavefront shaping, such as using algorithms based on Hadamard patterns :cite:`popoff2010measuring` or Fourier-based approaches :cite:`Mastiani2022`. Fast techniques that enable wavefront shaping in dynamic samples :cite:`Liu2017,Tzang2019` have also been developed, and many potential applications have been prototyped, including endoscopy :cite:`ploschner2015seeing`, optical trapping :cite:`Cizmar2010`, Raman scattering :cite:`Thompson2016`, and deep-tissue imaging :cite:`Streich2021`. Applications extend beyond that of microscope imaging, such as in optimizing photoelectrochemical absorption :cite:`Liew2016` and tuning random lasers :cite:`Bachelard2014`.

With the development of these advanced algorithms, however, the complexity of WFS software is steadily increasing as the field matures, which hinders cooperation as well as end-user adoption. Code for controlling wavefront shaping tends to be complex and setup-specific, and developing this code typically requires detailed technical knowledge and low-level programming. Moreover, since many labs use their own in-house programs to control the experiments, sharing and re-using code between different research groups is troublesome.

Expand All @@ -28,18 +28,18 @@ OpenWFS is a Python package that is primarily designed for performing and for si

* **Hardware control**. Modular code for controlling spatial light modulators, cameras, and other hardware typically encountered in wavefront shaping experiments. Highlights include:

* **Spatial light modulator**. The :class:`~.slm.SLM` object provides a versatile way to control spatial light modulators, allowing for software lookup tables, synchronization, texture warping, and multi-texture functionality accelerated by OpenGL.
* **Spatial light modulator**. The :class:`~.slm.SLM` object provides a versatile way to control spatial light modulators, allowing for software lookup tables, synchronization, texture mapping, and texture blending functionality accelerated by OpenGL.
* **Scanning microscope**. The :class:`~.devices.ScanningMicroscope` object uses a National Instruments data acquisition card to control a laser-scanning microscope.
* **GenICam cameras**. The :class:`~.devices.Camera` object uses the ``harvesters`` backend :cite:`harvesters` to access any camera supporting the GenICam standard :cite:`genicam`.
* **Automatic synchronization**. OpenWFS provides tools for automatic synchronization of actuators (e.g. an SLM) and detectors (e.g. a camera). The automatic synchronization makes it trivial to perform pipelined measurements that avoid the delay normally caused by the latency of the video card and SLM.
* **Automatic synchronization**. OpenWFS provides tools for automatic synchronization of actuators (e.g. an SLM) and detectors (e.g. a camera). The automatic synchronization makes it trivial to perform pipelined measurements :cite:`ThesisVellekoop` that avoid the delay normally caused by the latency of the video card and SLM.

* **Simulation**. The ability to simulate optical experiments is essential for the rapid development and debugging of wavefront shaping algorithms. OpenWFS provides an extensive framework for testing and simulating wavefront shaping algorithms, including the effect of measurement noise, stage drift, and user-defined aberrations. This allows for rapid prototyping and testing of new algorithms without the need for physical hardware.

* **Wavefront shaping algorithms**. A growing collection of wavefront shaping algorithms. OpenWFS abstracts the hardware control, synchronization, and signal processing so that the user can focus on the algorithm itself. As a result, even advanced algorithms can be implemented in a few dozens of lines of code, and automatically work with any combination of hardware and simulation tools that OpenWFS supports.
* **Wavefront shaping algorithms**. A growing collection of wavefront shaping algorithms. OpenWFS abstracts the hardware control, synchronization, and signal processing so that the user can focus on the algorithm itself. As a result, even advanced algorithms can be implemented in a few dozen lines of code, and automatically work with any combination of hardware and simulation tools that OpenWFS supports.

* **Platform for exchange and joint collaboration**. OpenWFS can be used as a platform for sharing and exchanging wavefront shaping algorithms. The package is designed to be modular and easy to expand, and it is our hope that the community will contribute to the package by adding new algorithms, hardware control modules, and simulation tools. Python was specifically chosen for this purpose for its active community, high level of abstraction and the ease of sharing tools.

* **Micro-Manager compatibility**. Micro-Manager :cite:`MMoverview`, a widely used open-source microscopy control platform. The devices in OpenWFS, such as GenICam camera's, or the scanning microscope, as well as all algorithms, can be controlled from Micro-Manager using the recently developed :cite:`PyDevice` adapter that imports Python scripts into Micro-Manager
* **Micro-Manager compatibility**. Micro-Manager :cite:`MMoverview` is a widely used open-source microscopy control platform. The devices in OpenWFS, such as GenICam camera's, or the scanning microscope, as well as all algorithms, can be controlled from Micro-Manager using the recently developed :cite:`PyDevice` adapter that imports Python scripts into Micro-Manager

* **Automated troubleshooting**. OpenWFS provides tools for automated troubleshooting of wavefront shaping experiments. This includes tools for measuring the performance of wavefront shaping algorithms, and for identifying common problems such as incorrect SLM calibration, drift, measurement noise, and other experimental imperfections.

Expand All @@ -65,9 +65,9 @@ This will also install the optional dependencies for OpenWFS:

*genicam* For the GenICam camera support, the ``harvesters`` package is installed, which, in turn, needs the ``genicam`` package. At the time of writing, this package is only available for Python versions up to 3.11. To use the GenICam camera support, you also need a compatible camera with driver installed.

* nidaq* For the scanning microscope, the ``nidaqmx`` package is installed, which requires a National Instruments data acquisition card with corresponding drivers to be installed on your system.
*nidaq* For the scanning microscope, the ``nidaqmx`` package is installed, which requires a National Instruments data acquisition card with corresponding drivers to be installed on your system.

If these dependencies cannot be installed on your system, the installation will fail. In this case, you can instead install OpenWFS without dependencies by omitting ``[all]`` in the installation command, and manually install only the required dependencies, e.g. ``pip install openwfs[opengl]``.
If these dependencies cannot be installed on your system, the installation will fail. In this case, you can instead install OpenWFS without dependencies by omitting ``[all]`` in the installation command, and manually install only the required dependencies, e.g. ``pip install openwfs[opengl,nidaq]``.

At the time of writing, OpenWFS is at version 0.1.0, and it is tested up to Python version 3.11 on Windows 11 and Manjaro and Ubuntu Linux distributions. Note that the latest versions of the package will be available on the PyPI repository, and the latest documentation and the example code can be found on the `Read the Docs <https://openwfs.readthedocs.io/en/latest/>`_ website :cite:`openwfsdocumentation`. The source code can be found on :cite:`openwfsgithub`.

Expand Down
30 changes: 16 additions & 14 deletions docs/source/references.bib
Original file line number Diff line number Diff line change
Expand Up @@ -25,21 +25,23 @@ @article{Piestun2012
}


@misc{Mastiani2024PracticalConsiderations,
title = {Practical Considerations for High-Fidelity Wavefront Shaping Experiments},
author = {Mastiani, Bahareh and Cox, Dani{\"e}l W. S. and Ivo M. Vellekoop},
@article{Mastiani2024PracticalConsiderations,
title = {Practical considerations for high-fidelity wavefront shaping experiments},
author = {Mastiani, Bahareh and Cox, Dani{\"e}l and Vellekoop, Ivo M},
journal = {Journal of Physics: Photonics},
year = {2024},
month = mar,
number = {arXiv:2403.15265},
eprint = {2403.15265},
primaryclass = {physics},
publisher = {arXiv},
doi = {10.48550/arXiv.2403.15265},
urldate = {2024-03-26},
abstract = {Wavefront shaping is a technique for directing light through turbid media. The theoretical aspects of wavefront shaping are well understood, and under near-ideal experimental conditions, accurate predictions for the expected signal enhancement can be given. In practice, however, there are many experimental factors that negatively affect the outcome of the experiment. Here, we present a comprehensive overview of these experimental factors, including the effect of sample scattering properties, noise, and response of the spatial light modulator. We present simple means to identify experimental imperfections and to minimize their negative effect on the outcome of the experiment. This paper is accompanied by Python code for automatically quantifying experimental problems using the OpenWFS framework for running and simulating wavefront shaping experiments.},
archiveprefix = {arxiv},
howpublished = {http://arxiv.org/abs/2403.15265},
keywords = {Physics - Optics} }
doi = {10.48550/arXiv.2403.15265}
}

@article{popoff2010measuring,
title = {Measuring the Transmission Matrix in Optics: An Approach to the Study and Control of Light Propagation in Disordered Media},
author = {Popoff, S{\'e}bastien M and Lerosey, Geoffroy and Carminati, R{\'e}mi and Fink, Mathias and Boccara, Albert Claude and Gigan, Sylvain},
journal = {Physical review letters},
volume = {104},
number = {10},
pages = {100601},
year = {2010},
publisher = {APS}
}

@book{neider1993opengl,
Expand Down
6 changes: 3 additions & 3 deletions docs/source/slms.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ On top of the basic functionality, the :class:`hardware.SLM` object provides adv

Texture mapping involves two components: a texture and a geometry, which are stored together in a :class:`~.hardware.SLM.Patch` object. The *texture* is a 2-D array holding phase values in radians. Values in the texture are referenced by texture coordinates ranging from 0 to 1. The *geometry* describes a set of triangles that is drawn to the screen, with each triangle holding a 2-D screen coordinate and a 2-D texture coordinate. The screen coordinate determines where the vertex is drawn on the screen, and the texture coordinate determines which pixel in the texture is used to color the vertex. When drawing the triangles, OpenGL automatically interpolates the texture coordinates between the vertices, and looks up the nearest value in the phase texture.

In the simplest form, a square texture is mapped to a square region on the screen. This region is comprised of two triangles, with the screen coordinates corresponding to the vertices of the square. The vertices hold texture coordinates ranging from (0,0) to (1,1). The graphics card then interpolates the texture coordinates between the vertices, and for each screen pixel looks up the nearest value in the texture. This way, the texture is scaled to fit the region, regardless of how many elements the texture map has.
In the simplest form, a square texture is mapped to a square region on the screen. This region is comprised of two triangles, with the screen coordinates corresponding to the vertices of the square. The vertices hold texture coordinates ranging from (0,0) to (1,1). This way, the graphics hardware automatically scales the texture to fit the region, regardless of how many elements the phase map has.

A more advanced example is shown in :numref:`slmdemo`, where the texture is mapped to a disk. The disk is drawn as a set of triangles, with the screen coordinates corresponding to points on the concentric rings that form the disk. In this example, the texture was a 1 × 18 element array with random values. The texture coordinates were defined such that the elements of this array are mapped to three concentric rings, consisting of 4, 6, and 8 segments, respectively (see :numref:`slmcode`). Such an approach can be useful for equalizing the contribution of different segments on the SLM :cite:`mastiani2021noise`.

Expand All @@ -46,7 +46,7 @@ The combination of texture mapping and blending allows for a wide range of use c

- Aligning the size and position of a square phase map with the illuminating beam.
- Correcting phase maps for distortions in the optical system, such as barrel distortion.
- Using two parts of the same SLM independently. This feature is possible because each patch object can independently be used as a :class:`~.PhaseSLM` object.
- Using two parts of the same SLM independently. This feature is possible because each Patch object can independently be used as a :class:`~.PhaseSLM` object.
- Blocking part of a wavefront by drawing a different patch on top of it, with :attr:`~.Patch.additive_blend` ``= False``.
- Modifying an existing wavefront by adding a gradient or defocus pattern.
- Compensating for curvature in the SLM and other system aberrations by adding an offset layer with :attr:`~.Patch.additive_blend` ``= True`` to compensate for these aberrations.
Expand All @@ -55,7 +55,7 @@ All of these corrections can be done in real time using OpenGL acceleration, mak

A final aspect of the SLM that is demonstrated in the example is the use of the :attr:`~.slm.SLM.pixels` attribute. This attribute holds a virtual camera that reads the gray values of the pixels currently displayed on the SLM. This virtual camera implements the :class:`~.Detector` interface, meaning that it can be used just like an actual camera. This feature is useful, e.g., for storing or checking the images displayed on the SLM.

For debugging or demonstration purposes, it is often useful to receive feedback on the image displayed on the SLM. In Windows, this image can be see by hovering over the program icon in the task bar. Alternatively, the combination Ctrl + PrtScn can be used to grab the image on all active monitors. For demonstration purposes, the :func:`~.slm.SLM.clone` function can be used to create a second SLM window (typically placed in a corner of the primary screen), which shows the same image as the original SLM. This technique is used in the ``wfs_demonstration_experimental.py`` code available in the online example gallery :cite:`readthedocsOpenWFS`.
For debugging or demonstration purposes, it is often useful to receive feedback on the image displayed on the SLM. In Windows, this image can be see by hovering over the program icon in the task bar. Alternatively, the combination Ctrl + PrtScn can be used to grab the image on all active monitors. For demonstration purposes, the :func:`~.slm.SLM.clone` function can be used to create a second SLM window (typically placed in a corner of the primary screen), which shows the same image as the original SLM. This technique is demonstrated in the ``wfs_demonstration_experimental.py`` code available in the online example gallery :cite:`readthedocsOpenWFS`.


Lookup table
Expand Down
Loading

0 comments on commit b7d8c70

Please sign in to comment.