From 03ccf5af5153ea3dc23073a2673d81701e1dcc7b Mon Sep 17 00:00:00 2001 From: Ivo Vellekoop Date: Mon, 7 Oct 2024 14:12:43 +0200 Subject: [PATCH] reordered sections, editing documentation --- docs/source/conclusion.rst | 2 +- docs/source/conf.py | 27 +-- docs/source/core.rst | 166 +++++++++--------- docs/source/index.rst | 3 +- docs/source/index_latex.rst | 5 +- .../source/{pydevice.rst => micromanager.rst} | 4 +- docs/source/readme.rst | 49 ++---- docs/source/references.bib | 50 ++++-- docs/source/troubleshooting.rst | 22 +++ 9 files changed, 182 insertions(+), 146 deletions(-) rename docs/source/{pydevice.rst => micromanager.rst} (95%) create mode 100644 docs/source/troubleshooting.rst diff --git a/docs/source/conclusion.rst b/docs/source/conclusion.rst index 73d5843..855b48a 100644 --- a/docs/source/conclusion.rst +++ b/docs/source/conclusion.rst @@ -5,7 +5,7 @@ In this work we presented an open-source Python package for conducting and simul OpenWFS incorporates features to reduce the chances of errors in the design of wavefront shaping code. Notably, the use of units of measure prevents the accidental mixing of units, and the automatic synchronization mechanism ensures that hardware is properly synchronized without the need to write any synchronization code. Finally, the ability to simulate full experiments, and to mock individual components, allows the user to test wavefront shaping algorithms without the need for physical hardware. We find this feature particularly useful since there is a lot that can go wrong in an experiment, (also see :cite:`Mastiani2024PracticalConsiderations`), and experimental issues and software issues are not always easy to distinguish. With OpenWFS, it is now possible to fully test the algorithms before entering the lab, which can save a lot of time and frustration. -We envision that OpenWFS will hold a growing collection of components for hardware control, advanced simulations, and wavefront shaping. The standardised interfaces for detectors, actuators and SLMs, enables the cooperative development of complex functionality. Additionally, standardized components and algorithms will greatly simplify developing reusable code that can be used across different setups and experiments. The simulation tools may additionally be used for research and education, ushering in a new phase of applications in wavefront shaping. We therefore encourage the reader to join us in developing new algorithms and components for this framework. +We envision that OpenWFS will hold a growing collection of components for hardware control, advanced simulations, and wavefront shaping. Further expansion of the supported hardware is of high priority, especially wrapping c-based libraries and adding support for Micro-Manager device adapters. The standardised interfaces for detectors, actuators and SLMs will greatly simplify developing reusable code that can be used across different setups and experiments. The simulation tools may additionally be used for research and education, ushering in a new phase of applications in wavefront shaping. We therefore encourage the reader to join us in developing new algorithms and components for this framework. Code availability ------------------------------------------------ diff --git a/docs/source/conf.py b/docs/source/conf.py index 2022b77..6a05b01 100644 --- a/docs/source/conf.py +++ b/docs/source/conf.py @@ -43,6 +43,8 @@ latex_elements = { "preamble": r""" \usepackage{authblk} + \usepackage{etoolbox} % Reduce font size for all tables + \AtBeginEnvironment{tabular}{\small} """, "maketitle": r""" \author[1]{Daniël~W.~S.~Cox} @@ -61,16 +63,21 @@ this research field is expanding rapidly. As the field advances, it stands out that many breakthroughs are driven by the development of better software that incorporates increasingly advanced physical models and algorithms. - Typical control software involves fragmented implementations for scanning microscopy, image processing, - optimization algorithms, low-level hardware control, calibration and troubleshooting, - and simulations for testing new algorithms. - - The complexity of the many different aspects of wavefront shaping software, however, - is becoming a limiting factor for further developments in the field, as well as for end-user adoption. - OpenWFS addresses these challenges by providing a modular and extensible Python library that - incorporates elements for hardware control, software simulation, and automated troubleshooting. - Using these elements, the actual wavefront shaping algorithm and its automated tests can be written - in just a few lines of code. + + Typical WFS software involves a complex combination of low-level hardware control, signal processing, + calibration, troubleshooting, simulation, and the wavefront shaping algorithm itself. + This complexity makes it hard to compare different algorithms and to extend existing software with new + hardware or algorithms. Moreover, the complexity of the software can be a significant barrier for end + users of microscopes to adopt wavefront shaping. + + OpenWFS addresses these challenges by providing a modular Python library that + separates hardware control from the wavefront shaping algorithm itself. + Using these elements, an wavefront shaping algorithm can be written + in just a few lines of code, with OpenWFS taking care of low-level hardware control, synchronization, + and troubleshooting. Algorithms can be used on different hardware or in a completely + simulated environment without changing the code. Moreover, we provide full integration with + the \textmu Manager microscope control software, enabling wavefront shaping experiments to be + executed from a user-friendly graphical user interface. } } \maketitle diff --git a/docs/source/core.rst b/docs/source/core.rst index ede5a49..293a397 100644 --- a/docs/source/core.rst +++ b/docs/source/core.rst @@ -10,7 +10,27 @@ In addition, OpenWFS maintains metadata and units for all data arrays and proper Detectors ------------ -Detectors in OpenWFS are objects that capture, generate, or process data. All detectors derive from the :class:`~.Detector` base class. A Detector object may correspond to a physical device such as a camera, or it may be a software component that generates synthetic data (see :numref:`section-simulations`). Detectors have the following properties and methods: +Detectors in OpenWFS are objects that capture, generate, or process data. A Detector object may correspond to a physical device such as a camera, or it may be a software component that generates synthetic data (see :numref:`section-simulations`). Currently, the following detectors are supported: + +.. list-table:: + :header-rows: 1 + + * - Detector + - + * - Camera + - Supports all GenICam/GenTL cameras. + * - ScanningMicroscope + - Laser scanning microscope using galvo mirrors and National Instruments data acquisition card. + * - SimulatedWFS + - Simulated detector for testing wavefront shaping algorithms. + * - Microscope + - Fully simulated microscope, including aberrations, diffraction limit, and translation stage. + * - StaticSource + - Returns pre-set data, simulating a static source. + * - NoiseSource + - Generates uniform or Gaussian noise as a source. + +All detectors derive from the :class:`~.Detector` base class. and have the following properties and methods: .. code-block:: python @@ -27,7 +47,7 @@ Detectors in OpenWFS are objects that capture, generate, or process data. All de def coordinates(dimension: int) -> Quantity -The :meth:`~.Detector.read()` method of a detector starts a measurement and returns the captured data. It triggers the detector and blocks until the data is available. Data is always returned as `numpy` array :cite:`numpy`. Subclasses of :class:`~.Detector` typically add properties specific to that detector (e.g. shutter time, gain, etc.). In the simplest case, setting these properties and calling :meth:`.~Detector.read()` is all that is needed to capture data. The :meth:`~.Detector.trigger()` method is used for asynchronous measurements as described below. All other properties and methods are used for metadata and units, as described in :numref:`Units and metadata`. +The :meth:`~.Detector.read()` method of a detector starts a measurement and returns the captured data. It triggers the detector and blocks until the data is available. Data is always returned as ``numpy`` array :cite:`numpy`. Subclasses of :class:`~.Detector` typically add properties specific to that detector (e.g. shutter time, gain, etc.). In the simplest case, setting these properties and calling :meth:`.~Detector.read()` is all that is needed to capture data. The :meth:`~.Detector.trigger()` method is used for asynchronous measurements as described below. All other properties and methods are used for metadata and units, as described in :numref:`Units and metadata`. The detector object inherits some properties and methods from the base class :class:`~.Device`. These are used by the synchronization mechanism to determine when it is safe to start a measurement, as described in :numref:`device-synchronization`. @@ -36,7 +56,7 @@ Asynchronous measurements +++++++++++++++++++++++++++ :meth:`.~Detector.read()` blocks the program until the captured data is available. This behavior is not ideal when multiple detectors are used simultaneously, or when transferring or processing the data takes a long time. In these cases, it is preferable to use :meth:`.~Detector.trigger()`, which initiates the process of capturing or generating data and returns directly. The program can continue operation while the data is being captured/transferred/generated in a worker thread. While fetching and processing data is underway, any attempt to modify a property of the detector will block until the fetching and processing is complete. This way, all properties (such as the region of interest) are guaranteed to be constant between the calls to :meth:`.~Detector.trigger` and the moment the data is actually fetched and processed in the worker thread. -The asynchronous measurement mechanism can be seen in action in the `StepwiseSequential` algorithm used in :numref:`hello-wfs`. The `execute()` function of this algorithm is implemented as +The asynchronous measurement mechanism can be seen in action in the :class:`.~StepwiseSequential` algorithm used in :numref:`hello-wfs`. The :meth:`execute() ` function of this algorithm is implemented as .. code-block:: python @@ -55,7 +75,7 @@ The asynchronous measurement mechanism can be seen in action in the `StepwiseSeq self.feedback.wait() return analyze_phase_stepping(measurements, axis=2) -This code performs a wavefront shaping algorithm similar to the one described in :cite:`Vellekoop2007`. In this version, there is no pre-optimization. It works by cycling the phase of each of the n_x × n_y segments on the SLM between 0 and 2π, and measuring the feedback signal at each step. `self.feedback` holds a `Detector` object that is triggered, and stores the measurement in a pre-allocated `measurements` array when it becomes available. It is possible to find the optimized wavefront for multiple targets simultaneously by using a detector that returns an array of size `feedback.data_shape`, which contains a feedback value for each of the targets. +This code performs a wavefront shaping algorithm similar to the one described in :cite:`Vellekoop2007`. In this version, there is no pre-optimization. It works by cycling the phase of each of the ``n_x × n_y`` segments on the SLM between 0 and 2π, and measuring the feedback signal at each step. ``self.feedback`` holds a :class:`~.Detector` object that is triggered, and stores the measurement in a pre-allocated ``measurements`` array when it becomes available. It is possible to find the optimized wavefront for multiple targets simultaneously by using a detector that returns an array of size ``feedback.data_shape``, which contains a feedback value for each of the targets. The program does not wait for the data to become available and can directly proceed with preparing the next pattern to send to the SLM (also see :numref:`device-synchronization`). After running the algorithm, `wait` is called to wait until all measurement data is stored in the `measurements` array, and the utility function `analyze_phase_stepping` is used to extract the transmission matrix from the measurements, as well as a series of troubleshooting statistics (see :numref:`Analysis and troubleshooting`). @@ -69,15 +89,68 @@ Note that, except for this asynchronous mechanism for fetching and processing da Processors ------------ -A `Processor` is a `Detector` that takes input from one or more other detectors, and combines/processes this data. We already encountered an example in :numref:`Getting started`, where the `SingleRoiProcessor` was used to average the data from a camera over a region of interest. A block diagram of the data flow of this code is shown in :numref:`hellowfsdiagram`. Since a processor, itself, is a `Detector`, multiple processors can be chained together to combine their functionality. The OpenWFS further includes various processors, such as a `CropProcessor` to crop data to a rectangular region of interest, and a `TransformProcessor` to perform affine image transformations to image produced by a source. +A :class:`.~Processor` is an object that takes input from one or more other detectors, and combines/processes this data. By itself, a processor is a :class:`.~Detector`, enabling multiple processors to be chained together to combine their functionality. We already encountered an example in :numref:`Getting started`, where the :class:`.~SingleRoiProcessor` was used to average the data from a camera over a region of interest. A block diagram of the data flow of this code is shown in :numref:`hellowfsdiagram`. The OpenWFS currently includes the following processors: + +.. list-table:: + :header-rows: 1 + + * - Processor + - + * - SingleRoi + - Averages signal over a single ROI. + * - MultipleRoi + - Averages signals over multiple regions of interest (ROIs). + * - CropProcessor + - Crops data from the source to a region of interest. + * - TransformProcessor + - Performs affine transformations on the source data. + * - GaussianNoise + - Adds Gaussian noise to the source data. + * - ADCProcessor + - Simulates an analog-digital converter, including optional shot-noise and readout noise. + Actuators --------- -Actuators are devices that *move* things in the setup. This can be literal, such as moving a translation stage, or a virtual movement, like an SLM that takes time to switch to a different phase pattern. All actuators are derived from the common :class:`.Actuator` base class. Actuators have no additional methods or properties other than those in the :class:`.Device` base class. + +Actuators are devices that *move* things in the setup. This can be literal, such as moving a translation stage, or a virtual movement, like an SLM that takes time to switch to a different phase pattern. All actuators are derived from the common :class:`.Actuator` base class. Actuators have no additional methods or properties other than those in the :class:`.Device` base class. A list of actuators currently supported by OpenWFS can be found in the table below. + +.. list-table:: + :header-rows: 1 + :name: supported-actuators + + * - SLM + - Controls and renders patterns on a Spatial Light Modulator (SLM) using OpenGL + * - simulation.SLM + - Simulates a phase-only spatial light modulator, including timing and non-linear phase response. + * - simulation.XYStage + - Simulates a translation stage, used in :class:`~Microscope`. + + +Algorithms +------------ +OpenWFS comes with a number of wavefront shaping algorithms already implemented, as listed in the table below. Although these algorithms could be implemented as functions, we chose to implement them as objects, so that the parameters of the algorithm can be stored as attributes of the object. This simplifies keeping the parameters together in one place in the code, and also allows the algorithm parameters to be accessible in the the Micro-Manager graphical user interface (GUI), see :ref:`micromanager`. + +All algorithms are designed to be completely hardware-agnostic, so that they can be used with any type of feedback signal and either use real hardware or simulated hardware without the need to change a single line of code in the algorithm implementation. The :class:`.~FourierDualReference`, :class:`.~DualReference` and :class:`.~StepwiseSequential` algorithms provide support for optimizing multiple targets simulaneously in a single run of the algorithm. + +.. list-table:: + :header-rows: 1 + + * - Algorithm + - + * - FourierDualReference + - A dual reference algorithm that uses plane waves from a disk in k-space for wavefront shaping :cite:`Mastiani2022`. + * - DualReference + - A generic dual reference algorithm with a configurable basis set :cite:`Cox2024`. + * - SimpleGenetic + - A simple genetic algorithm for optimiziang wavefronts :cite:`Piestun2012`. + * - StepwiseSequential + - A simplified version of the original wavefront shaping algorithm :cite:`Vellekoop2007`, with pre-optimization omitted. + Units and metadata ---------------------------------- -OpenWFS consistently uses `astropy.units` :cite:`astropy` for quantities with physical dimensions, which allows for calculations to be performed with correct units, and for automatic unit conversion where necessary. Importantly, it prevents errors caused by passing a quantity in incorrect units, such as passing a wavelength in micrometers when the function expects a wavelength in nanometers. By using `astropy.units`, the quantities are converted automatically, so one may for example specify a time in milliseconds, minutes or days. The use of units is illustrated in the following snippet: +OpenWFS consistently uses ``astropy.units`` :cite:`astropy` for quantities with physical dimensions, which allows for calculations to be performed with correct units, and for automatic unit conversion where necessary. Importantly, it prevents errors caused by passing a quantity in incorrect units, such as passing a wavelength in micrometers when the function expects a wavelength in nanometers. By using ``astropy.units``, the quantities are converted automatically, so one may for example specify a time in milliseconds, minutes or days. The use of units is illustrated in the following snippet: .. code-block:: python @@ -89,7 +162,7 @@ OpenWFS consistently uses `astropy.units` :cite:`astropy` for quantities with ph In addition, OpenWFS allows attaching pixel-size metadata to data arrays using the functions :func:`~.set_pixel_size()`. Pixel sizes can represent a physical length (e.g. as in the size pixels on an image sensor), or other units such as time (e.g. as the sampling period in a time series). OpenWFS fully supports anisotropic pixels, where the pixel sizes in the x and y directions are different. -The data arrays returned by the :meth:`~.Detector.read()` function of a detector have `pixel_size` metadata attached whenever appropriate. The pixel size can be retrieved from the array using :func:`~.get_pixel_size()`, or obtained from the :attr:`~.Detector.pixel_size` attribute directly. As an alternative to accessing the pixel size directly, :func:`~get_extent()` and :class:`~.Detector.extent` provide access to the extent of the array, which is always equal to the pixel size times the shape of the array. Finally, the convenience function :meth:`~.Detector.coordinates` returns a vector of coordinates with appropriate units along a specified dimension of the array. +The data arrays returned by the :meth:`~.Detector.read()` function of a detector have ``pixel_size`` metadata attached whenever appropriate. The pixel size can be retrieved from the array using :func:`~.get_pixel_size()`, or obtained from the :attr:`~.Detector.pixel_size` attribute directly. As an alternative to accessing the pixel size directly, :func:`~get_extent()` and :class:`~.Detector.extent` provide access to the extent of the array, which is always equal to the pixel size times the shape of the array. Finally, the convenience function :meth:`~.Detector.coordinates` returns a vector of coordinates with appropriate units along a specified dimension of the array. .. _device-synchronization: @@ -115,9 +188,9 @@ Each device can either be *busy* or *ready*, and this state can be polled by cal - before starting a measurement, wait until all motion is (almost) completed - before starting any movement, wait until all measurements are (almost) completed -Here, 'almost' refers to the fact that devices may have a *latency*. Latency is the time between sending a command to a device, and the moment the device starts responding. An important example is the SLM, which typically takes one or two frame periods to transfer the image data to the liquid crystal chip. Such devices can specify a non-zero `latency` attribute. When specified, the device 'promises' not to do anything until `latency` milliseconds after the start of the measurement or movement. When a latency is specified, detectors or actuators can be started slightly before the devices of the other type (actuators or detectors, respectively) have finished their operation. For example, this mechanism allows sending a new frame to the SLM *before* the measurements of the current frame are finished, since it is known that the SLM will not respond for `latency` milliseconds anyway. This way, measurements and SLM updates can be pipelined to maximize the number of measurements that can be done in a certain amount of time. To enable these pipelined measurements, the `Device` class also provides a `duration` attribute, which is the maximum time interval between the start and end of a measurement or actuator action. +Here, 'almost' refers to the fact that devices may have a *latency*. Latency is the time between sending a command to a device, and the moment the device starts responding. An important example is the SLM, which typically takes one or two frame periods to transfer the image data to the liquid crystal chip. Such devices can specify a non-zero ``latency`` attribute. When specified, the device 'promises' not to do anything until ``latency`` milliseconds after the start of the measurement or movement. When a latency is specified, detectors or actuators can be started slightly before the devices of the other type (actuators or detectors, respectively) have finished their operation. For example, this mechanism allows sending a new frame to the SLM *before* the measurements of the current frame are finished, since it is known that the SLM will not respond for ``latency`` milliseconds anyway. This way, measurements and SLM updates can be pipelined to maximize the number of measurements that can be done in a certain amount of time. To enable these pipelined measurements, the ``Device`` class also provides a `duration` attribute, which is the maximum time interval between the start and end of a measurement or actuator action. -This synchronization is performed automatically. If desired, it is possible to explicitly wait for the device to become ready by calling :meth:`~.Device.wait()`. To accommodate taking into account the latency, this function takes an optional parameter `up_to`, which indicates that the function may return the specified time *before* the device hardware is ready. In user code, it is only necessary to call `wait` when using the `out` parameter to store measurements in a pre-defined location (see :numref:`Asynchronous measurements` above). A typical usage pattern is illustrated in the following snippet: +This synchronization is performed automatically. If desired, it is possible to explicitly wait for the device to become ready by calling :meth:`~.Device.wait()`. To accommodate taking into account the latency, this function takes an optional parameter ``up_to``, which indicates that the function may return the specified time *before* the device hardware is ready. In user code, it is only necessary to call ``wait`` when using the ``out`` parameter to store measurements in a pre-defined location (see :numref:`Asynchronous measurements` above). A typical usage pattern is illustrated in the following snippet: .. code-block:: python @@ -137,76 +210,5 @@ This synchronization is performed automatically. If desired, it is possible to e cam1.wait() # wait until camera 1 is done grabbing frames cam2.wait() # wait until camera 2 is done grabbing frames -Finally, devices have a `timeout` attribute, which is the maximum time to wait for a device to become ready. This timeout is used in the state-switching mechanism, and when explicitly waiting for results using :meth:`~.Device.wait()` or :meth:`~.Device.read()`. - -Currently available devices ----------------------------- - -The following devices are currently implemented in OpenWFS: - -.. list-table:: - :header-rows: 1 +Finally, devices have a ``timeout`` attribute, which is the maximum time to wait for a device to become ready. This timeout is used in the state-switching mechanism, and when explicitly waiting for results using :meth:`~.Device.wait()` or :meth:`~.Device.read()`. - * - Device Name - - Device Type - - Description - * - Camera - - Detector - - Adapter for GenICam/GenTL cameras - * - ScanningMicroscope - - Detector - - Laser scanning microscope using galvo mirrors and NI DAQ - * - StaticSource - - Detector - - Returns pre-set data, simulating a static source - * - NoiseSource - - Detector - - Generates uniform or Gaussian noise as a source - * - SingleRoi - - Processor (Detector) - - Averages signal over a single ROI - * - MultipleRoi - - Processor (Detector) - - Averages signals over multiple regions of interest (ROIs) - * - CropProcessor - - Processor (Detector) - - Crops data from the source to a region of interest - * - TransformProcessor - - Processor (Detector) - - Performs affine transformations on the source data - * - ADCProcessor - - Processor (Detector) - - Simulates an analog-digital converter - * - SimulatedWFS - - Processor - - Simulates wavefront shaping experiment using Fourier transform-based intensity computation at the focal plane - * - Gain - - Actuator - - Controls PMT gain voltage using NI data acquisition card - * - PhaseSLM - - Actuator - - Simulates a phase-only spatial light modulator - * - SLM - - Actuator - - Controls and renders patterns on a Spatial Light Modulator (SLM) using OpenGL - -Available Algorithms ---------------------- - -The following algorithms are available in OpenWFS for wavefront shaping: - -.. list-table:: - :header-rows: 1 - - * - Algorithm Name - - Description - * - FourierDualReference - - A Fourier dual reference algorithm that uses plane waves from a disk in k-space for wavefront shaping :cite:`Mastiani2022`. - * - IterativeDualReference - - A generic iterative dual reference algorithm with the ability to use custom basis functions for non-linear feedback applications. - * - DualReference - - A generic dual reference algorithm with the option for optimized reference, suitable for multi-target optimization and iterative feedback. - * - SimpleGenetic - - A simple genetic algorithm that optimizes wavefronts by selecting elite individuals and introducing mutations for focusing through scattering media :cite:`Piestun2012`. - * - StepwiseSequential - - A stepwise sequential algorithm which systematically modifies the phase pattern of each SLM element :cite:`Vellekoop2007`. \ No newline at end of file diff --git a/docs/source/index.rst b/docs/source/index.rst index c6da15e..c7886ca 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -9,7 +9,8 @@ OpenWFS - a library for conducting and simulating wavefront shaping experiments core slms simulations + micromanager + troubleshooting development - pydevice api auto_examples/index diff --git a/docs/source/index_latex.rst b/docs/source/index_latex.rst index 5945f93..aad09ab 100644 --- a/docs/source/index_latex.rst +++ b/docs/source/index_latex.rst @@ -9,8 +9,7 @@ OpenWFS - a library for conducting and simulating wavefront shaping experiments core slms simulations - pydevice + micromanager + troubleshooting development conclusion - - auto_examples/index diff --git a/docs/source/pydevice.rst b/docs/source/micromanager.rst similarity index 95% rename from docs/source/pydevice.rst rename to docs/source/micromanager.rst index 7eca427..6069b5c 100644 --- a/docs/source/pydevice.rst +++ b/docs/source/micromanager.rst @@ -1,6 +1,6 @@ -.. _section-pydevice: +.. _micromanager: -OpenWFS in PyDevice +OpenWFS in μ-Manager ============================================== To smoothly enable end-user interaction with wavefront shaping algorithms, the Micro-Manager device adapter PyDevice was developed :cite:`PyDevice`. A more detailed description can be found in the mmCoreAndDevices source tree :cite:`mmCoreAndDevices`. In essence, PyDevice is Micro-Manager adapter that imports objects from a Python script and integrates them as devices, e.g. a camera or stage. OpenWFS was written in compliance with the templates required for PyDevice, which means OpenWFS cameras, scanners and algorithms can be loaded into Micro-Manager as devices. Examples of this are found in the example gallery :cite:`readthedocsOpenWFS`. Further developments due to this seamless connection can be a dedicated Micro-Manager based wavefront shaping GUI. \ No newline at end of file diff --git a/docs/source/readme.rst b/docs/source/readme.rst index 6284454..c709b38 100644 --- a/docs/source/readme.rst +++ b/docs/source/readme.rst @@ -17,12 +17,14 @@ Wavefront shaping (WFS) is a technique for controlling the propagation of light It stands out that an important driving force in WFS is the development of new algorithms, for example, to account for sample movement :cite:`valzania2023online`, experimental conditions :cite:`Anderson2016`, to be optimally resilient to noise :cite:`mastiani2021noise`, or to use digital twin models to compute the required correction patterns :cite:`salter2014exploring,ploschner2015seeing,Thendiyammal2020,cox2023model`. Much progress has been made towards developing fast and noise-resilient algorithms, or algorithms designed specifically for the methodology of wavefront shaping, such as using algorithms based on Hadamard patterns or Fourier-based approaches :cite:`Mastiani2022`. Fast techniques that enable wavefront shaping in dynamic samples :cite:`Liu2017,Tzang2019` have also been developed, and many potential applications have been prototyped, including endoscopy :cite:`ploschner2015seeing`, optical trapping :cite:`Cizmar2010`, Raman scattering :cite:`Thompson2016`, and deep-tissue imaging :cite:`Streich2021`. Applications extend beyond that of microscope imaging, such as in optimizing photoelectrochemical absorption :cite:`Liew2016` and tuning random lasers :cite:`Bachelard2014`. -With the development of these advanced algorithms, however, the complexity of WFS software is steadily increasing as the field matures, which hinders cooperation as well as end-user adoption. Code for controlling wavefront shaping tends to be complex and setup-specific, and developing this code typically requires detailed technical knowledge and low-level programming. A recent c++ based contribution :cite:`Anderson2024`, highlights the growing need for software based tools that enable use and development. Moreover, since many labs use their own in-house programs to control the experiments, sharing and re-using code between different research groups is troublesome. +With the development of these advanced algorithms, however, the complexity of WFS software is steadily increasing as the field matures, which hinders cooperation as well as end-user adoption. Code for controlling wavefront shaping tends to be complex and setup-specific, and developing this code typically requires detailed technical knowledge and low-level programming. Moreover, since many labs use their own in-house programs to control the experiments, sharing and re-using code between different research groups is troublesome. + +Even though authors are increasingly sharing their code, for example for controlling spatial light modulators (SLMs) :cite:`PopoffslmPy`, or running genetic algorithms :cite:`Anderson2024`, a modular framework that combines all aspects of hardware control, simulation, and graphical user interface (GUI) integration is still lacking. What is OpenWFS? ---------------------- -OpenWFS is a Python package for performing and for simulating wavefront shaping experiments. It aims to accelerate wavefront shaping research by providing: +OpenWFS is a Python package that is primarily designed for performing and for simulating wavefront shaping experiments. It aims to accelerate wavefront shaping research by providing: * **Hardware control**. Modular code for controlling spatial light modulators, cameras, and other hardware typically encountered in wavefront shaping experiments. Highlights include: @@ -31,26 +33,34 @@ OpenWFS is a Python package for performing and for simulating wavefront shaping * **GenICam cameras**. The :class:`~.devices.Camera` object uses the `harvesters` backend :cite:`harvesters` to access any camera supporting the GenICam standard :cite:`genicam`. * **Automatic synchronization**. OpenWFS provides tools for automatic synchronization of actuators (e.g. an SLM) and detectors (e.g. a camera). The automatic synchronization makes it trivial to perform pipelined measurements that avoid the delay normally caused by the latency of the video card and SLM. -* **Wavefront shaping algorithms**. A (growing) collection of wavefront shaping algorithms. OpenWFS abstracts the hardware control, synchronization, and signal processing so that the user can focus on the algorithm itself. As a result, most algorithms can be implemented cleanly without hardware-specific programming. +* **Simulation**. OpenWFS provides an extensive framework for testing and simulating wavefront shaping algorithms, including the effect of measurement noise, stage drift, and user-defined aberrations. This allows for rapid prototyping and testing of new algorithms without the need for physical hardware. -* **Simulation**. OpenWFS provides an extensive framework for testing and simulating wavefront shaping algorithms, including the effect of measurement noise, stage drift, and user-defined aberrations. This allows for rapid prototyping and testing of new algorithms, without the need for physical hardware. +* **Wavefront shaping algorithms**. A (growing) collection of wavefront shaping algorithms. OpenWFS abstracts the hardware control, synchronization, and signal processing so that the user can focus on the algorithm itself. As a result, even advanced algorithms can be implemented in a few dozens of lines of code, and automaticallyt work with any combination of hardware and simulation tools that OpenWFS supports. -* **Platform for exchange and joint collaboration**. OpenWFS can be used as a platform for sharing and exchanging wavefront shaping algorithms. The package is designed to be modular and easy to expand, and it is our hope that the community will contribute to the package by adding new algorithms, hardware control modules, and simulation tools. Python was specifically chosen for this purpose for its active community, high level of abstraction and the ease of sharing tools. Further expansion of the supported hardware is of high priority, especially wrapping c-based software support with tools like ctypes and the Micro-Manager based device adapters. +* **Platform for exchange and joint collaboration**. OpenWFS can be used as a platform for sharing and exchanging wavefront shaping algorithms. The package is designed to be modular and easy to expand, and it is our hope that the community will contribute to the package by adding new algorithms, hardware control modules, and simulation tools. Python was specifically chosen for this purpose for its active community, high level of abstraction and the ease of sharing tools. * **Platform for simplifying use of wavefront shaping**. OpenWFS is compatible with the recently developed PyDevice :cite:`PyDevice`, and can therefore be controlled from Micro-Manager :cite:`MMoverview`, a commonly used microscopy control platform. * **Automated troubleshooting**. OpenWFS provides tools for automated troubleshooting of wavefront shaping experiments. This includes tools for measuring the performance of wavefront shaping algorithms, and for identifying common problems such as incorrect SLM calibration, drift, measurement noise, and other experimental imperfections. + + .. only:: latex - Here, we first show how to get started using OpenWFS for simulating and controlling wavefront shaping experiments. An in-depth discussion of the core design of OpenWFS is given in :numref:`Key concepts`. Key to any wavefront shaping experiment is the SLM. The support for advanced options like texture warping and the use of a software lookup table are explained in :numref:`section-slms`. + Here, we first show how to get started using OpenWFS for simulating and controlling wavefront shaping experiments. An in-depth discussion of the core design of OpenWFS is given in :numref:`Key concepts`. Key to any wavefront shaping experiment is the SLM. The support for advanced options like texture mapping and the use of a software lookup table are explained in :numref:`section-slms`. The ability to simulate optical experiments is essential for the rapid development and debugging of wavefront shaping algorithms. The built-in options for realistically simulating experiments are be discussed in :numref:`section-simulations`. Finally, OpenWFS is designed to be modular and easy to extend. In :numref:`section-development`, we show how to write custom hardware control modules. Note that not all functionality of the package is covered in this document, and we refer to the API documentation :cite:`openwfsdocumentation` for a complete overview of most recent version of the package. Getting started ---------------------- -OpenWFS is available on the PyPI repository, and it can be installed with the command ``pip install openwfs``. The latest documentation and the example code can be found on the `Read the Docs `_ website :cite:`openwfsdocumentation`, and the entire repository can be found on :cite:`openwfsgithub`. To use OpenWFS, you need to have Python 3.9 or later installed. At the time of writing, OpenWFS is tested up to Python version 3.11 (not all dependencies were available for Python 3.12 yet). OpenWFS is developed and tested on Windows 11 and Manjaro Linux. Note that for certain hardware components, third party software needs to be installed. This is always mentioned in the documentation and docstrings of these functions. +To use OpenWFS, you need to have Python 3.9 or later installed. At the time of writing, OpenWFS is tested up to Python version 3.11 on Windows 11 and Manjaro Linux. OpenWFS is available on the PyPI repository. To install it, run the following command: + +.. code-block:: bash + + pip install openwfs[all] + +This will also install the optional dependencies for OpenWFS, such as ``PyOpenGL``, ``nidaqmx`` and ``harvesters``, which are used for OpenGL-accelerated SLM control, scanning microscopy, and camera control, respectively. These dependencies cannot be installed on your system, the installation will fail. At the time of writing, this can happen with ``PyOpenGL`` on systems without OpenGL driver installed, or for ``harvesters``, which currently only works for Python versions up to 3.11. You can instead install OpenWFS without dependencies by omitting ``[all]`` in the installation command, and then install only the required dependencies as indicated in the API documentation for each hardware component. The latest documentation and the example code can be found on the `Read the Docs `_ website :cite:`openwfsdocumentation`, and the source code can be found on :cite:`openwfsgithub`. :numref:`hello-wfs` shows an example of how to use OpenWFS to run a simple wavefront shaping experiment. This example illustrates several of the main concepts of OpenWFS. First, the code initializes objects to control a spatial light modulator (SLM) connected to a video port, and a camera that provides feedback to the wavefront shaping algorithm. @@ -59,31 +69,10 @@ OpenWFS is available on the PyPI repository, and it can be installed with the co :language: python :caption: ``hello_wfs.py``. Example of a simple wavefront shaping experiment using OpenWFS. -This example uses the `~.StepwiseSequential` wavefront shaping algorithm :cite:`vellekoop2008phase`. The algorithm needs access to the SLM for controlling the wavefront. This feedback is obtained from a :class:`~.SingleRoi` object, which takes images from the camera, and averages them over the specified circular region of interest. The algorithm returns the measured transmission matrix in the field `results.t`, which is used to compute the optimal phase pattern to compensate the aberrations. Finally, the code measures the intensity at the detector before and after applying the optimized phase pattern. - -This code illustrates how OpenWFS separates the concerns of the hardware control (:class:`~.SLM` and :class:`~.Camera`), signal processing (:class:`~.SingleRoi(Processor)`) and the algorithm itself (:class:`~.StepwiseSequential`). A large variety of wavefront shaping experiments can be performed by using different types of feedback signals (such as optimizing multiple foci simultaneously using a :class:`~.MultiRoi(Processor)` object), using different algorithms, or different image sources, such as a :class:`~.ScanningMicroscope`. Notably, these objects can be replaced by *mock* objects, that simulate the hardware and allow for rapid prototyping and testing of new algorithms without direct access to wavefront shaping hardware (see :numref:`section-simulations`). - - -Analysis and troubleshooting ------------------------------------------------- -The principles of wavefront shaping are well established, and under close-to-ideal experimental conditions, it is possible to accurately predict the signal enhancement. In practice, however, there exist many practical issues that can negatively affect the outcome of the experiment. OpenWFS has built-in functions to analyze and troubleshoot the measurements from a wavefront shaping experiment. - -The ``result`` structure in :numref:`hello-wfs`, as returned by the wavefront shaping algorithm, was computed with the utility function :func:`analyze_phase_stepping`. This function extracts the transmission matrix from phase stepping measurements, and additionally computes a series of troubleshooting statistics in the form of a *fidelity*, which is a number that ranges from 0 (no sensible measurement possible) to 1 (perfect situation, optimal focus expected). These fidelities are: - -* :attr:`~.WFSResults.fidelity_noise`: The fidelity reduction due to noise in the measurements. -* :attr:`~.WFSResults.fidelity_amplitude`: The fidelity reduction due to unequal illumination of the SLM. -* :attr:`~.WFSResults.fidelity_calibration`: The fidelity reduction due to imperfect phase response of the SLM. - -If these fidelities are much lower than 1, this indicates a problem in the experiment, or a bug in the wavefront shaping experiment. For a comprehensive overview of the practical considerations in wavefront shaping and their effects on the fidelity, please see :cite:`Mastiani2024PracticalConsiderations`. - -Further troubleshooting can be performed with the :func:`~.troubleshoot` function, which estimates the following fidelities: - -* :attr:`~.WFSTroubleshootResult.fidelity_non_modulated`: The fidelity reduction due to non-modulated light., e.g. due to reflection from the front surface of the SLM. -* :attr:`~.WFSTroubleshootResult.fidelity_decorrelation`: The fidelity reduction due to decorrelation of the field during the measurement. +This example uses the :class:`~.StepwiseSequential` wavefront shaping algorithm :cite:`vellekoop2008phase`. The algorithm needs access to the SLM for controlling the wavefront. This feedback is obtained from a :class:`~.SingleRoi` object, which takes images from the camera, and averages them over the specified circular region of interest. The algorithm returns the measured transmission matrix in the field `results.t`, which is used to compute the optimal phase pattern to compensate the aberrations. Finally, the code measures the intensity at the detector before and after applying the optimized phase pattern. -All fidelity estimations are combined to make an order of magnitude estimation of the expected enhancement. :func:`~.troubleshoot` returns a ``WFSTroubleshootResult`` object containing the outcome of the different tests and analyses, which can be printed to the console as a comprehensive troubleshooting report with the method :meth:`~.WFSTroubleshootResult.report()`. See ``examples/troubleshooter_demo.py`` for an example of how to use the automatic troubleshooter. +This code illustrates how OpenWFS separates the concerns of the hardware control (:class:`~.SLM` and :class:`~.Camera`), signal processing (:class:`~.SingleRoi`) and the algorithm itself (:class:`~.StepwiseSequential`). A large variety of wavefront shaping experiments can be performed by using different types of feedback signals (such as optimizing multiple foci simultaneously using a :class:`~.MultiRoi` object), using different algorithms, or different image sources, such as a :class:`~.ScanningMicroscope`. Notably, these objects can be replaced by *mock* objects, that simulate the hardware and allow for rapid prototyping and testing of new algorithms without direct access to wavefront shaping hardware (see :numref:`section-simulations`). -Lastly, the :func:`~.troubleshoot` function computes several image frame metrics such as the *unbiased contrast to noise ratio* and *unbiased contrast enhancement*. These metrics are especially useful for scenarios where the contrast is expected to improve due to wavefront shaping, such as in multi-photon excitation fluorescence (multi-PEF) microscopy. Furthermore, :func:`~.troubleshoot` tests the image capturing repeatability and runs a stability test by capturing and comparing many frames over a longer period of time. %endmatter% diff --git a/docs/source/references.bib b/docs/source/references.bib index 404a9f5..fc7d9b5 100644 --- a/docs/source/references.bib +++ b/docs/source/references.bib @@ -6,23 +6,22 @@ @book{goodman2015statistical } - @article{Piestun2012, - abstract = {We introduce genetic algorithms (GA) for wavefront control to focus light through highly scattering media. We theoretically and experimentally compare GAs to existing phase control algorithms and show that GAs are particularly advantageous in low signal-to-noise environments.}, - author = {Rafael Piestun and Albert N. Brown and Antonio M. Caravaca-Aguirre and Donald B. Conkey}, - doi = {10.1364/OE.20.004840}, - issn = {1094-4087}, - issue = {5}, - journal = {Optics Express, Vol. 20, Issue 5, pp. 4840-4849}, - keywords = {Optical trapping,Phase conjugation,Phase shift,Scattering media,Spatial light modulators,Turbid media}, - month = {2}, - pages = {4840-4849}, - pmid = {22418290}, - publisher = {Optica Publishing Group}, - title = {Genetic algorithm optimization for focusing through turbid media in noisy environments}, - volume = {20}, - url = {https://opg.optica.org/viewmedia.cfm?uri=oe-20-5-4840&seq=0&html=true https://opg.optica.org/abstract.cfm?uri=oe-20-5-4840 https://opg.optica.org/oe/abstract.cfm?uri=oe-20-5-4840}, - year = {2012}, + abstract = {We introduce genetic algorithms (GA) for wavefront control to focus light through highly scattering media. We theoretically and experimentally compare GAs to existing phase control algorithms and show that GAs are particularly advantageous in low signal-to-noise environments.}, + author = {Rafael Piestun and Albert N. Brown and Antonio M. Caravaca-Aguirre and Donald B. Conkey}, + doi = {10.1364/OE.20.004840}, + issn = {1094-4087}, + issue = {5}, + journal = {Optics Express, Vol. 20, Issue 5, pp. 4840-4849}, + keywords = {Optical trapping,Phase conjugation,Phase shift,Scattering media,Spatial light modulators,Turbid media}, + month = {2}, + pages = {4840-4849}, + pmid = {22418290}, + publisher = {Optica Publishing Group}, + title = {Genetic algorithm optimization for focusing through turbid media in noisy environments}, + volume = {20}, + url = {https://opg.optica.org/viewmedia.cfm?uri=oe-20-5-4840&seq=0&html=true https://opg.optica.org/abstract.cfm?uri=oe-20-5-4840 https://opg.optica.org/oe/abstract.cfm?uri=oe-20-5-4840}, + year = {2012}, } @@ -368,6 +367,13 @@ @article{vellekoop2008phase publisher = {Elsevier} } +@misc{PopoffslmPy, + author = {S. Popoff}, + title = {slmPy: A simple Python module to interact with spatial light modulators}, + year = {2017}, + howpublished = {\url{https://github.com/wavefrontshaping/slmPy}}, +} + @article{Liu2017, author = {Yan Liu et al.}, journal = {Optica}, @@ -460,6 +466,7 @@ @misc{mmCoreAndDevices title = {Micro-Manager mmCoreAndDevices repository}, url = {https://github.com/micro-manager/mmCoreAndDevices}, } + @misc{MMoverview, author = {Mark Tsuchida and Sam Griffin}, title = {Micro-Manager Project Overview}, @@ -484,7 +491,7 @@ @article{Anderson2024 publisher = {IOP Publishing}, title = {A modular GUI-based program for genetic algorithm-based feedback-assisted wavefront shaping}, volume = {6}, - url = {https://iopscience.iop.org/article/10.1088/2515-7647/ad6ed3 https://iopscience.iop.org/article/10.1088/2515-7647/ad6ed3/meta}, + url = {https://iopscience.iop.org/article/10.1088/2515-7647/ad6ed3}, year = {2024}, } @@ -566,6 +573,15 @@ @ARTICLE{Astropy2022 adsnote = {Provided by the SAO/NASA Astrophysics Data System} } +@article{Cox2024, + author = {Dani{\"e}l W.S. Cox and Ivo M. Vellekoop}, + title = {Othonormalization of phase-only basis functions}, + journal = {ArXiv}, + volume = {2409.04565}, + year = {2024}, + doi = {https://doi.org/10.48550/arXiv.2409.04565}, +} + @article{Lai2015, author = {Lai, Puxiang and Wang, Li and Wang, Lihong}, year = {2015}, diff --git a/docs/source/troubleshooting.rst b/docs/source/troubleshooting.rst new file mode 100644 index 0000000..78966e4 --- /dev/null +++ b/docs/source/troubleshooting.rst @@ -0,0 +1,22 @@ +.. _troubleshooting: + +Analysis and troubleshooting +================================================== +The principles of wavefront shaping are well established, and under close-to-ideal experimental conditions, it is possible to accurately predict the signal enhancement. In practice, however, there exist many practical issues that can negatively affect the outcome of the experiment. OpenWFS has built-in functions to analyze and troubleshoot the measurements from a wavefront shaping experiment. + +The ``result`` structure in :numref:`hello-wfs`, as returned by the wavefront shaping algorithm, was computed with the utility function :func:`analyze_phase_stepping`. This function extracts the transmission matrix from phase stepping measurements, and additionally computes a series of troubleshooting statistics in the form of a *fidelity*, which is a number that ranges from 0 (no sensible measurement possible) to 1 (perfect situation, optimal focus expected). These fidelities are: + +* :attr:`~.WFSResults.fidelity_noise`: The fidelity reduction due to noise in the measurements. +* :attr:`~.WFSResults.fidelity_amplitude`: The fidelity reduction due to unequal illumination of the SLM. +* :attr:`~.WFSResults.fidelity_calibration`: The fidelity reduction due to imperfect phase response of the SLM. + +If these fidelities are much lower than 1, this indicates a problem in the experiment, or a bug in the wavefront shaping experiment. For a comprehensive overview of the practical considerations in wavefront shaping and their effects on the fidelity, please see :cite:`Mastiani2024PracticalConsiderations`. + +Further troubleshooting can be performed with the :func:`~.troubleshoot` function, which estimates the following fidelities: + +* :attr:`~.WFSTroubleshootResult.fidelity_non_modulated`: The fidelity reduction due to non-modulated light., e.g. due to reflection from the front surface of the SLM. +* :attr:`~.WFSTroubleshootResult.fidelity_decorrelation`: The fidelity reduction due to decorrelation of the field during the measurement. + +All fidelity estimations are combined to make an order of magnitude estimation of the expected enhancement. :func:`~.troubleshoot` returns a ``WFSTroubleshootResult`` object containing the outcome of the different tests and analyses, which can be printed to the console as a comprehensive troubleshooting report with the method :meth:`~.WFSTroubleshootResult.report()`. See ``examples/troubleshooter_demo.py`` for an example of how to use the automatic troubleshooter. + +Lastly, the :func:`~.troubleshoot` function computes several image frame metrics such as the *unbiased contrast to noise ratio* and *unbiased contrast enhancement*. These metrics are especially useful for scenarios where the contrast is expected to improve due to wavefront shaping, such as in multi-photon excitation fluorescence (multi-PEF) microscopy. Furthermore, :func:`~.troubleshoot` tests the image capturing repeatability and runs a stability test by capturing and comparing many frames over a longer period of time.