Skip to content

Commit

Permalink
Update Examples.rst
Browse files Browse the repository at this point in the history
  • Loading branch information
LMBooth authored Sep 2, 2023
1 parent eaf9085 commit d816556
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions docs/BackgroundInformation/Examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Examples

The following examples can all be found on the `PyBCI github <https://github.com/LMBooth/pybci/tree/main/pybci/Examples>`_.

Note all the examples shown that are not in a dedicated folder work with the PsuedoLSLStreamGenerator found in `PsuedoLSLSreamGenerator/mainSend.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/PsuedoLSLStreamGenerator/mainSend.py>`_, if using with own LSL capable hardware you may need to adjust the scripts accordingly.
Note all the examples shown that are not in a dedicated folder work with the pseudoLSLStreamGenerator found in `pseudoLSLSreamGenerator/mainSend.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/pseudoLSLStreamGenerator/mainSend.py>`_, if using with own LSL capable hardware you may need to adjust the scripts accordingly.

.. list-table:: PyBCI Examples
:widths: 25 75
Expand All @@ -14,7 +14,7 @@ Note all the examples shown that are not in a dedicated folder work with the Psu
- Description
* - `ArduinoHandGrasp/ <https://github.com/LMBooth/pybci/tree/main/pybci/Examples/ArduinoHandGrasp>`_
- Folder contains an LSL marker creator in `MarkerMaker.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/ArduinoHandGrasp/MarkerMaker.py>`_ using PyQt5 as an on screen text stimulus, illustrates how LSL markers can be used to train. `ServoControl.ino <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/ArduinoHandGrasp/ServoControl/ServoControl.ino>`_ is designed for an arduino uno which controls 5 servo motors, each of which control the position of an indidividual finger for a 3D printed hand which can be controlled via serial commands. There is also a `Myoware Muscle Sensor <https://myoware.com/products/muscle-sensor/>`_ attached to analog pin A0 being read continuously over the serial connection. `ArduinoToLSL.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/ArduinoHandGrasp/ArduinoToLSL.py>`_ is used to send and receive serial data to and from the arduino, whilst pushing the A0 data to an LSL outlet which is classified in `testArduinoHand.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/ArduinoHandGrasp/testArduinoHand.py>`_, whilst simultaneously receiving a marker stream from testArduinoHand.py to inform which hand position to do.
* - `PsuedoLSLSreamGenerator/mainSend.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/PsuedoLSLStreamGenerator/mainSend.py>`_
* - `pseudoLSLSreamGenerator/mainSend.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/pseudoLSLStreamGenerator/mainSend.py>`_
- Generates multiple channels on a given stream type at a given sample rate. A baseline signal is generated on an LSL stream outlet and a PyQt button can be pressed to signify this signal on a separate LSL marker stream. The signal can be altered by 5 distinct markers for a configurable amount of time, allowing the user to play with various signal patterns for clasification. NOTE: Requires `PyQt5` and `pyqtgraph` installs for data viewer.
* - `PupilLabsRightLeftEyeClose/ <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/PupilLabsRightLeftEyeClose/>`_
- Folder contains example basic pupil labs example as LSL input device, classifying left and right eye closed with a custom extractor class. `RightLeftMarkers.py` uses tkinter to generate visual on-screen stimuli for only right, left or both eyes open, sends same onscreen stimuli as LSL markers, ideal for testing pupil-labs eyes classifier test. `bciGazeExample.py` Illustrates how a 'simple' custom pupil-labs feature extractor class can be passed for the gaze data, where the mean pupil diameter is taken for each eye and both eyes and used as feature data, where nans for no confidence are set to a value of 0.
Expand All @@ -25,7 +25,7 @@ Note all the examples shown that are not in a dedicated folder work with the Psu
* - `testPytorch.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/testPytorch.py>`_
- Provides an example of how to use a Pytorch Neural net Model as the classifier. (testRaw.py also has a Pytorch example with a C-NN).
* - `testRaw.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/testRaw.py>`_
- This example shows how raw time series across multiple channels can be used as an input by utilising a custom feature extractor class, combined with a custom C-NN Pytorch model when initialising PyBCI. The raw data from the data receiver thread comes in the form [samples, channels], the data receiver threads slice data based on relative timestamps meaning depending on the devices frequency for pushing LSL chunks can vary the number of samples received in the buffer for each window, to mitigate this a desired length is set and data should be trimmed based on the expected data for the created model. Multiple channels are also dropped (with the PsuedoLSLSreamGenerator in mind) to save computational complexity as raw time series over large windows can give a lot of parameters for the neural net to train.
- This example shows how raw time series across multiple channels can be used as an input by utilising a custom feature extractor class, combined with a custom C-NN Pytorch model when initialising PyBCI. The raw data from the data receiver thread comes in the form [samples, channels], the data receiver threads slice data based on relative timestamps meaning depending on the devices frequency for pushing LSL chunks can vary the number of samples received in the buffer for each window, to mitigate this a desired length is set and data should be trimmed based on the expected data for the created model. Multiple channels are also dropped (with the pseudoLSLSreamGenerator in mind) to save computational complexity as raw time series over large windows can give a lot of parameters for the neural net to train.
* - `testSimple.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/testSimple.py>`_
- Provides the simplest setup, where no specific streams or epoch settings are given, all default to sklearn SVM classifier and `GlobalEpochSettings() <https://github.com/LMBooth/pybci/blob/main/pybci/Configuration/EpochSettings.py>`_.
* - `testSklearn.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/testSklearn.py>`_
Expand Down

0 comments on commit d816556

Please sign in to comment.