Skip to content

Commit

Permalink
Merge branch 'main' of https://github.com/LMBooth/pybci
Browse files Browse the repository at this point in the history
  • Loading branch information
LMBooth committed Sep 2, 2023
2 parents 86f015b + 375980d commit a8b30ee
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
6 changes: 3 additions & 3 deletions docs/BackgroundInformation/Examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Examples

The following examples can all be found on the `PyBCI github <https://github.com/LMBooth/pybci/tree/main/pybci/Examples>`_.

Note all the examples shown that are not in a dedicated folder work with the PsuedoLSLStreamGenerator found in `PsuedoLSLSreamGenerator/mainSend.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/PsuedoLSLStreamGenerator/mainSend.py>`_, if using with own LSL capable hardware you may need to adjust the scripts accordingly.
Note all the examples shown that are not in a dedicated folder work with the pseudoLSLStreamGenerator found in `pseudoLSLSreamGenerator/mainSend.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/pseudoLSLStreamGenerator/mainSend.py>`_, if using with own LSL capable hardware you may need to adjust the scripts accordingly.

.. list-table:: PyBCI Examples
:widths: 25 75
Expand All @@ -14,7 +14,7 @@ Note all the examples shown that are not in a dedicated folder work with the Psu
- Description
* - `ArduinoHandGrasp/ <https://github.com/LMBooth/pybci/tree/main/pybci/Examples/ArduinoHandGrasp>`_
- Folder contains an LSL marker creator in `MarkerMaker.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/ArduinoHandGrasp/MarkerMaker.py>`_ using PyQt5 as an on screen text stimulus, illustrates how LSL markers can be used to train. `ServoControl.ino <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/ArduinoHandGrasp/ServoControl/ServoControl.ino>`_ is designed for an arduino uno which controls 5 servo motors, each of which control the position of an indidividual finger for a 3D printed hand which can be controlled via serial commands. There is also a `Myoware Muscle Sensor <https://myoware.com/products/muscle-sensor/>`_ attached to analog pin A0 being read continuously over the serial connection. `ArduinoToLSL.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/ArduinoHandGrasp/ArduinoToLSL.py>`_ is used to send and receive serial data to and from the arduino, whilst pushing the A0 data to an LSL outlet which is classified in `testArduinoHand.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/ArduinoHandGrasp/testArduinoHand.py>`_, whilst simultaneously receiving a marker stream from testArduinoHand.py to inform which hand position to do.
* - `PsuedoLSLSreamGenerator/mainSend.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/PsuedoLSLStreamGenerator/mainSend.py>`_
* - `pseudoLSLSreamGenerator/mainSend.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/pseudoLSLStreamGenerator/mainSend.py>`_
- Generates multiple channels on a given stream type at a given sample rate. A baseline signal is generated on an LSL stream outlet and a PyQt button can be pressed to signify this signal on a separate LSL marker stream. The signal can be altered by 5 distinct markers for a configurable amount of time, allowing the user to play with various signal patterns for clasification. NOTE: Requires `PyQt5` and `pyqtgraph` installs for data viewer.
* - `PupilLabsRightLeftEyeClose/ <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/PupilLabsRightLeftEyeClose/>`_
- Folder contains example basic pupil labs example as LSL input device, classifying left and right eye closed with a custom extractor class. `RightLeftMarkers.py` uses tkinter to generate visual on-screen stimuli for only right, left or both eyes open, sends same onscreen stimuli as LSL markers, ideal for testing pupil-labs eyes classifier test. `bciGazeExample.py` Illustrates how a 'simple' custom pupil-labs feature extractor class can be passed for the gaze data, where the mean pupil diameter is taken for each eye and both eyes and used as feature data, where nans for no confidence are set to a value of 0.
Expand All @@ -25,7 +25,7 @@ Note all the examples shown that are not in a dedicated folder work with the Psu
* - `testPytorch.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/testPytorch.py>`_
- Provides an example of how to use a Pytorch Neural net Model as the classifier. (testRaw.py also has a Pytorch example with a C-NN).
* - `testRaw.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/testRaw.py>`_
- This example shows how raw time series across multiple channels can be used as an input by utilising a custom feature extractor class, combined with a custom C-NN Pytorch model when initialising PyBCI. The raw data from the data receiver thread comes in the form [samples, channels], the data receiver threads slice data based on relative timestamps meaning depending on the devices frequency for pushing LSL chunks can vary the number of samples received in the buffer for each window, to mitigate this a desired length is set and data should be trimmed based on the expected data for the created model. Multiple channels are also dropped (with the PsuedoLSLSreamGenerator in mind) to save computational complexity as raw time series over large windows can give a lot of parameters for the neural net to train.
- This example shows how raw time series across multiple channels can be used as an input by utilising a custom feature extractor class, combined with a custom C-NN Pytorch model when initialising PyBCI. The raw data from the data receiver thread comes in the form [samples, channels], the data receiver threads slice data based on relative timestamps meaning depending on the devices frequency for pushing LSL chunks can vary the number of samples received in the buffer for each window, to mitigate this a desired length is set and data should be trimmed based on the expected data for the created model. Multiple channels are also dropped (with the pseudoLSLSreamGenerator in mind) to save computational complexity as raw time series over large windows can give a lot of parameters for the neural net to train.
* - `testSimple.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/testSimple.py>`_
- Provides the simplest setup, where no specific streams or epoch settings are given, all default to sklearn SVM classifier and `GlobalEpochSettings() <https://github.com/LMBooth/pybci/blob/main/pybci/Configuration/EpochSettings.py>`_.
* - `testSklearn.py <https://github.com/LMBooth/pybci/blob/main/pybci/Examples/testSklearn.py>`_
Expand Down
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Welcome to the PyBCI documentation!

**PyBCI** is a Python package to create a Brain Computer Interface (BCI) with data synchronisation and pipelining handled by the `Lab Streaming Layer <https://github.com/sccn/labstreaminglayer>`_, machine learning with `Pytorch <https://pytorch.org/>`_, `scikit-learn <https://scikit-learn.org/stable/#>`_ or `TensorFlow <https://www.tensorflow.org/install>`_, leveraging packages like `Antropy <https://github.com/raphaelvallat/antropy>`_, `SciPy <https://scipy.org/>`_ and `NumPy <https://numpy.org/>`_ for generic time and/or frequency based feature extraction or optionally have the users own custom feature extraction class used.

The goal of PyBCI is to enable quick iteration when creating pipelines for testing human machine and brain computer interfaces, namely testing applied data processing and feature extraction techniques on custom machine learning models. Training the BCI requires LSL enabled devices and an LSL marker stream for training stimuli. (The `examples folder <https://github.com/LMBooth/pybci/tree/main/pybci/Examples>`_ found on the github has a `pseudo LSL data generator and marker creator <https://github.com/LMBooth/pybci/tree/main/pybci/Examples/PsuedoLSLStreamGenerator>`_ in the `mainSend.py <https://github.com/LMBooth/pybci/tree/main/pybci/Examples/PsuedoLSLStreamGenerator/mainSend.py>`_ file so the examples can run without the need of LSL capable hardware.)
The goal of PyBCI is to enable quick iteration when creating pipelines for testing human machine and brain computer interfaces, namely testing applied data processing and feature extraction techniques on custom machine learning models. Training the BCI requires LSL enabled devices and an LSL marker stream for training stimuli. (The `examples folder <https://github.com/LMBooth/pybci/tree/main/pybci/Examples>`_ found on the github has a `pseudo LSL data generator and marker creator <https://github.com/LMBooth/pybci/tree/main/pybci/Examples/PseudoLSLStreamGenerator>`_ in the `mainSend.py <https://github.com/LMBooth/pybci/tree/main/pybci/Examples/PseudoLSLStreamGenerator/mainSend.py>`_ file so the examples can run without the need of LSL capable hardware.)

`Github repo here! <https://github.com/LMBooth/pybci/>`_

Expand Down

0 comments on commit a8b30ee

Please sign in to comment.