Skip to content

Online Prediction

Faheem Ershad edited this page May 14, 2018 · 18 revisions

Running the Online Classifiers

This guide details the procedure for running the final version of our code developed for this project. For anyone that wants to continue/modify/give suggestions for any aspect of our project, please feel free to create an issue in the repository. Keep in mind that this code/guide is a result of our progress in the time allotted and that it can definitely be improved. This guide is just for the Two Binary classifiers (no movement, left, right classes) version, but the Left/Right version follows a similar procedure.

  1. Setup OpenViBE if you have not already. Follow this link from OpenViBE's website or this link from OpenBCI's website. Their instructions have also been copied to the README of the Custom-OpenViBE-Scenarios branch should the links be removed/changed. We used version 2.0.1.

  2. Turn on the Cyton Board and plug the dongle into a USB port on your computer.

  3. Run the OpenViBE acquisition program. Click on connect, then play. There should be a red LED on the dongle beside the blue LED.

  4. Download this scenario which has been customized from the provided OpenViBE scenarios and save it in the directory shown in the screenshot below. Run the OpenViBE Designer. Click on File, then open. Navigate to the path shown below, select the scenario you downloaded, and then click open.

  5. This is what the customized scenario looks like. You'll notice that when comparing this to the original file, the channel selector, two temporal filters (Notch for 60 Hz, Bandpass between 8-30 Hz), and LSL Export (Gipsa) boxes have been added.

  6. Click on the play icon to run the scenario. Wait until the raw and filtered signals show up. Only five electrodes over Fc1, Fc2, C3, C4, and Cz (channels 3-7 for us) are used for the purpose of our project. Once the raw signals can be observed and appear to be stable, move on to the next step.

  7. The Cursor Control GUI was made to work with 100% scaling. Different computers will have different recommended scaling, but change your scaling to 100% if it isn't already that. For Windows users (specifically Windows 10), go to the desktop screen, right click, select Display Settings, and choose 100% under the Scale and layout section.

  8. Download all the cursor control files from the Two Binary Classifiers folder in the Cursor-control branch. Open CursorMovement.m in MATLAB (using R2018a at the time of writing). Then run the script and maximize the GUI window. It is recommended to use a two monitor setup to most efficiently run the cursor control and the online prediction. The GUI should be on screen 1 and the online classifier (the second instance of MATLAB) should run on screen 2.

  9. Download all the files in the Two Binary Classifiers folder in the Online-Training-and-Prediction branch. Open another instance (the second instance) of MATLAB. Then open the file SynchronousNLRClassifierFinal.m, which was built using the ReceiveData.m example provided in the Lab Streaming Layer (LSL) MATLAB examples. This step assumes that you have LSL setup. If you do not have this setup, please see the Streaming-via-LSL branch. To change the training time, alter the variables trialtime, traintime (recommended to be a multiple of trialtime x numtrials + 1), and startprediction (recommended to be a multiple of trialtime x numtrials + 8). These additions account for a 1 second delay prior to the start of the training session and allow enough time (7 seconds) for the classifier to be trained between the training and prediction sessions. If your screen resolution is not 1920 x 1080, change the ScDim variable (near the beginning of the script) to include your screen resolution. You may also need to change the values for the positions of the no movement, right, and left clicks which can be found near the end of the script.

  10. Run the script. On screen 1 (left side of the screenshot) and the command window in MATLAB (screen 2, right), you will see directions to either rest or close/open your right/left hand. Note that this is for motor execution, but can easily be adapted for motor imagery. The cross validation and training accuracies will be displayed in the command window and message box for each classifier.

  11. Once the message box shows "Training session is finished", the prediction commands will show. As the prediction shows up in the command window of MATLAB on screen 2, the cursor will move in the predicted direction (currently a fixed amount) on screen 1 and then automatically click on the GUI. Note that the GUI targets and the commands from the message boxes do not exactly correlate: initially, this code was written for an asynchronous BCI but changed to a synchronous BCI due to time constraints.

  12. To stop the prediction, you will have to try to use your trackpad/mouse to pause the code in MATLAB or navigate to the OpenViBE acquisition/designer programs and stop/close them. If you have a touch screen, it will be easier to go back the the OpenViBE acquisition window and disconnect. That's it!

To analyze the data from the training session and online prediction (feature v. feature plot), see the Analysis Wiki page!

Clone this wiki locally