Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inspecting BCILAB userdata imag_movements2 and running CSP calibration #46

Closed
faheemersh opened this issue Feb 9, 2018 · 2 comments
Closed

Comments

@faheemersh
Copy link

So I'm following Christian Kothe's videos on youtube as I do this. I'm trying to use CSP to create a model and perform feature extraction. Whenever I'm at the calibrate a model window, after specifying the approach and using the default parameters, I click on inspect data and get the following error: Link

After that, I pressed OK and tried the calibration but received another error: Link

I have a few other questions about this data set in general as I can't seem to find the details of these anywhere. Please excuse my lack of experience with BCI in general as this is my first real project with this.

  1. What do the different sessions of the data mean? There are sessions1, 2, 3, and 4.
  2. Is this a left or right handed movement? Basically, what was the paradigm exactly? The person waited a few seconds, a trigger was sent, and then the person moved their hand within a time window?
  3. Should I be using the default parameters after choosing CSP?
  4. How can we output the results of BCILAB into another matlab function? Should we just get the classification results from the lastmodel variable, if that's where the results are stored?
  5. I'm trying to do asynchronous cursor control with an OpenBCI headset the my design group made. Below are the steps I'm planning on taking in BCILAB to do this.

First I want to be able to do this offline, not asynchronous.

  • Once I am able to do CSP successfully with the provided user tutorial data, I will import one of the text files created using the OpenBCI GUI recording (with the external trigger input to the cap to record markers)
  • Then I will create a new approach, choose CSP. The question I have here is what parameters should I be using, or at least, how do I determine them. Specifically, I'm referring to this: Link
  • After that, I will train the model on the recorded data from the cap
  • Then I will test the model with other recordings by going to 'apply model to data'
  • Finally I will take the classification results from the lastmodel variable and use them in a custom GUI for cursor control in Matlab

Second, attempting to do asynchronous

  • Stream the time series data into Matlab via LSL (I can do this now). I'll go to online analysis-->read input from-->lab streaming layer. The data seems to come in batches.
  • Then I will define a new approach, CSP with LDA again, choose the correct parameters
  • Take the classification results and use them in a custom GUI made in matlab for cursor movement control

Is the general idea/process for these above steps I'm taking correct?

Any help would be much appreciated!

@zuestal1
Copy link

zuestal1 commented Mar 6, 2020

Hi faheemersh

I also work with OpenBCI and would like to use BCILAB.
It seems the tutorial does not work anymore. At least not without changes to the settings.

Requiered changes can be found under: Quick Start Guide
https://sccn.ucsd.edu/wiki/BCILAB

Did you solve the above described problems? Did you continue to work with BCILAB? - It seems it's not really supported anymore- if you changed program what did you choose for your project.

I would appreciate your insights!

@faheemersh
Copy link
Author

Hi zuestal1

I didn't end up solving any of those problems that I had with BCILAB and actually I decided against using it. I ended up using a program called OpenViBE instead.

You can check out my Github to see the information I have about OpenViBE. It worked out much better for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants