Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anyone wanna help maintain this? #164

Open
qdot opened this issue May 11, 2016 · 24 comments
Open

Anyone wanna help maintain this? #164

qdot opened this issue May 11, 2016 · 24 comments

Comments

@qdot
Copy link
Member

qdot commented May 11, 2016

So. Uh. Hi. Absentee repo maintainer here.

Just noticed there's tons of issues and pull requests, and I have no idea where to even start on cleaning this up. I haven't touched emokit in a long time.

Anyone interested in helping out on cleaning up issues and bringing in pull requests? I don't want to just start blindly doing so. :)

@olorin
Copy link
Member

olorin commented May 11, 2016

As another absentee kinda-maintainer (I've had a lot less time/ability to contribute since I stopped using the EPOC myself), I'd like to suggest @bschumacher if he is interested, he's done a lot of good work on this project.

Will also tag @nh2 in case he's interested in maintaining the C in addition to his Haskell version.

@nh2
Copy link

nh2 commented May 11, 2016

Thanks for pinging me! While I would like to help, I currently don't have enough free capacity to maintain the C version in addition. I will continue to follow this with interest though.

@warrenarea
Copy link

i know it doesn't answer your question, but I am starting to think the future of
EEG might be with OpenBCI -- http://openbci.com/ as they offer more sensor
positions, its open source (less license restrictions), and its helmet-like, which
means less artifacts from movement.

although emotiv has made strides in making the epoc more sensitive, i still kind
of feel they are a little too rigid, especially when it comes to the raw data, and
adapting to what the users want. which i think is evident in how many programs are
available in their store. couple that with the fact, that they refuse to accept that
the manufacturing of their plastic is brittle and cheap and can literally fall apart
after minimal usage, its not a good sign.

i did manage to port the 2.7 python code to python 3.3 and latest release of gevent,
as they updated gevent to work with 3.3 but for the sake of ease, i eliminated all
linux code (hehe), so its only good for windows right now. plus, i put in an intro menu
to let you select your interface you want to use. as there was some issues with gevent
trying to run 2 devices in parallel. my template for python 3.3 should get you started
though, if you decide to go that route.

although, after porting to 3.3, i realized that there was actually no advantage in the port,
other than keeping up with the new standards, so its kind of a moot point, and i didn't
consider it worth the trouble in the end.

i plan to keep working on my CyKit version soon, and it does support connecting with OpenVibe
via TCP streaming... I will probably release a plugin for Unity3D soon.

I'm not as fluent with C, so i probably couldn't be much help to you there, but I've got a little
knowledge about the emotiv, I've tore apart 2 and rebuilt mine twice. lol

funny you messaged now Kyle, as I just finished charging my headset yesterday.

@ghost
Copy link

ghost commented May 11, 2016

I’ll help on the pull requests and issues for the Python code.

On May 10, 2016, at 11:36 PM, Sharif Olorin notifications@github.com wrote:

As another absentee kinda-maintainer (I've had a lot less time/ability to contribute since I stopped using the EPOC myself), I'd like to suggest @bschumacher https://github.com/bschumacher if he is interested, he's done a lot of good work on this project.

Will also tag @nh2 https://github.com/nh2 in case he's interested in maintaining the C in addition to his Haskell version.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub #164 (comment)

@qdot
Copy link
Member Author

qdot commented May 11, 2016

@bschumacher, just added you to the org.

From the looks of the issues and pull requests, it looks like the python version is getting far more usage than the C version. Anyone know what kinda usage we're seeing of python vs. C vs. whatever else?

@nh2
Copy link

nh2 commented May 11, 2016

Anyone know what kinda usage we're seeing of python vs. C vs. whatever else

I can't speak for Python vs C, but what people seemed to appreciate about my Haskell version was that I provide ready-built static Linux and Windows executables for my tool that exports the raw data in various ways (as JSON, via TCP for OpenVibe etc). Maybe that's something Emokit could also do? In that case, the C version would be handy.

@qdot
Copy link
Member Author

qdot commented May 11, 2016

Yup, that's totally something I'd meant to do in the first place. Would love to get binaries/packages together, also trying to do that on some other projects. Definitely worth filing more issues about. :)

@nikhiljay
Copy link

I can help maintain this repo if you need any help. When I can, I try to answer questions and help with issues.

@kimmansur
Copy link

I want! I'm not working yet with it, but soon I finish my masters degree I will be working on it. Anyone knows how to post-analysis the data with affectiv expressiv e cognitiv? Just with the raw EEG?

Enviado do meu iPhone

Em 17 de jun de 2016, às 17:47, nikhiljay notifications@github.com escreveu:

I can help maintain this repo if you need any help. When I can, I try to answer questions and help with issues.


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or mute the thread.

@ghost ghost mentioned this issue Jun 20, 2016
@mattrjohnson
Copy link

I don't want to besmirch the fine work everyone has done so far, because it's a really impressive hack and works surprisingly well given all the weirdnesses of the Emotiv architecture, the differences between models, etc.

However -- in testing this for my own use, I have begun to wonder if this code base would benefit from a major refactoring. Part of this may come from my own biases, because I am a cranky old man who came up as a procedural C programmer and I'm not always a fan of the "Pythonic" way of doing things.

That said, there is some complexity intrinsic to this code base that could probably be taken out to make it easier to maintain. For one example, I think you could get by without any of the gevent stuff at all -- which could solve a lot of complications related to concurrency. There will still need to be some concurrency for Windows (if you can't use hidapi.hid_read() the way you can on *nix -- not sure of the history behind that decision), but it can be minimized.

Similarly, I think the EmotivWriter class is probably unnecessary unless someone really intends to expand the writing-output options significantly (which I assume was the original intent). Right now, it's basically a wrapper for the built-in csv class. And I'm not sure if anyone actually uses the EmotivReader class in practice (maybe primarily useful for debugging?), but if not, that could go too.

Anyway, for my own private use, I'm currently working on a super-stripped-down version that simply logs all (fully-processed) values to a CSV file (and hopefully draws a display like render.py does, although that is currently dog-slow on Windows for me -- haven't gotten far enough to tell if that is an issue with pygame that I can't get around, or if it is specific to render.py's implementation).

If folks are interested, I can post it and see if anyone is interested in helping flesh it out or test it on their own systems. Trying to get a basic version done today, but we'll see if that actually happens...

@ppasler
Copy link
Contributor

ppasler commented Nov 7, 2016

I used Emokit for the last year and really liked it. The changes @bschumacher made in an OO direction makes it easier to start with Emokit. As I am doing my master thesis at the moment using the EPOC+, I'd like to help improve Emokit and answer questions.

@mdtdev
Copy link

mdtdev commented Nov 7, 2016

@mattrjohnson As mostly an end-user of this repo for getting data from an EPOC into a data processing scheme, what would help me the most would be really super simple tools, like command line programs, that wrap setup, connection, QC checking, and data logging (preferably with marker signals) into something like:

> emosetup -name emo1 -id 9A8DA8F1
   9A8DA8F1 as emo1 is active
> emoqc
   AF3   Green   95%
   ... 14 more channels ...
   AF4   Green   100%
> emolog -f experiment1.csv &
   Logging emo1 to experiment1.csv
   ...time passes...
> emolog -stop
   Logging emo 1 stopped.

So I am very interested in seeing your super simple logger as something that sort of fits into such a framework. But I am not enough of a developer to implement stuff like this myself so far. ☹️

That said, such tools are really more of a subproject, the project as it currently stands has a lot of extras that are good for ongoing development. But I would definitely like to see what you describe as a subproject!

@mattrjohnson
Copy link

@mdtdev You can check out our progress so far here: https://bitbucket.org/MattTheGr8/emofox

Right now, it appears to work fine with our EPOC on both Mac & Windows. Which isn't TOO surprising given that most of the core code was lifted straight from emokit and was just rearranged a bit to simplify it.

Currently it works like this: You run the single Python script (emofox.py), it asks you for a CSV filename for logging data, and then it pops up a window similar to render.py from emokit that displays the incoming data live (including quality values). It can also monitor and log a keyboard keypress (currently set to the '0' key) which our group uses for basic tagging of the beginning and end of events. Hitting the Q or Esc key quits the script and closes the log file.

It's not SUPER-flexible right now but it could be made more flexible with a medium-sized amount of work. If you know some Python, you could make some adaptations pretty easily (e.g. if you'd rather not show a visual graph, you could just comment out that stuff and replace it with some print statements). But things like that are not currently set up as explicit options on the user side. (There also aren't a lot of options for data logging -- it basically always logs, and logs everything it can. It would also be nice to add an option for logging keypresses for more than a single key, or things like mouse clicks or whatnot... but none of that is there yet.)

There also might be one or two bugs to track down, the main one being that occasionally it seems to drop a sample (or possible get an extra sample) for reasons that aren't entirely clear to me. It might be happening on a lower level (e.g. the system just gives the script an incomplete or badly-formatted packet occasionally). Haven't delved too deeply into it because it happens pretty rarely (once every few minutes, if at all) and doesn't matter for what we're doing, but I don't think it will be TOO hard to figure out when we get the time to look at it.

The main benefit is that it is WAY faster than emokit, particularly when it comes to the graphical display (which for us was unusably slow on Windows in emokit). The code I wrote for that isn't particularly flexible, but it seems to work fine on all the computers I tested it on. Basically I pull some tricks to make it as fast as I reasonably can, the main one being that it renders the display in little bits and pieces between samples and then only draws the final display every 17 samples or so. Thus the graphing is not particularly smooth, but it seems to keep up with the pace of data acquisition just fine (i.e. we don't get behind on collecting data samples from the EPOC and the signal correctly renders in real time), even on the relatively slow Windows laptop we're using for data collection.

Anyway, feel free to check it out and let me know (either here or on Bitbucket) how it works for you, if anyone would like to contribute or request features, etc. I don't have a ton of spare time to work on tweaks that aren't needed for our work anyway, but I'll try to contribute to the community when I can...

@Morgan243
Copy link
Contributor

I'm interested in helping out. I have access to an EPOC+ which I'll be working with from a Linux machine. I've gotten some preliminary test data off of the device, so my next steps will likely be around validating results and improving some of the tooling.

@JuanaV
Copy link

JuanaV commented Aug 8, 2017

Hi @Morgan243, I'm having a really bad time trying to get EEG raw data from an EPOC+ device. At this very moment, I can read the data but it doesn't look like reliable EEG data, it's more like random numbers, even the battery level oscillates randomly. I really don't know what the problem could be.
I'm working on Linux, any help would be highly appreciated.

This is the output I get from hid_info.py:
usage_page, 65535
product_id, 60674
interface_number, -1
manufacturer_string, Emotiv
vendor_id, 4660
release_number, 6
serial_number, UD20150217000AA2
usage, 2
path, USB_1234_ed02_14100000
product_string, Brain Computer Interface USB Receiver/Dongle

@warrenarea
Copy link

Currently this version of emokit does not fully support the Epoc+

i have posted a list of all 6 keys on another Issue page.
It uses model 6

If the first number is counting up, then you'll know its being decrypted
properly... then its just a matter of selecting the correct columns.

I'm currently working at putting out a new revision that will be a huge
improvement to our current setup and will simplify everything.

@JuanaV
Copy link

JuanaV commented Aug 9, 2017

Hi @warrenarea, I'm a bit lost here.
I have already changed the key to model 6, but what number should be counting up to check the encrypted data?
Thanks in advance

@warrenarea
Copy link

to see the decrypted data, you'll have to print out the "data" variable.

after the data = cipher.decrypt,
you'll want something like this:

apacket = ""
for i in range(0,len(data)):
apacket = apacket + " " + str(ord(data[i]))

print apacket

if it is deciphering correctly, the first number will count up, from 0 - 255

after that, every other number, that is around 127, i'd ignore for now... but the
numbers between the 127, should be giving you the raw EEG signals for each sensor.

if you get to that point, i'd recommend displaying it on a graph.

@JuanaV
Copy link

JuanaV commented Aug 9, 2017

I have finally solved it.

For some reason, this Emotiv works with the key of model 2 #264.
If it is useful to someone the model I'm using is Epoc+Premium, Model 1.1

Thanks a lot @warrenarea

@warrenarea
Copy link

ah yes, model 2, that would be for the regular Epoc model.

This means, that your device has been put in Epoc-mode, which can be set from
the Xavier control panel. Model 6, will be for Epoc+ mode.

@warrenarea
Copy link

Epoc+ as you well might know, has the ability to change its model-mode,
from the old Epoc mode, to Epoc+ and ability to change how frequently it
sends MEMS (gyro) data... to preserve battery life.

@warrenarea
Copy link

warrenarea commented Aug 9, 2017

To change the mode, simply plug your device into a USB jack to your computer.
and then open Xavier Control Panel, click upper left menu button, and click headset settings.

Remember, to change the mode, the Epoc+ must be connected to USB, and turned ON.
if it is not turned on during the process, the mode change will not work.

I do have code in the Issues, that allows you to change the modes from emokit.
You would also have to make a change, to allow detection of Product_Name == Epoc+

@warrenarea
Copy link

Just to keep everyone updated... you may not have noticed much project activity lately.

However, I just want to let you all know, I'm still working daily on a new script for this.
In fact, I've rebuilt it from the ground up, and I think you will all be pleasantly surprised
at what I have in store.

I've got past the major hurdles, and the coding should all be 'relatively' down-hill from
here on out....

I think once you see what I have done with this, it will inspire a new wave of projects,
as this will open a few more doors for us.

Plus, I've made some pretty neat changes to the original emotiv code, that should speed
things up a wee bit.

I'm still going to be using Python27, i couldn't justify switching to Py3... also, it will be limited
to Windows systems at this time. I also figure, if you are going to add support for another
system, it would make more sense to create a new script anyways, no sense in merging the
code together and just making a mess for anyone reading the code.

So stay tuned..... If all goes smoothly, it shouldn't be too long before I put up a new release.

@warrenarea
Copy link

warrenarea commented Nov 20, 2017

https://discordapp.com/invite/gTYNWc7
If anyone would like to assist me in the Epoc+ data
I have a Discord chat server setup here.

You can either download the discord chat messenger,
or just click the link to use the browser interface.

Basically we have the input and the output.... and just need to sort out how
it operates.

*** I have pinned a few of these files on the chat. ***

I also compiled a list of floating point numbers that correspond to each Decimal value (ie.4200)
and there seems to be a pattern, that I have yet to sort out....

The floating point numbers actually repeat and are not unique at all.... i think if someone with
a bit more knowledge than I about this sort of math could take a look, they could easily tell what
is going on.

also, it seems like if you multiply or add some of them together, they match up, but with a slight
variance.

would appreciate it if you could take a look...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests