I don't understand the definition of channels mask seen in emokit.c. For each channels there are a list of bit position, but this list does not match the bits showed in the documentation:https://github.com/qdot/emokit/blob/master/doc/emotiv_protocol.asciidoc.
Anyway, it seems to work like this, but so it would mean raw data for a given channel may be scattered all around a frame ? then how did you do to find these masks??
Also I suppose these lines are wrong in the doc:
well maybe this part of the doc is not up to date, or does it? Im a little lost here..
If someone can confirm there are some problems here (or confirm I didnot understand a thing!) I may try to correct these in the coming days
yeah, the docs are clearly wrong... for example, the docs reference AF8, but that sensor does not exist in the code
it appears to be magic.
I just tried reading the values with normal bitshifting techniques, and I do not get wave-like patterns at all... there is one thing correct in the docs, and that is the battery is part of the counter... I also managed to, what appears, get the signal strength on bits 112:119 ... ie. when connected to the normal epoc app, I get similar signal strength... (using a normal bit shifting, not that crazy magic).
the reality is, I'm quite lost on the signal strength deal... I'm just gonna go with it for now and hook it up to a visualizer. seems something is working, cause I am getting wave-like numbers :)
The sensor not existing in the code doesn't mean the doc is wrong. The doc was not derived from the code, it was derived from information I was sent by someone who reworked the packet layout. :)
We are actually missing 2 sensors in the code (notice there's 16 sensors on the headset, 14 in the code).
In the end, I'd like to get a better corelation between the doc and the code. It does look pretty magic at the moment, but I swear it does actually make a very odd sort of sense.
yeah, I know :)
I spent a good amount of time trying to figure that magic out with nearly no success.
as for the first version of the prototype. come back later today, as I'm putting the final touches on the visualizer now. just added the fft for the spectrum data. webgl/three.js rocks!
I think the two sensors that are missing from the code are only there to make sure the headset is in place, which then turns on the quality reading for all the other sensors. Take one out and the quality will not be reported. They are the two interchangeable sensors on each side. I'm pretty sure I've exhausted all possible bit mask combinations to get a value out of them. I spent the better part of a day trying to figure this out.
TP7 and TP8 are the question marks on either side, respectively.
Here is a crude mapping diagram:
I've documented some of the patterns for the bit masks on my fork's wiki. Might help someone figure it out.
Confirming that Emotiv's software doesn't show readout for a TP7 or TP8 either. This issue can be closed the docs need to revert back to AF3 and AF4 also. Per the Emotiv documentation located here: http://emotiv.com/eeg/download_specs.php
I also didn't understood the difference between the channel data bits Indexes given in the doc
and the bits indexes used in the code (emotiv.py -> sensorBits)
For example, for F3 sensor, the doc indicates 8:21 bits but the code takes [10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4, 5, 6, 7] bits.
Is the doc wrong or did I miss something ?