You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have been experimenting with the input_gt911 device using the Arduino GIGA with the Giga display shield using the Arduino Zephyr setup. In their code I added the input_gt911 input callback code into their fixup.c, which then looks to see if an Arduino sketch (or library) registers a callback function, if so the one in fixup.c will forward the callbacks...
I now have the single finger touch paint sketch reasonably working.
So then I wondered about adding multiple touch support. I edited the config file for the GIGA under the display shield and set:
CONFIG_INPUT_GT911_MAX_TOUCH_POINTS =3
Rebuilt zephyr flashed my giga. rebuilt the sketch... And the touch still worked. But wondered how to detect if I have
multiple fingers touching...
I was using, code similar to the input example sketch which looks like:
And was not seeing anything. Then closer at the code in input_gt911.c and noticed that it is sending another event code:
Which I can check for like: if (evt->code == INPUT_ABS_MT_SLOT) {
But the question is, how should a sketch/library know maybe when it should process these callbacks. That is currently
the callback code here sets a semaphore each time it gets a sync which is for every point. Which might at that point wake up my
main thread that is waiting on the semaphore.
I confirmed this with a simple sketch, that receives all of the callbacks and simply stores the data away, until I type something in the Serial monitor and which point it prints these all out:
The 3 47 0 -> Is the first finger and the 3 47 1 -> is the second finger.
a) Hope that the sketch or library calling this, is not a higher priority preemptive... and as such maybe we will receive
all of the callbacks before the user code tries to process them.
b) if not a) maybe user code should delay for an ms after waking to see if another event comes in?
c) Maybe the function in input_gt911.c should only set the sync on the last point it is going to output.
d) gt911 send another report at start saying how many points there are
e) The code sending the points output in reverse order and we only process on point 0
Thoughts?
Secondary question: I believe that this device supports gestures, when the number of points is >= 2. Does this
driver support this? I did not notice anything within it.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
I have been experimenting with the input_gt911 device using the Arduino GIGA with the Giga display shield using the Arduino Zephyr setup. In their code I added the input_gt911 input callback code into their fixup.c, which then looks to see if an Arduino sketch (or library) registers a callback function, if so the one in fixup.c will forward the callbacks...
I now have the single finger touch paint sketch reasonably working.
So then I wondered about adding multiple touch support. I edited the config file for the GIGA under the display shield and set:
CONFIG_INPUT_GT911_MAX_TOUCH_POINTS =3
Rebuilt zephyr flashed my giga. rebuilt the sketch... And the touch still worked. But wondered how to detect if I have
multiple fingers touching...
I was using, code similar to the input example sketch which looks like:
And was not seeing anything. Then closer at the code in input_gt911.c and noticed that it is sending another event code:
Which I can check for like:
if (evt->code == INPUT_ABS_MT_SLOT) {
But the question is, how should a sketch/library know maybe when it should process these callbacks. That is currently
the callback code here sets a semaphore each time it gets a sync which is for every point. Which might at that point wake up my
main thread that is waiting on the semaphore.
I confirmed this with a simple sketch, that receives all of the callbacks and simply stores the data away, until I type something in the Serial monitor and which point it prints these all out:
And I do see some multiple events when I put two fingers on the display, like:
The 3 47 0 -> Is the first finger and the 3 47 1 -> is the second finger.
a) Hope that the sketch or library calling this, is not a higher priority preemptive... and as such maybe we will receive
all of the callbacks before the user code tries to process them.
b) if not a) maybe user code should delay for an ms after waking to see if another event comes in?
c) Maybe the function in input_gt911.c should only set the sync on the last point it is going to output.
d) gt911 send another report at start saying how many points there are
e) The code sending the points output in reverse order and we only process on point 0
Thoughts?
Secondary question: I believe that this device supports gestures, when the number of points is >= 2. Does this
driver support this? I did not notice anything within it.
Thanks
Kurt
Beta Was this translation helpful? Give feedback.
All reactions