Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any Stylus/Pen user? Looking for feedback to design storage/api for their inputs.. #2372

Open
ocornut opened this issue Feb 22, 2019 · 4 comments

Comments

@ocornut
Copy link
Owner

ocornut commented Feb 22, 2019

I would like to eventually add fields in ImGuiIO to be able to distinguish Mouse from Stylus/Pen or Touch inputs. For practical and legacy purpose we may need to keep using the name 'Mouse' for several fields, but we could have a way to select the current input system. Then we can go and tweak some widgets behavior according to the type of input.

(In particular, we should also aim to improve quality of life with touch inputs, see #2334 - this itself it out of the scope of this discussion and will be done separately.)

Because I am not a stylus user at the moment, what I would like to discuss here is clarify the data we need to add to ImGuiIO to hold all the required information provided by Pen API. Even if default Dear ImGui widgets don't use any or all of them yet, their availability would facilitate the creation and sharing of custom widgets.

If you have experienced with those, could you provide feedback of which information are useful or essential to program applications?

Anything specific to know in term of variation of "portability" accross brands and technology? (e.g. a Wacom pen may not provide the same info as a iPad pro pen)

The goal is to design the type/name of the fields that would be added to ImGuiIO, e.g.

struct ImGuiIO
{
   float PenPressure;      // Typically 0.0f...1.0f (1.0f is a normal press, some hardware may report value larger than 1.0f)
}

Aside from the Pressure value, everything gets more complicated.
Apple reports a radius (1 float) + tolerance, Windows seems to report it as a rectangle (4 values).
Does anyone knows enough about this to help design a sensible common structure?

I'm just dropping those note here but I haven't researched it much yet.

API References:


Apple Pencil
https://developer.apple.com/documentation/uikit/pencil_interactions/handling_input_from_apple_pencil
https://developer.apple.com/documentation/uikit/uitouch

var majorRadius: CGFloat
The radius (in points) of the touch.

var majorRadiusTolerance: CGFloat
The tolerance (in points) of the touch’s radius.

[...]

var force: CGFloat
The force of the touch, where a value of 1.0 represents the force of an average touch (predetermined by the system, not user-specific).

var maximumPossibleForce: CGFloat
The maximum possible force for a touch.

var altitudeAngle: CGFloat
The altitude (in radians) of the stylus.

func azimuthAngle(in: UIView?) -> CGFloat
Returns the azimuth angle (in radians) of the stylus.

func azimuthUnitVector(in: UIView?) -> CGVector
Returns a unit vector that points in the direction of the azimuth of the stylus.

Windows seems to have a POINTER_TOUCH_INFO structure:
https://docs.microsoft.com/en-us/windows/win32/api/winuser/ns-winuser-pointer_touch_info

typedef struct tagPOINTER_TOUCH_INFO {
  POINTER_INFO pointerInfo;
  TOUCH_FLAGS  touchFlags;
  TOUCH_MASK   touchMask;
  RECT         rcContact;
  RECT         rcContactRaw;
  UINT32       orientation;
  UINT32       pressure;
} POINTER_TOUCH_INFO;
rcContact
Type: RECT

The predicted screen coordinates of the contact area, in pixels. By default, if the device does not report a contact area, this field defaults to a 0-by-0 rectangle centered around the pointer location.
The predicted value is based on the pointer position reported by the digitizer and the motion of the pointer. This correction can compensate for visual lag due to inherent delays in sensing and processing the pointer location on the digitizer. This is applicable to pointers of type PT_TOUCH.

rcContactRaw
Type: RECT
The raw screen coordinates of the contact area, in pixels. For adjusted screen coordinates, see rcContact.

orientation
Type: UINT32
A pointer orientation, with a value between 0 and 359, where 0 indicates a touch pointer aligned with the x-axis and pointing from left to right; increasing values indicate degrees of rotation in the clockwise direction.
This field defaults to 0 if the device does not report orientation.

pressure
Type: UINT32
A pen pressure normalized to a range between 0 and 1024. The default is 0 if the device does not report pressure.

Wacom WinTab api
https://developer-docs.wacom.com/display/DevDocs/Windows+Wintab+Documentation

typedef struct tagPACKET {

HCTX        pkContext;
UINT        pkStatus;
LONG        pkTime;
WTPKT       pkChanged;
UINT        pkSerialNumber;
UINT        pkCursor;
DWORD       pkButtons;
DWORD       pkX;
DWORD       pkY;
DWORD       pkZ;
UINT        pkNormalPressure;
UINT        pkTangentPressure;
ORIENTATION pkOrientation;

ROTATION    pkRotation;  /* 1.1 */

} PACKET;

Namely

  | pkX, pkY, pkZ | In absolute mode, each is a DWORD containing the scaled cursor location along the x, y, and z axes, re­spec­tively. In relative mode, each is a LONG contain­ing the scaled change in cursor position.
  | pkNormalPressure, pkTangentPressure | In absolute mode, each is a UINT containing the adjusted state of the normal and tangent pressures, respectively. In relative mode, each is an int containing the change in adjusted pressure state.
  | pkOrientation | Contains updated cursor orientation information. For de­tails, see the description of the ORIENTATION data structure in section 7.4.2.
  | pkRotation  (1.1) | Contains updated cursor rotation information. For de­tails, see the description of the ROTATION data struc­ture in section 7.4.3.

// The ORIENTATION data structure specifies the orientation of the cursor with re­spect to the tablet.
typedef struct tagORIENTATION {
int   orAzimuth;
int   orAltitude;
int   orTwist;
} ORIENTATION;

  | orAzimuth | Specifies the clockwise rotation of the cursor about the z axis through a full circular range.
  | orAltitude | Specifies the angle with the x-y plane through a signed, semicir­cular range.  Positive values specify an angle upward toward the positive z axis; negative values specify an angle downward toward the negative z axis.
  | orTwist | Specifies the clockwise rotation of the cursor about its own major axis.

// The ROTATION data structure specifies the Rotation of the cursor with re­spect to the tablet.
typedef struct tagROTATION {
int   roPitch;
int   roRoll;
int   roYaw;
} ROTATION;

  | roPitch | Specifies the pitch of the cursor.
  | roRoll | Specifies the roll of the cursor.
  | roYaw | Specifies the yaw of the cursor.

@ocornut ocornut added the inputs label Feb 22, 2019
@ifarbod
Copy link

ifarbod commented Mar 3, 2019

Here's what Android styli can report (I briefly listed them in that mentioned issue):

Pressure: 0.0f to 1.0f

AXIS_DISTANCE
For a stylus, reports the distance of the stylus from the screen. A value of 0.0 indicates direct contact and larger values indicate increasing distance from the surface.

AXIS_ORIENTATION
For a stylus, the orientation indicates the direction in which the stylus is pointing in relation to the vertical axis of the current orientation of the screen. The range is from -PI radians to PI radians, where 0 is pointing up, -PI/2 radians is pointing left, -PI or PI radians is pointing down, and PI/2 radians is pointing right. See also AXIS_TILT.

AXIS_TILT
For a stylus, reports the tilt angle of the stylus in radians where 0 radians indicates that the stylus is being held perpendicular to the surface, and PI/2 radians indicates that the stylus is being held flat against the surface.

@BrutPitt
Copy link

BrutPitt commented Mar 6, 2019

Personally (currently) I use WebAssembly via EMSCRIPTEN that do not have a specific Stylus/Pen event, but only touch ones.
Currently there are no differences between fingers or other: no direct function to get pressure or tilt... but it is acquirable via Pointer Events HTML5 API -> alias via direct JS code inside C/C++ using EM_JS macro (pressure or tilt, personally never used, yet).

typedef struct EmscriptenTouchPoint
{
  long identifier;   //An identification number for each touch point.
  long screenX;   //The touch coordinate relative to the whole screen origin, in pixels.
  long screenY;
  long clientX;   //The touch coordinate relative to the viewport, in pixels.
  long clientY;
  long pageX;   //The touch coordinate relative to the viewport, in pixels, and including any scroll offset.
  long pageY;
  EM_BOOL isChanged;   //Specifies whether the touch point changed during this event.
  EM_BOOL onTarget;   //Specifies whether this touch point is still above the original target on which it was initially pressed.
  long targetX;   //These fields give the touch coordinates mapped relative to the coordinate space of the target DOM element receiving the input events (Emscripten-specific extension).
  long targetY;
  long canvasX;   //The touch coordinates mapped to the Emscripten canvas client area, in pixels (Emscripten-specific extension).
  long canvasY;
} EmscriptenTouchPoint;

typedef struct EmscriptenTouchEvent {
  int numTouches;   //The number of valid elements in the touches array.
  EM_BOOL ctrlKey;   //Specifies which modifiers were active during the touch event.
  EM_BOOL shiftKey;
  EM_BOOL altKey;
  EM_BOOL metaKey;
  EmscriptenTouchPoint touches[32];   //An array of currently active touches, one for each finger.
} EmscriptenTouchEvent;

A pointer to EmscriptenTouchEvent struct is passed to single touch registered event, and four events are available via installable callbacks:

EMSCRIPTEN_RESULT emscripten_set_touchstart_callback(const char *target, void *userData, EM_BOOL useCapture, em_touch_callback_func callback);
EMSCRIPTEN_RESULT emscripten_set_touchend_callback(const char *target, void *userData, EM_BOOL useCapture, em_touch_callback_func callback);
EMSCRIPTEN_RESULT emscripten_set_touchmove_callback(const char *target, void *userData, EM_BOOL useCapture, em_touch_callback_func callback);
EMSCRIPTEN_RESULT emscripten_set_touchcancel_callback(const char *target, void *userData, EM_BOOL useCapture, em_touch_callback_func callback);

And this is the form of callback function to pass:

typedef EM_BOOL (*em_touch_callback_func)(int eventType, const EmscriptenTouchEvent *touchEvent, void *userData);

Note: Recently there are some changes internally EMSCRIPTEN, Upcoming breaking change: HTML5 API DOM element lookup rules are changing:

// Now deprecated -> return always 0,0
long EmscriptenTouchPoint::canvasX;
long EmscriptenTouchPoint::canvasY;
// Using instead, also for "#canvas"
long EmscriptenTouchPoint::targetX;
long EmscriptenTouchPoint::targetY;

Below there is an example of how I use of touchs events: tap, move and pinching (for zoom)
My emsMDeviceClass class manages the touch events and is declared in the files emsTouch.h and implemented in emsTouch.cpp
(They are excluded from VS and/or desktop build: need emscripten cmake build, to compile it... eventually read main project page for instructions)
The callbacks are registered in the glApp.cpp

@anael-seghezzi
Copy link

I'm currently adding win/mac/linux pen tablet support to glfw : glfw/glfw#1445

I developed some animation and digital painting tools for in house anim studio.
In general it comes down to:

Tablet events:

  • proximity : is the pen entering/exiting tablet proximity
  • pen 'cursor' : was the pen cursor changed (pen or eraser for ex)

Pen data (a queue is important as the frequency can be high on wacom)

  • x, y, z position at full precision (not as pixels)
  • pressure
  • pitch / yaw (or altitude azimuth) are more useful for user than tiltx and tilty

@digitalsignalperson
Copy link

relevant issue for emscripten (which has touch, but not pen) emscripten-core/emscripten#7278

also noted SDL 3 has pen support with SDL_pen.h

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants