Skip to content
Robert Grider edited this page Jun 12, 2017 · 8 revisions

The current NetLogo architecture for mouse primitives is terrific for simple, straightforward mouse interaction. For more complex interaction, however, it is lacking. As NetLogo toes toward a world where touch interactions become more and more common, the NetLogo API should adjust to allow a rich exploration of that space.

Current API

The current NetLogo API consists of the following methods:

  • mouse-down? - returns true if the mouse is currently down
  • mouse-inside? - returns true if the mouse is inside the current view
  • mouse-xcor - returns the xcor of the current mouse location
  • mouse-ycor - returns the ycor of the current mouse location

Possible API Designs and Considerations

Level of abstraction - NetLogo Model

The NetLogo mouse primitives are typically used to locate a patch or turtle within the given model. However, the mouse API provides little assistance in getting these items.

Managing Multiple interaction streams

Similarly, whereas mouse events are typically atomic (each one is unrelated to the one before and after), touch events represent a connected set of events (a "gesture"). It is worth considering whether NetLogo model authors are interested in a stream of touch events or whether that information is best filtered out. Also worth considering is whether touch events should use their stream to differentiate between different streams.

Managing different types of interaction

The JavaFX UI tooklit enumerates 4 types of touch events:

  • Touch Moved
  • Touch Pressed
  • Touch Released
  • Touch Stationary

And 7 different types of mouse events:

  • Mouse Moved
  • Mouse Entered
  • Mouse Exited
  • Mouse Pressed
  • Mouse Released
  • Mouse Clicked
  • Mouse Dragged

Typically, NetLogo models use subsequent calls to mouse-down? to determine whether the mouse is down at a given point. Using global variables, it's possible to determine whether the mouse has been pressed, released, or clicked (press + release). Since the mouse and touch paradigms have different expectations, it isn't clear that "pressed" represents a useful concept in the world of touch.

Multi-Threaded environment considerations

The current mouse API freezes the job thread while it looks for events on the UI thread. This is unacceptable for the touch API. The touch API must not lock the job thread while waiting on the UI thread. A thread-safe data structure should be used to transfer information about touch between the UI thread and the job thread. An event handler will need to be registered on the canvas/view at the time of canvas/view creation which tells JavaFX to listen for touch events and records those events (probably to the thread-safe data structure). As the number of touch events tends to grow quite large, the queue will need to allocate a fixed capacity (and perhaps implement some sort of stale-data removal protocol) to prevent excessive memory usage while maintaining the ability to store and access relevant events.

Suggested API

Touch Event Extension Object

The touch event extension object encapsulates all relevant information about a salient touch event. It interacts with the various touch primitives. This could be represented as a list instead of as an extension object, but there are some major disadvantages to using a list:

  • Data Validation: the touch extension primitives will have a touch event passed in. If this is a list, it must be checked for validity on every use.
  • Data Immutability: we know that the data passed into a touch primitive originated with the touch extension, which gives us (further) confidence that the data is valid and correctly formatted.
  • API Stability: using a list to represent touch gives users the ability to "snoop" in the data, which means that changing the order or the number of elements in the list in the future may break user code. We don't want this.
  • Referential data: because lists hold data which is serializable (typically) we can't have references between two lists. It seems that we may want to reference between touch events at some point, and preserving the ability to do so is valuable.

Primitives acting on touch events

These primitives will apply to touch events regardless of whether the poll-based or event-based API is used.

touch:ing? touch-event

Returns true if the touch event given as an argument is of type moved, stationary, or pressed. If the name is too cute, we can select a different one.

touch:x touch-event, touch:y touch-event

Return the x and y coordinates of the given touch.

touch:age touch-event

Gives the time delta (in milliseconds) between when the touch event happened and now. This will be based on Java System time, and so may have a granularity of up to 10ms.

touch:gesture-id touch-event

Returns a numeric ID specifying the gesture to which the touch belongs. All touch events within a single touch "frame" will have the same gesture ID, but gesture IDs will be reused.

touch:turtles-here touch-event, touch:patch-here touch-event

These return an agentset (for turtles-here) and an agent (for patch-here) representing the agent/s located "at" the point the touch occurs.

Poll-based Touch Primitives

touch:events-waiting?

Returns true if there are one or more touch events waiting, false otherwise

touch:fetch-events

Returns a list of touch events. The events retrieved by this are removed from the touch event queue. They will not be returned by subsequent calls to touch:fetch-events

Event-based Touch Primitives

touch:on-press callback

Registers a handler for touch presses. The callback is a single-argument anonymous procedure which receives a touch event as an argument when run.

touch:on-move callback

Registers a handler for touch move events. The callback is a single-argument anonymous procedure which receives a touch event as an argument when run.

touch:on-release callback

Registers a handler for touch released events. The callback is a single-argument anonymous procedure which receives a touch event as an argument when run.

touch:on-gesture callback

Registers a handler for "gesture" events. Gestures are recognized at a level above the three basic handlers (above) and will call back to this separately as they are recognized.

touch:process-events

Runs the callback for each touch event in the touch buffer. The appropriate callback is run for each touch event starting with the oldest touch event and moving to the newest. Touch events corresponding to an event type with no handler are discarded.

Notes on 2-finger touch

Testing with the 3 developers in the lab, their 2-finger presses were 25, 33, and 49 pixels apart. The surface device seems to assume any touches closer than 20 pixels are actually a single touch. This may mean that it is not practical to recognize 2-finger gestures for children, who may have natural two-finger press distances below the minimum resolution threshold.

Clone this wiki locally