Skip to content

Latest commit

 

History

History
247 lines (206 loc) · 12.9 KB

gesturerecognizer.md

File metadata and controls

247 lines (206 loc) · 12.9 KB
-api-id -api-type
T:Windows.UI.Input.GestureRecognizer
winrt class

Windows.UI.Input.GestureRecognizer

-description

Provides gesture and manipulation recognition, event listeners, and settings.

-remarks

You can create a gesture object for each appropriate element when your app starts. However, this approach might not scale well depending on the number of gesture objects you need to create (for example, a jigsaw puzzle with hundreds of pieces).

In this case, you can create gesture objects dynamically on a pointerdown event and destroy them on an MSGestureEnd event. This approach scales well, but does incur some overhead due to creating and releasing these objects.

Alternatively, you can statically allocate and dynamically manage a pool of reusable gesture objects.

Note

This class is not agile, which means that you need to consider its threading model and marshaling behavior. For more info, see Threading and Marshaling (C++/CX) and Using Windows Runtime objects in a multithreaded environment (.NET).

For more detail on how to use cross-slide functionality, see Guidelines for cross-slide. The threshold distances used by the cross-slide interaction are shown in the following diagram.

Screen shot showing the select and drag and drop processes.

The PivotRadius and PivotCenter properties are used only when single pointer input is detected. They have no effect on multiple pointer input. The value for these properties should be updated regularly during the interaction.

Rotation is supported by a GestureRecognizer only when manipulationRotate is set through the GestureSettings property.

Rotation is not supported for single pointer input if the value of PivotRadius is set to 0.

Version history

Windows version SDK version Value added
2004 19041 HoldMaxContactCount
2004 19041 HoldMinContactCount
2004 19041 HoldRadius
2004 19041 HoldStartDelay
2004 19041 TapMaxContactCount
2004 19041 TapMinContactCount
2004 19041 TranslationMaxContactCount
2004 19041 TranslationMinContactCount

-examples

Here we set up a GestureRecognizer object with a collection of input event handlers for processing both pointer and gesture input. For more information on how to listen to and handle Windows Runtime events, see Events and routed events overview. See the Basic input sample for the full implementation.

class ManipulationInputProcessor
{
    GestureRecognizer recognizer;
    UIElement element;
    UIElement reference;
    TransformGroup cumulativeTransform;
    MatrixTransform previousTransform;
    CompositeTransform deltaTransform;
    
    public ManipulationInputProcessor(GestureRecognizer gestureRecognizer, UIElement target, UIElement referenceFrame)
    {
        recognizer = gestureRecognizer;
        element = target;
        reference = referenceFrame;
        // Initialize the transforms that will be used to manipulate the shape
        InitializeTransforms();
        // The GestureSettings property dictates what manipulation events the
        // Gesture Recognizer will listen to.  This will set it to a limited
        // subset of these events.
        recognizer.GestureSettings = GenerateDefaultSettings();
        // Set up pointer event handlers. These receive input events that are used by the gesture recognizer.
        element.PointerPressed += OnPointerPressed;
        element.PointerMoved += OnPointerMoved;
        element.PointerReleased += OnPointerReleased;
        element.PointerCanceled += OnPointerCanceled;
        // Set up event handlers to respond to gesture recognizer output
        recognizer.ManipulationStarted += OnManipulationStarted;
        recognizer.ManipulationUpdated += OnManipulationUpdated;
        recognizer.ManipulationCompleted += OnManipulationCompleted;
        recognizer.ManipulationInertiaStarting += OnManipulationInertiaStarting;
    }
    
    public void InitializeTransforms()
    {
        cumulativeTransform = new TransformGroup();
        deltaTransform = new CompositeTransform();
        previousTransform = new MatrixTransform() { Matrix = Matrix.Identity };
        cumulativeTransform.Children.Add(previousTransform);
        cumulativeTransform.Children.Add(deltaTransform);
        element.RenderTransform = cumulativeTransform;
    }
    
    // Return the default GestureSettings for this sample
    GestureSettings GenerateDefaultSettings()
    {
        return GestureSettings.ManipulationTranslateX |
            GestureSettings.ManipulationTranslateY |
            GestureSettings.ManipulationRotate |
            GestureSettings.ManipulationTranslateInertia |
            GestureSettings.ManipulationRotateInertia;
    }
    
    // Route the pointer pressed event to the gesture recognizer.
    // The points are in the reference frame of the canvas that contains the rectangle element.
    void OnPointerPressed(object sender, PointerRoutedEventArgs args)
    {
        // Set the pointer capture to the element being interacted with so that only it
        // will fire pointer-related events
        element.CapturePointer(args.Pointer);
        // Feed the current point into the gesture recognizer as a down event
        recognizer.ProcessDownEvent(args.GetCurrentPoint(reference));
    }
    
    // Route the pointer moved event to the gesture recognizer.
    // The points are in the reference frame of the canvas that contains the rectangle element.
    void OnPointerMoved(object sender, PointerRoutedEventArgs args)
    {
        // Feed the set of points into the gesture recognizer as a move event
        recognizer.ProcessMoveEvents(args.GetIntermediatePoints(reference));
    }
    
    // Route the pointer released event to the gesture recognizer.
    // The points are in the reference frame of the canvas that contains the rectangle element.
    void OnPointerReleased(object sender, PointerRoutedEventArgs args)
    {
        // Feed the current point into the gesture recognizer as an up event
        recognizer.ProcessUpEvent(args.GetCurrentPoint(reference));
        // Release the pointer
        element.ReleasePointerCapture(args.Pointer);
    }
    
    // Route the pointer canceled event to the gesture recognizer.
    // The points are in the reference frame of the canvas that contains the rectangle element.
    void OnPointerCanceled(object sender, PointerRoutedEventArgs args)
    {
        recognizer.CompleteGesture();
        element.ReleasePointerCapture(args.Pointer);
    }
    
    // When a manipulation begins, change the color of the object to reflect
    // that a manipulation is in progress
    void OnManipulationStarted(object sender, ManipulationStartedEventArgs e)
    {
        Border b = element as Border;
        b.Background = new SolidColorBrush(Windows.UI.Colors.DeepSkyBlue);
    }
    
    // Process the change resulting from a manipulation
    void OnManipulationUpdated(object sender, ManipulationUpdatedEventArgs e)
    {
        previousTransform.Matrix = cumulativeTransform.Value;
        // Get the center point of the manipulation for rotation
        Point center = new Point(e.Position.X, e.Position.Y);
        deltaTransform.CenterX = center.X;
        deltaTransform.CenterY = center.Y;
        // Look at the Delta property of the ManipulationDeltaRoutedEventArgs to retrieve
        // the rotation, X, and Y changes
        deltaTransform.Rotation = e.Delta.Rotation;
        deltaTransform.TranslateX = e.Delta.Translation.X;
        deltaTransform.TranslateY = e.Delta.Translation.Y;
    }
    
    // When a manipulation that's a result of inertia begins, change the color of the
    // the object to reflect that inertia has taken over
    void OnManipulationInertiaStarting(object sender, ManipulationInertiaStartingEventArgs e)
    {
        Border b = element as Border;
        b.Background = new SolidColorBrush(Windows.UI.Colors.RoyalBlue);
    }
    
    // When a manipulation has finished, reset the color of the object
    void OnManipulationCompleted(object sender, ManipulationCompletedEventArgs e)
    {
        Border b = element as Border;
        b.Background = new SolidColorBrush(Windows.UI.Colors.LightGray);
    }
    
    // Modify the GestureSettings property to only allow movement on the X axis
    public void LockToXAxis()
    {
        recognizer.CompleteGesture();
        recognizer.GestureSettings |= GestureSettings.ManipulationTranslateY | GestureSettings.ManipulationTranslateX;
        recognizer.GestureSettings ^= GestureSettings.ManipulationTranslateY;
    }
    
    // Modify the GestureSettings property to only allow movement on the Y axis
    public void LockToYAxis()
    {
        recognizer.CompleteGesture();
        recognizer.GestureSettings |= GestureSettings.ManipulationTranslateY | GestureSettings.ManipulationTranslateX;
        recognizer.GestureSettings ^= GestureSettings.ManipulationTranslateX;
    }
    
    // Modify the GestureSettings property to allow movement on both the X and Y axes
    public void MoveOnXAndYAxes()
    {
        recognizer.CompleteGesture();
        recognizer.GestureSettings |= GestureSettings.ManipulationTranslateX | GestureSettings.ManipulationTranslateY;
    }
    
    // Modify the GestureSettings property to enable or disable inertia based on the passed-in value
    public void UseInertia(bool inertia)
    {
        if (!inertia)
        {
            recognizer.CompleteGesture();
            recognizer.GestureSettings ^= GestureSettings.ManipulationTranslateInertia | GestureSettings.ManipulationRotateInertia;
        }
        else
        {
            recognizer.GestureSettings |= GestureSettings.ManipulationTranslateInertia | GestureSettings.ManipulationRotateInertia;
        }
    }
    
    public void Reset()
    {
        element.RenderTransform = null;
        recognizer.CompleteGesture();
        InitializeTransforms();
        recognizer.GestureSettings = GenerateDefaultSettings();
    }
}

Samples

Archived samples

-see-also

Windows.UI.Input Classes, Windows.Devices.Input, Windows.UI.Core, Windows.UI.Input, Windows.UI.Input.Inking, Windows.UI.Xaml.Input, Custom user interactions, UX guidelines for custom user interactions, Touch design guidelines