$1 Unistroke Gesture Recognizer in Swift
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.
Example Added movie Oct 14, 2015
Pod Added movie Oct 14, 2015
.DS_Store New logo Jan 8, 2017
ExampleMovie.mp4 Added movie Oct 14, 2015
LICENSE Commit Oct 9, 2015
README.md Added links Feb 26, 2017
SwiftUnistroke-Logo-Source.sketch New logo Jan 8, 2017
SwiftUnistroke.podspec Commit Oct 9, 2015
_Pods.xcodeproj Commit Oct 9, 2015
logo.png New logo Jan 8, 2017
swiftunistroke.png . Oct 14, 2015



$1 Unistroke Gesture Recognizer in Swift

★★ Star our github repository to help us! ★★

[![CI Status](http://img.shields.io/travis/Daniele Margutti/SwiftUnistroke.svg?style=flat)](https://travis-ci.org/Daniele Margutti/SwiftUnistroke) Version License Platform


You also may like

Do you like SwiftRichString? I'm also working on several other opensource libraries.

Take a look here:

  • SwiftDate - Full features Dates & TimeZone management for iOS,macOS,tvOS and watchOS
  • Hydra - Promise, Async/Await on sterioids!
  • SwiftLocation - CoreLocation and Beacon Monitoring on steroid!
  • SwiftScanner - String scanner in pure Swift with full unicode support
  • SwiftSimplify - Tiny high-performance Swift Polyline Simplification Library
  • SwiftMsgPack - MsgPack Encoder/Decoder in Swit


A full video of the gesture recognizer is available here:

SwiftUnistroke Video


SwiftUnistroke is a pure Swift 2 implementation of the $1 Unistroke Algorithm developed by Jacob Wobbrock, Andy Wilson and Yang Li.

The $1 Unistroke Recognizer is a 2-D single-stroke recognizer designed for rapid prototyping of gesture-based user interfaces.

In machine learning terms, $1 is an instance-based nearest-neighbor classifier with a Euclidean scoring function, i.e., a geometric template matcher.

Despite its simplicity, $1 requires very few templates to perform well and is only about 100 lines of code, making it easy to deploy. An optional enhancement called Protractor improves $1's speed.

A more detailed description of the algorithm is available both on official project paper and on my blog's article here.

This library also contain an example project which demostrate how the algorithm works with a set of loaded templates; extends this library is pretty easy and does not involve any machine learning stuff. Other languages implementation can be found here.


  • Fast gestures recognition
  • Simple code, less than 200 lines
  • Easy extensible pattern templates collection
  • High performance even with old hardware
  • Machine learning is not necessary
  • An optional enhancement called protractor (more) improves speed.


Daniele Margutti


  • If you found a bug, open an issue.
  • If you have a feature request, open an issue.
  • If you want to contribute, submit a pull request.

##Version History ##1.0 (Oct 9, 2015)

  • First release


  • Mac OS X 10.10+ or iOS 8+
  • Swift 2+

How to use it

SwiftUnistroke is really simple to use: first of all you need to provide a set of templates; each template is composed by a series of points which describe the path. You can create a new SwiftUnistrokeTemplate object from an array of CGPoints or StrokePoint.

In this example we load a template from a JSON dictionary which contains name,points keys:

let templateDict = try NSJSONSerialization.JSONObjectWithData(jsonData!, options: NSJSONReadingOptions.AllowFragments) as! NSDictionary
let name = templateDict["name"]! as! String
let rawPoints: [AnyObject] = templateDict["points"]! as! [AnyObject]

var points: [StrokePoint] = []
for rawPoint in rawPoints {
	let x = (rawPoint as! [AnyObject]).first! as! Double
	let y = (rawPoint as! [AnyObject]).last! as! Double
	points(StrokePoint(x: x, y: y))
let templateObj = SwiftUnistrokeTemplate(name: name, points: points)		

Now suppose you have an array of SwiftUnistrokeTemplate and an array of captured points (inputPoints, your path to recognize). In order to perform a search you need to allocate a new SwiftUnistroke and call recognizeIn() method:

let recognizer = SwiftUnistroke(points: inputPoints!)
do {
	let (template,distance) = try recognizer.recognizeIn(self.templates, useProtractor:  false)
	if template != nil {
		print("[FOUND] Template found is \(template!.name) with distance: \(distance!)")
	} else {
		print("[FAILED] Template not found")
} catch (let error as NSError) {
	print("[FAILED] Error: \(error.localizedDescription)")

That's all, this method return the best match in your templates bucket.


SwiftUnistroke is available through CocoaPods. To install it, simply add the following line to your Podfile:

pod "SwiftUnistroke"


SwiftUnistroke is available under the MIT license. See the LICENSE file for more info.