Skip to content
/ HaptiQ Public

HaptiQ ("Haptic-Cue") - The first Vector-based display for the Visually Impaired


Notifications You must be signed in to change notification settings


Repository files navigation

HaptiQ: A Haptic Device for Graph Exploration by People with Visual Disabilities

Student: Simone Ivan Conte

Supervisor: Miguel A. Nacenta

Project Description:

Traditional touchscreen devices do not provide any tactile feedback, being then of scarce usability for people with visual disabilities. Miguel A. Nacenta and his team at the University of Calgary, now a Lecturer in St Andrews, developed the Haptic TableTop Puck (HTP), an inexpensive tactile feedback input device to be used on digital tabletop surfaces. Friction, height, texture, and malleability are communicated through a combination of properties of the HTP: a rod and a brake pad controlled by two servo motors, and a pressure sensor on the rod. The HTP, however, uses only one actuator on a finger to convey information to the user, which makes it unsuitable for use by people with visual disabilities, since people cannot detect directions and edges of tactile objects.

The aim of this project is to extend the HTP concept and overcome some of its limitations for its use for people with visual disabilities. The extension will consist into using multiple servo motors and rods as well as redesigning the API to support new functionalities (haptic objects, behaviours, etc).

My hypothesis is that the HaptiQ (pronounced Haptic cue) will facilitate the recognition of directions and edges of displayed objects. The first main objective is to create a physical prototype that is adapted for the specific needs of people with visual disabilities.



- Design and implement an haptic device for graph exploration by people with visual disabilities.
- Develop an API to control one or more haptic devices.
- Develop an API to easily develop WPF client applications.
- Design and implement a new set of dynamic behaviours.
- Design and implement a basic application for Graphs exploration.


- Increase the haptic device resolution.
- Allow the haptic device to provide audio-feedback as well.
- Design and implement an application for simple mathematical functions exploration.


- Enable the haptic device to be used collaboratively.
- Enable the haptic device to sense textures.

Hardware used

  • Phidgets ServoBoard and InterfaceKit
  • Phidgets Servo Motors
  • Pressure Sensors
  • 3D-Printed models


This project is currently under alpha development and the designing is constantly changing.

Please do create a branch of the project if a new feature has to be implemented. Also, use the issue tracking on GitHub if any bug is found or new functionality is wanted.

Getting started

Please refer to the manual to get started with the HaptiQ and the HaptiQ API


Whenever a bug is found please:

  • Assign a short, clear and meaningful name to the bug
  • Be precise and clear
  • Include only one problem per report

And follow a similar scheme

| Description | Describe the issue (step by step)                 |
| Workaround  | Explain a way to temporarily solve the problem    |
| Solution    | Explain what the solution should be               |
| Attachments | Attach any relevant Log/Screenshot file if needed |
Label: bug, duplicate, invalid, etc..


This work is supported by the School of Computer Science of the University of St Andrews and SISCA


HaptiQ ("Haptic-Cue") - The first Vector-based display for the Visually Impaired







No releases published


No packages published