Skip to content

charliegerard/gaze-detection

main
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
June 24, 2021 19:01
February 2, 2021 01:38
February 2, 2021 14:54
January 30, 2021 14:20
February 1, 2021 23:04
January 30, 2021 22:31
January 31, 2021 13:58
February 2, 2021 14:54
February 2, 2021 14:54
February 2, 2021 16:31

Gaze-detection

Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences!

Demo

Visit https://gaze-keyboard.netlify.app/ (Works well on mobile too!!) πŸ˜ƒ

Inspired by the Android application "Look to speak".

Uses Tensorflow.js's face landmark detection model.

Detection

This tool detects when the user looks right, left, up and straight forward.

How to use

Install

As a module:

npm install gaze-detection --save

Code sample

Start by importing it:

import gaze from "gaze-detection";

Load the machine learning model:

await gaze.loadModel();

Then, set up the camera feed needed for the detection. The setUpCamera method needs a video HTML element and, optionally, a camera device ID if you are using more than the default webcam.

const videoElement = document.querySelector("video");

const init = async () => {
  // Using the default webcam
  await gaze.setUpCamera(videoElement);

  // Or, using more camera input devices
  const mediaDevices = await navigator.mediaDevices.enumerateDevices();
  const camera = mediaDevices.find(
    (device) =>
      device.kind === "videoinput" &&
      device.label.includes(/* The label from the list of available devices*/)
  );

  await gaze.setUpCamera(videoElement, camera.deviceId);
};

Run the predictions:

const predict = async () => {
  const gazePrediction = await gaze.getGazePrediction();
  console.log("Gaze direction: ", gazePrediction); //will return 'RIGHT', 'LEFT', 'STRAIGHT' or 'TOP'
  if (gazePrediction === "RIGHT") {
    // do something when the user looks to the right
  }
  let raf = requestAnimationFrame(predict);
};
predict();

Stop the detection:

cancelAnimationFrame(raf);