For a demonstration see this video or try out some of the examples with a laptop that has a camera and a browser that has camera webRTC/getUserMedia support. For an overview of browsers supporting the getUserMedia standard see http://caniuse.com/stream.
Download the minified library headtrackr.js and include it in your webpage.
The following code initiates the headtrackr with a video element which will be used for the mediastream, and a canvas element we will copy the videoframes to.
When the headtracker is started, this will now regularly generate the events headtrackingEvent and facetrackingEvent on the document. The event headtrackingEvent has the attributes x, y, z, which tells us the estimated position of the users head in relation to the center of the screen, in centimeters. The event facetrackingEvent has the attributes x, y, width, height and angle, which tell us the estimated position and size of the face on the video.
You can now either create an eventlistener to handle these events somehow, or, if you're using three.js, try to use one of the pre-packaged controllers in this library to create pseudo-3D, aka head-coupled perspective effects.
To get some more idea about usage look at the source code for the examples above, this overview, or the reference.
Projects that have used headtrackr
Building from source
Make sure you have grunt and node installed.
To install the development dependencies run
npm install and to build it run
grunt in the root directory.
Headtrackr.js is distributed under the MIT License, and includes some code bits (courtesy Liu Liu and Benjamin Jung) that are under the BSD-3 License and the MIT License respectively.