mandigab edited this page Feb 9, 2016 · 1 revision
Clone this wiki locally

Notch Android SDK


This is a brief preview of Notch Android SDK. The SDK will be available as a library that manages the communication and data transfer between the Notch devices and the phone/tablet, processes the measurements and provides 3D visualiser for the recorded motion.


Notch Device

Notch Devices (notches) are the physical layer of the system, actual devices that capture motion and deliver data to the host (mobile) application. Each module has a unique MAC address, a Network ID and Name.

Device Connectivity

Notch Device will initiate connection when turned on. Notches use Bluetooth 4.0 protocol for communication.


Bone is a vector that has mandatory Name (e.g. “RightUpperArm”) and length (e.g. 0.3m) parameter. Extra bones can be added to represent for example an equipment.


Skeleton is a biomechanical model of the human. It is a tree-structured hierarchy of bones originated from single root bone. The following figure illustrates the hierarchy of the default Notch Skeleton composed of 19 bones:


Workout is the measurement setting for tracking pre-set motion or group of motions (e.g. baseball swing, punching drill, shoulder mobility test). Workout is the core element of Notch-compatible app architecture. New Workout always starts with a steady Measurement and us followed by a series of regular Measurements.


Measurement is the motion data collected from Notch devices that is used for visualization, processing and setup. This document refers to two types of Measurements - Steady and Regular.

Steady measurement is the data collected during the Steady phase. Steady phase is a brief sensor alignment procedure user must perform at the very beginning of the recording session. During steady phase user must stand with the arms straightened and legs shoulder-width for about 2 seconds. Once steady phase is complete user can perform as many recordings as necessary until they remove or move Notch sensors.

Regular measurement is the data collected during any other recording.

General workflow

The general workflow of recording measurements is as follows:

Activities/fragments of the app have to be structured according to this logic and call the corresponding methods of Notch SDK. Below we will outline steps necessary to integrate Notch SDK into an existing Android application.

Using the library

To use the Notch SDK library include the latest version as a dependency in your build.gradle:

dependencies {

Using the NotchService

The NotchService interface manages the communication between the host device and the Notch devices such as setup, start, stop and downloading of measurements. First declare the service in the AndroidManifest.xml.

        android:name="com.wearnotch.service.network.NotchAndroidService" />

Then create an activity that binds to the NotchService.

public class MeasurementActivity extends AppCompatActivity implements ServiceConnection {

   protected void onCreate(Bundle savedInstanceState) {

       Intent controlServiceIntent = new Intent(this, NotchAndroidService.class);
       bindService(controlServiceIntent, this, BIND_AUTO_CREATE);

Once the service is connected you can make calls to the interface methods. The results are managed via NotchCallback callbacks.

Initialising a workout

Once the specific Workout is selected/defined initialize mNotchService to search for the required number of registered devices and build up NotchNetwork.

mNotchService.init(mWorkout, mRegisteredDevices, mInitResult);

private NotchCallback<NotchNetwork> mInitResult = new NotchCallback<NotchNetwork>(this) {
    public void onSuccess(final NotchNetwork notchNetwork) {
        // Navigate to the next fragment: preparing the `NotchNetwork` for steady measurement

Preparing a capture

Before capturing a motion the devices need to be setup according to current Workout (sampling frequency, recording time, etc.). Depending on the type of measurement you can do this with configureSteady(...), configureCapture(...) or configureTimedCapture(...) methods.

// Preparing a steady measurement
mNotchService.configureSteady(true, mConfigureSteadyResult);

// Preparing a timed capture that automatically stops after `mTimerMilliSec` milliseconds 
mNotchService.configureTimedCapture(mTimerMilliSec, false, mConfigureTimedCaptureCallback);

Start capturing

Once the devices are setup for capture, call corresponding steady(...) or capture(...) methods to start the measurements.

private NotchCallback<Void> mConfigureSteadyResult = new NotchCallback<Void>(this) {
    public void onSuccess(Void value) {
        // Start steady measurement

Stop capturing

Call stop(...) method to stop any type of recording and prepare the network for acquiring/downloading the data.


This method needs to be called after regular recordings only, not during steady recordings.

Downloading data

Call getSteadyData(...) function to get the data necessary for a steady measurement. To process and visualize regular measurement data you need to download the data from Notch devices. The download(...) function handles the transfer and processing of the data so it can be used for visualization. download(...) function also accepts… FILES

// Gathering data after a steady measurement
private NotchCallback<Session> mSteadyCallback = new NotchCallback<Session>(this) {
    public void onSuccess(final Session session) {
        // Gather steady data

// Downloading and processing data after a successful capture
private NotchCallback<Measurement> mStopCallback = new NotchCallback<Measurement>(this) {
    public void onSuccess(Measurement measurement) {
        mNotchService.download(mOutputFile, measurement, mDownloadResult);


The OpenGL based visualiser creates renderable objects using the fully processed measurement data and maps results to the built-in 3D human model. You can perform standard graphic manipulations such as zoom, rotation of the model as well as control the playback (pause, stop, speed change etc.) like you would any other media player. Visualizer also gives access to per-bone orientations and positions for further processing (e.g. to calculate and display relative or absolute angles of the model’s joints). You can extend visualizer via shaders to show paths, velocity or acceleration vectors and other motion-related information.