Skip to content

Releases: Munroe-Meyer-Institute-VR-Laboratory/Biosensor-Framework

Multi-Class Update

08 May 17:25
Compare
Choose a tag to compare
Multi-Class Update Pre-release
Pre-release

Updated Unity Package to use multi-class classifier with Empatica E4 streaming server.

Biosensor-Framework

05 Aug 00:23
ec79f53
Compare
Choose a tag to compare

Biosensor Framework: A C# Library for Affective Computing

Parties Involved

Institution: Munroe Meyer Institute in the University of Nebraska Medical Center
Laboratory: Virtual Reality Laboratory
Advisor: Dr. James Gehringer
Developer: Walker Arce

Dependencies

.NETStandard 2.0
    LightGBM (>= 2.2.3)
    MathNet.Numerics (>= 4.15.0)
    Microsoft.ML (>= 1.5.5)
    Microsoft.ML.FastTree (>= 1.5.5)
    Microsoft.ML.LightGbm (>= 1.5.5)

Installation

The library can be installed using the NuGet package manager and is listed publicly. For Unity, downloading the package and importing it through the Unity UI is the best approach. This includes all of the dependency libraries, a test script, a test scene, and an example output file. This example performs the same tasks as Example2 and needs to be supplied the E4 Streaming Server exe path, an output path, and your API key for your device.

The following command can be used to install the package using the NuGet CLI:

Install-Package Biosensor-Framework -Version 1.0.0

Unity Package

The included Unity package includes the required DLLs, a test script, and an example output file. To install the package, clone this repository and open the target Unity project.

Click on Assets -> Import Package -> Custom Package...

image

Navigate to the Unity Package folder and click on the "BiosensorFramework_Unity.unitypackage" file. When the file has loaded in, click "All" and "Import" to import the package into your project.

image

Testing Installation

Once the package has been installed, this repo can be cloned to get the test datasets and the trained models.
Follow Example 1 to test the functionality of the Empatica E4 TCP server communication and window data gathering.
Follow Example 2 to test the functionality of Microsoft.ML for affective computing tasks with a connected body-worn sensor.
Follow Example 3 to demonstrate the functionality of Microsoft.ML for affective computing tasks using the WESAD dataset without a connected body-worn sensor.

Basic Usage

This repository has two example projects. Example1 shows the basic usage to communicate with the server and pull data off of a biosensor. Example2 expands Example1 by adding in Microsoft.ML inferencing. Example2 will be used to explain the operation of the library.

        using System;
        using System.Diagnostics;
        using System.IO;
        using System.Threading;
        using Microsoft.ML;

        using MMIVR.BiosensorFramework.Biosensors.EmpaticaE4;
        using MMIVR.BiosensorFramework.MachineLearningUtilities;
        
        namespace Example2_ComputingWithMicrosoftML
        {
            class Program
            {
                public static ServerClient Device1;
                static ServerClient client;
                public static MLContext mlContext;
                public static ITransformer Model;
                public static DataViewSchema ModelSchema;
                // TODO: Fill these out with your own values
                public static string APIKey = "";
                public static string ServerPath = @"";
                public static string WesadDirectory = @"";
                public static string SessionOutputPath = @"";
                public static string ModelDir = @"";

For each project, a client connection is established using the ServerClient class. This handles the TCP initialization, command communication, and error handling. Each device will have its own ServerClient instance, so Device1 is represented by a device name and a TCP connection. Additionally, the API key and the path to the server executable need to be defined. The API key can be found by following Empatica's directions from the Installation section. The path to the server is found in the installation directory for the server (i.e. @"{installation path}\EmpaticaBLEServer.exe").

Next the Microsoft.ML variables are created.

        static void Main(string[] args)
        {
            mlContext = new MLContext();
            Train.RunBenchmarks(WesadDirectory, out ITransformer RegModel, out ITransformer MultiModel, out Model, ModelDir);

In this example, the models are trained on the WESAD data using the RunBenchmarks function. The best performing models for each class are output. For this example, the binary classification model (BinModel) is used.

            Console.WriteLine("E4 Console Interface - Press ENTER to begin the client");
            Console.ReadKey();

            Console.WriteLine("Step 1 - Start Empatica server");
            Utilities.StartE4ServerGUI(ServerPath);

The E4 server is started through a Process call. This will open the GUI for the server, if that is not desired, the command line variant of the command can be called.

            client = new ServerClient();
            Console.ReadKey();
            client.StartClient();
            Utilities.StartE4Server(ServerPath, APIKey);

The E4 server can also be started through the command line variant as shown.

            Console.WriteLine("Step 2 - Getting connected E4 devices");
            Utilities.ListDiscoveredDevices(client);
            Console.ReadKey();

            Console.WriteLine("     Available Devices:");
            Utilities.AvailableDevices.ForEach(i => Console.WriteLine("     Device Name: {0}", i));
            Console.ReadKey();

Listing the connected devices will show all devices connected through the Bluetooth dongle. These are managed internally.

            Device1 = new ServerClient();
            Device1.StartClient();
            Device1.DeviceName = Utilities.AvailableDevices[0];
            Utilities.ConnectDevice(Device1);
            Console.ReadKey();

To connect a device, it needs to be assigned one of the available devices. Once that's done, the ConnectDevice function can be called and its TCP connection will be established.

            Utilities.SuspendStreaming(Device1);

Since there are configurations to be done on the device, the streaming needs to be suspended.

            Console.WriteLine("Step 3 - Adding biometric data streams");

            foreach (ServerClient.DeviceStreams stream in Enum.GetValues(typeof(ServerClient.DeviceStreams)))
            {
                Thread.Sleep(100);
                Console.WriteLine("     Adding new stream: " + stream.ToString());
                Utilities.SubscribeToStream(Device1, stream);
            }

Each available device stream is assigned, but any number of streams can be assigned.

            Utilities.StartStreaming(Device1);
            var timer = Utilities.StartTimer(5);
            Utilities.WindowedDataReadyEvent += PullData;

            Console.WriteLine("ENTER to end program");
            Console.ReadKey();
        }

When the device has streaming restarted, it will begin collecting data as quickly as the server sends it. The window size for this example is 5 seconds, but can be any length. An internal timer will trigger an event at each expiration and then reset. Ending the program can be done by pressing the 'Enter' key.

        private static void PullData()
        {
            var watch = Stopwatch.StartNew();
            var WindowData = Utilities.GrabWindow(Device1, @"C:\Readings.data");
            var pred = Predict.PredictWindow(mlContext, Model, WindowData);
            watch.Stop();
            Console.WriteLine("Time: {0} | Prediction: {1}", DateTime.Now, pred.Prediction);
            Console.WriteLine("Processing Time: {0}", watch.ElapsedMilliseconds);
        }

The data can be manipulated once the timer expires for the window size. The data can be grabbed by the GrabWindow function, which also can be fed a filepath to write the readings to a data file. Predictions can be done on the raw data using the PredictWindow function and a prediction is returned in a simple structure that contains the output bool.

Class Documentation

Biosensor Framework Documentation

Contact

To address issues in the codebase, please create an issue in the repo and it will be addressed by a maintainer. For collaboration inquiries, please address Dr. James Gehringer (james.gehringer@unmc.edu). For technical questions, please address Walker Arce (walker.arce@unmc.edu).