Face and Emotion API Workshop
This is a simple app designed to showcase the capabilities of the Microsoft Cognitive Services Face and Emotion APIs. It is being used as part of the Cambridge University Microsoft Student Partners' workshop on the Cognitive Services Face APIs.
If you just want to watch how to build an app with Cognitive Services just come along! If you want to follow along with the coding then make sure you have the following.
- Proficiency in Java. Understanding of the Observer deisgn pattern desirable.
- A working installation of Android Studio.
- Make sure you have Git installed. If not, follow instructions here.
- Some way to run Android apps. A physical Android device connected over USB is desirable (don't forget your USB cable!), otherwise an emulator will do.
Windows, Mac OS and Linux are all acceptable.
Preparing for the Workshop
If you just want to watch how to build an app with Cognitive Services just come along!
If you want to follow along with the coding then make sure you have the following. It is highly recommended you do this before coming along.
- Clone this repository by typing this command:
git clone https://github.com/cambridge-msp/face-emotion-api-workshop.git.
- Navigate to the project directory:
- Open Android Studio. Open this project - it is in the directory mentioned in point 2. It is likely that Android Studio will complain that the correct SDK version is not installed: install the SDK by following the instructions if so.
- It is likely Android Studio will say the project requires a Gradle Sync. Follow the instructions to sync the project.
- Checkout the workshop branch:
git checkout workshop.
If you don't want the "workshop version" but would rather see the complete version, do
git checkout master at any point. To return to the workshop version, do
git checkout workshop. (See below for details on the "workshop version").
This repository contains two branches.
This branch contains a fully working implementation of the app. This is a very thoroughly documented and stable version of the app following Android good programming practices. It is good for looking over to see how a complete implementation works.
This branch contains gaps in the code which are suitable for being filled in during a workshop to demonstrate how easy it is to integrate Emotion API into apps. It is deliberately designed to ensure that it requires no knowledge of Android specific APIs by ensuring the Android specific parts are already implemented.
Note that it does not build!
You can find the slide show we used here on Google Slides
A recording of the workshop is available on YouTube
About the Presentation
The original presentation is being given by Henry Thompson and David Adeboye on Friday February 24th 2017.