Skip to content

Cognitive Services

Erika Ehrli edited this page Aug 1, 2016 · 2 revisions

Use Microsoft Cognitive Services to personalize your app’s responses


Microsoft Cognitive Services (formerly Project Oxford) are a set of APIs, SDKs and services available to developers to make their applications more intelligent, engaging and discoverable. Microsoft Cognitive Services expands on Microsoft’s evolving portfolio of machine learning APIs and enables developers to easily add intelligent features – such as emotion and video detection; facial, speech and vision recognition; and speech and language understanding – into their applications. Our vision is for more personal computing experiences and enhanced productivity aided by systems that increasingly can see, hear, speak, understand and even begin to reason.

Microsoft Cognitive Services let you build apps with powerful algorithms using just a few lines of code. They work across devices and platforms such as iOS, Android, and Windows, keep improving, and are easy to set up.

The Shopping Demo app uses the Emotion API to analyze faces and detect a range of feelings to help rate the app. The app uses the camera to get a photo of the users’s face and calculates a rating for the app based on happiness. The more you smile, the better you rate the app.


Sign up for Microsoft Azure

You need an Azure account to complete this tutorial. You can:

If you want to get started with Azure App Service before you sign up for an Azure account, go to Try App Service. There, you can immediately create a short-lived starter mobile app in App Service—no credit card required, and no commitments.

Set up the development environment

To start, set up your development environment by installing the latest version of the Azure SDK and Xamarin.

If you do not have Visual Studio installed, use the link for Visual Studio 2015, and Visual Studio will be installed along with the SDK.

Subscribe to Emotion API

You must subscribe to Emotion API and get an API key to use it. So, please follow the steps described here and get the key:

Place that key on the file AppSettings.cs:

public const string EmotionApiKey = "ADD YOUR APP KEY";

Add NuGet package

Shopping Demo App has already the NuGet package for Microsoft.ProjectOxford.Emotion, and is ready to use it:

As a shared project, there is a service to use in the app.



using (var memoryStream = new MemoryStream(photoBytes))
    var emotionService = new EmotionService();
    var firstFaceEmotion = default(Emotion);

        var emotionList = await emotionService.RecognizeAsync(memoryStream);
        firstFaceEmotion = emotionList.FirstOrDefault();
        await UserDialogs.Instance.AlertAsync("There was an error analyzing your photo. Please, try again.");

We need to get the photo stream and call RecognizeAsync() method to get the emotions and see the happiness value.




UIImage sendImage = UIImage.FromImage(CaptureImageView.Image.CGImage, 1.0f, UIImageOrientation.Right);

Emotion[] detectedEmotions = await emotionClient.RecognizeAsync(sendImage.AsJPEG().AsStream());
Emotion emotion = detectedEmotions.FirstOrDefault();


if (emotion != null)

Windows 10 Mobile


var photoStream = await this.mediaCaptureHelper.TakePhotoAsync();

photoStream = await this.mediaCaptureHelper.RotatePhoto(photoStream);

await this.SetCapturedImage(photoStream);

var emotionService = new EmotionService();

var detectedEmotions = await emotionService.RecognizeAsync(photoStream.AsStream());

Emotion emotion = detectedEmotions.FirstOrDefault();

Learn more

Getting started with Emotion API

Microsoft Cognitive Services

For more information, please visit

You can’t perform that action at this time.