-
Notifications
You must be signed in to change notification settings - Fork 168
Cognitive Services
Microsoft Cognitive Services (formerly Project Oxford) are a set of APIs, SDKs and services available to developers to make their applications more intelligent, engaging and discoverable. Microsoft Cognitive Services expands on Microsoft’s evolving portfolio of machine learning APIs and enables developers to easily add intelligent features – such as emotion and video detection; facial, speech and vision recognition; and speech and language understanding – into their applications. Our vision is for more personal computing experiences and enhanced productivity aided by systems that increasingly can see, hear, speak, understand and even begin to reason.
Microsoft Cognitive Services let you build apps with powerful algorithms using just a few lines of code. They work across devices and platforms such as iOS, Android, and Windows, keep improving, and are easy to set up.
The Shopping Demo app uses the Emotion API to analyze faces and detect a range of feelings to help rate the app. The app uses the camera to get a photo of the users’s face and calculates a rating for the app based on happiness. The more you smile, the better you rate the app.
You need an Azure account to complete this tutorial. You can:
-
Open an Azure account for free. You get credits that can be used to try out paid Azure services. Even after the credits are used up, you can keep the account and use free Azure services and features, such as the Web Apps feature in Azure App Service.
-
Activate Visual Studio subscriber benefits. Your Visual Studio subscription gives you credits every month that you can use for paid Azure services.
-
Get credits every month by joining to Visual Studio Dev Essentials.
If you want to get started with Azure App Service before you sign up for an Azure account, go to Try App Service. There, you can immediately create a short-lived starter mobile app in App Service—no credit card required, and no commitments.
To start, set up your development environment by installing the latest version of the Azure SDK and Xamarin.
If you do not have Visual Studio installed, use the link for Visual Studio 2015, and Visual Studio will be installed along with the SDK.
You must subscribe to Emotion API and get an API key to use it. So, please follow the steps described here and get the key:
Place that key on the file AppSettings.cs:
public const string EmotionApiKey = "ADD YOUR APP KEY";
Shopping Demo App has already the NuGet package for Microsoft.ProjectOxford.Emotion, and is ready to use it:
As a shared project, there is a service to use in the app.
using (var memoryStream = new MemoryStream(photoBytes))
{
var emotionService = new EmotionService();
var firstFaceEmotion = default(Emotion);
try
{
var emotionList = await emotionService.RecognizeAsync(memoryStream);
firstFaceEmotion = emotionList.FirstOrDefault();
}
catch
{
await UserDialogs.Instance.AlertAsync("There was an error analyzing your photo. Please, try again.");
}
[...]
}
We need to get the photo stream and call RecognizeAsync()
method to get the emotions and see the happiness value.
RatingProcessingViewController.cs:
UserDialogs.Instance.ShowLoading("Processing...");
UIImage sendImage = UIImage.FromImage(CaptureImageView.Image.CGImage, 1.0f, UIImageOrientation.Right);
Emotion[] detectedEmotions = await emotionClient.RecognizeAsync(sendImage.AsJPEG().AsStream());
Emotion emotion = detectedEmotions.FirstOrDefault();
UserDialogs.Instance.HideLoading();
if (emotion != null)
{
SetRating(emotion.Scores.Happiness);
}
var photoStream = await this.mediaCaptureHelper.TakePhotoAsync();
photoStream = await this.mediaCaptureHelper.RotatePhoto(photoStream);
await this.SetCapturedImage(photoStream);
var emotionService = new EmotionService();
var detectedEmotions = await emotionService.RecognizeAsync(photoStream.AsStream());
Emotion emotion = detectedEmotions.FirstOrDefault();
Getting started with Emotion API
For more information, please visit http://azure.com/xamarin