Skip to content

Latest commit

 

History

History
39 lines (27 loc) · 2.04 KB

concept-detecting-adult-content.md

File metadata and controls

39 lines (27 loc) · 2.04 KB
title titleSuffix description author manager ms.service ms.topic ms.date ms.author
Detect Adult, racy, or gory content - Azure AI Vision
Azure AI services
Concepts related to detecting adult content in images using the Azure AI Vision API.
PatrickFarley
nitinme
azure-ai-vision
conceptual
01/19/2024
pafarley

Adult content detection

Azure AI Vision can detect adult material in images so that developers can restrict the display of these images in their software. Content flags are applied with a score between zero and one so developers can interpret the results according to their own preferences.

Try out the adult content detection features quickly and easily in your browser using Vision Studio.

[!div class="nextstepaction"] Try Vision Studio

Tip

Azure AI Content Safety is the latest offering in AI content moderation. For more information, see the Azure AI Content Safety overview.

Content flag definitions

The "adult" classification contains several different categories:

  • Adult images are explicitly sexual in nature and often show nudity and sexual acts.
  • Racy images are sexually suggestive in nature and often contain less sexually explicit content than images tagged as Adult.
  • Gory images show blood/gore.

Use the API

You can detect adult content with the Analyze Image 3.2 API. When you add the value of Adult to the visualFeatures query parameter, the API returns three boolean properties—isAdultContent, isRacyContent, and isGoryContent—in its JSON response. The method also returns corresponding properties—adultScore, racyScore, and goreScore—which represent confidence scores between zero and one for each respective category.