A NSFW (aka porn) detector with CoreML
Branch: master
Clone or download
Latest commit 91bcb57 Jan 29, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
Example Converted to Swift 4.2 Jan 29, 2019
NSFWDetector Fix a typo Sep 22, 2018
assets Add Header Image and Feedback section to README Sep 14, 2018
.gitignore Initial commit Aug 14, 2018
LICENSE Initial commit Aug 14, 2018
NSFWDetector.podspec Converted to Swift 4.2 Jan 29, 2019
README.md Add Xcode 10 constraint to README Sep 18, 2018
_Pods.xcodeproj Initial commit Aug 14, 2018



Version License Platform

NSFWDetector is a small (17 kB) CoreML Model to scan images for nudity. It was trained using CreateML to distinguish between porn/nudity and appropriate pictures. With the main focus on distinguishing between instagram model like pictures and porn.


guard #available(iOS 12.0, *), let detector = NSFWDetector.shared else {

detector.check(image: image, completion: { result in
    switch result {
    case let .success(nsfwConfidence: confidence):
        if confidence > 0.9 {
            // 😱🙈😏
        } else {
            // ¯\_(ツ)_/¯

If you want to enforce stricter boundaries for your platform, just apply a lower threshold for the confidence.


NSFWDetector is available through CocoaPods. To install it, simply add the following line to your Podfile:

pod 'NSFWDetector'

⚠️ Because the model was trained with CreateML, you need Xcode 10 and above to compile the project.

App Size

The Machine Learning Model is only 17 kB in size, so App size won't be affected compared to other libraries using the yahoo model.

Using just the Model

If you don't want to use the Detection Code, you can also just download the MLModel file directly from the latest Release.


If you recognize issues with certain kind of pictures, feel free to reach out via Mail or Twitter.


Michael Berg, michael.berg@lovoo.com


NSFWDetector is available under the BSD license. See the LICENSE file for more info.