Skip to content

Latest commit

 

History

History
41 lines (24 loc) · 2.6 KB

README.md

File metadata and controls

41 lines (24 loc) · 2.6 KB

UnityVision-iOS

This native plugin enables Unity to take advantage of specific features of Core-ML and Vision Framework on the iOS platform. This plugin is able to work together with Unity's ARKit plugin or without it. When using ARKit, image analysis is performed on ARKit's CoreVideo pixel buffer. If this is not available, the plugin also accepts native pointers to Unity textures.

Currently supported features:

  • Image classification

  • Rectangle detection

Installation

Requirements:

The plugin was tested using Unity 2018.1.0f2, but it should work with Unity 2017 as well, however this was never confirmed. Using Core-ML requires iOS 11.0 or above.

Follow the steps below to integrate the plugin to your Unity project:

  1. Copy the contents of UnityVision-iOS/Assets/Scripts/Possible to YourProject/Assets/Scripts/Possible.
  2. Set the following values in player settings:
    • Scripting backend: IL2CPP
    • Target minimum iOS Version: 11.0
    • Architecture: ARM64

Usage guide

For information on how to use the plugin, study the example scenes located in UnityVision-iOS/Assets/Examples. Please note that it is not possible to test the plugin's functionality by running the scenes in the editor. To see the plugin in action, please build and deploy one of the example scenes to device.

For image classification, the plugin uses the InceptionV3 machine learning model. This model is provided in this repository inside the MLModel folder. Add this model to your Xcode project (generated by Unity after building), by dragging the model into the project navigator. Make sure that the model is added to the Unity-iPhone build target.

The InceptionV3 machine learning model is quite large in file size. If you wish to use a different model, perhaps a smaller one, it is possible as long as it is an image classification model. In this case, you'll need to modify VisionNative.swift, located under UnityVision-iOS/Assets/Plugins/iOS/Vision/Native. The only change you'll have to make is modifying the VNCoreMLModel object's initialization parameter from InceptionV3( ).model to your model at line 49 of the source file. Again, this is only needed if you want to switch the underlying machine learning model. The plugin works with InceptionV3 out of the box.

Troubleshoot

If you get linker errors for ARKit when trying to build the XCode project, it means that the particular version of Unity you are using did not include the ARKit.framework in linked binaries for the generated project. Go to Build Phases / Link Binary With Libraries, and add ARKit.framework.