openFrameworks + ARKit
A Basic toolkit for an IOS app with ARKit support.
This is an openFrameworks addon that provides some helper classes for working within ARKit.
Installation and project setup
- Download openFrameworks here
- clone this repo into your addons folder
- start a new project with the project generator.
After you've opened up the project file
- add a permission setting in your
ofxIOS-Info.plistfile. See Permissions below.
- set the project target for IOS 11 / 12
- you may need to do two things with the
- add it to the compiled sources
- make sure to set the file designation back to it's default(for some reason it's treated as "Data" in the projectGenerator generated project)
Note that you may have to repeat these steps if you make any changes to your project via the generator.
To get started, you need to initialize the ARKit framework. This can be done a couple of different ways. ofxARKit provides a helper api to quickly initialize a session without too much fuss.
ofxARKit::core::SFormat format; format.enablePlaneTracking().enableLighting(); auto session = ARCore::generateNewSession(format);
SFormat object is a way to enable various features of ARKit in a more straightforward manner. Passing an instance of an
SFormat object to
ARCore::generateNewSession will automatically generate a new
ARSession object, while ensuring the specified features are useable on your device.
You can of course, write things by hand which isn't too difficult either.
@interface <your view controller name>() @property (nonatomic, strong) ARSession *session; @end // then somewhere in your implementation block... // official example shows you ought to declare the session in viewWillLoad and initialize in viewWillAppear but it probably doesn't matter. self.session = [ARSession new]; // World tracking is used for 6DOF, there are other tracking configurations as well, see // https://developer.apple.com/documentation/arkit/arconfiguration ARWorldTrackingConfiguration *configuration = [ARWorldTrackingConfiguration new]; // setup horizontal plane detection - note that this is optional configuration.planeDetection = ARPlaneDetectionHorizontal; // start the session [self.session runWithConfiguration:configuration];
As to where to initialize, it really doesn't matter all that much, if your project setup is more in the form of a traditional IOS objective-c app, you can set things up in your view controller, or if your app is more like a normal oF app, you should be able to just as easily set things up in your
See the wiki for a brief description of current funcitonality.
Potential Hurdles in setup of ARKit
Though ARKit is supported on all devices with an A9 chip(6s onwards I believe) - it is helpful to have a fairly recent device or you may experience near immediate degradation of tracking performance. That being said - ARKit is helpful in that manner by warning you of when you're loosing performance by spitting out a message to the effect of
...tracking performance reduced due to resource constraints...
FPS appears to be minimally affected, but like the message says, things might not work as well.
If you see the message pop up, the ARKit api offers a limited function set to see what the reason might be in the degredation of tracking quality. You can log the current tracking status by
ARCam. Will log to the console a basic string describing the status.
- you can also call
getTrackingStatein either class to get the raw tracking state from ARKit.
debugInfoobject which is an instance of
ARDebugInfowhich can be used as well. Using this will also provide information about FPS, etc.
Note that in order for those functions to work, you'll need to call the
setup function of either of
ARProcessor and pass in the boolean
For ARKit - You'll have to enable the
Privacy - Camera Usage Description in your
ofxiOS-Info.plist file. The value for this field is just the string you want to show users when you ask for camera permissions. If you've never touched a plist file before, no worries! Its very easy to change.
You'll see I added it to the very end. If the permission isn't there, all you need to do is over over one of the items already in the list and click on the plus sign. This will add a new field and you can just start typing
Privacy - Camera Usage Description. Xcode will attempt to autocomplete as well.
Deploying to the App Store
#AR_FACE_TRACKING is turned on, allowing you to try out examples
example-face-tracking (if you have an iPhone X). We keep this
variable on by default in order to make the plugin easy to experiment with, but if you're
not using the TrueDepth API for face tracking in your app then you'll get issues trying to
publish to th Apple App Store:
"We noticed your app contains the TrueDepth APIs but we were unable to locate these features in your app. Please provide information about how your app uses the TrueDepth APIs."
To avoid this if you're not using TrueDepth & going to publish to the App Store change the macro defined in
ARFaceTrackingBool.h in your OpenFrameworks plugins directory to
// Line 2 of ARFaceTrackingBool.h - #define AR_FACE_TRACKING true + #define AR_FACE_TRACKING false
This will remove the code from compilation so you don't get flagged by Apple for including code you're not using.
As I certainly am not the most knowledgeable on many of the topics required to work in AR, that and with ARKit still being in beta; if there's something you feel you can contribute, by all means, feel free to make PR's!
As long as it doesn't break anything I'll most likely accept it. Please make all PRs against the
A big thank you to all contributors thus far!