Demo app for the 'Introduction to ARKit' talk (https://ryandavis.io/introduction-to-arkit/). It demonstrates a number of the core features in ARKit (to varying degrees of sophistication) using theming and elements from everybody's favourite 90's SNES RPG, Earthbound.
Appropriately, the demos are found in the 'Demos' folder, organised by the ARKit feature they cover. These demos use sound quite heavily so it is best to have that enabled. Since these are ARKit demos you will need to run them on a physical device.
A single demo that demonstrates the setup and function of a basic ARKit world tracking session, without any extras ("all R, no A"). It includes a couple of debugging features, such as the continuous display of position and tracking state, and the ability to enable/disable the display of ARKit debugging information (tap the tracking state indicator to toggle).
This class is the base class for all other demos, so they get the debugging and setup 'for free', so you can just focus on the incremental code required to use other features.
You can return to the menu from this and any other scene by tapping the door icon in the top left.
These demos show how to include virtual SceneKit content in an AR session, including:
- loading a SceneKit scene from file (.scn) straight into the AR world
- loading a SceneKit scene and using contents as 'prefabs' to place in the AR world
- using
SCNAction
s to control movement of virtual content (kind of janky because I'm bad a math/ran out of time) - a broken 'earthbound home designer' that allows you to select earthbound sprites and tap to place them in the virtual world. If you try it you'll find it doesn't work as you expect because we haven't got any awareness of real world features (like walls and floors), so virtual content gets placed behind/below them in 3D space.
These demos show how ARKit plane tracking works, in three stages. Each stage builds slightly on the previous so you can focus on the change in code:
-
plane detection: receiving planes and updates from ARKit by placing transparent overlays in the virtual scene. Different sounds play depending on whether ARKit is adding, updating or removing detected planes
-
hit testing support: the above + support for continuous (per AR frame) hit testing to determine whether a surface is being pointed at. Tap the 'column' icon in the bottom left to switch to hit testing mode after you have detected some surfaces.
-
content placement support: the above + the ability to add virtual content to the planes by tapping when pointing at a detected surface while in hit testing mode.
You can toggle showing/hiding the transparent overlays by long-pressing the 'column' icon.
Demonstrates the image detection feature of ARKit. At the event, attendees held up their Meetup profile picture which was recognised and used to anchor their position in the room and produce a floating version of the avatar. Then, it was possible to point at any of the detected avatars and tap to display a brief summary of their Meetup profile retrieved from the Meetup api.
These calls have been replaced in the sample with hard-coded data. You can print (or display on screen) the Earthbound character files in the Resources folder (ness, paula, jeff, poo) and they will work instead.
Demonstrates the image tracking feature of ARKit. These demos used four jumbo playing cards (the King of each suit) which I photographed to use as reference images. It's unlikely that you'll have access to the same deck of cards. Given that, the best approach to running this demo is probably going to be taking photos of your own set of cards and replacing the reference images I've provided. These can be found in the Assets/cards reference group - don't forget to update the physical size properties to match the size of your cards!
Demonstrates the object detection feature of ARKit. This is the least reliable of all the demos in the app. The Assets/objects reference group contains scanned models of a Xamarin monkey + two amiibos - ness and lucas. These objects are 'reasonably' easy to get your hands on in the real world, but you could easily replace them with other objects. In any case, my experience was that scanned models were only reliably recognised in the original environment they were scanned in, so you'll be better off replacing and re-scanning them. You can use Apple's ARScanner app to generate detection objects. In order to get the Earthbound dialog, name your models 'ninten', 'ness' and 'lucas', or add your own mappings to the ReplacementNameFor
method.
Demonstrates some of the face tracking features of ARKit, in several steps:
- detecting a face: plays a spooky sound when a face is detected
- tracking a face: the above + receiving updates to a face as it moves and displaying a peppermint mesh over the face in 3D
- expression detection: the above + display of the top five strongest 'blend shapes' detected by ARKit and very naive/rudimentary detection of some expressions. Try making 'mouth wide open', 'big smile', 'frown' or 'winking' expressions. These were only tested on me and only consider a single blendshape so may not work well for others.
Besides helping satisfy a long unfulfilled desire to make something Earthbound themed, AR-bound was designed to make it easy to quickly test different ARKit features. If you want to try out different features you can easily add a screen to the menu by creating a view controller that derives from BaseARViewController
(this gets you the debugging overlays, etc.) and then adding a DisplayInMenu
attribute to the class to give it a title etc. Just take a look at any of the existing ones if you get stuck. Menu items are ordered by namespace so use that knowledge to put your screens before or after the baked in ones.
- the scrolling background on the home screen is hack that lasted long enough for the meetup but the sad truth will be revealed if you leave it running long enough
- the flying ship doesn't know how to face the direction it's flying 🧐
- various other bugs and bits of jank due to my cripplingly poor familiarity with 3D math, vectors, matrices etc. and the fact that this is just a demo app 😎
- 'Apple Kid' font from the Ultimate Earthbound Font Pack
- 'Choose a File' bgm, assorted Earthbound sounds and sprites from Starman.net
Larry O'Brien's posts on ARKit with Xamarin were a good help and so was ToSCNMatrix4
.
P.S. You can toggle the menu BGM by tapping the 'AR Bound' title.