Code examples for Depth APIs in iOS
Switch branches/tags
Nothing to show
Clone or download
Latest commit 106813e Sep 28, 2018
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Pods first commit Sep 12, 2018
README_resources Add PointCloud sample Sep 15, 2018
iOS-Depth-Sampler.xcodeproj minor fixes Sep 26, 2018
iOS-Depth-Sampler.xcworkspace first commit Sep 12, 2018
iOS-Depth-Sampler minor fixes Sep 28, 2018
.gitignore Initial commit Sep 5, 2018
LICENSE Initial commit Sep 5, 2018
Podfile first commit Sep 12, 2018
Podfile.lock first commit Sep 12, 2018
README.md Fix typo Sep 18, 2018

README.md

iOS-Depth-Sampler

Platform Language License Twitter

Code examples of Depth APIs in iOS

Requirement

Use devices which has a dual camera (e.g. iPhone 8 Plus) or a TrueDepth camera (e.g. iPhone X)

How to build

Open ARKit-Sampler.xcworkspace with Xcode 10 and build it!

It can NOT run on Simulator. (Because it uses Metal.)

Contents

Real-time Depth

Depth visualization in real time using AV Foundation.

Real-time Depth Mask

Blending a background image with a mask created from depth.

Depth from Camera Roll

Depth visualization from pictures in the camera roll.

Plaease try this after taking a picture with the Camera app using the PORTRAIT mode.

Portrait Matte

Background removal demo using Portrait Effect Matte (or Portrait Effect Matte).

Plaease try this after taking a picture of a HUMAN with PORTRAIT mode.

Available in iOS 12 or later.

ARKit Depth

Depth visualization on ARKit. The depth on ARKit is available only when using ARFaceTrackingConfiguration.

2D image in 3D space

A demo to render a 2D image in 3D space.

AR occlusion

[WIP] An occlusion sample on ARKit using depth.

Author

Shuichi Tsutsumi

Freelance iOS programmer from Japan.

Support via PayPal