Facial Detection/Compter Vision Chat
Using https://github.com/aaronabentheuer/AAFaceDetection (Visage) that has been coverted to Swift 3.0
-
SmileChat lets you use your face to show how you feel while you chat with friends.
-
It's currently using the above repo to classify facial states and provide an approximate result.
-
It has a few UI issues (updates labels very quickly, quirks in basic functionality), but it is mostly a proof of concept.
-
Supports Facebook login, but other types are welcome to be added.
To get started and run the app, you need to follow these steps (and possibly more):
- Open the SmileChat workspace in Xcode.
- Change the Bundle Identifier to match your domain.
- Go to Firebase and create new project.
- Select "Add Firebase to your iOS app" option, type the bundle Identifier & click continue.
- Download "GoogleService-Info.plist" file and add to the project. Make sure file name is "GoogleService-Info.plist".
- Edit Model files to make sure they conform to schema
- Go to Firebase Console, select your project, choose "Authentication" from left menu, select "SIGN-IN METHOD" and enable Facebook option and follow tutorial to setup Facebook on Firebase & Facebook Dev sites.
- Use on device to make use of front facing camera.
This project is written in Swift 3.0 and requires Xcode 8.2 to build and run.
Copyright 2017 Thomas Hocking.
Licensed under MIT License: https://opensource.org/licenses/MIT