-
Notifications
You must be signed in to change notification settings - Fork 4
/
params.json
1 lines (1 loc) · 7.89 KB
/
params.json
1
{"name":"Facelytics-ios","tagline":"Facelytics is an SDK allowing mobile and embedded apps to detect face criterias of people located in front of a camera, by analyzing the video feed in realtime.","body":"# Facelytics\r\n\r\n[![Version](https://img.shields.io/cocoapods/v/Facelytics.svg?style=flat)](http://cocoapods.org/pods/Facelytics)\r\n[![License](https://img.shields.io/cocoapods/l/Facelytics.svg?style=flat)](http://cocoapods.org/pods/Facelytics)\r\n[![Platform](https://img.shields.io/cocoapods/p/Facelytics.svg?style=flat)](http://cocoapods.org/pods/Facelytics)\r\n\r\nFacelytics is an SDK allowing mobile apps to detect face criterias of people by analyzing the front video feed in realtime. Facelytics is able to track multiple faces and then detect gender, some emotions, age range and accessories, for each detected face. For more [informations](http://face-lytics.com). You can download a sample application on the [appstore](https://itunes.apple.com/ai/app/facelytics/id997764123) to see usage exemples for the sdk\r\n\r\n## Installation\r\n\r\n### Cocoapods\r\n\r\n[CocoaPods](http://www.cocoapods.org) is the recommended way to add CaptchaFace to your project.\r\n\r\n1. Add a pod entry for Facelytics to your *Podfile* :\r\n\r\n```ruby\r\npod \"Facelytics\"\r\n```\r\n\r\n2. Install the pod(s) by running `pod install`.\r\n3. Include Facelytics wherever you need it with `#import <Facelytics/Facelytics.h>` from Objective-C or `import Facelytics` from Swift.\r\n4. The library embeded in the cocoapods is compiled in debug mode to allow you to attach the debugger while developement. A release version of the library is available in the **Pod/lib** directory of the [following archive](https://github.com/wassafr/Facelytics-ios/archive/master.zip) if you want better performances in your release build \r\n\r\n### Manual installation\r\n\r\n1. Download the [latest code version](https://github.com/wassafr/Facelytics-ios/archive/master.zip) or add the repository as a git submodule to your git-tracked project.\r\n2. Drag and drop the **Pod** directory from the archive in your project navigator. Make sure to select *Copy items* when asked if you extracted the code archive outside of your project.\r\n3. Under `Pod/lib` there is 2 versions of the library, you need to add to your target only the one you need (wether you ar debugging or making a release)\r\n4. Download the [opencv library](http://sourceforge.net/projects/opencvlibrary/files/opencv-ios/2.4.11/opencv2.framework.zip/download) and drag and drop the opencv2.framework in your project navigator.\r\n5. Add the opencv dependancies in your project properties in *Build Phases* > *Link with libraries* :\r\n * libstdc++\r\n * Accelerate\r\n * AssetsLibrary\r\n * AVFoundation\r\n * CoreGraphics\r\n * CoreImage\r\n * CoreMedia\r\n * CoreVideo\r\n * Foundation\r\n * QuartzCore\r\n * ImageIO\r\n * MobileCoreServices\r\n * UIKit\r\n6. Include Facelytics wherever you need it with `#import \"Facelytics.h\"` from Objective-C or `import Facelytics` from Swift.\r\n\r\n\r\n\r\n\r\n## Usage\r\n\r\nTo run the example project, clone the repo, and run `pod install` from the Example directory first. With [CocoaPods](http://www.cocoapods.org) , you can run `pod try Facelytics`\r\nfrom the command line.\r\n\r\nMake sure you also see [Facelytics documentation on Cocoadocs](http://cocoadocs.org/docsets/Faelytics).\r\n\r\n**Attention :** To use the SDK, you need a API key that you can get for free on the [Facelytics website](http://face-lytics.com)\r\n\r\nThe sample code is commented and show usage exemples of the SDK.\r\n\r\n###Basics\r\n1. Add the following import to the top of the file or the bringing header for swift:\r\n\r\n ```\r\n #import <Facelytics/Facelytics.h>\r\n ```\r\nThe main sdk entry point is the FLYCaptureManager object.You have to keep a strong reference on the object while the session is running. You will need a new license request at each FLYCaptureManager creation. It's recomanded to create a new FLYCaptureManager for each session.\r\n\r\n2. Optional : If the entire interface is based on Facelytics, you can check if the device can run Facelytics prior to show any Facelytics related UI :\r\n\r\n ```objc\r\n if([FLYCaptureManager deviceSupportsFacelytics])\r\n {\r\n //goto step 3-4\r\n }\r\n else\r\n {\r\n //fallback if the device can't use facelytics\r\n }\r\n ```\r\n3. Optional : If the entire interface is based on Facelytics, you can check if you are authorised to launch Facelytics prior to show any Facelytics related UI. You need an apikey to lauch the sdk. You can visite [Facelytics website](http://face-lytics.com) to get a free demo key :\r\n\r\n ```objc\r\n \r\n self.currentManager = [FLYCaptureManager alloc] init]\r\n [self.currentManager requestLicenceAuthorisation:@\"<your_key>\" completion:^(NSError *error) {\r\n if(!error)\r\n {\r\n //show the facelytics related ui\r\n }\r\n else\r\n {\r\n //handle the error ( can be no camera, camera access not granted, device not powerfull enough or provided licence invalid\r\n }\r\n }];\r\n ```\r\n\r\n4. Optional, you can show a Facelytics related UI (ie. a view with the live video feed and an appropriate drawing). I you want to show a fullscreen preview, simply show a UIViewController which inherits from `FLYVideoPreviewViewController`. I you want to show a non fullScreen preview, you can attach an instance of `FLYVideoPreviewViewController`to the capture manager by using the function '- (NSError*)attachPreview:(FLYVideoPreviewViewController*)preview' (see the sample code for different exemples)\r\n \r\n5. You have to start the sdk to begin analyzing faces. You can do it in the 'viewDidAppear' of the related ViewController for exemple\r\nIf you didn't perform step 3 :\r\n \r\n ```objc\r\n [self.currentManager startCapturewithDefaultCameraAndLicenceKey:@\"kuc\" completion:^(NSError *error) {\r\n if(error)\r\n {\r\n //handle the error ( can be no camera, camera access not granted, device not powerfull enough or provided licence invalid\r\n\r\n }\r\n else\r\n {\r\n //start the face detection. By default only the camera feed is started\r\n [self.currentManager startFaceRecognition];\r\n \r\n //step 6\r\n }\r\n }];\r\n ```\r\n \r\n If you already performed step 3 :\r\n \r\n ```objc\r\n [self.currentManager startCapturewithDefaultCameraCompletion:^(NSError *error) {\r\n if(error)\r\n {\r\n //handle the error ( can be no camera, camera access not granted, device not powerfull enough or provided licence invalid\r\n \r\n }\r\n else\r\n {\r\n //start the face detection. By default only the camera feed is started\r\n [self.currentManager startFaceRecognition];\r\n \r\n //step 6\r\n }\r\n \r\n }];\r\n ```\r\n Look at the documentation of 'FLYCamera' to allow you to customize camera settings\r\n6. Optional : You can assign a `FLYDetectionDelegate` to the capture manager to receive face related event and to know when the session is over to hide the related ui :\r\n\r\n ```objc\r\n [self.currentManager setDetectionDelegate:id<FLYDetectionDelegate>];\r\n ```\r\nYou should implement the method `- (void)detectionDidStopAfterLicenceElapsedTime` to know when the session is over and do the appropriate ui staff.\r\n\r\n7. When you're done, stop the session by calling the `[self.currentManager stopFaceRecognition]` method\r\n`\r\n \r\n## Requirements\r\n\r\n* Xcode 5\r\n* iOS 7\r\n* ARC\r\n* Devices responding to `[FLYCaptureManager deviceSupportsFacelytics]`, typically iPhones from iphone 4s and iPads from the iPad3\r\n\r\n## License\r\n\r\nFacelytics is available under a commercial license. See the LICENSE file for more info.\r\n\r\n## Author\r\n\r\nWassa, contact@wassa.fr\r\n\r\n","google":"","note":"Don't delete this file! It's used internally to help with page regeneration."}