Skip to content

aermilin/alan-sdk-ios

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 

Repository files navigation

Alan iOS SDK

Alan PlatformAlan StudioDocsFAQBlogTwitter

GitHub release (latest by date)

Create a voice script for your application in Alan Studio and then add it to your app.

Setup

  1. Download Alan iOS SDK framework here
  2. Copy framework to your project
  3. Add framework to "Embedded Binaries" (General tab for your target)
  4. Add framework to "Linked Frameworks and Libraries" (General tab for your target)
  5. Add NSMicrophoneUsageDescription key to Info.plist of your application (must be added to microphone access)

Integrate into Swift

  1. Simply import AlanSDK
import AlanSDK
  1. Define AlanButton variable
fileprivate var button: AlanButton!
  1. Setup AlanButton
let config = AlanConfig(key: "YOUR_KEY_FROM_ALAN_STUDIO_HERE")
self.button = AlanButton(config: config)
  1. Layout button the same way as any UIView in your app

Integrate into Objective C

  1. Simply import AlanSDK
@import AlanSDK;
  1. Define AlanButton variable
@property (nonatomic) AlanButton* button;
  1. Setup AlanButton in viewDidLoad
AlanConfig* config = [[AlanConfig alloc] initWithKey:@"YOUR_KEY_FROM_ALAN_STUDIO_HERE"];
self.button = [[AlanButton alloc] initWithConfig:config];
  1. Layout button the same way as any UIView in your app

AlanSDK classes

AlanConfig

Object that describes parameters which will be provided for AlanButton.

  1. Create new AlanConfig instance with given project key:
- (instancetype)initWithKey:(NSString *)key;
Name Type Description
key NSString Project key from Alan Studio

AlanButton

This class provides a view with voice button and instance methods to communicate with Alan Studio

Create new AlanButton instance with given config object:

- (instancetype)initWithConfig:(AlanConfig *)config;
Name Type Description
config AlanConfig AlanConfig object for configuration which is described above

Play text via Alan:

- (void)playText:(NSString *)text;
Name Type Description
text NSString Text to be played

Send voice synchronized data event:

- (void)playCommand:(NSDictionary *)command;
Name Type Description
command NSDictionary Data event to be send

Set visual state of an application:

- (void)setVisualState:(NSDictionary *)visualStateData;
Name Type Description
visualStateData NSDictionary Data with visual state description

Call a function from Alan Studio:

- (void)callProjectApi:(NSString *)method withData:(NSDictionary*)data callback:(void(^)(NSError *error, NSString *object))callback;
Name Type Description
method NSString Function name
data NSDictionary Function params
callback (void(^)(NSError *error, NSString *object)) Callback to handle result

Handle events from AlanSDK:

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(handleEvent:) name:@"kAlanSDKEventNotification" object:nil];
- (void)handleEvent:(NSNotification*)notification
{
    NSDictionary *userInfo = notification.userInfo;
    NSLog(@"%@", userInfo);
}
Name Description
kAlanSDKEventNotification Notification name for Alan SDK events

Print debug log information from Alan Studio:

[AlanLog setEnableLogging:YES];

Links

  1. Alan iOS SDK documentation
  2. Integration with Swift documentation
  3. Integration with Objective-C documentation

Other platforms:

Have questions?

If you have any questions or if something is missing in the documentation, please contact us, or tweet us @alanvoiceai. We love hearing from you!).