♦️ Ruby SDK to use the IBM Watson services.
Clone or download

README.md

IBM Watson Ruby SDK

Build Status Build status Slack codecov.io Gem Version semantic-release CLA assistant

Ruby gem to quickly get started with the various IBM Watson services.

Table of Contents

Before you begin

Watson Assistant v2 API is released in beta. For details, see the "Introducing Watson Assistant" blog post.

Installation

Install the gem:

gem install ibm_watson

Install with development dependencies:

gem install --dev ibm_watson

Inside of your Ruby program do:

require "ibm_watson"

Examples

The examples folder has basic and advanced examples. The examples within each service assume that you already have service credentials.

Running in IBM Cloud

If you run your app in IBM Cloud, the SDK gets credentials from the VCAP_SERVICES environment variable.

Authentication

Watson services are migrating to token-based Identity and Access Management (IAM) authentication.

  • With some service instances, you authenticate to the API by using IAM.
  • In other instances, you authenticate by providing the username and password for the service instance.

Getting credentials

To find out which authentication to use, view the service credentials. You find the service credentials for authentication the same way for all Watson services:

  1. Go to the IBM Cloud Dashboard page.
  2. Either click an existing Watson service instance or click Create resource > AI and create a service instance.
  3. Copy the url and either apikey or username and password. Click Show if the credentials are masked.

IAM

IBM Cloud is migrating to token-based Identity and Access Management (IAM) authentication. IAM authentication uses a service API key to get an access token that is passed with the call. Access tokens are valid for approximately one hour and must be regenerated.

You supply either an IAM service API key or an access token:

  • Use the API key to have the SDK manage the lifecycle of the access token. The SDK requests an access token, ensures that the access token is valid, and refreshes it if necessary.
  • Use the access token if you want to manage the lifecycle yourself. For details, see Authenticating with IAM tokens.

Supplying the IAM API key

# In the constructor, letting the SDK manage the IAM token
discovery = IBMWatson::DiscoveryV1.new(
  version: "2017-10-16",
  iam_apikey: "<iam_apikey>",
  iam_url: "<iam_url>" # optional - the default value is https://iam.ng.bluemix.net/identity/token
)
# after instantiation, letting the SDK manage the IAM token
discovery = IBMWatson::DiscoveryV1.new(version: "2017-10-16")
discovery.iam_apikey(iam_apikey: "<iam_apikey>")

Supplying the access token

# in the constructor, assuming control of managing IAM token
discovery = IBMWatson::DiscoveryV1.new(
  version: "2017-10-16",
  iam_access_token: "<iam_access_token>"
)
# after instantiation, assuming control of managing IAM token
discovery = IBMWatson::DiscoveryV1.new(version: "2017-10-16")
discovery.iam_access_token(iam_access_token: "<access_token>")

Username and password

require "ibm_watson"
include IBMWatson
# In the constructor
discovery = DiscoveryV1.new(version: "2017-10-16", username: "<username>", password: "<password>")
# After instantiation
discovery = DiscoveryV1.new(version: "2017-10-16")
discovery.username = "<username>"
discovery.password = "<password>"

Sending requests asynchronously

Requests can be sent asynchronously. There are two asynchronous methods available for the user, async & await. When used, these methods return an Ivar object.

  • To call a method asynchronously, simply insert .await or .async into the call: service.translate would be service.async.translate
  • To access the response from an Ivar object called future, simply call future.value

When await is used, the request is made synchronously.

speech_to_text = IBMWatson::SpeechToTextV1.new(
  username: "username",
  password: "password"
)
audio_file = File.open(Dir.getwd + "/resources/speech.wav")
future = speech_to_text.await.recognize(
  audio: audio_file
)
p future.complete? # If the request is successful, then this will be true
output = future.value # The response is accessible at future.value

When async is used, the request is made asynchronously

speech_to_text = IBMWatson::SpeechToTextV1.new(
  username: "username",
  password: "password"
)
audio_file = File.open(Dir.getwd + "/resources/speech.wav")
future = speech_to_text.async.recognize(
  audio: audio_file
)
p future.complete? # Can be false if the request is still running
future.wait # Wait for the asynchronous call to finish
p future.complete? # If the request is successful, then this will now be true
output = future.value

Sending request headers

Custom headers can be passed in any request in the form of a Hash as a parameter to the headers chainable method. For example, to send a header called Custom-Header to a call in Watson Assistant, pass the headers as a parameter to the headers chainable method:

require "ibm_watson"
include IBMWatson

assistant = AssistantV1.new(
  username: "xxx",
  password: "yyy",
  version: "2017-04-21"
)

response = assistant.headers(
  "Custom-Header" => "custom_value"
  ).list_workspaces

Parsing HTTP response info

HTTP requests all return DetailedResponse objects that have a result, status, and headers

require "ibm_watson"
include IBMWatson

assistant = AssistantV1.new(
  username: "xxx",
  password: "yyy",
  version: "2017-04-21"
)

response = assistant.headers(
  "Custom-Header" => "custom_value"
  ).list_workspaces

p "Status: #{response.status}"
p "Headers: #{response.headers}"
p "Result: #{response.result}"

This would give an output of DetailedResponse having the structure:

Status: 200
Headers: "<http response headers>"
Result: "<response returned by service>"

Configuring the HTTP client

To set client configs like timeout or proxy use the configure_http_client function and pass in the configurations.

require "ibm_watson/assistant_v1"
include IBMWatson

assistant = AssistantV1.new(
  username: "{username}",
  password: "{password}",
  version: "2018-07-10"
)

assistant.configure_http_client(
  timeout: {
    # Accepts either :per_operation or :global
    per_operation: { # The individual timeouts for each operation
      read: 5,
      write: 7,
      connect: 10
    }
    # global: 30 # The total timeout time
  },
  proxy: {
    address: "bogus_address.com",
    port: 9999,
    username: "username",
    password: "password",
    headers: {
      bogus_header: true
    }
  }
)

Using Websockets

The Speech-to-Text service supports websockets with the recognize_using_websocket method. The method accepts a custom callback class. The eventmachine loop that the websocket uses blocks the main thread by default. Here is an example of using the websockets method:

require "ibm_watson"

callback = IBMWatson::RecognizeCallback.new
audio_file = "<Audio File for Analysis>"
speech_to_text = IBMWatson::SpeechToTextV1.new(
  username: "<username>",
  password: "<password>"
)
websocket = speech_to_text.recognize_using_websocket(
  audio: audio_file,
  recognize_callback: callback,
  interim_results: true
)
thr = Thread.new do # Start the websocket inside of a thread
  websocket.start # Starts the websocket and begins sending audio to the server.
  # The `callback` processes the data from the server
end
thr.join # Wait for the thread to finish before ending the program or running other code

Note: recognize_with_websocket has been deprecated in favor of recognize_using_websocket

Ruby version

Tested on:

  • MRI Ruby (RVM): 2.3.7, 2.4.4, 2.5.1
  • RubyInstaller (Windows x64): 2.3.3, 2.4.4, 2.5.1

Contributing

See CONTRIBUTING.md.

License

This library is licensed under the Apache 2.0 license.