-
Notifications
You must be signed in to change notification settings - Fork 0
- Create a Bluemix Account. Sign up in Bluemix, or use an existing account. Watson Beta or Experimental Services are free to use.
- If it is not already installed on your system, download and install the Cloud-foundry CLI tool.
- You will need to have the cURL package installed to make HTTP requests from the terminal.
- If it is not already installed on your system, install Node.js. Installing Node.js will also install the npm command. Make sure to use node version 4.2.1 as specified in package.json or you may run into problems like installation issues.
- Connect to Bluemix in the command line tool
cf api https://api.ng.bluemix.net
cf login -u <your user ID>
- Create the Visual Recognition service in Bluemix. Note: if you deployed the pre-built application that was used in the other exercise in this lab, a service instance named
visual-recognition-free
will have already been created, and you can skip to the next step.
cf create-service watson_vision_combined free visual-recognition-free
- Create a set of service credentials under the id
myKey
cf create-service-key visual-recognition-free myKey
- Retrieve your service credentials from Bluemix
cf service-key visual-recognition-free myKey
Keep the API key that is returned handy for the next section.
For the following sections, you can refer to cURL commands for the Visual Recognition service here - http://www.ibm.com/watson/developercloud/visual-recognition/api/v3/?curl#
Let's try sending a sample image of a bowl of fruit to the /classify endpoint. You will need to replace the {api-key}
value with the API key that we returned earlier.
curl -X GET "https://gateway-a.watsonplatform.net/visual-recognition/api/v3/classify?api_key={api-key}&url=https://github.com/watson-developer-cloud/doc-tutorial-downloads/raw/master/visual-recognition/fruitbowl.jpg&version=2016-05-19"
The response from the service should look something like this:
{
"custom_classes": 0,
"images": [
{
"classifiers": [
{
"classes": [
{
"class": "fruit",
"score": 0.937027,
"type_hierarchy": "/foods/fruit"
},
{
"class": "apple",
"score": 0.668188
},
{
"class": "banana",
"score": 0.549834,
"type_hierarchy": "/foods/fruits/banana"
},
{
"class": "food",
"score": 0.524979
},
{
"class": "orange",
"score": 0.5,
"type_hierarchy": "/colors/orange"
}
],
"classifier_id": "default",
"name": "default"
}
],
"resolved_url": "https://raw.githubusercontent.com/watson-developer-cloud/doc-tutorial-downloads/master/visual-recognition/fruitbowl.jpg",
"source_url": "https://github.com/watson-developer-cloud/doc-tutorial-downloads/raw/master/visual-recognition/fruitbowl.jpg"
}
],
"images_processed": 1
}
In this response, the identified classes are returned inside the classes
array, and include the identified class, the score
or confidence in that class, and the hierarchy of that class, if applicable.
- Download the sample training images to your local machine by clicking the links below:
Fruit Bowls - Positive
Not Fruit Bowls - Negative
Visual Recognition takes in a set of zip files representing images that depict each class of objects, along with a zip containing those images that do not conform to any of these classes.
- Using the zips downloaded above execute the classify call to create a new instnce of a classifier.
curl -X POST -F "fruitbowl_positive_examples=@fruitbowl.zip" -F "negative_examples=@not-fruit-bowls.zip" -F "name=fruitbowls" "https://gateway-a.watsonplatform.net/visual-recognition/api/v3/classifiers?api_key={api_key}&version=2016-05-20"
The response to creating a classifier will contain the classifier_id
of your newly created instance, along with its current status
.
{
"classifier_id": "fruitbowls_206030696",
"name": "fruitbowls",
"owner": "0e03a4bf-6b50-461d-a21d-ccdc32fe0906",
"status": "training",
"created": "2016-11-08T22:23:39.034Z",
"classes": [{"class": "fruitbowl"}]
}
You can check the status
of your classifier by listing all your classifiers as follows:
curl -X GET "https://gateway-a.watsonplatform.net/visual-recognition/api/v3/classifiers?api_key={api_key}&version=2016-05-20"
{
"classifiers": [{
"classifier_id": "fruitbowls_206030696",
"name": "fruitbowls",
"status": "ready"
}]
}
Once your classifier is ready
you can start classifying images using it.
We are now ready to re-classify our image using our new custom classifer. Like before we will invoke the classify
endpoint. Unlike before, this time we will pass the set of classifier_ids
to use to classify. Here we will send in the default
classifier along with the classifier_id
of our custom classifer.
curl -X GET "https://gateway-a.watsonplatform.net/visual-recognition/api/v3/classify?api_key={api_key}&url=https://github.com/watson-developer-cloud/doc-tutorial-downloads/raw/master/visual-recognition/fruitbowl.jpg&version=2016-05-19&classifier_ids={classifier_id},default"
Our response will look similar to before, but will now include the classifer results from our custom classifier.
{
"custom_classes": 1,
"images": [
{
"classifiers": [
{
"classes": [
{
"class": "fruit",
"score": 0.937027,
"type_hierarchy": "/foods/fruit"
},
{
"class": "apple",
"score": 0.668188
},
{
"class": "banana",
"score": 0.549834,
"type_hierarchy": "/foods/fruits/banana"
},
{
"class": "food",
"score": 0.524979
},
{
"class": "orange",
"score": 0.5,
"type_hierarchy": "/colors/orange"
}
],
"classifier_id": "default",
"name": "default"
},
{
"classes": [
{
"class": "fruitbowl",
"score": 0.55598
}
],
"classifier_id": "fruitbowls_206030696",
"name": "fruitbowls"
}
],
"resolved_url": "https://raw.githubusercontent.com/watson-developer-cloud/doc-tutorial-downloads/master/visual-recognition/fruitbowl.jpg",
"source_url": "https://github.com/watson-developer-cloud/doc-tutorial-downloads/raw/master/visual-recognition/fruitbowl.jpg"
}
],
"images_processed": 1
}