This is a simple web application running on Bluemix that:
- uses a bootstrap front-end (asking for a link to a picture)
- passes that link to a node.js back-end application running on the IBM Bluemix platform
- then uses the Watson Visual Recognition service API to analyze the pictures content
- passes the result back to front-end
- displays the result in the front-end
I'm using this application during workshops to quickly illustrate:
- code hosting on GitHub
- the IBM Open Toolchain for CI/CD
- deployment of a node.js back-end on IBM Bluemix
- the mechanics of how front-end, back-end and an API interact
This code is not meant to be production worthy, so don't use any of it for real-life applications.
If you are looking for a detailed "getting started" guide for the Watson Visual Recognition service, head on over here. Looking for a demo of the service? Go here.
- Sign up for a free IBM Bluemix account here: https://console.ng.bluemix.net/registration
- Sign up for a free GitHub account here: https://github.com/join
- Download and install GitHub Desktop (instructions here: https://desktop.github.com)
- Download and install the Atom editor for free here: https://atom.io
- Download and install node.js on your laptop: https://nodejs.org/en/download/
- Verify the node.js installation:
node -v
- Update npm to the latest version:
sudo npm install npm@latest -g
Below are rudimentary instructions. They are not meant to be detailed enough to be followed without some face-to-face guidance.
-
Log into Bluemix
-
Create and deploy a node.js starter app at https://console.ng.bluemix.net/catalog/starters/sdk-for-nodejs
-
Wait for the application to deploy
-
Add a toolchain to the app (from the applications overview page) and configure it correctly:
- Github integration
- Delivery pipeline
-
Check that a repo has been created at https://github.com
-
Download (clone) the source code to the GitHub desktop tool
-
Run
npm install --save
to have npm modules installed locally -
Copy and paste the provided
.gitignore
,.cfignore
and.eslintrc
files -
Start the node.js back-end locally:
node app.js
-
Check if the web app is served locally at (e.g. at http://localhost:6015)
==> We now have a functional development environment set up
-
Open
index.html
and change the welcome message -
Reload web app locally and check if the change worked
-
Commit change in GitHub Desktop
-
Watch it trigger the toolchain and deploy the change to Bluemix
==> We've shown that we can propagate code changes to deployment
We will not be spending much time and effort on coding or understanding the details of the front-end. It's built on the bootstrap framework and does little more than prompt for a link, which it passes (via the javascript contained in /public/src/client-js.js
) to the back-end node.js application.
-
Copy all files from the
/public
directory of this repo into your/public
directory -
Re-start your node.js application locally and check to see if the new front-end is being served locally at (e.g. at http://localhost:6015)
-
Commit the changes and deploy the app to Bluemix via the toolchain
==> We now have a functioning front-end for our application
Next, we are going to "build up" the back-end application. We'll start with the simple node.js server that we already have, and add a route (API) to it that will accept the link (to the picture being analyzed) from the front-end.
-
Replace your current
app.js
withapp-with-route.js
(still using the nameapp.js
) -
Go through the
app.js
source code to understand how the URL is being received by the back-end -
Replace your current
package.json
with thepackage.json
file from this repo. -
Install all required npm modules with
npm install --save
-
Re-start your node.js application locally and check to see if the new back-end is being served locally (e.g. at http://localhost:6015)
-
Commit the changes and deploy the app to Bluemix via the toolchain
==> We now have a back-end that receives the link from the front-end
Next, we need to instantiate an instance of the Watson Visual Recognition service that we will be calling to analzye the picture for us.
-
In the Bluemix dashboard, navigate to the app and (in the "connections" tab) add the Watson service to it.
-
Copy the
vcap.json.example
file from this repo to your directory and name itvcap.json
-
Add the Watson API credentials to
vcap.json
(so that we have them accessible when running the app locally) -
Add the Watson service to the
manifest.yml
file==> We now have the Watson Visual Recognition service connected to our back-end application
Next, we need to add code to pass our URL to the Watson service via its REST API.
-
Replace your current
app.js
with theapp.js
from this repo -
Go through the
app.js
source code to understand how the Watson service is being called and the results are then passed to the front-end -
Re-start your node.js application locally, you should now have a fully functional visual recognition app (e.g. at http://localhost:6015)
-
Commit the changes and deploy the app to Bluemix via the toolchain
==> We're done and have a running Watson-based application on Bluemix