Setup Node Server
This Node.js server will give you with a quick and easy way to set up a REST API Service that's integrated with Windows Azure Active Directory for API protection using the OAuth2 protocol. The sample server included in the download are designed to run on any platform.
By the end of these walkthroughs, you should be able to build a running REST API server with the following features:
- A node.js server running an REST API interface with JSON using MongoDB as persistant storage
- REST APIs leveraging OAuth2 API protection for endpoints using Windows Azure Active Directory
We've released all of the source code for this running example in GitHub under an Apache 2.0 license, so feel free to clone (or even better, fork!) and provide feedback on the forums.
We will be using Node.js modules in this walkthrough. Modules are loadable JavaScript packages that provide specific functionality for your application. You usually install modules by using the Node.js NPM command-line tool in the NPM installation directory, but some modules, such as the HTTP module, are included the core Node.js package. Installed modules are saved in the node_modules directory at the root of your Node.js installation directory. Each module in the node_modules directory maintains its own node_modules directory that contains any modules that it depends on, and each required module has a node_modules directory. This recursive directory structure represents the dependency chain.
This dependency chain structure results in a larger application footprint, but it guarantees that all dependencies are met and that the version of the modules used in development will also be used in production. This makes the production app behavior more predictable and prevents versioning problems that might affect users.
To use this sample you will need a Windows Azure Active Directory Tenant. If you're not sure what a tenant is or how you would get one, see What is a Windows Azure AD tenant? and Sign up for Windows Azure as an organization to get you started with using Windows Azure AD.
After you get your Windows Azure AD tenant, add this sample app to your tenant so you can use it to protect your API endpoints. If you need help with this step, see: Register the REST API Service Windows Azure Active Directory
As part of registration you will be asked for your replyURL. For this walkthrough, you should have used http://localhost:8888/auth/provider/callback
To successfully use this sample, you must have a working installation of Node.js.
Install Node.js from http://nodejs.org.
To successfully use this sample, you must have a working installation of MongoDB. We will use MongoDB to make our REST API persistant across server instances.
Install MongoDB from http://mongodb.org.
NOTE: This walkthrough assumes that you use the default installation and server endpoints for MongoDB, which at the time of this writing is: mongodb://localhost
We will be using Resitfy to build our REST API. Restify is a minimal and flexible Node.js application framework derived from Express that has a robust set of features for building REST APIs on top of Connect.
From the command-line, change directories to the azuread directory. If the azuread directory does not exist, create it.
cd azuread - or- mkdir azuread; cd azuread
Type the following command:
npm install restify
This command installs Restify.
When using npm on some operating systems, you may receive an error of Error: EPERM, chmod '/usr/local/bin/..' and a request to try running the account as an administrator. If this occurs, use the sudo command to run npm at a higher privilege level.
You may see something like this when installing Restify:
clang: error: no such file or directory: 'HD/azuread/node_modules/restify/node_modules/dtrace-provider/libusdt'
make: *** [Release/DTraceProviderBindings.node] Error 1
gyp ERR! build error
gyp ERR! stack Error: `make` failed with exit code: 2
gyp ERR! stack at ChildProcess.onExit (/usr/local/lib/node_modules/npm/node_modules/node-gyp/lib/build.js:267:23)
gyp ERR! stack at ChildProcess.EventEmitter.emit (events.js:98:17)
gyp ERR! stack at Process.ChildProcess._handle.onexit (child_process.js:789:12)
gyp ERR! System Darwin 13.1.0
gyp ERR! command "node" "/usr/local/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js" "rebuild"
gyp ERR! cwd /Volumes/Development HD/azuread/node_modules/restify/node_modules/dtrace-provider
gyp ERR! node -v v0.10.11
gyp ERR! node-gyp -v v0.10.0
gyp ERR! not ok
npm WARN optional dep failed, continuing dtrace-provider@0.2.8
Restify provides a powerful mechanism to trace REST calls using DTrace. However, many operating systems do not have DTrace available. You can safely ignore these errors.
The output of this command should appear similar to the following:
restify@2.6.1 node_modules/restify
├── assert-plus@0.1.4
├── once@1.3.0
├── deep-equal@0.0.0
├── escape-regexp-component@1.0.2
├── qs@0.6.5
├── tunnel-agent@0.3.0
├── keep-alive-agent@0.0.1
├── lru-cache@2.3.1
├── node-uuid@1.4.0
├── negotiator@0.3.0
├── mime@1.2.11
├── semver@2.2.1
├── spdy@1.14.12
├── backoff@2.3.0
├── formidable@1.0.14
├── verror@1.3.6 (extsprintf@1.0.2)
├── csv@0.3.6
├── http-signature@0.10.0 (assert-plus@0.1.2, asn1@0.1.11, ctype@0.5.2)
└── bunyan@0.22.0 (mv@0.0.5)
Passport is authentication middleware for Node.js. Extremely flexible and modular, Passport can be unobtrusively dropped in to any Express-based or Resitify web application. A comprehensive set of strategies support authentication using a username and password, Facebook, Twitter, and more. We have developed a strategy for Windows Azure Active Directory. We will install this module and then add the Windows Azure Active Directory strategy plug-in.
From the command-line, change directories to the azuread directory.
Enter the following command to install passport.js
npm install passport
The output of the commadn should appear similar to the following:
passport@0.1.17 node_modules\passport
├── pause@0.0.1
└── pkginfo@0.2.3
Next, we will add the OAuth strategy, using passport-oauth, a OAuth2 handler for Passport. We will also add JWT token handler support by using node-jwt, and install support for Bearer tokens.
NOTE: Although OAuth2 provides a framework in which any known token type can be issued, only certain token types have gained wide-spread use. For protecting endpoints, that has turned out to be Bearer Tokens. Bearer tokens are the most widely issued type of token in OAuth2, and many implementations assume that bearer tokens are the only type of token issued.
From the command-line, change directories to the azuread directory.
Type the following command to install Passport.js passport-azure-ad module:
npm install passport-oauth
npm install passport-http-bearer
npm install node-jwt
The output of the commamd should appear similar to the following:
ms-passport-wsfed-saml2@0.3.8 node_modules\passport-oauth
├── xtend@2.0.3
├── xml-crypto@0.0.9
├── xmldom@0.1.13
└── xml2js@0.1.14 (sax@0.5.2)
We will be using MongoDB as our datastore For that reason, we need to install both the widely used plug-in to manage models and schemas called Mongoose, as well as the database driver for MongoDB, also called MongoDB.
npm install mongoose
npm install mongodb
Next, we'll install the remaining required modules.
From the command-line, change directories to the azuread folder if not already there:
cd azuread
Enter the following commands to install the following modules in your node_modules directory:
npm install crypto
npm install assert-plus
npm install posix-getopt
npm install util
npm install path
npm install connect
npm install xml-crypto
npm install xml2js
npm install xmldom
npm install async
npm install request
npm install underscore
npm install grunt-contrib-jshint@0.1.1
npm install grunt-contrib-nodeunit@0.1.2
npm install grunt-contrib-watch@0.2.0
npm install grunt@0.4.1
npm install xtend@2.0.3
npm install bunyan
npm update
The server.js file will be providing the majority of our functionality for our Web API server. We will be adding most of our code to this file. For production purposes you would refactor the functionality in to smaller files, such as separate routes and controllers. For the purpose of this demo we will use server.js for this functionality.
From the command-line, change directories to the azuread folder if not already there:
cd azuread
Create a server.js
file in our favorite editor and add the following information:
'use strict';
/**
* Module dependencies.
*/
var fs = require('fs');
var path = require('path');
var util = require('util');
var assert = require('assert-plus');
var bunyan = require('bunyan');
var getopt = require('posix-getopt');
var mongoose = require('mongoose/');
var restify = require('restify');
Save the file. We will return to it shortly.
This code file passes the configuration parameters from your Windows Azure Active Directory Portal to Passport.js. You created these configuration values when you added the Web API to the portal in the first part of the walkthrough. We will explain what to put in the values of these parameters after you've copied the code.
From the command-line, change directories to the azuread folder if not already there:
cd azuread
Create a config.js
file in our favorite editor and add the following information:
// Don't commit this file to your public repos
exports.creds = {
mongoose_auth_local: 'mongodb://localhost/tasklist', // Your mongo auth uri goes here
token_endpoint: '',
auth_endpoint: '',
client_secret: '', // this is the Secret you generated when configuring your Web API app on Azure AAD
// required options
federation_metadata: '', // this is the metadata URL from the AAD Portal
loginCallback: '', // this is the Callback URI you entered for APP ID URI when configuring your Web API app on Azure AAD
issuer: '', // this is the URI you entered for APP ID URI when configuring your Web API app on Azure AAD
client_id: '' // this is the Client ID you received after configuring your Web API app on Azure AAD
}
Required Values
FederationMetadata: Enter the URL of the federation metadata document.
To find the federation metadata document URL in the Windows Azure Management Portal, click Active Directory, click Integrated Apps, and in the black menu bar at the bottom of the page, click View endpoints. Then, copy the value of the Federation Metadata Document. We will try our best to get TokenEndpoint, AuthEndpoint from this metadata. If you have problems, you may wish to enter these values yourself from the metadata by looking at the XML in a web browser.
NOTE: We will add the code to parse the Federation Metadata in Step 13, known as aadutils.js.
LoginCallback: Enter the physical address of your app. Windows Azure AD sends the response for authenticated users to this URL. For this walkthrough, you should have used http://localhost:8888/auth/provider/callback
. To find this value in the Windows Azure Management Portal, click Active Directory, click Integrated Apps, click your app, and click Configure. The App ID URI is at the bottom of the page in the Single Sign-On section.
If you used a different LoginCallback when configuring your application in the previous steps, you can change it. See: Updating an App
issuer: Enter the App ID URI of your application. To find this value in the Windows Azure Management Portal, click Active Directory, click Integrated Apps, click your app, and click Configure. The App ID URI is at the bottom of the page in the Single Sign-On section. See: Updating an App
ClientID: This is the Client ID of your application which you saw in the portal when you first registered your application.
ClientSecret: The secret if you are using this flow for Client Credentials and other
We need to read these values from the Config file you just created across our application. To do this, we simply add the .config file as a required resource in our application and then set the global variables to those in the config.js document
From the command-line, change directories to the azuread folder if not already there:
cd azuread
Open your server.js
file in our favorite editor and add the following information:
var config = require('./config');
Then, add a new section to server.js
with the following code:
/**
* Setup some configuration
*/
var serverPort = process.env.PORT || 8888;
var serverURI = ( process.env.PORT ) ? config.creds.mongoose_auth_mongohq : config.creds.mongoose_auth_local;
var tokenEndpoint = config.creds.token_endpoint;
var authEndpoint = config.creds.auth_endpoint;
var fedEndpoint = config.creds.federation_metadata;
var clientID = config.creds.client_id;
var clientSecret = config.creds.client_secret;
var callbackURL = config.creds.loginCallback;
Since the goal is to keep only application logic in the server.js file, it makes sense to put some helper methods in a separate file. These methods simply help us parse the Federation Metadata and do not relate to the core scenario. It's better to stuff them away. We will be adding more to this file as we go through the Walkthrough.
From the command-line, change directories to the azuread folder if not already there:
cd azuread
Create a aadutils.js
file in our favorite editor and add the following information:
var xml2js = require('xml2js');
var request = require('request');
var aadutils = require('./aadutils');
var async = require('async');
'use strict';
exports.getElement = function (parentElement, elementName) {
if (parentElement['saml:' + elementName]) {
return parentElement['saml:' + elementName];
} else if (parentElement['samlp:'+elementName]) {
return parentElement['samlp:'+elementName];
}
return parentElement[elementName];
};
exports.getFirstElement = function (parentElement, elementName) {
var element = null;
if (parentElement['saml:' + elementName]) {
element = parentElement['saml:' + elementName];
} else if (parentElement['samlp:'+elementName]) {
element = parentElement['samlp:'+elementName];
} else {
element = parentElement[elementName];
}
return Array.isArray(element) ? element[0] : element;
};
var Metadata = function (url) {
if(!url) {
throw new Error("Metadata: url is a required argument");
}
this.url = url;
this.metadata = null;
};
Object.defineProperty(Metadata, 'url', {
get: function () {
return this.url;
}
});
Object.defineProperty(Metadata, 'saml', {
get: function () {
return this.saml;
}
});
Object.defineProperty(Metadata, 'wsfed', {
get: function () {
return this.wsfed;
}
});
Object.defineProperty(Metadata, 'oauth', {
get: function () {
return this.oauth;
}
});
Object.defineProperty(Metadata, 'metadata', {
get: function () {
return this.metadata;
}
});
exports.getElement = function (parentElement, elementName) {
if (parentElement['saml:' + elementName]) {
return parentElement['saml:' + elementName];
} else if (parentElement['samlp:'+elementName]) {
return parentElement['samlp:'+elementName];
}
return parentElement[elementName];
};
exports.getFirstElement = function (parentElement, elementName) {
var element = null;
if (parentElement['saml:' + elementName]) {
element = parentElement['saml:' + elementName];
} else if (parentElement['samlp:'+elementName]) {
element = parentElement['samlp:'+elementName];
} else {
element = parentElement[elementName];
}
return Array.isArray(element) ? element[0] : element;
};
Metadata.prototype.updateSamlMetadata = function(doc, next) {
try {
this.saml = {};
var entity = aadutils.getElement(doc, 'EntityDescriptor');
var idp = aadutils.getElement(entity, 'IDPSSODescriptor');
var signOn = aadutils.getElement(idp[0], 'SingleSignOnService');
var signOff = aadutils.getElement(idp[0], 'SingleLogoutService');
var keyDescriptor = aadutils.getElement(idp[0], 'KeyDescriptor');
this.saml.loginEndpoint = signOn[0].$.Location;
this.saml.logoutEndpoint = signOff[0].$.Location;
// copy the x509 certs from the metadata
this.saml.certs = [];
for (var j=0;j<keyDescriptor.length;j++) {
this.saml.certs.push(keyDescriptor[j].KeyInfo[0].X509Data[0].X509Certificate[0]);
}
next(null);
} catch (e) {
next(new Error('Invalid SAMLP Federation Metadata ' + e.message));
}
};
Metadata.prototype.updateWsfedMetadata = function(doc, next) {
try {
this.wsfed = {};
var entity = aadutils.getElement(doc, 'EntityDescriptor');
var roles = aadutils.getElement(entity, 'RoleDescriptor');
for(var i = 0; i < roles.length; i++) {
var role = roles[i];
if(role['fed:SecurityTokenServiceEndpoint']) {
var endpoint = role['fed:SecurityTokenServiceEndpoint'];
var endPointReference = aadutils.getFirstElement(endpoint[0],'EndpointReference');
this.wsfed.loginEndpoint = aadutils.getFirstElement(endPointReference,'Address');
var keyDescriptor = aadutils.getElement(role, 'KeyDescriptor');
// copy the x509 certs from the metadata
this.wsfed.certs = [];
for (var j=0;j<keyDescriptor.length;j++) {
this.wsfed.certs.push(keyDescriptor[j].KeyInfo[0].X509Data[0].X509Certificate[0]);
}
break;
}
}
return next(null);
} catch (e) {
next(new Error('Invalid WSFED Federation Metadata ' + e.message));
}
};
Metadata.prototype.updateOAuthMetadata = function(doc, next) {
try {
this.oauth = {};
var entity = aadutils.getElement(doc, 'EntityDescriptor');
var roles = aadutils.getElement(entity, 'RoleDescriptor');
for(var i = 0; i < roles.length; i++) {
var role = roles[i];
if(role['fed:SecurityTokenServiceEndpoint']) {
var endpoint = role['fed:SecurityTokenServiceEndpoint'];
var endPointReference = aadutils.getFirstElement(endpoint[0],'EndpointReference');
this.oauth.loginEndpoint = aadutils.getFirstElement(endPointReference,'Address');
var keyDescriptor = aadutils.getElement(role, 'KeyDescriptor');
// copy the x509 certs from the metadata
this.oauth.certs = [];
for (var j=0;j<keyDescriptor.length;j++) {
this.oauth.certs.push(keyDescriptor[j].KeyInfo[0].X509Data[0].X509Certificate[0]);
}
break;
}
}
return next(null);
} catch (e) {
next(new Error('Invalid OAuth Federation Metadata ' + e.message));
}
};
Metadata.prototype.fetch = function(callback) {
var self = this;
async.waterfall([
// fetch the Federation metadata for the AAD tenant
function(next){
request(self.url, function (err, response, body) {
if(err) {
next(err);
} else if(response.statusCode !== 200) {
next(new Error("Error:" + response.statusCode + " Cannot get AAD Federation metadata from " + self.url));
} else {
next(null, body);
}
});
},
function(body, next){
// parse the AAD Federation metadata xml
var parser = new xml2js.Parser({explicitRoot:true});
// Note: xml responses from Azure AAD have a leading \ufeff which breaks xml2js parser!
parser.parseString(body.replace("\ufeff", ""), function (err, data) {
self.metatdata = data;
next(err);
});
},
function(next){
// update the SAML SSO endpoints and certs from the metadata
self.updateSamlMetadata(self.metatdata, next);
},
function(next){
// update the SAML SSO endpoints and certs from the metadata
self.updateWsfedMetadata(self.metatdata, next);
},
function(next){
// update the SAML SSO endpoints and certs from the metadata
self.updateOAuthMetadata(self.metatdata, next);
}
], function (err) {
// return err or success (err === null) to callback
callback(err);
});
};
exports.Metadata = Metadata;
As you can see from the code, it simply takes the FederationMetadata URL you passed in config.js
and then parses it for information which we will use in the server.js
file. You are more than welcome to investigate this code and expand it if needed.
We need to tell our server where to get the methods you jus wrote.
From the command-line, change directories to the azuread folder if not already there:
cd azuread
Open your server.js
file in our favorite editor and add the following information:
var aadutils = require('./aadutils');
Next, add to the end of the Configuration
section this call to send the metadata document in our config.js
to the parser we just wrote:
this.aadutils = new aadutils.Metadata(config.creds.federation_metadata);
Now all this preparation is going to start paying off as we wind these three files together in to a REST API service.
For this walkthrough we will be using MongoDB to store our Tasks as discussed in Step 4.
If you recall from the config.js
file we created in Step 11 we called our database tasklist
as that was what we put at the end of our mogoose_auth_local connection URL. You don't need to create this database beforehand in MongoDB, it will create this for us on first run of our server application (assuming it does not already exist).
Now that we've told the server what MongoDB database we'd like to use, we need to write some additional code to create the model and schema for our server's Tasks.
Our Schema model is very simple, and you expand it as required.
NAME - The name of who is assigned to the task. A String
TASK - The task itself. A String
DATE - The date that the task is due. A DATETIME
COMPLETED - If the Task is completed or not. A BOOLEAN
From the command-line, change directories to the azuread folder if not already there:
cd azuread
Open your server.js
file in our favorite editor and add the following information below the configuration entry:
/**
*
* Connect to MongoDB
*/
global.db = mongoose.connect(serverURI);
var Schema = mongoose.Schema;
This will connect to the MongoDB server and hand back a Schema object to us.
Below the code you wrote above, add the following code:
/**
/ Here we create a schema to store our tasks. Pretty simple schema for now.
*/
var TaskSchema = new Schema({
name: String,
task: String,
completed: Boolean,
date: Date
});
// Use the schema to register a model
mongoose.model('Task', TaskSchema);
var Task = mongoose.model('Task');
As you can tell from the code, we create our Schema and then create a model object we will use to store our data throughout the code when we define our Routes.
Now that we have a database model to work with, let's add the routes we will use for our REST API server.
Routes work in Restify in the exact same way they do using the Express stack. You define routes using the URI that you expect the client applicaitons to call. Usually, you define your routes in a separate file. For our purposes, we will put our routes in the server.js file. We recommend you factor these in to their own file for production use.
A typical pattern for a Restify Route is:
function createObject(req, res, next) {
// do work on Object
_object.name = req.params.object; // passed value is in req.params under object
///...
return next(); // keep the server going
}
....
server.post('/service/:add/:object', createObject); // calls createObject on routes that match this.
This is the pattern at the most basic level. Resitfy (and Express) provide much deeper functionaltiy such as defining application types and doing complex routing across different endpoints. For our purposes, we will keep these routes very simply.
We will now add the basic CRUD routes of Create, Retrieve, Update, and Delete.
From the command-line, change directories to the azuread folder if not already there:
cd azuread
Open your server.js
file in our favorite editor and add the following information below the database entries you made above:
/**
*
* APIs
*/
function createTask(req, res, next) {
// Resitify currently has a bug which doesn't allow you to set default headers
// This headers comply with CORS and allow us to mongodbServer our response to any origin
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "X-Requested-With");
// Create a new task model, fill it up and save it to Mongodb
var _task = new Task();
if (!req.params.task) {
req.log.warn('createTask: missing task');
next(new MissingTaskError());
return;
}
/* if (Task.find(req.params.task)) {
req.log.warn('%s already exists', req.params.task);
next(new TaskExistsError(req.params.task));
return;
}*/
_task.name = req.params.name;
_task.task = req.params.task;
_task.date = new Date();
_task.save(function (err) {
if (err) {
req.log.warn(err, 'createTask: unable to save');
next(err);
} else {
res.send(201, _task);
}
});
return next();
}
/**
* Deletes a Task by name
*/
function removeTask(req, res, next) {
Task.remove( { task:req.params.task }, function (err) {
if (err) {
req.log.warn(err,
'removeTask: unable to delete %s',
req.params.task);
next(err);
} else {
res.send(204);
next();
}
});
}
/**
* Deletes all Tasks. A wipe
*/
function removeAll(req, res, next) {
Task.remove();
res.send(204);
return next();
} });
}
/**
*
*
*
*/
function getTask(req, res, next) {
Task.find(req.params.name, function (err, data) {
if (err) {
req.log.warn(err, 'get: unable to read %s', req.params.name);
next(err);
return;
}
res.json(data);
});
return next();
}
/**
* Simple returns the list of TODOs that were loaded.
*
*/
function listTasks(req, res, next) {
// Resitify currently has a bug which doesn't allow you to set default headers
// This headers comply with CORS and allow us to mongodbServer our response to any origin
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "X-Requested-With");
console.log("server getTasks");
Task.find().limit(20).sort('date').exec(function (err,data) {
if (err)
return next(err);
if (data.length > 0) {
console.log(data);
}
if (!data.length) {
console.log('there was a problem');
console.log(err);
console.log("There is no tasks in the database. Did you initalize the database as stated in the README?");
}
else {
res.json(data);
}
});
return next();
}
It makes sense to add some error handling so we can communicate back to the client the problem we encountered in a way it can understand.
Add the following code underneath the code you've written above:
///--- Errors for communicating something interesting back to the client
function MissingTaskError() {
restify.RestError.call(this, {
statusCode: 409,
restCode: 'MissingTask',
message: '"task" is a required parameter',
constructorOpt: MissingTaskError
});
this.name = 'MissingTaskError';
}
util.inherits(MissingTaskError, restify.RestError);
function TaskExistsError(name) {
assert.string(name, 'name');
restify.RestError.call(this, {
statusCode: 409,
restCode: 'TaskExists',
message: name + ' already exists',
constructorOpt: TaskExistsError
});
this.name = 'TaskExistsError';
}
util.inherits(TaskExistsError, restify.RestError);
function TaskNotFoundError(name) {
assert.string(name, 'name');
restify.RestError.call(this, {
statusCode: 404,
restCode: 'TaskNotFound',
message: name + ' was not found',
constructorOpt: TaskNotFoundError
});
this.name = 'TaskNotFoundError';
}
util.inherits(TaskNotFoundError, restify.RestError);
We have our database defined, we have our routes in place, and the last thing to do is add our server instance that will manage our calls.
Restify (and Express) have a lot of deep customization you can do for a REST API server, but again we will use the most basic setup for our purposes.
/**
* Our Server
*/
var server = restify.createServer({
name: "Windows Azure Active Directroy TODO Server",
version: "1.0.0",
formatters: {
'application/json': function(req, res, body){
if(req.params.callback){
var callbackFunctionName = req.params.callback.replace(/[^A-Za-z0-9_\.]/g, '');
return callbackFunctionName + "(" + JSON.stringify(body) + ");";
} else {
return JSON.stringify(body);
}
},
'text/html': function(req, res, body){
if (body instanceof Error)
return body.stack;
if (Buffer.isBuffer(body))
return body.toString('base64');
return util.inspect(body);
},
'application/x-www-form-urlencoded': function(req, res, body){
if (body instanceof Error) {
res.statusCode = body.statusCode || 500;
body = body.message;
} else if (typeof (body) === 'object') {
body = body.task || JSON.stringify(body);
} else {
body = body.toString();
}
res.setHeader('Content-Length', Buffer.byteLength(body));
return (body);
}
}
});
// Ensure we don't drop data on uploads
server.pre(restify.pre.pause());
// Clean up sloppy paths like //todo//////1//
server.pre(restify.pre.sanitizePath());
// Handles annoying user agents (curl)
server.pre(restify.pre.userAgentConnection());
// Set a per request bunyan logger (with requestid filled in)
server.use(restify.requestLogger());
// Allow 5 requests/second by IP, and burst to 10
server.use(restify.throttle({
burst: 10,
rate: 5,
ip: true,
}));
// Use the common stuff you probably want
server.use(restify.acceptParser(server.acceptable));
server.use(restify.dateParser());
server.use(restify.queryParser());
server.use(restify.gzipResponse());
/// Now the real handlers. Here we just CRUD
server.get('/tasks', listTasks);
server.head('/tasks', listTasks);
server.get('/tasks/:name', getTask);
server.head('/tasks/:name', getTask);
server.post('/tasks/:name/:task', createTask);
server.del('/tasks/:name/:task', removeTask);
server.del('/tasks/:name', removeTask);
server.del('/tasks', removeAll, function respond(req, res, next) { res.send(204); next(); });
// Register a default '/' handler
server.get('/', function root(req, res, next) {
var routes = [
'GET /',
'POST /tasks/:name/:task',
'GET /tasks',
'PUT /tasks/:name',
'GET /tasks/:name',
'DELETE /tasks/:name/:task'
];
res.send(200, routes);
next();
});
server.listen(serverPort, function() {
var consoleMessage = '\n Windows Azure Active Directory Tutorial'
consoleMessage += '\n +++++++++++++++++++++++++++++++++++++++++++++++++++++'
consoleMessage += '\n %s server is listening at %s';
consoleMessage += '\n Open your browser to %s/tasks\n';
consoleMessage += '+++++++++++++++++++++++++++++++++++++++++++++++++++++ \n'
consoleMessage += '\n !!! why not try a $curl -isS %s | json to get some ideas? \n'
consoleMessage += '+++++++++++++++++++++++++++++++++++++++++++++++++++++ \n\n'
console.log(consoleMessage, server.name, server.url, server.url, server.url);
});
It's a good idea to make sure we have no mistakes befor we continue on to the OAuth part of the Walkthrough.
The easiest way to do this is by using curl
in a command line. Before we do that, we need a simple utility that allows us to parse output as JSON. To do that, install the json tool as all the examples below use that.
$npm install -g jsontool
This installs the JSON tool globally. Now that we've accomplished that - let's play with the server:
First, make sure that your monogoDB isntance is running..
$sudo mongod
Then, change to the directory and start curling..
$ cd azuread
$ node server.js
$ curl -isS http://127.0.0.1:8888 | json
HTTP/1.1 200 OK
Connection: close
Content-Type: application/x-www-form-urlencoded
Content-Length: 145
Date: Wed, 29 Jan 2014 03:41:24 GMT
[
"GET /",
"POST /tasks/:name/:task",
"GET /tasks",
"DELETE /tasks",
"PUT /tasks/:name",
"GET /tasks/:name",
"DELETE /tasks/:task"
]
Then, we can add a task this way:
$ curl -isS -X POST http://127.0.0.1:8888/tasks/brandon/Hello
The response should be:
HTTP/1.1 201 Created
Connection: close
Access-Control-Allow-Origin: *
Access-Control-Allow-Headers: X-Requested-With
Content-Type: application/x-www-form-urlencoded
Content-Length: 5
Date: Tue, 04 Feb 2014 01:02:26 GMT
Hello
And we can list tasks for Brandon this way:
$ curl -isS http://127.0.0.1:8888/tasks/brandon/
If all this works out, we are ready to add OAuth to the REST API server.
Now that we have a running REST API (congrats, btw!) let's get to making it useful against Windows Azure AD.
From the command-line, change directories to the azuread folder if not already there:
cd azuread
Open your server.js
file in our favorite editor and add the following information below where you previously stated the modules to load. This is towards the top of the file and should be right after the var aadutils = require('./aadutils');
import.
/*
*
* Load our old friend Passport for OAuth2 flows
*/
var passport = require('passport')
, OAuth2Strategy = require('passport-oauth').OAuth2Strategy;
Open your server.js
file in our favorite editor and add the following information below the server.get() where you defined your Routes but above the server.listen() method.
We need to tell Restify to begin using its authorizationParser()
and look at the contents of the Authorization header.
server.use(restify.authorizationParser());
server.use(restify.bodyParser({ mapParams: false }));
Here we use the specific OAuth2 parameters we added to the config.js file. If our aadutils.js
file did its job parsing our Federation Metadata document, all these values should be populated for us even if they are blank in the config.js file.
// Now our own handlers for authentication/authorization
// Here we only use Oauth2 from Passport.js
passport.use('provider', new OAuth2Strategy({
authorizationURL: authEndpoint,
tokenURL: tokenEndpoint,
clientID: clientID,
clientSecret: clientSecret,
callbackURL: callbackURL
},
function(accessToken, refreshToken, profile, done) {
User.findOrCreate({ UserId: profile.id }, function(err, user) {
done(err, user);
});
}
));
// Let's start using Passport.js
server.use(passport.initialize());
// Redirect the user to the OAuth 2.0 provider for authentication. When
// complete, the provider will redirect the user back to the application at
// /auth/provider/callback
server.get('/auth/provider', passport.authenticate('provider'));
// The OAuth 2.0 provider has redirected the user back to the application.
// Finish the authentication process by attempting to obtain an access
// token. If authorization was granted, the user will be logged in.
// Otherwise, authentication has failed.
server.get('/auth/provider/callback',
passport.authenticate('provider', { successRedirect: '/',
failureRedirect: '/login' }));
// Simple route middleware to ensure user is authenticated.
// Use this route middleware on any resource that needs to be protected. If
// the request is authenticated (typically via a persistent login session),
// the request will proceed. Otherwise, the user will be redirected to the
// login page.
var ensureAuthenticated = function(req, res, next) {
if (req.isAuthenticated()) {
return next();
}
res.redirect('/login');
};
// Passport session setup.
// To support persistent login sessions, Passport needs to be able to
// serialize users into and deserialize users out of the session. Typically,
// this will be as simple as storing the user ID when serializing, and finding
// the user by ID when deserializing.
passport.serializeUser(function(user, done) {
done(null, user.email);
});
passport.deserializeUser(function(id, done) {
findByEmail(id, function (err, user) {
done(err, user);
});
});
You protect endpoints by specifying the passport.authenticate() call with the protocol you wish to use.
Let's edit our route in our server code to do something more interesting:
server.get('/tasks', passport.authenticate('provider', { session: false }), listTasks);
Let's use curl
again to see if we now have OAuth2 protection against our endpoints. We will do this before runnning any of our client SDKs against this endpoint. The headers returned should be enough to tell us we are down the right path.
First, make sure that your monogoDB isntance is running..
$sudo mongod
Then, change to the directory and start curling..
$ cd azuread
$ node server.js
Try a basic GET:
$ curl -isS http://127.0.0.1:8888/tasks/
HTTP/1.1 302 Moved Temporarily
Connection: close
Location: https://login.windows.net/468a75f4-f9a7-4dc4-a527-4f4522734790/oauth2/authorize?response_type=code&redirect_uri=&client_id=123
Content-Length: 0
Date: Tue, 04 Feb 2014 02:15:14 GMT
A 302 is the response you are looking for here, as that indicates that the Passport layer is trying to redirect to the authorize endpoint, which is exactly what you want.
You've went as far as you can with this server without using an OAuth2 compatible client. You will need to go through an additional walkthrough.
If you were just looking for information on how to implement a REST API using Restify and OAuth2, you have more than enough code to keep developing your service and learning how to build on this example.
If you are interested in the next steps in your ADAL journey, here are some supported ADAL clients we recommend for you to keep working:
Simply clone down to your developer machine and configure as stated in the Walkthrough.