Here you can find answers to many questions you may have around the project. If your question is not answered here, feel free to raise an issue.
After you fork/clone the project, you could run it locally with node index.js
on the scores
folder. You should include a .env
file that sets necessary environment variables. Mine is listed here (with sensitive values hidden, of course)
MONGODB_CONNECTION_STRING=mongodb://nodecosmos:LALALALA@myurlcosmos.documents.azure.com:10255/?ssl=true&replicaSet=globaldb
PORT=3000
AZURE_FUNCTIONS_RUNTIME=false
NODE_ENV=development
Moreover, you can check here for details on how to run the Functions runtime locally and test the project within its context (the method I personally prefer). After you install Azure Functions tools, simply run func host start
on the Function(s) root directory (for our project it's the directory scoresFunctionApp
) and your function will start accepting requests. Don't forget to create a local.settings.json file that contains your environment variables in the leaderboardsFunctionApp
folder. Here is a sample file:
{
"IsEncrypted": false,
"Values": {
"MONGODB_CONNECTION_STRING": "mongodb://nodecosmos:LALALALA@myurlcosmos.documents.azure.com:10255/?ssl=true&replicaSet=globaldb",
"AZURE_FUNCTIONS_RUNTIME": "true",
"NODE_ENV": "development"
},
"Host": {
"LocalHttpPort": 7071,
"CORS": "*"
}
}
We use mocha test framework and chai assertion library. To execute the tests, just run npm test
on the shell prompt. The tests are executed on a test database (name is set on the config.js
file). Also, do not forget to create a .env
file on the scores
directory and assign your environmental variables there. Check below for a sample .env
file that is used for development and testing.
MONGODB_CONNECTION_STRING=mongodb://nodecosmos:PASSWORD@nodecosmos.documents.azure.com:10255/?ssl=true&replicaSet=globaldb
PORT=3000
AZURE_FUNCTIONS_RUNTIME=false
NODE_ENV=development
Azure Storage Explorer is a free and cross-platform tool that allows you to browse your Azure Storage accounts as well as your Cosmos DB databases. You can also use familiar MongoDB related tools, like MongoChef and Robomongo.
If you want to protect your game leaderboards from unauthorized access, you should implement an appropriate mechanism. Azure App Service (a service which Azure Functions sits on) has an excellent implementation that you can use to protect your backend and it is documented here. To use it in this Functions app, comment the appropriate lines in the authhelper.js
file.
Originally I was using expressjs/compression middleware. However, I encountered some instability during local development, not sure why (maybe it doesn't work so well with Azure Functions runtime?). Give it a shot and let me know if it works for you! If you use a proxy, you may want to delegate the CPU-intensive process to your reverse proxy, same goes if you want to use a certificate for SSL connections (shameless plug: check here for another article of mine on how to easily configure SSL for a Kubernetes cluster).
- Add the desired route on the api/routes/leaderboardsRoutes.js
- The route you just added should correspond to a method in api/controllers/leaderboardsController.js
- You may wish to use one of the helper methods in api/controllers/controllerHelpers.js
Me too, it's awesome, isn't it? If you don't know Postman, it's a free app to test your APIs, highly recommended. To get started, you can find and import my set of requests from the various/nodeleaderboardscores.postman_collection.json
file. To get this working, you need to create a {{URL}} Postman variable (details).
Easy! Find the relevant JavaScript file in the api/models/folder and update it to your preferences. Added fields/properties will 'automagically' be persisted in the database.
Sure, if you want to contribute via a pull request, go ahead! For bugs/features/complaints, I would be really grateful if you reported them here.
Since your API is stateless, you should use a store to preserve state in order to properly limit client requests and protect your API. A cool option to do that is Azure Redis Cache in alignment with one of these excellent express modules: strict-rate-limiter, express-brute, or rate-limiter. If you want to use a fast and scalable backend for these modules or if you just want something to cache your data, we recommend the Azure Redis Cache service.
Cosmos DB is a multi-tenant database. As such, you don't rent a Cosmos DB server with specific hardware specifications. When you use Cosmos DB, you pay for something called 'Request Units' (RUs) which is a measure of the computing resources that are needed in order to serve a specific client request. To find out more about RUs, check the official documentation here.
For the most updated pricing details, check out the official pricing documentation here. Be aware that the collection that is created by the project uses 1000RUs (this is the default), you can modify it via Azure Portal or programmatically and scale it down to 400RUs (that's the minimum), if you don't need the extra horse power. If you do need it though, you could as easily scale it up to 10.000 RUs.
There are various ways that you can monitor your RUs consumption, check the official documentation here to see some of them. If you happen to have many similar queries hitting the database in a short amount of time, maybe you should consider refactoring the project to add a caching layer (we recommend Azure Redis Cache) for your data.
Please check the official and frequently updated documentation here.
The Function gets configured to use Application Insights for instrumentation. Check here. Below you can see two screenshots that contain some of the performance metrics Application Insights can generate for you.
You can check here for a detailed guide about Azure Functions + Application Insights integration. Also, you can take a look at the Storage account that's created with the Azure Function, some logs about Function's calls are kept there.
You should the following resources created in your Azure subscription
- A Cosmos DB database account
- A Storage account (that backs your Azure Function)
- An App Service Plan (uses the Consumption plan)
- An App Service (hosts your Function)
- An Application Insights service that monitors your Function
Check here a reference screenshot
If you are new to Azure, you should check the ARM documentation here.
There is an idle timeout for Azure Functions that are hosted on a Consumption Plan, check here for details. This can lead to your Azure Functions host being removed (due to timeout) and re-added (due to a new API call). You can alter the default timeout (which is 5 minutes) by modifying the functionTimeout
in host.json
. It can be set to a maximum of 10 minutes, as per the code below.
// Set functionTimeout to 10 minutes
{
"functionTimeout": "00:10:00"
}
Other two options to prevent this behavior would be
- Use App Service Plan instead of Consumption plan to host your Function
- Create a separate Function using a timer trigger. This Function would be triggered every couple of minutes and just send a HTTP request to your Leaderboards API Function. In this way, the Leaderboards API Function idle timeout would never occur, so your Leaderboards API Function calls would always be fast.
Moreover, on the very first call to your Azure Function (the so-called "cold start") there will be a delay as node installs, reads and loads all module files. They are cached, though, so subsequent executions have significantly better performance. You can check here for a way this can be improved (even though this approach hasn't been tested with current project).
To save you some money. Cosmos DB charges per collection, check here for a relevant blog post.
You can read here in order to undestand Cosmos DB pricing.
Yup! Check here. Also, check here to learn more about MongoDB API for Cosmos DB. For Cosmos DB use cases, check here. For some free Azure resources, check here.
Check here. When you deploy the Function via the ARM template provided, you are billed by Azure Functions consumption plan. Great thing with consumption plan is that the first million calls per month are free. It's a pretty cost-effecive plan, for the specifics rest you should check the relevant pricing page here.
Microsoft Azure operates in many datacenters around the globe, you can check them here. If you want to see the latency between them and your location, you can use various online tools such as azurespeed.com or azurespedtest.azurewebsites.net.
Check architecture.vsdx
in the various
folder.
You can check here for a Unity client that can communicate with various Azure PaaS services like App Service Easy Tables, Event Hubs and Table Storage. Check here for the relevant blog post.