Repository for the google cloud functions used by various Canonn tools mainly used for the EDMC-Canonn and maps, providing data that is not available in the CAPI.
Canonn-GCloud uses Google's Serverless cloud function technology.
Cloud Functions can be written in Node.js, Python, Go, Java, .NET, Ruby, and PHP programming languages, and are executed in language-specific runtimes. Most of the cloud functions used here are written in python.
- No server infrastructure to maintain
- Frequently executed functions stay resident
- Automatic scaling. The number of instances are scalled up and down as needed.
- Infrequently used functions can be slow to execute because the runtime needs to be set up before first use.
- Pay as you go model means that compromises and optimisations need to be made.
- Pay as you go model means that software changes in Elite can ramp up costs.
- While its designed for small simple functions it or better to make complex multifunctional functions performance and scaling considerations
When capturing events you should batch them if possible as this helps keep function executions below the billing threshold. When fleet carriers were introduced the number of FSS events per system went up by two orders of magnitude. By December 2020, this pushed monthly invocations to 4.5 million. Batching the FSS events on the EDMC-Canonn plugin reduced invocations to 1.5 million. If you need to fetch multiple items from a database create a single function that returns all the data in one go.
When using cloud functions to access a mysql database the connection is cached so that it can be re-used by subsequent function calls. If each function scales up the number of instances then it is possible for the system overall to run out of connections. Ensure that all functions accessing limited shared resources have scaling limits applied and where possible create functions that batch.
Batched functions can end up taking longer to run so try to ensure that batches are limited to keep execution times below limits and also so that such functions do not block when scaling is limited.
Its possible to implement lazy caching so that data is stored between function executions and can be re-used. This can reduce the need to access the database and improve performance.
When exporting large amounts of data for instance for 3D maps etc you should use a paging model so that if the data grows beyond the function limits, the function will not stop working.
In order to save on bandwidth please use the following headers in the get request so that the gzip transport is used.
headers = {
"User-Agent": "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:20.0) Gecko/20100101 Firefox/20.0",
"Accept-Encoding": "gzip, deflate",
}
It is possible to test functions locally before deploying to the cloud. However as most functions need mysql access you will need to have a few things set up.
Most functions have been deployed with python 3.7 but google supports up to 3.9
I have no idea how to run node functions locally and will probably migrate them to python unless you figure it out and document it here
Sql Cloud Proxy is used to provide access to the mysql database. You will need a secrets file and a login provided by @NoFoolLikeOne
The way I use the proxy is set up as a service on a linux box so that I can access it from any PC on my home network. The command it executes is as follows
/usr/local/bin/cloud_sql_proxy -dir=/usr/local/bin/cloud_sql_proxy -instances=canonn-api-236217:europe-north1:canonnpai=tcp:10.0.0.72:3306 -credential_file=/var/local/cloud-sql-proxy/mysql_secret.json
Windows users may need to do something else
The Functions Framework lets you execute functions in your own environment.
In this example the current directory has file main.py with a function called payload.
First set envionment variables
export MYSQL_USER=yourusername
export MYSQL_PASSWORD=yourpassword
export MYSQL_HOST=localhost # or another ip address
export MYSQL_DATABASE=canonn
export INSTANCE_CONNECTION_NAME=canonn-api-236217:europe-north1:canonnpai
export GOOGLE_APPLICATION_CREDENTIALS=/var/local/cloud-sql-proxy/mysql_secret.json
Then start the function framework with the target
functions-framework --target payload --debug
NB: The functions don't currently have consistent --target functions. We wll standardise on payload
To execute the function you need to put the url into a browser or use curl
Use the following URL http://localhost:8080/?system=Merope&cmdr=LCU No Fool Like One
If all went well you would see a list POIs for that system.
For POST functions I would recommend using Postman
- Sanitise functions to remove sensitive features and check in to source control.
- Change functions to use the host environment variable so that you can specify the database host
- create a new function to make querying the database more standard and pool connections.