template and documentation to build an Azure function app in python.
It assumes that the function needs to be triggered at regular intervals; if an HTTP trigger is required see the Azure documentation.
Locally:
In Azure:
- An Azure Resource Group for linux resources (BEST PRACTICE: make a new resource group for each project)
- The role of "Contributor" in that resource group (ask Maarten)
- create an empty python function app with Visual Data Studio
- Copy-paste your script into
__init__.py
, see this template - Copy-paste the modules required to run your python script into
requirements.txt
. BEST PRACTICE: add only what is needed. If in doubt, start a new virtual env for your project, install only what is needed and then get requirements with
$ pip freeze > requirements.txt
- Configure how often your function will run in
function.json
: edit the cron expression in the fieldschedule
- Debug locally. Example: to run the function locally, execute this command from the project root folder
$ func start --functions <my-function> --python --verbose --build remote
[OPTIONAL] if you need to fetch the function settings from an existing function app, execute
$ func azure functionapp fetch-app-settings <my-function-app>
- Deploy to Azure using Visual Studio Code
- Store your code on rodekruis' github: create a new repo named
<my-function-app>-function-app
and put your code there
You will now be able to monitor your function in the Azure portal. A new resource of type "Application Insights" will be created, where you can monitor runs, errors, etc. Good to know: in Azure portal you can also check the logs within the Function App (Functions > <my-function> > Code + Test > Logs
)
If your function takes data as input/output, the recommended workflow is to store the data in an Azure storage account and download/upload it from/to there.
When you create a new function app with Visual Data Studio a new storage account will be created automatically in the same resource group; you can use this one or an existing one. Good to know: individual files in Azure storage are called 'blobs' and directories 'containers'.
- Create a container within the storage via the Azure portal
- Configure the function in
__init__.py
They way your function exchange data with the storage is via azure-storage-blob, which needs the rights credentials. If you are using the default storage that is created with the function, the credentials are already accessible as an environmental variable named AzureWebJobsStorage
; you can get it with
credentials = os.environ['AzureWebJobsStorage']
and then get your data with e.g.
blob_service_client = BlobServiceClient.from_connection_string(credentials)
blob_client = blob_service_client.get_blob_client(container='<my-container>', blob='<my-data-file>')
data = pickle.loads(blob_client.download_blob().readall())
[OPTIONAL] If you are using a different storage account
- Copy the credentials from the Azure portal
- Add them in the function settings, so that they will be callable within the function as environmental variables
- Add them in
local.settings.json
, in order to be able to run the function locally
If your function needs to use an API (e.g. Google Maps) and requires credentials, do NOT store them in __init__.py
, since this will expose them to whoever has access to the resource group. The recommended workflow is to store them in an Azure Key Vault.
- Create an Azure Key Vault in the same resource group
- Ask to be given the role of "Key Vault Secrets Officer" in the vault (ask the admin of the resource group)
- Add your credentials in the vault under
Secrets
, via the Azure portal - Integrate the credentials in your function app <-- N.B. remove curly braces when adding the “Secret Identifier” to the “Configuration” of the Function App, see this issue