Skip to content
This repository was archived by the owner on Oct 14, 2023. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ This is a collection of Python function samples on Azure Functions 2.X. For a co
| [timer-trigger-cosmos-output-binding](v2functions/timer-trigger-cosmosdb-output-binding) | Azure Functions Timer Trigger Python Sample. The function gets blog RSS feed and store the results into CosmosDB using Cosmos DB output binding | Timer | NONE | CosmosDB |
| [http-trigger-blob-sas-token](v2functions/http-trigger-blob-sas-token) | Azure Function HTTP Trigger Python Sample that returns a SAS token for Azure Storage for the specified container and blob name | HTTP | NONE | HTTP |
| [http-trigger-dump-request](v2functions/http-trigger-dump-request) | Azure Function HTTP Trigger Python Sample that returns request dump info with JSON format | HTTP | NONE | HTTP |
| [http-trigger-onnx-model](v2functions/http-trigger-onnx-model) | This function demonstrates running an inference using an ONNX model. It is triggered by an HTTP request. | HTTP | NONE | HTTP |
| [blob-trigger-watermark-blob-out-binding](v2functions/blob-trigger-watermark-blob-out-binding) | Azure Function Python Sample that watermarks an image. This function triggers on an input blob (image) and adds a watermark by calling into the Pillow library. The resulting composite image is then written back to blob storage using a blob output binding. | Blob Storage | Blob Storage | Blob Storage |
| [sbqueue-trigger-sbqueue-out-binding](v2functions/sbqueue-trigger-sbqueue-out-binding) | Azure Functions Service Bus Queue Trigger Python Sample. The function demonstrates reading from a Service Bus queue and placing a message into an output Service Bus queue. | Service Bus Queue | None | Service Bus Queue |

Expand Down
72 changes: 72 additions & 0 deletions v2functions/http-trigger-onnx-model/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
import logging
import azure.functions as func
import onnxruntime
from PIL import Image
import numpy as np
import io

def main(req: func.HttpRequest, context: func.Context) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')

body = req.get_body()

try:
image = Image.open(io.BytesIO(body))
except IOError:
return func.HttpResponse(
"Bad input. Unable to cast request body to an image format.",
status_code=400
)

result = run_inference(image, context)

return func.HttpResponse(result)

def run_inference(image, context):
# See https://github.com/onnx/models/tree/master/vision/style_transfer/fast_neural_style
# for implementation details
model_path = f'{context.function_directory}/rain_princess.onnx'
session = onnxruntime.InferenceSession(model_path)
metadata = session.get_modelmeta()
logging.info(f'Model metadata:\n' +
f' Graph name: {metadata.graph_name}\n' +
f' Model version: {metadata.version}\n' +
f' Producer: {metadata.producer_name}')

# Preprocess image
original_image_size = image.size[0], image.size[1]
logging.info('Preprocessing image...')
# Model expects a 224x224 shape input
image = image.resize((224, 224), Image.LANCZOS)
bands = image.getbands()
if bands == ('R', 'G', 'B'):
logging.info(f'Image is RGB. No conversion necessary.')
else:
logging.info(f'Image is {bands}, converting to RGB...')
image = image.convert('RGB')

x = np.array(image).astype('float32')
x = np.transpose(x, [2, 0, 1])
x = np.expand_dims(x, axis=0)

output_name = session.get_outputs()[0].name
input_name = session.get_inputs()[0].name
logging.info('Running inference on ONNX model...')
result = session.run([output_name], {input_name: x})[0][0]

# Postprocess image
result = np.clip(result, 0, 255)
result = result.transpose(1,2,0).astype("uint8")
img = Image.fromarray(result)
max_width = 800
height = int(max_width * original_image_size[1] / original_image_size[0])
# Upsample and correct aspect ratio for final image
img = img.resize((max_width, height), Image.BICUBIC)

# Store inferred image as in memory byte array
img_byte_arr = io.BytesIO()
# Convert composite to RGB so we can return JPEG
img.convert('RGB').save(img_byte_arr, format='JPEG')
final_image = img_byte_arr.getvalue()

return final_image
Binary file added v2functions/http-trigger-onnx-model/example.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
20 changes: 20 additions & 0 deletions v2functions/http-trigger-onnx-model/function.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
Binary file not shown.
93 changes: 93 additions & 0 deletions v2functions/http-trigger-onnx-model/readme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
# http-trigger-onnx-model (Python)

| Sample | Description | Trigger | In Bindings | Out Bindings
| ------------- | ------------- | ------------- | ----------- | ----------- |
| `http-trigger-onnx-model` | This function demonstrates running an inference using an ONNX model. It is triggered by an HTTP request. See _[Try it out](#try-it-out)_ for usage. | HTTP | NONE | HTTP |

The style transfer model used in this function is called _Rain Princess_. It is downloaded from the [ONNX Model Zoo][3].

Artistic style transfer models mix the content of an image with the style of another image. Examples of the styles can be seen [here][4].

Open Neural Network Exchange (ONNX) is an open standard format for representing machine learning models. ONNX is supported by a community of partners who have implemented it in many frameworks and tools.

The ONNX Model Zoo is a collection of pre-trained, state-of-the-art models in the ONNX format contributed by community members like you. See https://github.com/onnx/models for more.

You should be able to use other ONNX models in your function by rewriting the preprocess/postprocess code and wiring the expected inputs and outputs.

## Sample run
![Screenshot](example.png)
This example is probably not going to age well. However the pun stands on its own. Shown here: [httpie][1], [imgcat][2].

## Dependencies
```
Pillow==7.0.0
onnxruntime==1.1.0
numpy==1.18.1
```

## Configuration
As specified in `functions.json`, this function is triggered by an HTTP request. It expects a POST request with raw image bytes (JPEG/PNG/whatever the Pillow library can open). Output is an HTTP response with the resulting style transferred image (JPEG encoded).

```json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
```

## How to develop and publish the functions

### Local development

```sh
func host start
```

### Try it out
```bash
# Make a POST request
$ curl -s --data-binary @babyyoda.jpg http://localhost:7071/api/http-trigger-onnx-model -o out.jpg

# Open the resulting image (on a Mac)
# Use feh or xdg-open on Linux
$ open out.jpg
```

### Publish the function to the cloud

Publish the function to the cloud
```sh
FUNCTION_APP_NAME="MyFunctionApp"
func azure functionapp publish $FUNCTION_APP_NAME --build-native-deps --no-bundler
```

Add Functions App Settings
```sh
FUNCTION_STORAGE_CONNECTION="*************"
az webapp config appsettings set \
-n $FUNCTION_APP_NAME \
-g $RESOURCE_GROUP \
--settings \
MyStorageConnectionString=$FUNCTION_STORAGE_CONNECTION
```


[1]: https://httpie.org/
[2]: https://iterm2.com/documentation-images.html
[3]: https://github.com/onnx/models/tree/master/vision/style_transfer/fast_neural_style
[4]: https://github.com/pytorch/examples/tree/master/fast_neural_style#models
4 changes: 3 additions & 1 deletion v2functions/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,6 @@ six==1.11.0
# Additional packages
requests==2.20.1
feedparser==5.2.1
pillow>=6.2.0
Pillow==7.0.0
numpy==1.18.1
onnxruntime==1.1.0