Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Trigger documentation and Modifying Output documentation #325

Merged
merged 12 commits into from
May 9, 2022
Merged
78 changes: 41 additions & 37 deletions docs/public-documentation/Output.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
**NOTE:** The Kafka bindings are only fully supported on [Premium](https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan) and [Dedicated App Service](https://docs.microsoft.com/en-us/azure/azure-functions/dedicated-plan) plans. Consumption plans are not supported. Kafka bindings are only supported for Azure Functions version 3.x and later versions

Use the Kafka output binding to send messages to a Kafka topic.
For information on setup and configuration details, see the overview.
For information on setup and configuration details, see the overview page.

# Examples

Expand All @@ -14,24 +14,22 @@ Use the Kafka output binding to send messages to a Kafka topic.

\<-- placeholder for the examples --\>

| Setting | Description |
| --- | --- |
| Topic | Topic Name used for writing message on Kafka topic in Kafka Output Annotation |
| BrokerList | Server address for Kafka Broker |
|
|
|
|
|
|
|
|
|

| **Setting** | **librdkafka property** | **Description** |
| --- | --- | --- |
|Setting|Description|
|-|-|
|Topic|Topic Name used for Kafka Trigger|
|BrokerList|Server Address for kafka broker|
|AvroSchema|Should be used only if a generic record should be generated|
|MaxMessageBytes|Maximum transmit message size. Default: 1MB|
|BatchSize|Maximum number of messages batched in one MessageSet. default: 10000|
|EnableIdempotence|When set to `true`, the producer will ensure that messages are successfully produced exactly once and in the original produce order. default: false|
jainharsh98 marked this conversation as resolved.
Show resolved Hide resolved
|MessageTimeoutMs|Local message timeout. This value is only enforced locally and limits the time a produced message waits for successful delivery. A time of 0 is infinite. This is the maximum time used to deliver a message (including retries). Delivery error occurs when either the retry count or the message timeout are exceeded. default: 300000|
|RequestTimeoutMs|The acknowledgement timeout of the producer request in milliseconds. default: 5000|
|MaxRetries|How many times to retry sending a failing Message. **Note:** default: 2. <remarks>Retrying may cause reordering unless <c>EnableIdempotence</c> is set to <c>true</c>.</remarks>|


|Setting|librdkafka property|Description|
|-|-|-|
| AuthenticationMode | sasl.mechanism | SASL mechanism to use for authentication |
| --- | --- | --- |
| Username | sasl.username | SASL username for use with the PLAIN and SASL-SCRAM |
| Password | sasl.password | SASL password for use with the PLAIN and SASL-SCRAM |
| Protocol | security.protocol | Security protocol used to communicate with brokers |
Expand All @@ -50,10 +48,24 @@ The following Java function uses the @KafkaOutput annotation from the Azure func

## Annotation

| **Setting** | **librdkafka property** | **Description** |
| --- | --- | --- |
|Parameter|Description|
|-|-|
|name|The variable name used in function code for the request or request body.|
|dataType| <p>Defines how Functions runtime should treat the parameter value. Possible values are:</p><ul><li>"" or string: treat it as a string whose value is serialized from the parameter</li><li>binary: treat it as a binary data whose value comes from for example OutputBinding&lt;byte[]&lt;</li></ul>|
|topic|Defines the topic.|
|brokerList|Defines the broker list.|
|maxMessageBytes|Defines the maximum transmit message size. Default: 1MB|
|batchSize|Defines the maximum number of messages batched in one MessageSet. default: 10000|
|enableIdempotence|When set to `true`, the producer will ensure that messages are successfully produced exactly once and in the original produce order. default: false|
|messageTimeoutMs|Local message timeout. This value is only enforced locally and limits the time a produced message waits for successful delivery. A time of 0 is infinite. This is the maximum time used to deliver a message (including retries). Delivery error occurs when either the retry count or the message timeout are exceeded. default: 300000|
|requestTimeoutMs|The acknowledge timeout of the producer request in milliseconds. default: 5000|
|maxRetries|How many times to retry sending a failing Message. **Note:** default: 2. Retrying may cause reordering unless EnableIdempotence is set to true.|

For connection to a secure Kafka Broker -

|Setting|librdkafka property|Description|
|-|-|-|
| authenticationMode | sasl.mechanism | SASL mechanism to use for authentication |
| --- | --- | --- |
| username | sasl.username | SASL username for use with the PLAIN and SASL-SCRAM |
| password | sasl.password | SASL password for use with the PLAIN and SASL-SCRAM |
| protocol | security.protocol | Security protocol used to communicate with brokers |
Expand All @@ -62,18 +74,6 @@ The following Java function uses the @KafkaOutput annotation from the Azure func
| sslCertificateLocation | ssl.certificate.location | Path to client&#39;s certificate |
| sslCaLocation | ssl.ca.location | Path to CA certificate file for verifying the broker&#39;s certificate |
| dataType |
-
| Defines how Functions runtime should treat the parameter value. Possible values are :-
- Binary
- String
|
| cardinality |
-
| Set to many in order to enable batching. If omitted or set to one, a single message is passed to the function. For Java functions, if you set &quot;MANY&quot;, you need to set a dataType.
- ONE
- MANY

|

# JavaScript/TypeScript/Python/Powershell

Expand All @@ -83,18 +83,22 @@ Here&#39;s the binding data in the _function.json_ file:

\&lt;-- placeholder for the examples --\&gt;

| **function.json property** | **Description** |
|-|-|
|type|Must be set to kafkaOutput.|
|direction|Must be set to out.|
|name|Name of the variable that represents request or request body in the function code.|
|brokerList|Defines the broker list.|

For connection to a secure Kafka Broker -

| **function.json property** | **librdkafka property** | **Description** |
| --- | --- | --- |
|-|-|-|
| authenticationMode | sasl.mechanism | SASL mechanism to use for authentication |
| --- | --- | --- |
| username | sasl.username | SASL username for use with the PLAIN and SASL-SCRAM |
| password | sasl.password | SASL password for use with the PLAIN and SASL-SCRAM |
| protocol | security.protocol | Security protocol used to communicate with brokers |
| sslKeyLocation | ssl.key.location | Path to client&#39;s private key (PEM) used for authentication |
| sslKeyPassword | ssl.key.password | Password for client&#39;s certificate |
| sslCertificateLocation | ssl.certificate.location | Path to client&#39;s certificate |
| sslCaLocation | ssl.ca.location | Path to CA certificate file for verifying the broker&#39;s certificate |
jainharsh98 marked this conversation as resolved.
Show resolved Hide resolved
jainharsh98 marked this conversation as resolved.
Show resolved Hide resolved

# USAGE
128 changes: 128 additions & 0 deletions docs/public-documentation/Trigger.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@
# Public Documentation

**NOTE:** The Kafka bindings are only fully supported on [Premium](https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan) and [Dedicated App Service](https://docs.microsoft.com/en-us/azure/azure-functions/dedicated-plan) plans. Consumption plans are not supported.

**NOTE:** Kafka bindings are only supported for Azure Functions version 3.x and later versions.

Use the Kafka output binding to send messages to a Kafka topic
For information on setup and configuration details, see the overview

# Examples

# C# Attributes

|Setting|Description|
|-|-|
|Topic|Topic Name used for Kafka Trigger|
|BrokerList|Server Address for kafka broker|
|ConsumerGroup|Name for the Consumer Group|
|AvroSchema|Should be used only if a generic record should be generated|
|LagThreshold|Threshold for lag(Default 1000)|

For connection to a secure Kafka Broker -

|Authentication Setting|librdkafka property|Description|
|-|-|-|
|AuthenticationMode|sasl.mechanism|SASL mechanism to use for authentication|
|Username|sasl.username|SASL username for use with the PLAIN and SASL-SCRAM|
|Password|sasl.password|SASL password for use with the PLAIN and SASL-SCRAM|
|Protocol|security.protocol|Security protocol used to communicate with brokers|
|SslKeyLocation|ssl.key.location|Path to client's private key (PEM) used for authentication|
|SslKeyPassword|ssl.key.password|Password for client's certificate|
|SslCertificateLocation|ssl.certificate.location|Path to client's certificate|
|SslCaLocation|ssl.ca.location|Path to CA certificate file for verifying the broker's certificate|

# Java Annotations
|Parameter|Description|
|-|-|
|name|The variable name used in function code for the request or request body.|
|topic|Defines the topic.|
|brokerList|Defines the broker list.|
|consumerGroup|Name for the Consumer Group.|
|cardinality|Cardinality of the trigger input. Choose 'One' if the input is a single message or 'Many' if the input is an array of messages. If you choose 'Many', please set a dataType. Default: 'One'|
|dataType| <p>Defines how Functions runtime should treat the parameter value. Possible values are:</p><ul><li>""(Default): Get the value as a string, and try to deserialize to actual parameter type like POJO.</li><li>string: Always get the value as a string</li><li>binary: Get the value as a binary data, and try to deserialize to actual parameter type byte[].</li></ul>|
|avroSchema|Avro schema for generic record deserialization|

For connection to a secure Kafka Broker -

|Authentication Setting|librdkafka property|Description|
|-|-|-|
|authenticationMode|sasl.mechanism|SASL mechanism to use for authentication|
|username|sasl.username|SASL username for use with the PLAIN and SASL-SCRAM|
|password|sasl.password|SASL password for use with the PLAIN and SASL-SCRAM|
|protocol|security.protocol|Security protocol used to communicate with brokers|
|sslKeyLocation|ssl.key.location|Path to client's private key (PEM) used for authentication|
|sslKeyPassword|ssl.key.password|Password for client's certificate|
|sslCertificateLocation|ssl.certificate.location|Path to client's certificate|
|sslCaLocation|ssl.ca.location|Path to CA certificate file for verifying the broker's certificate|

# Javascript/Typescript/Powershell/Python Configuration

The following tables explain the binding configuration properties that you set in the [function.json](https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference?tabs=blob#function-code) file -

|function.json property|Description|
|-|-|
|type|Must be set to kafkaTrigger.|
|direction|Must be set to in.|
|name|Name of the variable that represents request or request body in the function code.|
|brokerList|Defines the broker list.|
|cardinality|Cardinality of the trigger input. Choose 'One' if the input is a single message or 'Many' if the input is an array of messages. If you choose 'Many', please set a dataType. Default: 'One'|
|dataType|<p>Defines how Functions runtime should treat the parameter value. Possible values are:</p><ul><li>""(Default): Get the value as a string, and try to deserialize to actual parameter type like POJO.</li><li>string: Always get the value as a string</li><li>binary: Get the value as a binary data, and try to deserialize to actual parameter type byte[].</li></ul>|

For connection to a secure Kafka Broker -

|function.json property|librdkafka property|Description|
|-|-|-|
|authenticationMode|sasl.mechanism|SASL mechanism to use for authentication|
|username|sasl.username|SASL username for use with the PLAIN and SASL-SCRAM|
|password|sasl.password|SASL password for use with the PLAIN and SASL-SCRAM|
|protocol|security.protocol|Security protocol used to communicate with brokers|
|sslKeyLocation|ssl.key.location|Path to client's private key (PEM) used for authentication|
|sslKeyPassword|ssl.key.password|Password for client's certificate|
|sslCertificateLocation|ssl.certificate.location|Path to client's certificate|
|sslCaLocation|ssl.ca.location|Path to CA certificate file for verifying the broker's certificate|

When you are developing locally, add your application settings in the [local.settings.json](https://docs.microsoft.com/en-us/azure/azure-functions/functions-develop-local#local-settings-file) file in the Values collection.

**NOTE:** Username and password should reference a Azure function configuration variable and not be hardcoded.


# host.json settings

This section describes the configuration settings available for this binding in versions 2.x and higher. Settings in the host.json file apply to all functions in a function app instance. For more information about function app configuration settings in versions 2.x and later versions, see [host.json reference for Azure Functions](https://docs.microsoft.com/en-us/azure/azure-functions/functions-host-json).

|Setting|Description|Default Value
|-|-|-|
|MaxBatchSize|Maximum batch size when calling a Kafka trigger function|64
|SubscriberIntervalInSeconds|Defines the minimum frequency in which messages will be executed by function. Only if the message volume is less than MaxBatchSize / SubscriberIntervalInSeconds|1
|ExecutorChannelCapacity|Defines the channel capacity in which messages will be sent to functions. Once the capacity is reached the Kafka subscriber will pause until the function catches up|1
|ChannelFullRetryIntervalInMs|Defines the interval in milliseconds in which the subscriber should retry adding items to channel once it reaches the capacity|50

The settings exposed here are to customize how librdkafka works. [Librdkafka Documentation](https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md) for information on each setting.

|Setting|librdkafka property|
|-|-|
|ReconnectBackoffMs|reconnect.backoff.max.ms|
|ReconnectBackoffMaxMs|reconnect.backoff.max.ms|
|StatisticsIntervalMs|statistics.interval.ms|
|SessionTimeoutMs|session.timeout.ms|
|MaxPollIntervalMs|max.poll.interval.ms|
|QueuedMinMessages|queued.min.messages|
|QueuedMaxMessagesKbytes|queued.max.messages.kbytes|
|MaxPartitionFetchBytes|max.partition.fetch.bytes|
|FetchMaxBytes|fetch.max.bytes|
|AutoCommitIntervalMs|auto.commit.interval.ms|
|LibkafkaDebug|debug|
|MetadataMaxAgeMs|metadata.max.age.ms|
|SocketKeepaliveEnable|socket.keepalive.enable|

# Enable Runtime Scaling
In order for the Kafka trigger to scale out to multiple instances, the Runtime Scale Monitoring setting must be enabled.

In the portal, this setting can be found under Configuration > Function runtime settings for your function app.

![My image](../images/virtual-network-trigger-toggle.png)

In the CLI, you can enable Runtime Scale Monitoring by using the following command:

```az resource update -g <resource_group> -n <function_app_name>/config/web --set properties.functionsRuntimeScaleMonitoringEnabled=1 --resource-type Microsoft.Web/sites```