Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Introduce Kamelet input/output data types #1162

Merged
merged 28 commits into from
Dec 2, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
e79d447
Introduce Kamelet input/output data types
christophd Nov 14, 2022
4760cf6
Refine Kamelet data type solution with review comments
christophd Nov 17, 2022
609eb4f
Fix Jitpack coordinates replacement and use KinD cluster v0.14.0
christophd Nov 17, 2022
239a377
Add CloudEvent output type on AWS S3 Kamelet source
christophd Nov 17, 2022
911ac23
Use log-sink Kamelet and show headers
christophd Nov 18, 2022
a8c2214
Fail on missing data type and add log output
christophd Nov 18, 2022
7a42538
Make sure data type resolver works on all runtimes
christophd Nov 18, 2022
2795728
Load S3 converters via annotation scan
christophd Nov 18, 2022
8420067
Preserve AWS S3 Key header as it is required during onCompletion
christophd Nov 18, 2022
5f84f84
Remove AWS S3 Json output type
christophd Nov 18, 2022
e82a3c0
Load AWS DDB converters via annotation scan
christophd Nov 18, 2022
c94bee7
Fix AWS DDB sink Kamelet
christophd Nov 21, 2022
d14ae28
Enhance YAKS tests with AWS S3 data type test
christophd Nov 21, 2022
495ddf2
Fix cloud event type and do not set data content type
christophd Nov 21, 2022
4e28c94
Enhance data type AWS S3 YAKS tests
christophd Nov 22, 2022
14cd806
Add option to disable data type registry classpath scan
christophd Nov 23, 2022
b67651e
Set proper media types
christophd Nov 24, 2022
0f2b888
Fix rest-openapi-sink YAKS test
christophd Nov 25, 2022
26b6166
Remove camel-cloudevents dependency
christophd Nov 25, 2022
0f99d4b
Move AWS S3 binary output type to generic level
christophd Nov 29, 2022
4fd0681
Do cache ObjectMapper instance in JsonModelDatType converter
christophd Nov 29, 2022
29e2cc9
Enhance documentation on data type SPI
christophd Nov 29, 2022
4cc1de4
Improve CloudEvents output produced by AWS S3 source
christophd Nov 30, 2022
dd0c65e
Simplify Json model data type
christophd Nov 30, 2022
9dd3251
Fix Knative YAKS tests
christophd Nov 30, 2022
11a8450
Revert existing Kamelets to not use data type converter
christophd Nov 30, 2022
c8e3f16
Add experimental Kamelets using data type converter API
christophd Nov 30, 2022
df62f1a
Include experimental Kamelets in the catalog
christophd Dec 1, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 26 additions & 0 deletions .github/actions/install-knative/action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# ---------------------------------------------------------------------------
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ---------------------------------------------------------------------------

name: install-knative
description: 'Install Knative serving and eventing'
runs:
using: "composite"
steps:
- name: Install Knative
shell: bash
run: |
./.github/actions/install-knative/install-knative.sh
142 changes: 142 additions & 0 deletions .github/actions/install-knative/install-knative.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
#!/bin/bash

# ---------------------------------------------------------------------------
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ---------------------------------------------------------------------------

####
#
# Install the knative setup
#
####

set -e

# Prerequisites
sudo wget https://github.com/mikefarah/yq/releases/download/v4.26.1/yq_linux_amd64 -O /usr/bin/yq && sudo chmod +x /usr/bin/yq

set +e

export SERVING_VERSION=knative-v1.6.0
export EVENTING_VERSION=knative-v1.6.0
export KOURIER_VERSION=knative-v1.6.0

apply() {
local file="${1:-}"
if [ -z "${file}" ]; then
echo "Error: Cannot apply. No file."
exit 1
fi

kubectl apply --filename ${file}
if [ $? != 0 ]; then
sleep 5
echo "Re-applying ${file} ..."
kubectl apply --filename ${file}
if [ $? != 0 ]; then
echo "Error: Application of resource failed."
exit 1
fi
fi
}

SERVING_CRDS="https://github.com/knative/serving/releases/download/${SERVING_VERSION}/serving-crds.yaml"
SERVING_CORE="https://github.com/knative/serving/releases/download/${SERVING_VERSION}/serving-core.yaml"
KOURIER="https://github.com/knative-sandbox/net-kourier/releases/download/${KOURIER_VERSION}/kourier.yaml"
EVENTING_CRDS="https://github.com/knative/eventing/releases/download/${EVENTING_VERSION}/eventing-crds.yaml"
EVENTING_CORE="https://github.com/knative/eventing/releases/download/${EVENTING_VERSION}/eventing-core.yaml"
IN_MEMORY_CHANNEL="https://github.com/knative/eventing/releases/download/${EVENTING_VERSION}/in-memory-channel.yaml"
CHANNEL_BROKER="https://github.com/knative/eventing/releases/download/${EVENTING_VERSION}/mt-channel-broker.yaml"

# Serving
apply "${SERVING_CRDS}"

YAML=$(mktemp serving-core-XXX.yaml)
curl -L -s ${SERVING_CORE} | head -n -1 | yq e 'del(.spec.template.spec.containers[].resources)' - > ${YAML}
if [ -s ${YAML} ]; then
apply ${YAML}
echo "Waiting for pods to be ready in knative-serving (dependency for kourier)"
kubectl wait --for=condition=Ready pod --all -n knative-serving --timeout=60s
else
echo "Error: Failed to correctly download ${SERVING_CORE}"
exit 1
fi

# Kourier
apply "${KOURIER}"

sleep 5

kubectl patch configmap/config-network \
--namespace knative-serving \
--type merge \
--patch '{"data":{"ingress.class":"kourier.ingress.networking.knative.dev"}}'
if [ $? != 0 ]; then
echo "Error: Failed to patch configmap"
exit 1
fi

# Eventing
apply "${EVENTING_CRDS}"

YAML=$(mktemp eventing-XXX.yaml)
curl -L -s ${EVENTING_CORE} | head -n -1 | yq e 'del(.spec.template.spec.containers[].resources)' - > ${YAML}
if [ -s ${YAML} ]; then
apply ${YAML}
else
echo "Error: Failed to correctly download ${SERVING_CORE}"
exit 1
fi

# Eventing channels
YAML=$(mktemp in-memory-XXX.yaml)
curl -L -s ${IN_MEMORY_CHANNEL} | head -n -1 | yq e 'del(.spec.template.spec.containers[].resources)' - > ${YAML}
if [ -s ${YAML} ]; then
apply ${YAML}
else
echo "Error: Failed to correctly download ${SERVING_CORE}"
exit 1
fi

# Eventing broker
YAML=$(mktemp channel-broker-XXX.yaml)
curl -L -s ${CHANNEL_BROKER} | head -n -1 | yq e 'del(.spec.template.spec.containers[].resources)' - > ${YAML}
if [ -s ${YAML} ]; then
apply ${YAML}
else
echo "Error: Failed to correctly download ${SERVING_CORE}"
exit 1
fi

# Eventing sugar controller configuration
echo "Patching Knative eventing configuration"
kubectl patch configmap/config-sugar \
-n knative-eventing \
--type merge \
-p '{"data":{"namespace-selector":"{\"matchExpressions\":[{\"key\":\"eventing.knative.dev/injection\",\"operator\":\"In\",\"values\":[\"enabled\"]}]}"}}'

kubectl patch configmap/config-sugar \
-n knative-eventing \
--type merge \
-p '{"data":{"trigger-selector":"{\"matchExpressions\":[{\"key\":\"eventing.knative.dev/injection\",\"operator\":\"In\",\"values\":[\"enabled\"]}]}"}}'

# Wait for installation completed
echo "Waiting for all pods to be ready in kourier-system"
kubectl wait --for=condition=Ready pod --all -n kourier-system --timeout=60s
echo "Waiting for all pods to be ready in knative-serving"
kubectl wait --for=condition=Ready pod --all -n knative-serving --timeout=60s
echo "Waiting for all pods to be ready in knative-eventing"
kubectl wait --for=condition=Ready pod --all -n knative-eventing --timeout=60s
27 changes: 18 additions & 9 deletions .github/workflows/yaks-tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ concurrency:
env:
CAMEL_K_VERSION: 1.10.3
YAKS_VERSION: 0.11.0
YAKS_IMAGE_NAME: "docker.io/yaks/yaks"
YAKS_IMAGE_NAME: "docker.io/citrusframework/yaks"
YAKS_RUN_OPTIONS: "--timeout=15m"

jobs:
Expand All @@ -61,10 +61,10 @@ jobs:
HEAD_REF: ${{ github.head_ref }}
HEAD_REPO: ${{ github.event.pull_request.head.repo.full_name }}
run: |
echo "Set JitPack dependency coordinates to ${HEAD_REPO/\//.}:camel-kamelets-utils:${HEAD_REF/\//'~'}-SNAPSHOT"
echo "Set JitPack dependency coordinates to ${HEAD_REPO/\//.}:camel-kamelets-utils:${HEAD_REF//\//'~'}-SNAPSHOT"

# Overwrite JitPack coordinates in the local Kamelets so the tests can use the utility classes in this PR
find kamelets -maxdepth 1 -name '*.kamelet.yaml' -exec sed -i "s/github:apache.camel-kamelets:camel-kamelets-utils:${BASE_REF}-SNAPSHOT/github:${HEAD_REPO/\//.}:camel-kamelets-utils:${HEAD_REF/\//'~'}-SNAPSHOT/g" {} +
find kamelets -maxdepth 1 -name '*.kamelet.yaml' -exec sed -i "s/github:apache.camel-kamelets:camel-kamelets-utils:${BASE_REF}-SNAPSHOT/github:${HEAD_REPO/\//.}:camel-kamelets-utils:${HEAD_REF//\//'~'}-SNAPSHOT/g" {} +
- name: Get Camel K CLI
run: |
curl --fail -L --silent https://github.com/apache/camel-k/releases/download/v${CAMEL_K_VERSION}/camel-k-client-${CAMEL_K_VERSION}-linux-64bit.tar.gz -o kamel.tar.gz
Expand All @@ -83,23 +83,24 @@ jobs:
rm -r _yaks
- name: Kubernetes KinD Cluster
uses: container-tools/kind-action@v1
with:
version: v0.14.0
node_image: kindest/node:v1.23.6@sha256:b1fa224cc6c7ff32455e0b1fd9cbfd3d3bc87ecaa8fcb06961ed1afb3db0f9ae
- name: Info
run: |
kubectl version
kubectl cluster-info
kubectl describe nodes
- name: Install Knative
uses: ./.github/actions/install-knative
- name: Install Camel K
run: |
# Configure install options
export KAMEL_INSTALL_BUILD_PUBLISH_STRATEGY=Spectrum
export KAMEL_INSTALL_REGISTRY=$KIND_REGISTRY
export KAMEL_INSTALL_REGISTRY_INSECURE=true

kamel install -w

# TODO replaces the below statement with --operator-env-vars KAMEL_INSTALL_DEFAULT_KAMELETS=false
# when we use camel k 1.8.0
kubectl delete kamelets --all
kamel install -w --operator-env-vars KAMEL_INSTALL_DEFAULT_KAMELETS=false

# Install the local kamelets
find kamelets -maxdepth 1 -name '*.kamelet.yaml' -exec kubectl apply -f {} \;
Expand All @@ -108,15 +109,23 @@ jobs:
yaks install --operator-image $YAKS_IMAGE_NAME:$YAKS_VERSION
- name: YAKS Tests
run: |
echo "Running tests"
echo "Running tests for Kamelets"
yaks run test/aws-ddb-sink $YAKS_RUN_OPTIONS

yaks run test/aws-s3 $YAKS_RUN_OPTIONS

yaks run test/extract-field-action $YAKS_RUN_OPTIONS
yaks run test/insert-field-action $YAKS_RUN_OPTIONS
yaks run test/mail-sink $YAKS_RUN_OPTIONS
yaks run test/timer-source $YAKS_RUN_OPTIONS
yaks run test/earthquake-source $YAKS_RUN_OPTIONS
yaks run test/rest-openapi-sink $YAKS_RUN_OPTIONS
yaks run test/kafka $YAKS_RUN_OPTIONS
- name: YAKS Tests experimental Kamelets
run: |
echo "Running tests for experimental Kamelets"
yaks run test/experimental/aws-ddb-sink-exp $YAKS_RUN_OPTIONS
yaks run test/experimental/aws-s3-exp $YAKS_RUN_OPTIONS
- name: YAKS Report
if: failure()
run: |
Expand Down
148 changes: 148 additions & 0 deletions kamelets/aws-ddb-experimental-sink.kamelet.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
# ---------------------------------------------------------------------------
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ---------------------------------------------------------------------------

apiVersion: camel.apache.org/v1alpha1
kind: Kamelet
metadata:
name: aws-ddb-experimental-sink
annotations:
camel.apache.org/kamelet.support.level: "Experimental"
camel.apache.org/catalog.version: "main-SNAPSHOT"
camel.apache.org/kamelet.icon: "data:image/svg+xml;base64,PHN2ZyBoZWlnaHQ9IjEwMCIgd2lkdGg9IjEwMCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj48cGF0aCBmaWxsPSIjMkQ3MkI4IiBkPSJNNzQuMTc0IDMxLjgwN2w3LjQzNyA1LjM2N3YtNy42MDJsLTcuNDgtOC43NjV2MTAuOTU3bC4wNDMuMDE1eiIvPjxwYXRoIGZpbGw9IiM1Mjk0Q0YiIGQ9Ik01OS44MzggODUuNjY2bDE0LjI5My03LjE0NlYyMC43OTFsLTE0LjMwMy03LjEyNHoiLz48cGF0aCBmaWxsPSIjMjA1Qjk4IiBkPSJNMzkuNDk2IDg1LjY2NkwyNS4yMDMgNzguNTJWMjAuNzkxbDE0LjMwMy03LjEyNHoiLz48cGF0aCBmaWxsPSIjMkQ3MkI4IiBkPSJNMzkuNTA2IDEzLjY2N2gyMC4zMjF2NzEuOTk5SDM5LjUwNnpNNzQuMTMxIDY3LjU2NFY3OC41Mmw3LjQ4LTguNzY0di03LjYwMmwtNy40MzcgNS4zOTd6TTc0LjEzMSA2Mi45MzZsLjA0My0uMDEgNy40MzctNHYtNy42NDlsLTcuNDguNjg4ek03NC4xNzQgMzYuNDI5bC0uMDQzLS4wMVY0Ny4zNWw3LjQ4LjY5OXYtNy42NDV6Ii8+PHBhdGggZmlsbD0iIzFBNDc2RiIgZD0iTTgxLjYxMSA0OC4wNDlsLTcuNDgtLjY5OS0xNC4zMDMtLjU3MkgzOS41MDZsLTE0LjMwMy41NzJWMzYuNDQzbC0uMDE1LjAwOC4wMTUtLjAzMiAxNC4zMDMtMy4zMTRINTkuODI4bDE0LjMwMyAzLjMxNCA1LjI1OCAyLjc5NXYtMS43OTdsMi4yMjItLjI0My03LjQ4LTUuNDEtMTQuMzAzLTQuNDMySDM5LjUwNmwtMTQuMzAzIDQuNDMyVjIwLjgwN2wtNy40OCA4Ljc2M3Y3LjY1M2wuMDU4LS4wNDIgMi4xNjQuMjM2djEuODM0bC0yLjIyMiAxLjE4OXY3LjYxNWwuMDU4LS4wMDYgMi4xNjQuMDMydjMuMTk2bC0xLjg2Ny4wMjgtLjM1NS0uMDM0djcuNjE4bDIuMjIyIDEuMTk1djEuODU1bC0yLjEyOS4yMzUtLjA5My0uMDd2Ny42NTJsNy40OCA4Ljc2NFY2Ny41NjRsMTQuMzAzIDQuNDMySDU5LjgyOGwxNC4zNDUtNC40NDUgNy40MzgtNS4zNjctMi4yMjItLjI0NXYtMS44MThsLTUuMjE2IDIuODA1LTE0LjM0NSAzLjI5NXYuMDA0SDM5LjUwNnYtLjAwNGwtMTQuMzQ4LTMuMjk1LS4wMjUtLjA1MS4wNy4wMzdWNTEuOTY1bDE0LjMwMy41N3YuMDE0SDU5LjgyOHYtLjAxNGwxNC4zMDMtLjU3IDcuNDgtLjY1Ni0yLjIyMi0uMDMydi0zLjE5NnoiLz48L3N2Zz4="
camel.apache.org/provider: "Apache Software Foundation"
camel.apache.org/kamelet.group: "AWS DynamoDB Streams"
labels:
camel.apache.org/kamelet.type: "sink"
spec:
definition:
title: "AWS DynamoDB Experimental Sink"
description: |-
Send data to Amazon DynamoDB. The sent data inserts, updates, or deletes an item on the specified AWS DynamoDB table.

The basic authentication method for the AWS DynamoDB service is to specify an access key and a secret key. These parameters are optional because the Kamelet provides a default credentials provider.

If you use the default credentials provider, the DynamoDB client loads the credentials through this provider and doesn't use the basic authentication method.

This Kamelet expects a JSON-formatted body and it must include the primary key values that define the DynamoDB item. The mapping between the JSON fields and table attribute values is done by key. For example, for '{"username":"oscerd", "city":"Rome"}' input, the Kamelet inserts or update an item in the specified AWS DynamoDB table and sets the values for the 'username' and 'city' attributes.

This Kamelet supports experimental input format to specify the data type that that is given to this sink. The Kamelet will do best effort to convert the provided input type to the required input for the sink.
required:
- table
- region
type: object
properties:
table:
title: Table
description: The name of the DynamoDB table.
type: string
accessKey:
title: Access Key
description: The access key obtained from AWS.
type: string
format: password
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:password
- urn:camel:group:credentials
secretKey:
title: Secret Key
description: The secret key obtained from AWS.
type: string
format: password
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:password
- urn:camel:group:credentials
region:
title: AWS Region
description: The AWS region to access.
type: string
enum: ["ap-south-1", "eu-south-1", "us-gov-east-1", "me-central-1", "ca-central-1", "eu-central-1", "us-iso-west-1", "us-west-1", "us-west-2", "af-south-1", "eu-north-1", "eu-west-3", "eu-west-2", "eu-west-1", "ap-northeast-3", "ap-northeast-2", "ap-northeast-1", "me-south-1", "sa-east-1", "ap-east-1", "cn-north-1", "us-gov-west-1", "ap-southeast-1", "ap-southeast-2", "us-iso-east-1", "ap-southeast-3", "us-east-1", "us-east-2", "cn-northwest-1", "us-isob-east-1", "aws-global", "aws-cn-global", "aws-us-gov-global", "aws-iso-global", "aws-iso-b-global"]
operation:
title: Operation
description: "The operation to perform. The options are PutItem, UpdateItem, or DeleteItem."
type: string
default: PutItem
example: PutItem
writeCapacity:
title: Write Capacity
description: The provisioned throughput to reserve for writing resources to your table.
type: integer
default: 1
useDefaultCredentialsProvider:
title: Default Credentials Provider
description: If true, the DynamoDB client loads credentials through a default credentials provider. If false, it uses the basic authentication method (access key and secret key).
type: boolean
x-descriptors:
- 'urn:alm:descriptor:com.tectonic.ui:checkbox'
default: false
uriEndpointOverride:
title: Overwrite Endpoint URI
description: The overriding endpoint URI. To use this option, you must also select the `overrideEndpoint` option.
type: string
overrideEndpoint:
title: Endpoint Overwrite
description: Select this option to override the endpoint URI. To use this option, you must also provide a URI for the `uriEndpointOverride` option.
type: boolean
x-descriptors:
- 'urn:alm:descriptor:com.tectonic.ui:checkbox'
default: false
inputFormat:
title: Input Type
description: Specify the input type for this Kamelet. The Kamelet will automatically apply conversion logic in order to transform message content to this data type.
type: string
default: json
example: json
types:
in:
mediaType: application/json
dependencies:
- github:apache.camel-kamelets:camel-kamelets-utils:main-SNAPSHOT
- "camel:core"
- "camel:jackson"
- "camel:aws2-ddb"
- "camel:kamelet"
template:
beans:
- name: dataTypeRegistry
type: "#class:org.apache.camel.kamelets.utils.format.DefaultDataTypeRegistry"
- name: inputTypeProcessor
type: "#class:org.apache.camel.kamelets.utils.format.DataTypeProcessor"
property:
- key: scheme
value: 'aws2-ddb'
- key: format
value: '{{inputFormat}}'
- key: registry
value: '#bean:{{dataTypeRegistry}}'
from:
uri: "kamelet:source"
steps:
- set-property:
name: operation
constant: "{{operation}}"
- process:
ref: "{{inputTypeProcessor}}"
- to:
uri: "aws2-ddb:{{table}}"
parameters:
secretKey: "{{?secretKey}}"
accessKey: "{{?accessKey}}"
region: "{{region}}"
operation: "{{operation}}"
writeCapacity: "{{?writeCapacity}}"
useDefaultCredentialsProvider: "{{useDefaultCredentialsProvider}}"
uriEndpointOverride: "{{?uriEndpointOverride}}"
overrideEndpoint: "{{overrideEndpoint}}"
Loading