Skip to content

eduardocerqueira/s3-pull-processor

Repository files navigation

Docker Image CI pre-commit CI Linux GitHub tag GitHub issues PyPI version publish a Container image

s3-pull-processor

This project is a POC, proof of concept for client to upload pipeline artifacts to an object storage as S3 and able to consume the artifacts efficiently. The consumer is scalable, containerized and ready to run in k8s/Openshift

The diagram below represents a high-level workflow, the boxes uploader CLI and processor are the components here being implemented.

The infra needed to run this POC is basically AWS S3 and AWS SQS.

See developer guide for instructions to run or to contribute

diagram

demo

The items 1 to 5 below, show an e2e execution from uploader to processor or consumer.

  1. SQS and S3 are empty, no messages and no files

    sqs_empty s3_empty

  2. simulating a HOST A where the artifact exist and need to be uploaded to S3, also a message is sent to SQS. For this step the code test_host_producer was executed:

    host_A

  3. check there are messages in SQS and files in S3

    sqs_full s3_full

  4. simulating a HOST B where it consumes messages from SQS and for each message download the artifact from S3, run an action on this case it run import_to_ibutsu action then delete the file from S3 and finally delete the message from SQS. For this step the code test_host_consumer was executed

    host_B

Also see scenarios in developer guide.

links

About

POC for uploading and processing pipeline artifact files from an Object Storage

Resources

License

Stars

Watchers

Forks

Packages

No packages published