This repository is for educational purposes only. The goal is to touch on microservice, cloud, scalability and cost-efficiency topics through the lens of MLOps. If you actually wanna write an ML pipeline on GCP, just natively write a Kubeflow pipline on Vetrex AI like a normal human being.
Blog post describing this simple GCP pipeline example can be found here.