This application is based on this repository by Google™
Serverless Store is a basic prototype of an e-commerce web application, utilizing various Google Cloud Platform product and several other third party applications. On this web app, user can both upload a product to be sold, and purchase a product from other users. For payments, Serverless Store connects to Stripe and SendGrid to process payments via credit card and send email confirmation to users, respectively.
Serverless Store Web Application runs on Kubernetes (Google Kubernetes Engine). It consists of two microservices that communicates internally using HTTP requests. There is also a load generator inside the Kubernetes cluster that constantly simulates user activity.
Outside of the web application (and the Kubernetes engine), this application also uses several other GCP Solutions:
Products | Usage |
---|---|
Cloud Functions | Performs event-driven jobs such as sending payments to Stripe, sending emails to customers, etc. |
Firebase Authentication | An OAuth2.0 based authentication service that allows users to login with various means (Email, Google, Facebook, etc.) |
Firestore Database | NoSQL type database that stores user data |
Cloud AutoML | Machine Learning service for custom image classification |
Cloud Vision | An image classification service that can label numerous store products accurately |
Cloud Storage | Cloud service for storing files anywhere |
Stackdriver Logging | Cloud product that tracks program logs |
Serverless Store runs on several interconnecting products and components. The diagram below visualizes the relationship between those components:
Serverless Store is created in a way that can be easily set up and tear down as soon as the developer have a GCP Project with billing enabled.
Before running this set-up guide, please make sure that you have the following:
- A Google Cloud Platform project with billing enabled.
- An account with owner as a role on the project mentioned before.
- Cloud SDK installed in your local machine (You can follow the installation guide here).
In order to use Firebase product for your own Serverless Store, you need to set a Firebase Project for your GCP project.
- Go to Firebase Console.
- Click "Add project" card on the Web UI.
- On the first step of project creation, select your GCP Project, and click next.
- You will be given a reminder on the effects of adding Firebase to a Google Cloud Project. Select Continue.
- Turn off the "Enable Google Analytics for this project" switch and click Add Firebase.
- Wait a few minutes. You will be automatically redirected to your project's Firebase dashboard.
- On the overview page, click Add App, then select the web button (</>).
- Set the app name as you wish (e.g. "serverless-store"), and click Register App.
- Jot down the configuration given afterwards, specifically this part of the code:
var firebaseConfig = {
apiKey: YOUR_API_KEY,
authDomain: YOUR_AUTH_DOMAIN,
databaseURL: YOUR_DATABASE_URL,
projectId: YOUR_PROJECT_ID,
storageBucket: YOUR_STORAGE_BUCKET,
messagingSenderId: YOUR_MESSAGE_SENDER_ID,
appId: YOUR_APP_ID,
measurementId: YOUR_MEASUREMENT_ID
};
- Download the
firebase_config.json
file for yout new app. - Place the configuration file from before into both your front end (
/src/frontend
) and back end (/src/frontend
) folder.
There are some additional steps requires to get you firestore database up and running:
- In your Cloud Console page, navigate to Firestore
- Select Select Native Mode (Important! This is permanent)
- Select a location, such as
us-central1
- Click Create Database. Your database will be created very soon!
Aside of creating the project, there are some components needed in order to ensure smooth data transaction between your app and Firestore
- Open Firebase Console
- On the left drawer, go to Cloud Firestore
- Select the Indexes tab, and click Add Index
- A pop up will appear, asking you to specify the new index.
- Create the following five indexes (Order does not matter):
Collection ID: carts
Fields to index:
- Field path: uid
Index options: Ascending
- Field path: item_id
Index options: Ascending
Collection ID: carts
Fields to index:
- Field path: uid
Index options: Ascending
- Field path: modify_time
Index options: Descending
Collection ID: carts
Fields to index:
- Field path: uid
Index options: Ascending
- Field path: modify_time
Index options: Ascending
Collection ID: promos
Fields to index:
- Field path: label
Index options: Ascending
- Field path: score
Index options: Descending
Collection ID: promos
Fields to index:
- Field path: label
Index options: Ascending
- Field path: score
Index options: Ascending
Serverless Store connects with Stripe API for credit card payment processing. Doing so requires a developer to have a Stripe account.
- Go to Stripe main page.
- Sign up a new Stripe account. Follow the instructions given.
- (Optional) If you so desired, you can further activate your Stripe account. This requires several more information to be given to Stripe, including but not limited to: credit card information, bank details, and business purpose.
- Write down your test API secret key for later use. (Note: If you managed to activate your stripe account per previous step, you can use your live API secret key instead)
Note: If you are using test API, your Serverless Store will only accept test credit card number. The details are described here.
Cloud AutoML is a suite of machine learning products that enables developers with limited machine learning expertise to train high-quality models specific to their business needs. It relies on Google’s state-of-the-art transfer learning and neural architecture search technology.
One of the more specific use case of AutoML is image classification. In this step, you will train a Machine Learning model using AutoML by feeding it several test images.
- Go to Cloud Storage, click Create Bucket.
- Fill out the name of the bucket and set the location to
us-central1
. - Collect a few images of pet and non-pet products. There should be at least 10 images for each group (so 20 total).
- Create a new file named
manifest.csv
, and write out the content using the pattern[TYPE],[FILE_PATH],[LABEL]
- The
TYPE
element defines which dataset the image should go to. This can either beTRAIN
,VALIDATION
, orTEST
. AutoML requires at least 1VALIDATION
image, 1TEST
image, and 8TRAIN
images for each category. - The
FILE_PATH
element defines the image location. This would normally be in the formatgs://[bucket_name]/[image_name].png
. - The
LABEL
element defines the type of object represented in the image. This element woule bepets
for images of pet product, andnot_for_pets
for images of non-pet product. - This CSV file defines the list of images to be used by AutoML. Make sure that you enlist all images in the file.
- The
- Upload
manifest.csv
file and all of the collected images to your newly created bucket.
Now that you have your training set inside a Cloud Storage bucket, you can create your AutoML model.
- Go to Cloud Vision Datasets, click New Data Set.
- Write the name of your new model (e.g.
pet_recommendations
). - Ensure that
Single-Label Classification
option is chosen, and clickCreate Dataset
. - On the
import
tab, chooseSelect a CSV file on Cloud Storage
. - Write or browse your bucket's location, and click
Continue
. - Wait a few minutes until all images are imported.
- Go to the
Train
tab and clickStart Training
. - If asked, create 1 node for your AutoML model.
- Once the model is done, go to the
Model
page and write down theModel ID
. It should be in the formatICANXXXXXXXXXX...
.
Google also offer image classification service for a more general purpose. Google Cloud’s Vision API offers powerful pre-trained machine learning models through REST and RPC APIs. For this application, we will use Cloud Vision to assign labels to product images and quickly classify them into several categories predefined by Google.
Before you can deploy your Kubernetes cluster, you need to push your modules to Google Container Registry
- Go to folder
/src/frontend
. - Run the command
gcloud builds submit --tag gcr.io/[YOUR_GCP_PROJECT_NAME]/frontend:latest
. - Go to folder
/src/backend
. - Run the command
gcloud builds submit --tag gcr.io/[YOUR_GCP_PROJECT_NAME]/backend:latest
. - Go to folder
/src/loadgen
. - Run the command
gcloud builds submit --tag gcr.io/[YOUR_GCP_PROJECT_NAME]/loadgen:latest
.
The remaining components (Pub/Sub, Cloud Functions, Service Accounts, GKE, Cloud Storage) will be created using Terraform. For details on this components, and how you can deploy it separately, refer to their respective docs. Keep in mind though, that you will need to sort out the component's dependency yourself. This means identifying which components should be deployed before which, and adjusting each component's variables for your own configuration (e.g. Changing the project name, GCD bucket name, etc. yourself).
- Ensure that you have Terraform installed in your local machine by running
terraform version
. If your machine does not have Terraform installed, you can install it here. - Go to
/terraform/microservices
and runterraform init
on the command line. - Run
terraform apply
. - Fill out your GCP project name, AutoML model ID created on step 4, and Stripe API Token created on step 3 when prompted. If there are variables that you not have or do not wish to use, you can input random values for them.
- Type
yes
when prompted, and wait until Terraform finish creating the remaining components. This may take 5-10 minutes.
Firebase Authentication restricts its service to specific addresses only. In order to let your front end module use Firebase Authentication, you need to add it's IP address to authorized domain list.
- Go to Kubernetes Engine Service & Ingress.
- Copy the front end's LoadBalancer endpoint.
- Go to [Firebase Authentication Sign-In Method] page by clicking Authentication on the main page navigation drawer.
- Enable Google as a Sign-In Provider.
- On the Authorized Domain list, click
Add Domain
. - Paste the endpoint as the domain name, then click
Add
.
Try to access your Serverless Store application by going to frontend's endpoint you get on step 10.
If you can access the webpage without any problems, then congratulations! You have successfully set up your own Serverless Store application.
This repository also integrates tools related to DevOps (Jenkins, Locust) and Data Science (BigQuery, Data Studio). Learn more about how this repository utilizes these tools here: