Skip to content

joshuachoy/BT3103-Prototype

Repository files navigation

Project name: Panduh by Team Pikachu (BT3103)

Description: A free and comprehensive platform to learn Python’s Pandas library through real-word business problems faced by a multitude of industries.

Below are some steps to take in order to deploy our website.

  • Note that Steps 1 to 3 are to be completed before forking our project from Github.
  • Note that you would need to activate your GitHub Actions prior to forking. This would ensure the correct setup in AWS environment.
  1. AWS S3 Set-up.
  2. AWS IAM Set-up.
  3. AWS Lambda Set-up.
  4. Edit template.yaml file from GitHub.
  5. Upload all files onto S3 Bucket.
  6. Update links on Lambda function.
  7. Making Changes to .HTML File.
  8. Making Changes to lambda_function.py File.

Setting Up

1. AWS S3 Set-up.

To store files needed -> .html .css .js & data csv file & pandas library

Go to S3 services.

a. Create a bucket of a particular region (remember this region as you will need it later).
b. Go to "Permissions" of this bucket:
     i. Ensure that all checkboxes are "off" in "Block public access".
     ii. Ensure that you have the following code in "Bucket policy". Please replace yourBucketName with the current bucket name.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::yourBucketName/*"
        }
    ]
}

c. Upload pandas_xlrd_3.7.zip (can be found in repo) into current bucket.
d. After uploading, click onto the file and copy the Object URL.

2. AWS IAM Set-up.

Policy will be used to form connection between different AWS services

Go to IAM services.

a. Go to "Policies":
     i. Create policy.
     ii. Choose S3 services and select all S3 actions and all resources.
     iii. Choose DynamoDB services and select all actions and all resources.
     iv. Choose CloudWatch Log services and select all actions and all resources.
     v. Give your policy a name, then create policy.
b. Go to "Roles":
     i. Select type of trusted entity: AWS service
     ii. Select service: Lambda
     iii. Click on "Next: Permissions".
     iv. Select the policy that was created earlier.
     v. Proceed on to create role.

3. AWS Lambda Set-up.

Go to lambda services and ensure that you are in the same region as before.

a. Create function:
     i. Give your lambda function a name.
     ii. Runtime: Python3.7
     iii. Execution role -> Use an existing role -> Select the one you have created earlier.
     iv. Add trigger -> API Gateway -> Create a new API -> Security: "Open"
     v. Click "Layers" -> Add a layer -> Give your layer a name. -> Upload a file from Amazon S3. -> Paste the link that was copied in 1. AWS S3 Set-up -> Runtime: Python3.7 -> Add layer to lambda function.
b. Go to "Basic settings" and set 5min as Timeout.
c. You can now use the following code to import pandas.

import pandas as pd

4. Edit template.yaml file from Github

Lines 21-29

Resources:
  PandaFunction:
    Type: AWS::Serverless::Function
    Properties:
      CodeUri: industries/lambda_functions
      Handler: lambda_function.lambda_handler
      Runtime: python3.7
      Layers: 
        - arn:aws:lambda:us-east-1:895200778545:layer:pandas_layer:1      

a. Change from: PandaFunction to the name of user's own AWS Lambda Function
b. Change from: 895200778545 to the user's AWS Account No.
c. Change from: pandas_layer to the name of user's created AWS Lambda Layer
d. Change from: 1 to the version number of user's created AWS Lambda Layer

5. Upload all files onto S3 Bucket

a. After forking the repo and editing template.yaml, upload all files onto S3 Bucket created previously
     i. Head to the S3 Bucket created previously
     ii. Click on Upload
     iii. Select all the files that are present on this GitHub page

b. On S3 Bucket, change ALL files to allow Public access (Note: using Change All button may not be accurate)
     i. Click on an uploaded file that is listed on the S3 Bucket
     ii. Click on Actions -> Make public
     iii. Repeat the steps i and ii for all other files

6. Update links on Lambda function

a. Head back to the lambda function created previously
b. At line 28, replace mylambdajosh with the name of your S3 bucket created previously

obj = s3.get_object(Bucket = '<your bucket name>', Key = "short_data.csv")

c. At line 261, replace "mylambdajosh" with the name of your S3 bucket created previously as well

file_obj = s3.get_object(Bucket = '<your bucket name>', Key = "project-stock.html")

Making Changes to Files

1. Making Changes to .HTML File

1.1 Re-upload the new .html file onto S3 Bucket

     i. Head to S3 Bucket created previously
     ii. Click on Upload
     iii. Choose your newly updated .html file to upload it

1.2 Making .html file publically accessible

     i. Click on the file that is now listed on the S3 Bucket
     ii. Click on Actions
     iii. Under Actions click on Make public

1.3 Updating links on all other .html files

a. After uploading the edited .html file onto S3 Bucket, you would now need to change the links in all other .html files that are referencing to this newly edited .html file
b. Replace the links with the S3 Object URL link of this newly added .html file
     i. To get the S3 Object URL link, click once on the file listed on the S3 Bucket
     ii. A panel will appear on the right of the screen
     iii. Copy the link under Overview -> Object URL

<link rel="stylesheet" href="https://mylambdajosh.s3.amazonaws.com/css/animate.css">

    Change from: mylambdajosh.s3.amazonaws.com to the name of your own S3 Bucket URL

c. If edits were made to multiple files, repeat same steps by uploading all the newly edited .html files and changing its links respectively to its S3 links

2. Making Changes to lambda_function.py File

a. There is no need to re-upload the .py file
b. Simply commit from GitHub and lambda function on AWS will be automatically updated.

About

BT3103 Project

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published