Skip to content

tom-clements/compounds-data

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

compounds-data

Deployed API: https://g2nabpqw52.execute-api.eu-west-2.amazonaws.com/v1/

The API is built with the Chalice Python framework. Documentation can be found here. Chalice is a wrapper around the Flask Python web framework and deploys and configures the python application onto AWS Services:

Development

Quick reference

Initial setup

  1. Clone this repository

    git clone git@github.com:tom-clements/compounds-data.git
  2. Change into the api directory

    cd compounds-data/api
  3. Create python environment

    Python version has to be supported by AWS lambdas: https://docs.aws.amazon.com/lambda/latest/dg/lambda-python.html.

    Latest supported version as of 2022-04-05: Python 3.9

    Please use a python environment manager of choice. Instructions here are shown using anaconda.
    Installing anaconda from https://www.anaconda.com/products/individual will enable conda CLI.

    conda create -n <venv_name> python=3.9
    conda activate <venv_name>
    conda install pip
  4. Install python dependencies

    pip install pytest black flake8 pytest-mypy
    pip install -r requirements.txt
  5. Start a local development environment

    chalice local --stage local

    The environment should be accessible from http://localhost:8000/ by default.

Module overview

chalicelib/

This folder has a protected name, which is picked up by chalice and deployed onto AWS lambda.
Inside here should contain all application code.

tests/

This folder contains all the tests for the application. Please run using:

pytest --mypy tests/

.chalice/config.json

The configuration file for the chalice deployment.

Deployment

Configure AWS credentials

Follow this official guide https://aws.github.io/chalice/quickstart.html.
Refer to the Credentials section.

Ensure the access and secret keys are setup for your local profile.

Chalice deploy

Deploy using:

chalice deploy --stage v1

Endpoint

Endpoint is configured to: https://g2nabpqw52.execute-api.eu-west-2.amazonaws.com/v1/

Data ETL

The data was loaded into an Amazon DynamoDB table. Run the ETL process using:

cd etl
python run_pipeline --table-name "compounds"

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages