Skip to content

ExchangeRatesAPI

Dirk Steynberg edited this page Jun 9, 2020 · 1 revision

MySQL to S3

Preamble

DBtoS3 is proud to announce support for full load and replication for exchangeratesapi.io Databases. Exchange rates API is a free service for current and historical foreign exchange rates published by the European Central Bank

Requirements

Database & AWS

  • No special access is required for the exchangeratesapi.io replication method
  • A Read/Write user with access to S3 will be required

Installation

The first step would be to install DBtoS3 in your python environment

pip install dbtos3

Usage

Environment Setup

To begin, first create an appropriate python project. It is crucial that this directory is not moved or tampered with once replication has begun. Within this directory, the DBtoS3 catalogue and logging will take place which is crucial for successful loading, replication and continuous logging for the replication process.

In this directory, create an empty .env and empty app.py file where we will set up our DBtoS3 ExchangeRatesAPI full load and replication tasks.

ProjectDirectory/
    |_ .env
    |_ app.py
    |_ catalog.db
    |_ Logs/

We will first have a look at our .env file. Copy the text below and paste it into your .env file, then go about adding all the relevant credentials pertaining to the api and s3 sources.

#[s3 credentials]
AWS_SECRET_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION=
S3_BUCKET=

#[exchange rate api api]
EXCHANGE_S3_MAIN_KEY=

Once you have added your credentials, we will gather these credentials in our app.py file before making use of the DBtoS3 features.

import os

from dotenv import load_dotenv

import dbtos3

APP_ROOT = os.path.join(os.path.dirname(__file__))  # refers to application_top
load_dotenv(os.path.join(APP_ROOT, '.env'))


if __name__ == '__main__':
    pass

Now that we have our app ready to consume credentials, we can import the ExchangeRatesAPI Model and assign the max historical date needed.

It is good practice to set up two methods, one for the initial full load and a second that will run from then onwards as the standard replication process.

It is important to initiate a full load which takes on a set amount of days, in order to set up the catalog that would monitor changes and assume any new data to S3 every time the application runs.

  • "start_at" refers to the max historical date of exchange rates for initial load
import os

from dotenv import load_dotenv

import dbtos3

APP_ROOT = os.path.join(os.path.dirname(__file__))  # refers to application_top
load_dotenv(os.path.join(APP_ROOT, '.env'))

###
# Setting up MySQL Replication and full-load
###

exchange_rates = dbtos3.ExchangesRatesReplicationMethod(
    region_name=os.getenv('AWS_REGION'),
    aws_access_key_id=os.getenv('AWS_SECRET_KEY_ID'),
    aws_secret_access_key=os.getenv('AWS_SECRET_ACCESS_KEY'),
    s3bucket=os.getenv('S3_BUCKET'),
    main_key=os.getenv('EXCHANGE_S3_MAIN_KEY')
)

def exchange_rates_full_load_methods():
    print('--   new process {}  ----------------------------'.format('exchange rate'))
    exchange_rates.full_load(start_at='2018-01-01')


def exchange_rates_replication_methods():
    print('--   new process {}  ----------------------------'.format('exchange rate'))
    exchange_rates.replicate()



if __name__ == '__main__':
    exchange_rates_full_load_methods()
    # exchange_rates_replication_methods()

Once you have run the application, you can find logs and processes in the Logs folder. You should then begin to see .json data coming into your relevant Bucket directory!