You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The immediate goal is to gain some preliminary insights from the data. The data is transactional data and some reference data, essentially written once but read many times for analytical purposes.
How?
It would make sense to have transactional data (and reference data as well) loaded into S3, and then use a variety of tools to look at the data.
We use Amazon Redshift to start with. We copy data from S3 into Redshift and will do some EDA. Later on, we will attempt using Spectrum instead of copying data over to Redshift.
We will consume the Redshift queries in a thin API layer (REST APIs).
What?
Model card data, customer data, merchant data and transaction log data.
Do EDA to get some meaningful insights.
Expose insights via REST APIs. Spring Boot stack.
Time permitting, do a ReactJS UI
The overarching objective is to have a fully working model that works end to end for which a demonstration can be done. This demonstration should exhibit, sound engineering practices, architectural maturity, design, logical thinking and coding capabilities.
Exclusions
The model in this challenge is not expected to work with massive scale. The analytical queries would have opportunities to be tuned to work for scale at a later time progressively.
Why?
How?
What?
Exclusions
What next?
See #18
The text was updated successfully, but these errors were encountered: