Skip to content

umshaf/python-developer-assessment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Junior Python Developer Assessment

Overview

This project was created for the Junior Python Developer Assessment. It includes three main components: a database setup script, a REST API, and an ETL export script. The application uses SQLite as the database and FastAPI for the API layer.

Project Structure

python-assessment/ │ ├── data/ │ ├── customers.csv │ └── orders.csv │ ├── output/ │ └── export.csv │ ├── database_setup.py ├── api.py ├── etl_export.py ├── requirements.txt └── README.md

How to Run the Application

  1. Create and activate a virtual environment

Windows

python -m venv venv venv\Scripts\activate

  1. Install dependencies

pip install -r requirements.txt

  1. Run the database setup script

python database_setup.py

This will create the SQLite database and load the sample customer and order data.

  1. Start the API

uvicorn api:app --reload

Open the API documentation in your browser:

http://127.0.0.1:8000/docs

You can test the endpoint:

GET /customers/{customer_id}

Example:

/customers/1

  1. Run the ETL export script

Stop the API first (CTRL + C), then run:

python etl_export.py

The script extracts active customers and their orders from the database, transforms the data, and exports the results to a CSV file in the output folder.

Choices and Reasoning

SQLite was chosen because it is lightweight, requires no additional installation, and is suitable for a small local project like this assessment.

FastAPI was selected for the REST API because it is simple to implement, fast, and automatically generates interactive API documentation using Swagger UI.

The built-in Python csv module was used for reading and writing CSV files to avoid unnecessary dependencies.

Application Flow

  1. Sample customer and order data are stored in CSV files inside the data folder.
  2. The database_setup.py script creates the customers and orders tables and loads the sample data into a SQLite database.
  3. The api.py application reads from the database and provides an endpoint to retrieve a customer and their orders using the customer ID.
  4. The etl_export.py script queries the database for active customers and their orders, transforms the data by combining the first name and surname and calculating order totals, and exports the results to a CSV file.

Potential Improvements

If more time were available, the following improvements could be made:

  • Add automated tests
  • Use SQLAlchemy models instead of raw SQL queries
  • Add input validation and error handling
  • Add logging for debugging and monitoring
  • Containerise the project using Docker
  • Support environment variables for configuration

Example API Response

Request:

GET /customers/1

Response:

{ "customer": { "id": 1, "first_name": "John", "surname": "Smith", "email": "john.smith@email.com", "status": "active" }, "orders": [ { "id": 1, "customer_id": 1, "product_name": "Laptop", "quantity": 1, "unit_price": 900 }, { "id": 2, "customer_id": 1, "product_name": "Keyboard", "quantity": 1, "unit_price": 75 } ] }

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages