A basic Product Logistics API which can be used to track the transactions of different products and its delivery to different cities.
The app/data/data.csv
file can be replaced according to your needs since the data_table
is constructed using this csv file.
For testing purposes, dumpCSV.py
uses faker
library to generate a fake dataset which is used to run the demo. This csv file is saved at app/data/data.csv
.
product_logistics_db=> \d data_table;
Table "public.data_table"
Column | Type | Collation | Nullable | Default
-------------------+-----------------------------+-----------+----------+---------
transaction_id | uuid | | not null |
transaction_time | timestamp without time zone | | |
product_name | character varying | | |
quantity | integer | | |
unit_price | double precision | | |
total_price | double precision | | |
delivered_to_city | character varying | | |
- Install PostgreSQL in your respective OS.
- Run the following commands to create an User as well as a database.
sudo su postgres;
psql -c "CREATE USER product_logistics_api WITH ENCRYPTED PASSWORD 'product_logistics_api';"
psql -c "CREATE DATABASE product_logistics_db;"
psql -c "GRANT ALL privileges on database product_logistics_db to product_logistics_api;"
-
Create an
.env
file at the root directory which will containSQLALCHEMY_DATABASE_URL
&SECRET_KEY
for the API. -
Check
.env.ex
for an example.
-
This API has been developed with Python version 3.8.5 and it is expected that the Python version installed in your system is 3.8.5 or above.
-
A default admin user with username
admin
and passwordadmin
will be created. You are expected to update the password using/user/update/
endpoint before using the API in production.
Install dependencies using-
pip install -r requirements.txt
Run the API using-
python main.py
-
Go to the below url to view the Swagger UI. It will list all the endpoints and you can also execute the GET and POST requests from the UI itself.
http://0.0.0.0:5000/docs#/
-
/access_token
: To get the access token to use/user
and/data
endpoints. The access token expires in 30 mins, after which you will need to get it again. This is hardcoded and can be changed according to your needs by editing theACCESS_TOKEN_EXPIRE_MINUTES
variable inmain.py
. -
/users/me
: To get info about the current logged in user. -
/user/create/
: Create a new admin user. -
/user/update/
: Update the password of an existing admin user. -
/user/delete/
: Delete an existing admin user. -
/users/
: Get info of all admin users -
/users/{user_id}
: Get info of admin user by user_id -
/data/upload
: API endpoint to upload the created CSV file to the server. The contents in the file would be moved to a Database (either PostgreSQL or MySQL as configured). -
/data/all
: API to query all rows from the database with pagination (entries per page should be passed as query parameter). -
/data/filter_by
:-
API to query rows from database with filter. The following filters should be supported: (a) Date range (b) Total Price range (c) Quantity range (d) City name
-
The filter parameters are:
city
,date
,total_price
andquantity
. -
The query result is downloaded as CSV file if
save_as_csv=true
parameter is passed in the query. It is saved asapp/data/filter_result.csv
.
-
-
/data/update/{transaction_id}
: API to update any row in the database. -
/data/delete
: API to delete an entry in the database based on given input.
dumpCSV.py
which can be found in the/app/utils
directory usesfaker
python library to build 1000 records of fake data which is saved as a CSV file atapp/data/data.csv
.- The
app/data/data.csv
file can be replaced according to your needs since thedata_table
is constructed using this csv file.