- I am a Data Engineer & Mentor at Northcoders
- Passionate about problem solving and seeing code to fruition
- Eager to utilise recently acquired skills in a new and engaging professional setting
console.log(javascriptSkills)
{API, array methods, asynchronous, classes, conditional logic, constructors, functions, fundamentals, error handling, export/import, iteration, postgresql, recursion, regex, TDD};
console.log(javascriptModules)
{express, jest, node-postgres, supertest};
print(python_skills)
{AWS, comprehension, decorators, exceptions, generators, iterators, Mock, OOP, postgresql, pg8000, TDD, terraform}
print(python_modules)
{autopep8, bandit, boto3, flake8, moto, pandas, pg8000, pytest}
aws s3 ls s3://skills
Athena, CLI, CloudWatch, Deployment, EC2, Glue, Lambda, S3, Terraform
NC-DE-DataBakers
My latest project involved Python
, YAML
, PostgreSQL
, AWS
, Terraform
. Following ETL, data was extracted from a hosted database with credentials, this was uploaded to a AWS S3 bucket, to then be pulled and transformed into star schema tables and converted to parquet format. Finally the data was uploaded into a final AWS S3 bucket. This entire project was written to be automatically deployed using AWS lambdas and terraform automation with CloudWatch alerts.
python-sql-pg8000
This project involved writing SQL code ranging from fundamental querying to advanced SQL such as UNION, INTERSECT, EXCEPT, WINDOWING, Conditional Expressions and COALESCE. PSQL was integrated via pg8000
module and functions were written with Python using a TDD approach.
SQL-Data-Normalisation
This project involves writing SQL code to refactor and normalise the data without mutating original data. Further code is written to transform to star schema.