Skip to content
View Abdelnaem2002's full-sized avatar
:octocat:
:octocat:

Block or report Abdelnaem2002

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Abdelnaem2002/README.md

Hi 👋, I'm Abdelnaem Alaref

Business Intelligence Engineer


dbt Logo



After I grab my morning coffee , I start my work with the goal of building something like this :

So I checkout a branch and start writing SQL using

that usually looks like :

{{
  config(
    alias='cool_data_mart_name',
    materialized: table
  )
}}

WITH fact_table AS (
  SELECT * FROM {{ref('fact_table_name')}} WHERE [condition]
  ),

dimention_table_1 AS (
  SELECT
    {{ dbt_utils.star(
        from=ref('dimention_table_1_name'),
        except=[UNWANTED_COLUMN_NAMES],
        , relation_alias='d1'
        )}},
   FROM
      {{ref('dimention_table_1_name')}} AS d1
   WHERE
      [condition]
  ),

dimention_table_2 AS (
  SELECT * FROM {{ref('dimention_table_2_name')}} WHERE [condition]
  ),

final AS (
  SELECT
    f.col1,
    d1.*,
    COALESCE(d2.col123, 'default_value') AS col123,
    ROW_NUMBER() OVER( PARTITION BY id ORDER BY event_at DESC) AS deduplicate
  FROM
    fact_table AS f
  INNER JOIN
    dimention_table_1 AS d1
  ON
    f.foreign_1 = d1.primary
  LEFT JOIN
    dimention_table_2 AS d2
  ON
    f.foreign_2 = d2.primary
  WHERE
    [insert mart conditions]
  QUALIFY
   deduplicate = 1
)
SELECT * FROM final

Now onto cool_data_mart_name.yml to write some tests and documentation !

version: 2

models:
  - name: cool_data_mart_name
    columns:
      - name: id
        description: unique identifier of record
        tests:
          - unique
          - not_null
          - dbt_utils.relationships_where:
              to: source('mixpanel', 'events')
              field: event_id

      - name: is_condition
        description: boolean column shows some condition is true or false
        tests:
          - accepted_values:
              values: [TRUE, FALSE]
              quote: false

      - name: first_name
        description: string column showing name of the action taker
        tests:
          - trimmed_spaces


    tests:
      - dbt_utils.expression_is_true:
          expression: "price >= 0"
          condition: "price IS NOT NULL"

but it is not always this easy, is it ?

AS you can see above we needed a custom schema test/ macro to check for string columns coming from backend without leading or trailing spaces (so that we could easily group by them)

implementation:

{% macro test_trimmed_spaces(model, column_name) %}

WITH untrimmed AS (
    SELECT
        DISTINCT COALESCE({{ column_name }}, '') as untrimmed_name

    FROM {{ model }}

),
trimmed AS (

    SELECT
        DISTINCT TRIM( COALESCE({{ column_name }}, '') ) as trimmed_name

    FROM {{ model }}
)
SELECT
    COUNT(*)
FROM
    untrimmed
LEFT JOIN
    trimmed
ON
    untrimmed.untrimmed_name = trimmed.trimmed_name
WHERE
    trimmed_name IS NULL
{% endmacro %}

This is just a sample of fully covering the columns with documentation, explanation, and tests that communicate and check for assumptions about the data.

and fix any problems that may come up and investigate any broken assumptions.

Speaking of Data Warehouses I previously used

Redshift Logo



so I am experienced with that as well :)

No, I am not done yet!

The rest of my role is building charts and dashboards To help Business

Tableau Logo



Power Bi




Connect with me:

abdelnaem-alaref abdelnaemalaref abdelnaemalaref

Languages and Tools:

Database Powerbi Excel

figma git html5 mysql pandas python seaborn

abdelnaem2002

 abdelnaem2002

abdelnaem2002

Popular repositories Loading

  1. ETL_Data_Pipline_Using_Python ETL_Data_Pipline_Using_Python Public

    ETL Data Pipline

    Jupyter Notebook 2

  2. Telecom_dashboard Telecom_dashboard Public

    TSQL 1

  3. Import-Company_Vis Import-Company_Vis Public

    Used Power bi to create visual representations of data, uncovering patterns for analysis and enhancing data clarity and readability

    1

  4. card.robot card.robot Public

    HTML

  5. Investigate_a_Dataset_movie Investigate_a_Dataset_movie Public

    Jupyter Notebook

  6. traffic_violaions traffic_violaions Public

    This dataset contains around 65k+ traffic-related violation records.

    Jupyter Notebook