14:13:06 Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [, , ]} 14:13:06 Running with dbt=1.8.0 14:13:06 running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'write_json': 'True', 'log_cache_events': 'False', 'partial_parse': 'True', 'cache_selected_only': 'False', 'profiles_dir': '/Users/john-nk/Projects/ra_dbt/ra_dbt', 'version_check': 'True', 'fail_fast': 'False', 'log_path': '/Users/john-nk/Projects/ra_dbt/ra_dbt/logs', 'debug': 'True', 'warn_error': 'None', 'use_colors': 'True', 'use_experimental_parser': 'False', 'no_print': 'None', 'quiet': 'False', 'warn_error_options': 'WarnErrorOptions(include=[], exclude=[])', 'introspect': 'True', 'invocation_command': 'dbt test --select test_type:unit --debug', 'log_format': 'default', 'target_path': 'None', 'static_parser': 'True', 'send_anonymous_usage_stats': 'True'} 14:13:06 target not specified in profile 'ra_dbt', using 'default' 14:13:06 Redshift adapter: Setting redshift_connector to ERROR 14:13:06 Redshift adapter: Setting redshift_connector.core to ERROR 14:13:06 Sending event: {'category': 'dbt', 'action': 'project_id', 'label': 'e8c67e90-84d0-4d49-a2fa-b316d6e2a2a9', 'context': []} 14:13:06 Sending event: {'category': 'dbt', 'action': 'adapter_info', 'label': 'e8c67e90-84d0-4d49-a2fa-b316d6e2a2a9', 'context': []} 14:13:06 Registered adapter: redshift=1.8.0 14:13:06 checksum: dbb12d416fdb55e3665e4828703bbedf40c5c90ed7ca6ff39bbbfb29b12b0a92, vars: {}, profile: , target: , version: 1.8.0 14:13:06 Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed. 14:13:06 Partial parsing enabled, no changes found, skipping parsing 14:13:06 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. There are 1 unused configuration paths: - models.example 14:13:06 Sending event: {'category': 'dbt', 'action': 'load_project', 'label': 'e8c67e90-84d0-4d49-a2fa-b316d6e2a2a9', 'context': []} 14:13:06 Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': 'e8c67e90-84d0-4d49-a2fa-b316d6e2a2a9', 'context': []} 14:13:06 Found 24 models, 24 data tests, 2 seeds, 60 sources, 603 macros, 1 unit test 14:13:06 Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'e8c67e90-84d0-4d49-a2fa-b316d6e2a2a9', 'context': []} 14:13:06 14:13:06 Acquiring new redshift connection 'master' 14:13:06 Acquiring new redshift connection 'list_data_warehouse_production_dbt_jnk' 14:13:06 Using redshift connection "list_data_warehouse_production_dbt_jnk" 14:13:06 On list_data_warehouse_production_dbt_jnk: BEGIN 14:13:06 Opening a new connection, currently in state init 14:13:06 Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'. 14:13:06 Redshift adapter: Connecting to redshift with username/password based auth... 14:13:06 SQL status: SUCCESS in 0.0 seconds 14:13:06 Using redshift connection "list_data_warehouse_production_dbt_jnk" 14:13:06 On list_data_warehouse_production_dbt_jnk: /* {"app": "dbt", "dbt_version": "1.8.0", "profile_name": "ra_dbt", "target_name": "default", "connection_name": "list_data_warehouse_production_dbt_jnk"} */ select table_catalog as database, table_name as name, table_schema as schema, 'table' as type from information_schema.tables where table_schema ilike 'dbt_jnk' and table_type = 'BASE TABLE' union all select table_catalog as database, table_name as name, table_schema as schema, case when view_definition ilike '%create materialized view%' then 'materialized_view' else 'view' end as type from information_schema.views where table_schema ilike 'dbt_jnk' 14:13:07 SQL status: SUCCESS in 0.0 seconds 14:13:07 On list_data_warehouse_production_dbt_jnk: ROLLBACK 14:13:07 On list_data_warehouse_production_dbt_jnk: Close 14:13:07 Using redshift connection "master" 14:13:07 On master: BEGIN 14:13:07 Opening a new connection, currently in state init 14:13:07 Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'. 14:13:07 Redshift adapter: Connecting to redshift with username/password based auth... 14:13:07 SQL status: SUCCESS in 0.0 seconds 14:13:07 Using redshift connection "master" 14:13:07 On master: /* {"app": "dbt", "dbt_version": "1.8.0", "profile_name": "ra_dbt", "target_name": "default", "connection_name": "master"} */ with relation as ( select pg_class.oid as relation_id, pg_class.relname as relation_name, pg_class.relnamespace as schema_id, pg_namespace.nspname as schema_name, pg_class.relkind as relation_type from pg_class join pg_namespace on pg_class.relnamespace = pg_namespace.oid where pg_namespace.nspname != 'information_schema' and pg_namespace.nspname not like 'pg\_%' ), dependency as ( select distinct coalesce(pg_rewrite.ev_class, pg_depend.objid) as dep_relation_id, pg_depend.refobjid as ref_relation_id, pg_depend.refclassid as ref_class_id from pg_depend left join pg_rewrite on pg_depend.objid = pg_rewrite.oid where coalesce(pg_rewrite.ev_class, pg_depend.objid) != pg_depend.refobjid ) select distinct dep.schema_name as dependent_schema, dep.relation_name as dependent_name, ref.schema_name as referenced_schema, ref.relation_name as referenced_name from dependency join relation ref on dependency.ref_relation_id = ref.relation_id join relation dep on dependency.dep_relation_id = dep.relation_id 14:13:07 SQL status: SUCCESS in 0.0 seconds 14:13:07 Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'e8c67e90-84d0-4d49-a2fa-b316d6e2a2a9', 'context': []} 14:13:07 On master: ROLLBACK 14:13:07 Using redshift connection "master" 14:13:07 On master: BEGIN 14:13:07 SQL status: SUCCESS in 0.0 seconds 14:13:07 On master: COMMIT 14:13:07 Using redshift connection "master" 14:13:07 On master: COMMIT 14:13:07 SQL status: SUCCESS in 0.0 seconds 14:13:07 On master: Close 14:13:07 Concurrency: 8 threads (target='default') 14:13:07 14:13:07 Began running node unit_test.ra_dbt.kevel_attribution.test_valid_outputs 14:13:07 1 of 1 START unit_test kevel_attribution::test_valid_outputs ................... [RUN] 14:13:07 Re-using an available connection from the pool (formerly list_data_warehouse_production_dbt_jnk, now unit_test.ra_dbt.kevel_attribution.test_valid_outputs) 14:13:07 Began compiling node unit_test.ra_dbt.kevel_attribution.test_valid_outputs 14:13:07 Began executing node unit_test.ra_dbt.kevel_attribution.test_valid_outputs 14:13:07 Using redshift connection "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:07 On unit_test.ra_dbt.kevel_attribution.test_valid_outputs: BEGIN 14:13:07 Opening a new connection, currently in state closed 14:13:07 Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'. 14:13:07 Redshift adapter: Connecting to redshift with username/password based auth... 14:13:07 SQL status: SUCCESS in 0.0 seconds 14:13:07 Using redshift connection "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:07 On unit_test.ra_dbt.kevel_attribution.test_valid_outputs: /* {"app": "dbt", "dbt_version": "1.8.0", "profile_name": "ra_dbt", "target_name": "default", "node_id": "unit_test.ra_dbt.kevel_attribution.test_valid_outputs"} */ with bound_views as ( select ordinal_position, table_schema, column_name, data_type, character_maximum_length, numeric_precision, numeric_scale from information_schema."columns" where table_name = 'kevel_clean' ), unbound_views as ( select ordinal_position, view_schema, col_name, case when col_type ilike 'character varying%' then 'character varying' when col_type ilike 'numeric%' then 'numeric' else col_type end as col_type, case when col_type like 'character%' then nullif(REGEXP_SUBSTR(col_type, '[0-9]+'), '')::int else null end as character_maximum_length, case when col_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(col_type, '[0-9,]+'), ',', 1), '')::int else null end as numeric_precision, case when col_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(col_type, '[0-9,]+'), ',', 2), '')::int else null end as numeric_scale from pg_get_late_binding_view_cols() cols(view_schema name, view_name name, col_name name, col_type varchar, ordinal_position int) where view_name = 'kevel_clean' ), external_views as ( select columnnum, schemaname, columnname, case when external_type ilike 'character varying%' or external_type ilike 'varchar%' then 'character varying' when external_type ilike 'numeric%' then 'numeric' else external_type end as external_type, case when external_type like 'character%' or external_type like 'varchar%' then nullif( REGEXP_SUBSTR(external_type, '[0-9]+'), '')::int else null end as character_maximum_length, case when external_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(external_type, '[0-9,]+'), ',', 1), '')::int else null end as numeric_precision, case when external_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(external_type, '[0-9,]+'), ',', 2), '')::int else null end as numeric_scale from pg_catalog.svv_external_columns where schemaname = 'dbt_jnk' and tablename = 'kevel_clean' ), unioned as ( select * from bound_views union all select * from unbound_views union all select * from external_views ) select column_name, data_type, character_maximum_length, numeric_precision, numeric_scale from unioned where table_schema = 'dbt_jnk' order by ordinal_position 14:13:07 SQL status: SUCCESS in 0.0 seconds 14:13:07 Writing injected SQL for node "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:07 Using redshift connection "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:07 On unit_test.ra_dbt.kevel_attribution.test_valid_outputs: /* {"app": "dbt", "dbt_version": "1.8.0", "profile_name": "ra_dbt", "target_name": "default", "node_id": "unit_test.ra_dbt.kevel_attribution.test_valid_outputs"} */ with bound_views as ( select ordinal_position, table_schema, column_name, data_type, character_maximum_length, numeric_precision, numeric_scale from information_schema."columns" where table_name = 'mixpanel_clean' ), unbound_views as ( select ordinal_position, view_schema, col_name, case when col_type ilike 'character varying%' then 'character varying' when col_type ilike 'numeric%' then 'numeric' else col_type end as col_type, case when col_type like 'character%' then nullif(REGEXP_SUBSTR(col_type, '[0-9]+'), '')::int else null end as character_maximum_length, case when col_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(col_type, '[0-9,]+'), ',', 1), '')::int else null end as numeric_precision, case when col_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(col_type, '[0-9,]+'), ',', 2), '')::int else null end as numeric_scale from pg_get_late_binding_view_cols() cols(view_schema name, view_name name, col_name name, col_type varchar, ordinal_position int) where view_name = 'mixpanel_clean' ), external_views as ( select columnnum, schemaname, columnname, case when external_type ilike 'character varying%' or external_type ilike 'varchar%' then 'character varying' when external_type ilike 'numeric%' then 'numeric' else external_type end as external_type, case when external_type like 'character%' or external_type like 'varchar%' then nullif( REGEXP_SUBSTR(external_type, '[0-9]+'), '')::int else null end as character_maximum_length, case when external_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(external_type, '[0-9,]+'), ',', 1), '')::int else null end as numeric_precision, case when external_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(external_type, '[0-9,]+'), ',', 2), '')::int else null end as numeric_scale from pg_catalog.svv_external_columns where schemaname = 'dbt_jnk' and tablename = 'mixpanel_clean' ), unioned as ( select * from bound_views union all select * from unbound_views union all select * from external_views ) select column_name, data_type, character_maximum_length, numeric_precision, numeric_scale from unioned where table_schema = 'dbt_jnk' order by ordinal_position 14:13:08 SQL status: SUCCESS in 0.0 seconds 14:13:08 Writing injected SQL for node "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:08 Using redshift connection "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:08 On unit_test.ra_dbt.kevel_attribution.test_valid_outputs: /* {"app": "dbt", "dbt_version": "1.8.0", "profile_name": "ra_dbt", "target_name": "default", "node_id": "unit_test.ra_dbt.kevel_attribution.test_valid_outputs"} */ with bound_views as ( select ordinal_position, table_schema, column_name, data_type, character_maximum_length, numeric_precision, numeric_scale from information_schema."columns" where table_name = 'event_id_currency_mapping' ), unbound_views as ( select ordinal_position, view_schema, col_name, case when col_type ilike 'character varying%' then 'character varying' when col_type ilike 'numeric%' then 'numeric' else col_type end as col_type, case when col_type like 'character%' then nullif(REGEXP_SUBSTR(col_type, '[0-9]+'), '')::int else null end as character_maximum_length, case when col_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(col_type, '[0-9,]+'), ',', 1), '')::int else null end as numeric_precision, case when col_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(col_type, '[0-9,]+'), ',', 2), '')::int else null end as numeric_scale from pg_get_late_binding_view_cols() cols(view_schema name, view_name name, col_name name, col_type varchar, ordinal_position int) where view_name = 'event_id_currency_mapping' ), external_views as ( select columnnum, schemaname, columnname, case when external_type ilike 'character varying%' or external_type ilike 'varchar%' then 'character varying' when external_type ilike 'numeric%' then 'numeric' else external_type end as external_type, case when external_type like 'character%' or external_type like 'varchar%' then nullif( REGEXP_SUBSTR(external_type, '[0-9]+'), '')::int else null end as character_maximum_length, case when external_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(external_type, '[0-9,]+'), ',', 1), '')::int else null end as numeric_precision, case when external_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(external_type, '[0-9,]+'), ',', 2), '')::int else null end as numeric_scale from pg_catalog.svv_external_columns where schemaname = 'dbt_jnk' and tablename = 'event_id_currency_mapping' ), unioned as ( select * from bound_views union all select * from unbound_views union all select * from external_views ) select column_name, data_type, character_maximum_length, numeric_precision, numeric_scale from unioned where table_schema = 'dbt_jnk' order by ordinal_position 14:13:08 SQL status: SUCCESS in 0.0 seconds 14:13:08 Writing injected SQL for node "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:08 Writing injected SQL for node "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:08 Using redshift connection "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:08 On unit_test.ra_dbt.kevel_attribution.test_valid_outputs: /* {"app": "dbt", "dbt_version": "1.8.0", "profile_name": "ra_dbt", "target_name": "default", "node_id": "unit_test.ra_dbt.kevel_attribution.test_valid_outputs"} */ create temporary table "test_valid_outputs__dbt_tmp151308545339" as ( select * from ( WITH __dbt__cte__kevel_clean as ( -- Fixture for kevel_clean select cast('1' as integer) as flight, cast(null as timestamp without time zone) as date_time, cast('2024-03-15' as date) as flight_start_date, cast('2024-03-15' as date) as date, cast('1' as double precision) as hour, cast('2' as integer) as user_click_platform, cast('5' as numeric(38,2)) as clicks, cast('5' as numeric(38,2)) as spend union all select cast('1' as integer) as flight, cast(null as timestamp without time zone) as date_time, cast('2024-03-15' as date) as flight_start_date, cast('2024-03-16' as date) as date, cast('5' as double precision) as hour, cast('2' as integer) as user_click_platform, cast('9' as numeric(38,2)) as clicks, cast('9' as numeric(38,2)) as spend ), __dbt__cte__mixpanel_clean as ( -- Fixture for mixpanel_clean select cast(null as character varying(128)) as user_id, cast('mmEkDFFyxzaBApuhnaEhpgajycnqlFlzyCtr' as character varying(128)) as mp_reserved_insert_id, cast(null as timestamp without time zone) as date_time, cast('2024-03-16' as date) as date, cast('5' as double precision) as hour, cast(null as super) as cart_item_event_ids, cast('1834396' as integer) as event_id, cast(null as super) as cart_item_ids, cast(null as integer) as cart_item_id, cast('22' as character varying(128)) as cart_value, cast('1' as character varying(128)) as cart_size, cast(null as character varying(128)) as ticket_fee, cast('USD' as character varying(128)) as currency union all select cast(null as character varying(128)) as user_id, cast('hjpsrhwlrokylgapthAoytmirpCbdtahutkg' as character varying(128)) as mp_reserved_insert_id, cast(null as timestamp without time zone) as date_time, cast('2024-03-16' as date) as date, cast('5' as double precision) as hour, cast(null as super) as cart_item_event_ids, cast('1834396' as integer) as event_id, cast(null as super) as cart_item_ids, cast(null as integer) as cart_item_id, cast('34' as character varying(128)) as cart_value, cast('1' as character varying(128)) as cart_size, cast(null as character varying(128)) as ticket_fee, cast('USD' as character varying(128)) as currency ), __dbt__cte__event_id_currency_mapping as ( -- Fixture for event_id_currency_mapping select cast('1' as integer) as event_id, cast('USD' as character varying(128)) as currency ), flights AS ( SELECT flight FROM __dbt__cte__kevel_clean GROUP BY 1 ), mixpanel AS ( SELECT CAST(mixpanel_clean.date AS DATE) AS date, mixpanel_clean.hour, mixpanel_clean.currency, mixpanel_clean.event_id, -- , mixpanel_clean.cart_item_id SUM(mixpanel_clean.cart_value) AS cart_value, --+ COALESCE(CAST(mixpanel_clean.ticket_fee as float),0.0)) as cart_value --now taking cart value from mixpanel but including the ticket fee SUM(mixpanel_clean.cart_size) AS cart_size, COUNT(DISTINCT mixpanel_clean.mp_reserved_insert_id) AS orders FROM __dbt__cte__mixpanel_clean AS mixpanel_clean -- LEFT JOIN cart_value_table -- ON cart_value_table.cart_item_id = mixpanel_clean.cart_item_id -- AND cart_value_table.row_num = 1 INNER JOIN flights ON mixpanel_clean.event_id = CAST(flights.flight AS INT) GROUP BY 1, 2, 3, 4 ), kevel AS ( SELECT flight, --event_id in RA DB date, hour, user_click_platform, flight_start_date, SUM(clicks) AS clicks, SUM(spend) AS spend FROM __dbt__cte__kevel_clean GROUP BY 1, 2, 3, 4, 5 ) SELECT COALESCE(CAST(kevel.flight AS INT), CAST(mixpanel.event_id AS INT)) AS event_id, --event_id in RA DB COALESCE(kevel.date, mixpanel.date) AS date, COALESCE(kevel.hour, mixpanel.hour) AS hour, kevel.clicks, kevel.spend, kevel.spend / NULLIF(kevel.clicks, 0) AS cost_per_click, kevel.user_click_platform, mixpanel.cart_value AS conversion_value, mixpanel.cart_size AS number_of_conversions, COALESCE(mapping.currency, mixpanel.currency) AS currency, mixpanel.event_id AS mixpanel_event_id, kevel.flight_start_date, mixpanel.orders FROM kevel FULL JOIN mixpanel ON CAST(kevel.flight AS INT) = CAST(mixpanel.event_id AS INT) AND kevel.date = CAST(mixpanel.date AS DATE) AND kevel.hour = mixpanel.hour LEFT JOIN __dbt__cte__event_id_currency_mapping AS mapping ON CAST(kevel.flight AS INT) = mapping.event_id ) as __dbt_sbq where false limit 0 ); 14:13:08 SQL status: SUCCESS in 0.0 seconds 14:13:08 Using redshift connection "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:08 On unit_test.ra_dbt.kevel_attribution.test_valid_outputs: /* {"app": "dbt", "dbt_version": "1.8.0", "profile_name": "ra_dbt", "target_name": "default", "node_id": "unit_test.ra_dbt.kevel_attribution.test_valid_outputs"} */ with bound_views as ( select ordinal_position, table_schema, column_name, data_type, character_maximum_length, numeric_precision, numeric_scale from information_schema."columns" where table_name = 'test_valid_outputs__dbt_tmp151308545339' ), unbound_views as ( select ordinal_position, view_schema, col_name, case when col_type ilike 'character varying%' then 'character varying' when col_type ilike 'numeric%' then 'numeric' else col_type end as col_type, case when col_type like 'character%' then nullif(REGEXP_SUBSTR(col_type, '[0-9]+'), '')::int else null end as character_maximum_length, case when col_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(col_type, '[0-9,]+'), ',', 1), '')::int else null end as numeric_precision, case when col_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(col_type, '[0-9,]+'), ',', 2), '')::int else null end as numeric_scale from pg_get_late_binding_view_cols() cols(view_schema name, view_name name, col_name name, col_type varchar, ordinal_position int) where view_name = 'test_valid_outputs__dbt_tmp151308545339' ), external_views as ( select columnnum, schemaname, columnname, case when external_type ilike 'character varying%' or external_type ilike 'varchar%' then 'character varying' when external_type ilike 'numeric%' then 'numeric' else external_type end as external_type, case when external_type like 'character%' or external_type like 'varchar%' then nullif( REGEXP_SUBSTR(external_type, '[0-9]+'), '')::int else null end as character_maximum_length, case when external_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(external_type, '[0-9,]+'), ',', 1), '')::int else null end as numeric_precision, case when external_type like 'numeric%' then nullif( SPLIT_PART(REGEXP_SUBSTR(external_type, '[0-9,]+'), ',', 2), '')::int else null end as numeric_scale from pg_catalog.svv_external_columns where schemaname = 'None' and tablename = 'test_valid_outputs__dbt_tmp151308545339' ), unioned as ( select * from bound_views union all select * from unbound_views union all select * from external_views ) select column_name, data_type, character_maximum_length, numeric_precision, numeric_scale from unioned order by ordinal_position 14:13:08 SQL status: SUCCESS in 0.0 seconds 14:13:08 Writing runtime sql for node "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:08 Using redshift connection "unit_test.ra_dbt.kevel_attribution.test_valid_outputs" 14:13:08 On unit_test.ra_dbt.kevel_attribution.test_valid_outputs: /* {"app": "dbt", "dbt_version": "1.8.0", "profile_name": "ra_dbt", "target_name": "default", "node_id": "unit_test.ra_dbt.kevel_attribution.test_valid_outputs"} */ -- Build actual result given inputs with dbt_internal_unit_test_actual as ( select event_id,mixpanel_event_id,flight_start_date,date,hour,user_click_platform,currency,conversion_value,number_of_conversions,orders,clicks,spend,cost_per_click, 'actual' as "actual_or_expected" from ( WITH __dbt__cte__kevel_clean as ( -- Fixture for kevel_clean select cast('1' as integer) as flight, cast(null as timestamp without time zone) as date_time, cast('2024-03-15' as date) as flight_start_date, cast('2024-03-15' as date) as date, cast('1' as double precision) as hour, cast('2' as integer) as user_click_platform, cast('5' as numeric(38,2)) as clicks, cast('5' as numeric(38,2)) as spend union all select cast('1' as integer) as flight, cast(null as timestamp without time zone) as date_time, cast('2024-03-15' as date) as flight_start_date, cast('2024-03-16' as date) as date, cast('5' as double precision) as hour, cast('2' as integer) as user_click_platform, cast('9' as numeric(38,2)) as clicks, cast('9' as numeric(38,2)) as spend ), __dbt__cte__mixpanel_clean as ( -- Fixture for mixpanel_clean select cast(null as character varying(128)) as user_id, cast('mmEkDFFyxzaBApuhnaEhpgajycnqlFlzyCtr' as character varying(128)) as mp_reserved_insert_id, cast(null as timestamp without time zone) as date_time, cast('2024-03-16' as date) as date, cast('5' as double precision) as hour, cast(null as super) as cart_item_event_ids, cast('1834396' as integer) as event_id, cast(null as super) as cart_item_ids, cast(null as integer) as cart_item_id, cast('22' as character varying(128)) as cart_value, cast('1' as character varying(128)) as cart_size, cast(null as character varying(128)) as ticket_fee, cast('USD' as character varying(128)) as currency union all select cast(null as character varying(128)) as user_id, cast('hjpsrhwlrokylgapthAoytmirpCbdtahutkg' as character varying(128)) as mp_reserved_insert_id, cast(null as timestamp without time zone) as date_time, cast('2024-03-16' as date) as date, cast('5' as double precision) as hour, cast(null as super) as cart_item_event_ids, cast('1834396' as integer) as event_id, cast(null as super) as cart_item_ids, cast(null as integer) as cart_item_id, cast('34' as character varying(128)) as cart_value, cast('1' as character varying(128)) as cart_size, cast(null as character varying(128)) as ticket_fee, cast('USD' as character varying(128)) as currency ), __dbt__cte__event_id_currency_mapping as ( -- Fixture for event_id_currency_mapping select cast('1' as integer) as event_id, cast('USD' as character varying(128)) as currency ), flights AS ( SELECT flight FROM __dbt__cte__kevel_clean GROUP BY 1 ), mixpanel AS ( SELECT CAST(mixpanel_clean.date AS DATE) AS date, mixpanel_clean.hour, mixpanel_clean.currency, mixpanel_clean.event_id, -- , mixpanel_clean.cart_item_id SUM(mixpanel_clean.cart_value) AS cart_value, --+ COALESCE(CAST(mixpanel_clean.ticket_fee as float),0.0)) as cart_value --now taking cart value from mixpanel but including the ticket fee SUM(mixpanel_clean.cart_size) AS cart_size, COUNT(DISTINCT mixpanel_clean.mp_reserved_insert_id) AS orders FROM __dbt__cte__mixpanel_clean AS mixpanel_clean -- LEFT JOIN cart_value_table -- ON cart_value_table.cart_item_id = mixpanel_clean.cart_item_id -- AND cart_value_table.row_num = 1 INNER JOIN flights ON mixpanel_clean.event_id = CAST(flights.flight AS INT) GROUP BY 1, 2, 3, 4 ), kevel AS ( SELECT flight, --event_id in RA DB date, hour, user_click_platform, flight_start_date, SUM(clicks) AS clicks, SUM(spend) AS spend FROM __dbt__cte__kevel_clean GROUP BY 1, 2, 3, 4, 5 ) SELECT COALESCE(CAST(kevel.flight AS INT), CAST(mixpanel.event_id AS INT)) AS event_id, --event_id in RA DB COALESCE(kevel.date, mixpanel.date) AS date, COALESCE(kevel.hour, mixpanel.hour) AS hour, kevel.clicks, kevel.spend, kevel.spend / NULLIF(kevel.clicks, 0) AS cost_per_click, kevel.user_click_platform, mixpanel.cart_value AS conversion_value, mixpanel.cart_size AS number_of_conversions, COALESCE(mapping.currency, mixpanel.currency) AS currency, mixpanel.event_id AS mixpanel_event_id, kevel.flight_start_date, mixpanel.orders FROM kevel FULL JOIN mixpanel ON CAST(kevel.flight AS INT) = CAST(mixpanel.event_id AS INT) AND kevel.date = CAST(mixpanel.date AS DATE) AND kevel.hour = mixpanel.hour LEFT JOIN __dbt__cte__event_id_currency_mapping AS mapping ON CAST(kevel.flight AS INT) = mapping.event_id ) _dbt_internal_unit_test_actual ), -- Build expected result dbt_internal_unit_test_expected as ( select event_id, mixpanel_event_id, flight_start_date, date, hour, user_click_platform, currency, conversion_value, number_of_conversions, orders, clicks, spend, cost_per_click, 'expected' as "actual_or_expected" from ( select cast('1' as integer) as event_id, cast(null as integer) as mixpanel_event_id, cast('2024-03-15' as date) as flight_start_date, cast('2024-03-16' as date) as date, cast('1' as double precision) as hour, cast('2' as integer) as user_click_platform, cast('USD' as character varying(128)) as currency, cast(null as character varying(128)) as conversion_value, cast(null as character varying(128)) as number_of_conversions, cast(null as bigint) as orders, cast('5' as numeric(38,2)) as clicks, cast('5' as numeric(38,2)) as spend, cast('1' as numeric(38,4)) as cost_per_click union all select cast('1' as integer) as event_id, cast('1' as integer) as mixpanel_event_id, cast('2024-03-15' as date) as flight_start_date, cast('2024-03-16' as date) as date, cast('5' as double precision) as hour, cast('2' as integer) as user_click_platform, cast('USD' as character varying(128)) as currency, cast('56' as character varying(128)) as conversion_value, cast('2' as character varying(128)) as number_of_conversions, cast('2' as bigint) as orders, cast('9' as numeric(38,2)) as clicks, cast('9' as numeric(38,2)) as spend, cast('1' as numeric(38,4)) as cost_per_click ) _dbt_internal_unit_test_expected ) -- Union actual and expected results select * from dbt_internal_unit_test_actual union all select * from dbt_internal_unit_test_expected 14:13:09 Redshift adapter: Redshift error: UNION types numeric and character varying cannot be matched 14:13:09 On unit_test.ra_dbt.kevel_attribution.test_valid_outputs: ROLLBACK 14:13:09 On unit_test.ra_dbt.kevel_attribution.test_valid_outputs: Close 14:13:09 Runtime Error in unit_test test_valid_outputs (models/unit_tests.yml) An error occurred during execution of unit test 'test_valid_outputs'. There may be an error in the unit test definition: check the data types. Database Error UNION types numeric and character varying cannot be matched 14:13:09 1 of 1 ERROR kevel_attribution::test_valid_outputs ............................. [ERROR in 1.56s] 14:13:09 Finished running node unit_test.ra_dbt.kevel_attribution.test_valid_outputs 14:13:09 Using redshift connection "master" 14:13:09 On master: BEGIN 14:13:09 Opening a new connection, currently in state closed 14:13:09 Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'. 14:13:09 Redshift adapter: Connecting to redshift with username/password based auth... 14:13:09 SQL status: SUCCESS in 0.0 seconds 14:13:09 On master: COMMIT 14:13:09 Using redshift connection "master" 14:13:09 On master: COMMIT 14:13:09 SQL status: SUCCESS in 0.0 seconds 14:13:09 On master: Close 14:13:09 Connection 'master' was properly closed. 14:13:09 Connection 'unit_test.ra_dbt.kevel_attribution.test_valid_outputs' was properly closed. 14:13:09 14:13:09 Finished running 1 unit test in 0 hours 0 minutes and 2.50 seconds (2.50s). 14:13:09 Command end result 14:13:09 14:13:09 Completed with 1 error and 0 warnings: 14:13:09 14:13:09 Runtime Error in unit_test test_valid_outputs (models/unit_tests.yml) An error occurred during execution of unit test 'test_valid_outputs'. There may be an error in the unit test definition: check the data types. Database Error UNION types numeric and character varying cannot be matched 14:13:09 14:13:09 Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1 14:13:09 Resource report: {"command_name": "test", "command_wall_clock_time": 3.1816416, "process_user_time": 1.549311, "process_kernel_time": 0.157369, "process_mem_max_rss": "117866496", "command_success": false, "process_in_blocks": "0", "process_out_blocks": "0"} 14:13:09 Command `dbt test` failed at 15:13:09.325323 after 3.18 seconds 14:13:09 Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [, , ]} 14:13:09 Flushing usage events