You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The second condition is always false on Spark because relation objects don't naturally have a value for type. Instead, relation type is handled by the spark_get_relation_type macro, which infers it from
show tblproperties ... ('view.default.database')
There are a few potential fixes:
Reimplement is_incremental() for dbt-spark
Update the relation object with its type as part of the incremental materialization, since we're already calling spark_get_relation_typehere
At the beginning of runs, cache all info on dbt relations, including relation types (similar to get_catalog, although this takes a while to run)
The text was updated successfully, but these errors were encountered:
The default implementation of
is_incremental()
bundles the following four conditions:The second condition is always false on Spark because relation objects don't naturally have a value for
type
. Instead, relation type is handled by thespark_get_relation_type
macro, which infers it fromshow tblproperties ... ('view.default.database')
There are a few potential fixes:
is_incremental()
fordbt-spark
relation
object with itstype
as part of theincremental
materialization, since we're already callingspark_get_relation_type
hereget_catalog
, although this takes a while to run)The text was updated successfully, but these errors were encountered: