You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have a few interesting changes coming in and we decided to increase dlt version to 0.5 which is a major release in our versioning scheme. Changes that can be considered breaking are just a few and in may be important for people accessing library internals and building advanced data platforms with it.
ETA: first week of July
We will be announcing alpha releases, details at the bottom.
if dlt.source or dlt.resource decorated function is passed a None in a default argument during a function call, it will be handled exactly like in regular Python function call. Previously such None would request argument injection from configuration. Please read more here: (fixes config injection edge cases #1430)
dlt.config.value and dlt.secrets.value were evaluating to None at runtime. Now they will evaluate to a sentinel value. All the existing code should be backward compatible. (fixes config injection edge cases #1430)
if you create an instance of a SPEC (ie. SnowflakeCredentials) it will not be marked as resolved even if all required fields are provided. previously some were resolving and some were not. Fix/1465 fixes snowflake auth credentials #1489
if dlt.source or dlt.resource decorated function is passed a None in a default argument during a function call, it will be handled exactly like in regular Python function call. Previously such None would request argument injection from configuration. Please read more here: (fixes config injection edge cases #1430)
dlt.config.value and dlt.secrets.value were evaluating to None at runtime. Now they will evaluate to a sentinel value. All the existing code should be backward compatible. (fixes config injection edge cases #1430)
if you create an instance of a SPEC (ie. SnowflakeCredentials) it will not be marked as resolved even if all required fields are provided. previously some were resolving and some were not. Fix/1465 fixes snowflake auth credentials #1489
0.5.1
release notesWe have a few interesting changes coming in and we decided to increase
dlt
version to0.5
which is a major release in our versioning scheme. Changes that can be considered breaking are just a few and in may be important for people accessing library internals and building advanced data platforms with it.ETA: first week of July
We will be announcing alpha releases, details at the bottom.
Possible breaking changes
PageNumberPaginator
takesbase_page
andpage
arguments instead ofinitial_page
. This allows to paginate APIs that number pages ie. from 0 or from 1. RESTClient: add integrations tests for paginators #1509dlt.source
ordlt.resource
decorated function is passed aNone
in a default argument during a function call, it will be handled exactly like in regular Python function call. Previously suchNone
would request argument injection from configuration. Please read more here: (fixes config injection edge cases #1430)dlt.config.value
anddlt.secrets.value
were evaluating toNone
at runtime. Now they will evaluate to a sentinel value. All the existing code should be backward compatible. (fixes config injection edge cases #1430)full_refresh
flag ofdlt.pipeline
will be deprecated and replaced withdev_mode
. (New "refresh" mode and "dev_mode" #1063) and (https://dlthub.com/devel/general-usage/pipeline#do-experiments-with-dev-mode)credentials
parameter fromdlt.pipeline
. Please use destination factories to instantiate destinations with explicit credentials. (https://dlthub.com/devel/general-usage/destination#pass-explicit-credentials)round_robin
fromfifo
as a default setting. You can switch back to the previous behaviour and learn more about what this means here: (https://dlthub.com/docs/reference/performance#resources-extraction-fifo-vs-round-robin)SnowflakeCredentials
) it will not be marked as resolved even if all required fields are provided. previously some were resolving and some were not. Fix/1465 fixes snowflake auth credentials #1489parse_native_representation
never marks config as resolved. previously some were resolving and some were not. Fix/1465 fixes snowflake auth credentials #1489credentials
argument was removed fromdlt.pipeline
. removes deprecated credentials argument from Pipeline #1537Other changes
orjson
dependency rangeupsert
merge strategy usingMERGE SQL
statement will be available (Addupsert
merge strategy #1466)Alpha release:
0.5.1a0
Following changes are available for testing:
support
delta
tables withdelta-rs
on top offilesystem
destination. (Add Delta table support forfilesystem
destination #1382)LanceDB
destination and examples (LanceDB Destination #1375)external files may be imported and loaded without extraction and normalization (https://dlthub.com/devel/general-usage/resource#import-external-files) - includes jsonl, csv, and parquet
pick the loader file format for particular resource (https://dlthub.com/devel/general-usage/resource#pick-loader-file-format-for-a-particular-resource)
extended support for various csv formats (https://dlthub.com/devel/dlt-ecosystem/file-formats/csv#change-settings)
csv support for snowflake (support csv file format in snowflake #1470 https://dlthub.com/devel/dlt-ecosystem/destinations/snowflake#custom-csv-formats)
we'll support case sensitive and insensitive modes for our destinations ie. snowflake, redshift, bigquery, mssql etc. may work in both modes (allows naming conventions to be changed #998 https://dlthub.com/devel/general-usage/naming-convention)
you'll be able to fully change naming convention ie. to have LATIN-1 character set or create collision-free names (https://dlthub.com/devel/general-usage/naming-convention#write-your-own-naming-convention)
two new naming conventions:
sql_cs_v1
(case sensitive) andsql_ci_v1
(case insensitive) to create SQL safe identifiers without snake case transformation (https://dlthub.com/devel/general-usage/naming-convention#available-naming-conventions)you'll be able to modify destination capabilities via destination factories (https://dlthub.com/devel/general-usage/destination#inspect-destination-capabilities)
schemas will be reflected with a single SQL statement which will make schema migrations faster
loader can handle many more jobs (files) than before. we tested with 30k jobs and it looks fine
we are adding
refresh
modes topipeline.run
that allow to drop and recreate tables - with different granularity. (https://dlthub.com/devel/general-usage/pipeline#refresh-pipeline-data-and-state)if
dlt.source
ordlt.resource
decorated function is passed aNone
in a default argument during a function call, it will be handled exactly like in regular Python function call. Previously suchNone
would request argument injection from configuration. Please read more here: (fixes config injection edge cases #1430)dlt.config.value
anddlt.secrets.value
were evaluating toNone
at runtime. Now they will evaluate to a sentinel value. All the existing code should be backward compatible. (fixes config injection edge cases #1430)the default resource extraction sequence has changed to
round_robin
fromfifo
as a default setting. You can switch back to the previous behaviour and learn more about what this means here: (https://dlthub.com/docs/reference/performance#resources-extraction-fifo-vs-round-robin)if you create an instance of a SPEC (ie.
SnowflakeCredentials
) it will not be marked as resolved even if all required fields are provided. previously some were resolving and some were not. Fix/1465 fixes snowflake auth credentials #1489parse_native_representation
never marks config as resolved. previously some were resolving and some were not. Fix/1465 fixes snowflake auth credentials #1489when generating fingerprint for
filesystem
destination only the bucket component is taken into account feat(filesystem): use only netloc and scheme for fingerprint #1516The text was updated successfully, but these errors were encountered: