Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

schema contract #594

Merged
merged 85 commits into from
Nov 21, 2023
Merged
Show file tree
Hide file tree
Changes from 40 commits
Commits
Show all changes
85 commits
Select commit Hold shift + click to select a range
78d6d71
basic schema freezing
sh-rp Aug 27, 2023
09d8c63
small changes
sh-rp Aug 27, 2023
aec10b1
temp
sh-rp Aug 28, 2023
9441dcb
add new schema update mode
sh-rp Aug 28, 2023
edad4ad
fix linting errors and one bug
sh-rp Aug 31, 2023
bef7ea4
move freeze code to schema
sh-rp Aug 31, 2023
fc6f083
some work on schema evolution modes
sh-rp Sep 4, 2023
b36a74f
add tests
sh-rp Sep 4, 2023
5659827
small tests change
sh-rp Sep 4, 2023
5343f4b
Merge branch 'devel' into d#/data_contracts
sh-rp Sep 4, 2023
894875f
small fix
sh-rp Sep 5, 2023
8bccfe5
fix some tests
sh-rp Sep 5, 2023
cfd3f64
add global override for schema evolution
sh-rp Sep 5, 2023
e05855c
finish implemention of global override
sh-rp Sep 5, 2023
3f07127
better tests
sh-rp Sep 6, 2023
6308369
carry over schema settings on update
sh-rp Sep 6, 2023
ab0b8d7
add tests for single values
sh-rp Sep 6, 2023
6e99ed9
small changes to tests and code
sh-rp Sep 6, 2023
1a10ec4
fix small error
sh-rp Sep 6, 2023
175bee5
add tests for data contract interaction
sh-rp Sep 6, 2023
18b9341
fix tests
sh-rp Sep 7, 2023
6dcaa7d
Merge branch 'devel' into d#/data_contracts
sh-rp Sep 12, 2023
881d79a
some PR work
sh-rp Sep 12, 2023
e707580
update schema management
sh-rp Sep 13, 2023
b3dc41d
fix schema related tests
sh-rp Sep 13, 2023
ac8f766
add nice schema tests
sh-rp Sep 13, 2023
869f278
add docs page
sh-rp Sep 13, 2023
0f22ba0
small test fix
sh-rp Sep 13, 2023
463c447
smaller PR fixes
sh-rp Sep 14, 2023
db3f447
more work
sh-rp Sep 17, 2023
c17577a
tests update
rudolfix Sep 18, 2023
fb1d224
almost there
sh-rp Sep 23, 2023
6a15fa2
tmp
sh-rp Sep 26, 2023
d66c2e6
fix freeze tests
sh-rp Sep 26, 2023
5e238c5
Merge branch 'devel' into d#/data_contracts
sh-rp Sep 26, 2023
bf9da7e
cleanup
sh-rp Sep 26, 2023
9ba2496
Merge branch 'devel' into d#/data_contracts
sh-rp Sep 26, 2023
038d03a
create data contracts page
sh-rp Sep 26, 2023
84384c3
small cleanup
sh-rp Sep 26, 2023
44dfb69
add pydantic dep to destination tests
sh-rp Sep 26, 2023
ece2bfc
Merge branch 'devel' into d#/data_contracts
sh-rp Sep 28, 2023
7b8f2d2
rename contract settings
sh-rp Sep 29, 2023
00c540b
rename schema contract dict keys
sh-rp Sep 29, 2023
3ed4630
some work
sh-rp Sep 29, 2023
333217c
more work...
sh-rp Sep 29, 2023
d69e54d
more work
sh-rp Oct 1, 2023
041da6d
move checking of new tables into extract function
sh-rp Oct 2, 2023
b72a1a9
fix most tests
sh-rp Oct 2, 2023
f2abadf
Merge branch 'devel' into d#/data_contracts
sh-rp Oct 2, 2023
2ae36e3
fix linter after merge
sh-rp Oct 2, 2023
d85e04f
small cleanup
sh-rp Oct 2, 2023
1d7be25
Merge branch 'devel' into d#/data_contracts
rudolfix Oct 10, 2023
96785c4
Merge branch 'devel' into d#/data_contracts
sh-rp Oct 17, 2023
302d909
post merge code updates
sh-rp Oct 17, 2023
4a1fab0
small fixes
sh-rp Oct 17, 2023
903f000
some cleanup
sh-rp Oct 18, 2023
912dd8b
update docs
sh-rp Oct 18, 2023
ef1b10f
Merge branch 'devel' into d#/data_contracts
rudolfix Oct 29, 2023
256920e
Merge branch 'devel' into d#/data_contracts
rudolfix Nov 1, 2023
168f0da
makes bumping version optional in Schema, preserves hashes on replace…
rudolfix Nov 1, 2023
760cc43
extracts on single pipeline schema
rudolfix Nov 1, 2023
b645927
allows to control relational normalizer descend with send
rudolfix Nov 12, 2023
8989a75
refactors data contract apply to generate filters instead of actual f…
rudolfix Nov 12, 2023
57842cc
detects if bytes string possibly contains pue characters
rudolfix Nov 12, 2023
441299f
applies schema contracts in item normalizer, uses binary stream, dete…
rudolfix Nov 12, 2023
fc0eb47
methods to remove and rename arrow columns, need arrow 12+
rudolfix Nov 12, 2023
9977227
implements contracts in extract, fixes issues in apply hints, arrow d…
rudolfix Nov 12, 2023
d74242a
always uses pipeline schema when extracting
rudolfix Nov 12, 2023
e9344ee
returns new items count from buffered write
rudolfix Nov 12, 2023
e980396
bumps pyarrow to 12, temporary removes snowflake extra
rudolfix Nov 12, 2023
5e2d131
Merge branch 'devel' into d#/data_contracts
rudolfix Nov 12, 2023
3dc4fa5
fixes arrow imports and normalizer config
rudolfix Nov 13, 2023
c76788a
fixes normalizer config tests and pipeline state serialization
rudolfix Nov 13, 2023
423a163
normalizes arrow tables before saving
rudolfix Nov 13, 2023
c1c32d6
adds validation and model synth for contracts to pydantic helper
rudolfix Nov 16, 2023
c24b643
splits extractor into files, improves pydantic validator
rudolfix Nov 16, 2023
2b97a4f
runs tests on ci with minimal dependencies
rudolfix Nov 16, 2023
c35ec2d
fixes deps in ci workflows
rudolfix Nov 16, 2023
340ed3d
re-adds snowflake connector
rudolfix Nov 16, 2023
35c10b7
Merge branch 'devel' into d#/data_contracts
rudolfix Nov 18, 2023
2d260e8
updates pydantic helper
rudolfix Nov 19, 2023
aa990f5
improves contract violation exception
rudolfix Nov 19, 2023
a6a782b
splits source and resource in extract, adds more tests
rudolfix Nov 19, 2023
f4d2cac
temp disable pydantic 1 tests
rudolfix Nov 19, 2023
f556584
fixes generic type parametrization on 3.8
rudolfix Nov 20, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/test_destinations.yml
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ jobs:

- name: Install dependencies
# if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction -E redshift -E gs -E s3 -E az -E parquet -E duckdb -E cli
run: poetry install --no-interaction -E redshift -E gs -E s3 -E az -E parquet -E duckdb -E cli -E pydantic

# - name: Install self
# run: poetry install --no-interaction
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test_local_destinations.yml
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ jobs:
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}-local-destinations

- name: Install dependencies
run: poetry install --no-interaction -E postgres -E duckdb -E parquet -E filesystem -E cli -E weaviate
run: poetry install --no-interaction -E postgres -E duckdb -E parquet -E filesystem -E cli -E weaviate -E pydantic

- run: poetry run pytest tests/load && poetry run pytest tests/cli
name: Run tests Linux
Expand Down
5 changes: 3 additions & 2 deletions dlt/common/pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
from dlt.common.destination import DestinationReference, TDestinationReferenceArg
from dlt.common.exceptions import DestinationHasFailedJobs, PipelineStateNotAvailable, ResourceNameNotAvailable, SourceSectionNotAvailable
from dlt.common.schema import Schema
from dlt.common.schema.typing import TColumnNames, TColumnSchema, TWriteDisposition
from dlt.common.schema.typing import TColumnNames, TColumnSchema, TWriteDisposition, TSchemaContractSettings
from dlt.common.source import get_current_pipe_name
from dlt.common.storages.load_storage import LoadPackageInfo
from dlt.common.typing import DictStrAny, REPattern
Expand Down Expand Up @@ -209,7 +209,8 @@ def run(
columns: Sequence[TColumnSchema] = None,
primary_key: TColumnNames = None,
schema: Schema = None,
loader_file_format: TLoaderFileFormat = None
loader_file_format: TLoaderFileFormat = None,
schema_contract_settings: TSchemaContractSettings = None,
) -> LoadInfo:
...

Expand Down
2 changes: 1 addition & 1 deletion dlt/common/schema/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from dlt.common.schema.typing import TSchemaUpdate, TSchemaTables, TTableSchema, TStoredSchema, TTableSchemaColumns, TColumnHint, TColumnSchema, TColumnSchemaBase # noqa: F401
from dlt.common.schema.typing import COLUMN_HINTS # noqa: F401
from dlt.common.schema.schema import Schema # noqa: F401
from dlt.common.schema.schema import Schema, DEFAULT_SCHEMA_CONTRACT_MODE # noqa: F401
from dlt.common.schema.utils import verify_schema_hash # noqa: F401
7 changes: 7 additions & 0 deletions dlt/common/schema/exceptions.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,3 +69,10 @@ def __init__(self, schema_name: str, init_engine: int, from_engine: int, to_engi
self.from_engine = from_engine
self.to_engine = to_engine
super().__init__(f"No engine upgrade path in schema {schema_name} from {init_engine} to {to_engine}, stopped at {from_engine}")


class SchemaFrozenException(SchemaException):
def __init__(self, schema_name: str, table_name: str, msg: str) -> None:
super().__init__(msg)
self.schema_name = schema_name
self.table_name = table_name
107 changes: 104 additions & 3 deletions dlt/common/schema/schema.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import yaml
from copy import copy, deepcopy
from typing import ClassVar, Dict, List, Mapping, Optional, Sequence, Tuple, Any, cast
from typing import ClassVar, Dict, List, Mapping, Optional, Sequence, Tuple, Any, cast, Literal
from dlt.common import json

from dlt.common.utils import extend_list_deduplicated
Expand All @@ -11,12 +11,19 @@
from dlt.common.schema import utils
from dlt.common.data_types import py_type_to_sc_type, coerce_value, TDataType
from dlt.common.schema.typing import (COLUMN_HINTS, SCHEMA_ENGINE_VERSION, LOADS_TABLE_NAME, VERSION_TABLE_NAME, STATE_TABLE_NAME, TPartialTableSchema, TSchemaSettings, TSimpleRegex, TStoredSchema,
TSchemaTables, TTableSchema, TTableSchemaColumns, TColumnSchema, TColumnProp, TColumnHint, TTypeDetections)
TSchemaTables, TTableSchema, TTableSchemaColumns, TColumnSchema, TColumnProp, TColumnHint, TTypeDetections, TSchemaContractModes, TSchemaContractSettings)
from dlt.common.schema.exceptions import (CannotCoerceColumnException, CannotCoerceNullException, InvalidSchemaName,
ParentTableNotFoundException, SchemaCorruptedException)
from dlt.common.validation import validate_dict
from dlt.common.schema.exceptions import SchemaFrozenException


DEFAULT_SCHEMA_CONTRACT_MODE: TSchemaContractModes = {
"table": "evolve",
"column": "evolve",
"data_type": "evolve"
}

class Schema:
ENGINE_VERSION: ClassVar[int] = SCHEMA_ENGINE_VERSION

Expand Down Expand Up @@ -61,6 +68,7 @@ def __init__(self, name: str, normalizers: TNormalizersConfig = None) -> None:

@classmethod
def from_dict(cls, d: DictStrAny) -> "Schema":

# upgrade engine if needed
stored_schema = utils.migrate_schema(d, d["engine_version"], cls.ENGINE_VERSION)
# verify schema
Expand Down Expand Up @@ -187,7 +195,93 @@ def coerce_row(self, table_name: str, parent_table: str, row: StrAny) -> Tuple[D

return new_row, updated_table_partial

def update_schema(self, partial_table: TPartialTableSchema) -> TPartialTableSchema:
def resolve_contract_settings_for_table(self, parent_table: str, table_name: str) -> TSchemaContractModes:
rudolfix marked this conversation as resolved.
Show resolved Hide resolved
"""Resolve the exact applicable schema contract settings for the table during the normalization stage."""

def resolve_single(settings: TSchemaContractSettings) -> TSchemaContractModes:
settings = settings or {}
if isinstance(settings, str):
return TSchemaContractModes(table=settings, column=settings, data_type=settings)
return settings

# find table settings
table = parent_table or table_name
if table in self.tables:
table = utils.get_top_level_table(self.tables, parent_table or table_name)["name"]

# modes
table_contract_modes = resolve_single(self.tables.get(table, {}).get("schema_contract_settings", {}))
schema_contract_modes = resolve_single(self._settings.get("schema_contract_settings", {}))

# resolve to correct settings dict
settings = cast(TSchemaContractModes, {**DEFAULT_SCHEMA_CONTRACT_MODE, **schema_contract_modes, **table_contract_modes})

return settings

def is_table_populated(self, table_name: str) -> bool:
return table_name in self.tables and (self.tables[table_name].get("populated") is True)

def apply_schema_contract(self, contract_modes: TSchemaContractModes, table_name: str, row: DictStrAny, partial_table: TPartialTableSchema, table_populated: bool) -> Tuple[DictStrAny, TPartialTableSchema]:
"""
Checks if contract mode allows for the requested changes to the data and the schema. It will allow all changes to pass, filter out the row filter out
columns for both the data and the schema_update or reject the update completely, depending on the mode. An example settings could be:

{
"table": "freeze",
"column": "evolve",
"data_type": "discard_row"
}

Settings for table affects new tables, settings for column affects new columns and settings for data_type affects new variant columns. Each setting can be set to one of:
* evolve: allow all changes
* freeze: allow no change and fail the load
* discard_row: allow no schema change and filter out the row
* discard_value: allow no schema change and filter out the value but load the rest of the row
"""

assert partial_table

# default settings allow all evolutions, skip all else
if contract_modes == DEFAULT_SCHEMA_CONTRACT_MODE:
return row, partial_table

# check case where we have a new table
if not table_populated:
if contract_modes["table"] in ["discard_row", "discard_value"]:
return None, None
if contract_modes["table"] == "freeze":
raise SchemaFrozenException(self.name, table_name, f"Trying to add table {table_name} but new tables are frozen.")

# check columns
for item in list(row.keys()):
# if this is a new column for an existing table...
if table_populated and (item not in self.tables[table_name]["columns"] or not utils.is_complete_column(self.tables[table_name]["columns"][item])):
is_variant = (item in partial_table["columns"]) and partial_table["columns"][item].get("variant")
if contract_modes["column"] == "discard_value" or (is_variant and contract_modes["data_type"] == "discard_value"):
row.pop(item)
partial_table["columns"].pop(item)
elif contract_modes["column"] == "discard_row" or (is_variant and contract_modes["data_type"] == "discard_row"):
return None, None
elif is_variant and contract_modes["data_type"] == "freeze":
raise SchemaFrozenException(self.name, table_name, f"Trying to create new variant column {item} to table {table_name} data_types are frozen.")
elif contract_modes["column"] == "freeze":
raise SchemaFrozenException(self.name, table_name, f"Trying to add column {item} to table {table_name} but columns are frozen.")

return row, partial_table

def update_schema(self, schema: "Schema") -> None:
"""
Update schema from another schema
note we are not merging props like max nesting or column propagation
"""

for table in schema.data_tables(include_incomplete=True):
self.update_table(
self.normalize_table_identifiers(table)
)
self.set_schema_contract_settings(schema._settings.get("schema_contract_settings", {}))

def update_table(self, partial_table: TPartialTableSchema) -> TPartialTableSchema:
table_name = partial_table["name"]
parent_table_name = partial_table.get("parent")
# check if parent table present
Expand Down Expand Up @@ -376,6 +470,13 @@ def update_normalizers(self) -> None:
normalizers["json"] = normalizers["json"] or self._normalizers_config["json"]
self._configure_normalizers(normalizers)

def set_schema_contract_settings(self, settings: TSchemaContractSettings, update_table_settings: bool = False) -> None:
self._settings["schema_contract_settings"] = settings
if update_table_settings:
for table in self.tables.values():
if not table.get("parent"):
table["schema_contract_settings"] = settings

def _infer_column(self, k: str, v: Any, data_type: TDataType = None, is_variant: bool = False) -> TColumnSchema:
column_schema = TColumnSchema(
name=k,
Expand Down
18 changes: 15 additions & 3 deletions dlt/common/schema/typing.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@


# current version of schema engine
SCHEMA_ENGINE_VERSION = 6
SCHEMA_ENGINE_VERSION = 7

# dlt tables
VERSION_TABLE_NAME = "_dlt_version"
Expand Down Expand Up @@ -70,6 +70,15 @@ class TColumnSchema(TColumnSchemaBase, total=False):
TColumnName = NewType("TColumnName", str)
SIMPLE_REGEX_PREFIX = "re:"

TSchemaEvolutionMode = Literal["evolve", "discard_value", "freeze", "discard_row"]

class TSchemaContractModes(TypedDict, total=False):
"""TypedDict defining the schema update settings"""
table: Optional[TSchemaEvolutionMode]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry for that but I have idea for another rename. it IMO tells way more what is happening
table -> tables
column -> columns
data_type -> data (?)

column: Optional[TSchemaEvolutionMode]
data_type: Optional[TSchemaEvolutionMode]

TSchemaContractSettings = Union[TSchemaEvolutionMode, TSchemaContractModes]

class TRowFilters(TypedDict, total=True):
excludes: Optional[List[TSimpleRegex]]
Expand All @@ -81,11 +90,12 @@ class TTableSchema(TypedDict, total=False):
name: Optional[str]
description: Optional[str]
write_disposition: Optional[TWriteDisposition]
table_sealed: Optional[bool]
schema_contract_settings: Optional[TSchemaContractSettings]

This comment was marked as resolved.

parent: Optional[str]
filters: Optional[TRowFilters]
columns: TTableSchemaColumns
resource: Optional[str]
populated: Optional[bool]


class TPartialTableSchema(TTableSchema):
Expand All @@ -95,8 +105,9 @@ class TPartialTableSchema(TTableSchema):
TSchemaTables = Dict[str, TTableSchema]
TSchemaUpdate = Dict[str, List[TPartialTableSchema]]


class TSchemaSettings(TypedDict, total=False):
schema_sealed: Optional[bool]
schema_contract_settings: Optional[TSchemaContractSettings]
detections: Optional[List[TTypeDetections]]
default_hints: Optional[Dict[TColumnHint, List[TSimpleRegex]]]
preferred_types: Optional[Dict[TSimpleRegex, TDataType]]
Expand All @@ -113,3 +124,4 @@ class TStoredSchema(TypedDict, total=False):
settings: Optional[TSchemaSettings]
tables: TSchemaTables
normalizers: TNormalizersConfig

21 changes: 18 additions & 3 deletions dlt/common/schema/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
from dlt.common.schema import detections
from dlt.common.schema.typing import (COLUMN_HINTS, SCHEMA_ENGINE_VERSION, LOADS_TABLE_NAME, SIMPLE_REGEX_PREFIX, VERSION_TABLE_NAME, TColumnName, TPartialTableSchema, TSchemaTables, TSchemaUpdate,
TSimpleRegex, TStoredSchema, TTableSchema, TTableSchemaColumns, TColumnSchemaBase, TColumnSchema, TColumnProp,
TColumnHint, TTypeDetectionFunc, TTypeDetections, TWriteDisposition)
TColumnHint, TTypeDetectionFunc, TTypeDetections, TWriteDisposition, TSchemaContractSettings, TSchemaContractModes)
from dlt.common.schema.exceptions import (CannotCoerceColumnException, ParentTableNotFoundException, SchemaEngineNoUpgradePathException, SchemaException,
TablePropertiesConflictException, InvalidSchemaName)

Expand Down Expand Up @@ -340,6 +340,15 @@ def migrate_filters(group: str, filters: List[str]) -> None:
# replace loads table
schema_dict["tables"][LOADS_TABLE_NAME] = load_table()
from_engine = 6
if from_engine == 6 and to_engine > 6:
# migrate from sealed properties to schema evolution settings
schema_dict["settings"].pop("schema_sealed", None)
schema_dict["settings"]["schema_contract_settings"] = {}
for table in schema_dict["tables"].values():
table.pop("table_sealed", None)
if not table.get("parent"):
table["schema_contract_settings"] = {}
from_engine = 7

schema_dict["engine_version"] = from_engine
if from_engine != to_engine:
Expand Down Expand Up @@ -426,7 +435,6 @@ def diff_tables(tab_a: TTableSchema, tab_b: TPartialTableSchema) -> TPartialTabl
continue
existing_v = tab_a.get(k)
if existing_v != v:
print(f"{k} ==? {v} ==? {existing_v}")
partial_table[k] = v # type: ignore

# this should not really happen
Expand Down Expand Up @@ -637,7 +645,9 @@ def new_table(
write_disposition: TWriteDisposition = None,
columns: Sequence[TColumnSchema] = None,
validate_schema: bool = False,
resource: str = None
resource: str = None,
schema_contract_settings: TSchemaContractSettings = None,
populated: bool = None
) -> TTableSchema:

table: TTableSchema = {
Expand All @@ -648,10 +658,15 @@ def new_table(
table["parent"] = parent_table_name
assert write_disposition is None
assert resource is None
assert schema_contract_settings is None
else:
# set write disposition only for root tables
table["write_disposition"] = write_disposition or DEFAULT_WRITE_DISPOSITION
table["resource"] = resource or table_name
if schema_contract_settings:
table["schema_contract_settings"] = schema_contract_settings
if populated is not None:
table["populated"] = populated
if validate_schema:
validate_dict_ignoring_xkeys(
spec=TColumnSchema,
Expand Down
17 changes: 7 additions & 10 deletions dlt/common/typing.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,41 +66,38 @@ def asstr(self, verbosity: int = 0) -> str:
...


def is_union_type(t: Type[Any]) -> bool:
return get_origin(t) is Union

def is_optional_type(t: Type[Any]) -> bool:
return get_origin(t) is Union and type(None) in get_args(t)


def is_final_type(t: Type[Any]) -> bool:
return get_origin(t) is Final


def extract_optional_type(t: Type[Any]) -> Any:
return get_args(t)[0]

def extract_union_types(t: Type[Any], no_none: bool = False) -> List[Any]:
if no_none:
return [arg for arg in get_args(t) if arg is not type(None)] # noqa: E721
return list(get_args(t))

def is_literal_type(hint: Type[Any]) -> bool:
return get_origin(hint) is Literal


def is_union(hint: Type[Any]) -> bool:
return get_origin(hint) is Union


def is_newtype_type(t: Type[Any]) -> bool:
return hasattr(t, "__supertype__")


def is_typeddict(t: Type[Any]) -> bool:
return isinstance(t, _TypedDict)


def is_list_generic_type(t: Type[Any]) -> bool:
try:
return issubclass(get_origin(t), C_Sequence)
except TypeError:
return False


def is_dict_generic_type(t: Type[Any]) -> bool:
try:
return issubclass(get_origin(t), C_Mapping)
Expand Down
Loading
Loading