Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add initial unit tests for
ModelRunner
class (#10196)
* Add unit test for `ModelRunner.print_result_line` * Add (and skip) unit test for `ModelRunner.execute` An attempt at testing `ModelRunner.execute`. We should probably also be asserting that the model has been executed. However before we could get there, we're running into runtime errors during `ModelRunner.execute`. Currently the struggle is ensuring the adapter exists in the global factory when `execute` goes looking for. The error we're getting looks like the following: ``` def test_execute(self, table_model: ModelNode, manifest: Manifest, model_runner: ModelRunner) -> None: > model_runner.execute(model=table_model, manifest=manifest) tests/unit/task/test_run.py:121: ---- core/dbt/task/run.py:259: in execute context = generate_runtime_model_context(model, self.config, manifest) core/dbt/context/providers.py:1636: in generate_runtime_model_context ctx = ModelContext(model, config, manifest, RuntimeProvider(), None) core/dbt/context/providers.py:834: in __init__ self.adapter = get_adapter(self.config) venv/lib/python3.10/site-packages/dbt/adapters/factory.py:207: in get_adapter return FACTORY.lookup_adapter(config.credentials.type) ---` self = <dbt.adapters.factory.AdapterContainer object at 0x106e73280>, adapter_name = 'postgres' def lookup_adapter(self, adapter_name: str) -> Adapter: > return self.adapters[adapter_name] E KeyError: 'postgres' venv/lib/python3.10/site-packages/dbt/adapters/factory.py:132: KeyError ``` * Add `postgres_adapter` fixture for use in `TestModelRunner` Previously we were running into an issue where the during `ModelRunner.execute` the mock_adapter we were using wouldn't be found in the global adapter factory. We've gotten past this error by supply a "real" adapter, a `PostgresAdapter` instance. However we're now running into a new error in which the materialization macro can't be found. This error looks like ``` model_runner = <dbt.task.run.ModelRunner object at 0x106746650> def test_execute( self, table_model: ModelNode, manifest: Manifest, model_runner: ModelRunner ) -> None: > model_runner.execute(model=table_model, manifest=manifest) tests/unit/task/test_run.py:129: ---- self = <dbt.task.run.ModelRunner object at 0x106746650> model = ModelNode(database='dbt', schema='dbt_schema', name='table_model', resource_type=<NodeType.Model: 'model'>, package_na...ected'>, constraints=[], version=None, latest_version=None, deprecation_date=None, defer_relation=None, primary_key=[]) manifest = Manifest(nodes={'seed.pkg.seed': SeedNode(database='dbt', schema='dbt_schema', name='seed', resource_type=<NodeType.Se...s(show=True, node_color=None), patch_path=None, arguments=[], created_at=1718229810.21914, supported_languages=None)}}) def execute(self, model, manifest): context = generate_runtime_model_context(model, self.config, manifest) materialization_macro = manifest.find_materialization_macro_by_name( self.config.project_name, model.get_materialization(), self.adapter.type() ) if materialization_macro is None: > raise MissingMaterializationError( materialization=model.get_materialization(), adapter_type=self.adapter.type() ) E dbt.adapters.exceptions.compilation.MissingMaterializationError: Compilation Error E No materialization 'table' was found for adapter postgres! (searched types 'default' and 'postgres') core/dbt/task/run.py:266: MissingMaterializationError ``` * Add spoofed macro fixture `materialization_table_default` for `test_execute` test Previously the `TestModelRunner:test_execute` test was running into a runtime error do to the macro `materialization_table_default` macro not existing in the project. This commit adds that macro to the project (though it should ideally get loaded via interactions between the manifest and adapter). Manually adding it resolved our previous issue, but created a new one. The macro appears to not be properly loaded into the manifest, and thus isn't discoverable later on when getting the macros for the jinja context. This leads to an error that looks like the following: ``` model_runner = <dbt.task.run.ModelRunner object at 0x1080a4f70> def test_execute( self, table_model: ModelNode, manifest: Manifest, model_runner: ModelRunner ) -> None: > model_runner.execute(model=table_model, manifest=manifest) tests/unit/task/test_run.py:129: ---- core/dbt/task/run.py:287: in execute result = MacroGenerator( core/dbt/clients/jinja.py:82: in __call__ return self.call_macro(*args, **kwargs) venv/lib/python3.10/site-packages/dbt_common/clients/jinja.py:294: in call_macro macro = self.get_macro() --- self = <dbt.clients.jinja.MacroGenerator object at 0x1080f3130> def get_macro(self): name = self.get_name() template = self.get_template() # make the module. previously we set both vars and local, but that's # redundant: They both end up in the same place # make_module is in jinja2.environment. It returns a TemplateModule module = template.make_module(vars=self.context, shared=False) > macro = module.__dict__[get_dbt_macro_name(name)] E KeyError: 'dbt_macro__materialization_table_default' venv/lib/python3.10/site-packages/dbt_common/clients/jinja.py:277: KeyError ``` It's becoming apparent that we need to find a better way to either mock or legitimately load the default and adapter macros. At this point I think I've exausted the time box I should be using to figure out if testing the `ModelRunner` class is possible currently, with the result being more work has yet to be done. * Begin adding the `LogModelResult` event catcher to event manager class fixture
- Loading branch information