Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration requirement check refactor #25626

Merged
merged 8 commits into from Aug 7, 2019

Conversation

@elupus
Copy link
Contributor

commented Aug 1, 2019

Description:

  • Factor out code to get an integration with installed requirements since this is done in multiple places.
  • Use specific exceptions to propagate errors instead of return values since it leads to easier to read code.

Checklist:

  • The code change is tested and works locally.
  • Local tests pass with tox. Your PR cannot be merged unless tests pass
  • There is no commented out code in this PR.
  • I have followed the development checklist

If user exposed functionality or configuration variables are added/changed:

If the code communicates with devices, web services, or third-party tools:

  • The manifest file has all fields filled out correctly. Update and include derived files by running python3 -m script.hassfest.
  • New or updated dependencies have been added to requirements_all.txt by running python3 -m script.gen_requirements_all.
  • Untested files have been added to .coveragerc.

If the code does not interact with devices:

  • Tests have been added to verify that the new code works.

@probot-home-assistant probot-home-assistant bot added the core label Aug 1, 2019

@project-bot project-bot bot added this to Needs review in Dev Aug 1, 2019

@elupus elupus force-pushed the elupus:req_check_refactor branch 2 times, most recently from b49c3c5 to c872c9a Aug 1, 2019

@elupus elupus force-pushed the elupus:req_check_refactor branch from a137426 to aeb6042 Aug 1, 2019

elupus added 2 commits Aug 1, 2019
elupus added 3 commits Aug 1, 2019
@elupus

This comment has been minimized.

Copy link
Contributor Author

commented Aug 1, 2019

Now that it's raising explicit exceptions things could be simplified further by combining the multiple exception catches into one.

            try:
                p_validated = component_platform_schema(p_config)

                # Not all platform components follow same pattern for platforms
                # So if p_name is None we are not going to validate platform
                # (the automation component is one of them)
                if p_name is None:
                    platforms.append(p_validated)
                    continue

                p_integration = await async_get_integration_with_requirements(
                    hass, p_name
                )
                platform = p_integration.get_platform(domain)

                # Validate platform specific schema
                platform_schema = getattr(platform, "PLATFORM_SCHEMA", None)
                if platform_schema is not None:
                    p_validated = platform_schema(p_validated)

            except (
                loader.IntegrationNotFound,
                RequirementsNotFound,
                ImportError,
            ) as ex:
                result.add_error("Platform error {}.{} - {}".format(domain, p_name, ex))
                continue
            except vol.Invalid as ex:
                _comp_error(ex, "{}.{}".format(domain, p_name), p_validated)
                continue

            platforms.append(p_validated)

vs before:

# Validate component specific platform schema
            try:
                p_validated = component_platform_schema(p_config)
            except vol.Invalid as ex:
                _comp_error(ex, domain, config)
                continue

            # Not all platform components follow same pattern for platforms
            # So if p_name is None we are not going to validate platform
            # (the automation component is one of them)
            if p_name is None:
                platforms.append(p_validated)
                continue

            try:
                p_integration = await async_get_integration_with_requirements(
                    hass, p_name
                )
                platform = p_integration.get_platform(domain)
            except (
                loader.IntegrationNotFound,
                RequirementsNotFound,
                ImportError,
            ) as ex:
                result.add_error("Platform error {}.{} - {}".format(domain, p_name, ex))
                continue

            # Validate platform specific schema
            platform_schema = getattr(platform, "PLATFORM_SCHEMA", None)
            if platform_schema is not None:
                try:
                    p_validated = platform_schema(p_validated)
                except vol.Invalid as ex:
                    _comp_error(ex, "{}.{}".format(domain, p_name), p_validated)
                    continue

            platforms.append(p_validated)
@elupus

This comment has been minimized.

Copy link
Contributor Author

commented Aug 1, 2019

I updated to a example instead of diff above. Diff was unreadable.

@balloob

This comment has been minimized.

Copy link
Member

commented Aug 1, 2019

I prefer not to combine try…except if we don't have to.

@MartinHjelmare

This comment has been minimized.

Copy link
Member

commented Aug 1, 2019

I like as narrow scope as possible on try... except, ie before example.

Take other opinions too.

@elupus

This comment has been minimized.

Copy link
Contributor Author

commented Aug 1, 2019

I don't really agree about narrow scope for exception when exceptions are specific and won't catch other errors than the intended.

Narrow scope caues the OK code path to be less readable. You end up having to jump through all the try/catch, if/else blocks that are only there to handle the exceptional case.

It's a completely different story if you catch generic exceptions thou. Stuff like TypeError (ImportError which should have been wrapped further in). In that case scope must be very narrow.

@elupus elupus marked this pull request as ready for review Aug 1, 2019

@elupus elupus requested a review from home-assistant/core as a code owner Aug 1, 2019

@elupus

This comment has been minimized.

Copy link
Contributor Author

commented Aug 1, 2019

Why is code coverage check failing? I'm increasing test coverage?

@pvizeli
pvizeli approved these changes Aug 5, 2019
Copy link
Member

left a comment

Looks good to me. @balloob ?

Dev automation moved this from Needs review to Reviewer approved Aug 5, 2019

@balloob balloob merged commit d1b9ebc into home-assistant:dev Aug 7, 2019

10 of 11 checks passed

codecov/patch 91.66% of diff hit (target 94%)
Details
CI Build #20190801.108 succeeded
Details
CI (FullCheck Mypy) FullCheck Mypy succeeded
Details
CI (FullCheck Pylint) FullCheck Pylint succeeded
Details
CI (Overview CheckFormat) Overview CheckFormat succeeded
Details
CI (Overview Lint) Overview Lint succeeded
Details
CI (Overview Validate) Overview Validate succeeded
Details
CI (Tests PyTest Python36) Tests PyTest Python36 succeeded
Details
CI (Tests PyTest Python37) Tests PyTest Python37 succeeded
Details
cla-bot Everyone involved has signed the CLA
codecov/project 94.01% (target 90%)
Details

Dev automation moved this from Reviewer approved to Done Aug 7, 2019

@lock lock bot locked and limited conversation to collaborators Aug 8, 2019

@elupus elupus deleted the elupus:req_check_refactor branch Aug 10, 2019

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
5 participants
You can’t perform that action at this time.