Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: add pytest support on windows; change: add meltano elt guidance for windows users #6155

Merged
merged 52 commits into from Jun 28, 2022

Conversation

visch
Copy link
Collaborator

@visch visch commented Jun 10, 2022

Closes #5940

Old MR from Gitlab https://gitlab.com/meltano/meltano/-/merge_requests/2639/diffs

(Note this has been rebased since this MR)

Pipeline is running here https://gitlab.com/vischous/meltano/-/merge_requests/3/pipelines

@netlify
Copy link

netlify bot commented Jun 10, 2022

👷 Deploy request for meltano pending review.

Visit the deploys page to approve it

Name Link
🔨 Latest commit b029791

@visch visch changed the title 2743 windows elt 2743 windows pytest Jun 10, 2022
@visch visch changed the title 2743 windows pytest 2743 windows ELT, and pytest working on Windows Jun 10, 2022
.github/workflows/test.yml Outdated Show resolved Hide resolved
tests/fixtures/core.py Outdated Show resolved Hide resolved
@visch visch requested a review from WillDaSilva June 24, 2022 19:49
@visch
Copy link
Collaborator Author

visch commented Jun 24, 2022

How do you feel about punting that to #3444 I added a comment there? I think that issue is going to end up being a bunch of sub issues addressing individual failures, and one of those should be switching to xfail for everything.

Works for me, though an alternative that lets us use xfail and avoid the having the process hang would be adding something like this at the start of each of these failing tests:

def test_something(...):
    if platform.system() == "Windows":
        pytest.xfail("reason...")
    # The original body of the test function continues here

How do you feel about punting that to #3444 I added a comment there? I think that issue is going to end up being a bunch of sub issues addressing individual failures, and one of those should be switching to xfail for everything.

Works for me, though an alternative that lets us use xfail and avoid the having the process hang would be adding something like this at the start of each of these failing tests:

def test_something(...):
    if platform.system() == "Windows":
        pytest.xfail("reason...")
    # The original body of the test function continues here

Went ahead and added xfail instead of skip!

@visch visch changed the title 2743 windows ELT(Directs users to use run instead), and pytest working on Windows feat: 2743 windows ELT(Directs users to use run instead), and pytest working on Windows Jun 24, 2022
Copy link
Member

@WillDaSilva WillDaSilva left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for getting this done @visch! I'm glad we'll be able to make these Windows tests available in CI in time for me to run the Python 3.10 changes against them.

tests/meltano/cli/test_elt.py Outdated Show resolved Hide resolved
tests/meltano/cli/test_elt.py Outdated Show resolved Hide resolved
tests/meltano/cli/test_elt.py Outdated Show resolved Hide resolved
@visch
Copy link
Collaborator Author

visch commented Jun 26, 2022

Code Coverage stuff isn't quite accurate here (Or maybe I misunderstand) as we are testing these things but it's dependent on our github actions. We could dive into this as a part of this PR as well? (this is new as well!)

@edgarrmondragon
Copy link
Collaborator

Code Coverage stuff isn't quite accurate here (Or maybe I misunderstand) as we are testing these things but it's dependent on our github actions. We could dive into this as a part of this PR as well? (this is new as well!)

@visch could it be because we're only analyzing coverage for the results of testing in 3.9?

@visch
Copy link
Collaborator Author

visch commented Jun 27, 2022

Code Coverage stuff isn't quite accurate here (Or maybe I misunderstand) as we are testing these things but it's dependent on our github actions. We could dive into this as a part of this PR as well? (this is new as well!)

@visch could it be because we're only analyzing coverage for the results of testing in 3.9?

I don't know, I wouldn't think so but I have no idea.

I guess my question here is do we need to fix code coverage here or not?

@edgarrmondragon
Copy link
Collaborator

I guess my question here is do we need to fix code coverage here or not?

@visch not at all. It's currently only a guideline and not a strict requirement.

@WillDaSilva
Copy link
Member

@visch The new tests failures are my bad: #6306

Sorry about that. I'll have it sorted out as soon as possible.

@aaronsteers aaronsteers changed the title feat: 2743 windows ELT(Directs users to use run instead), and pytest working on Windows ci: add pytest support on windows; change: add meltano elt guidance for windows users Jun 28, 2022
@aaronsteers aaronsteers changed the title ci: add pytest support on windows; change: add meltano elt guidance for windows users ci: add pytest support on windows; change: add meltano elt guidance for windows users Jun 28, 2022
@aaronsteers aaronsteers merged commit 64fda9f into meltano:main Jun 28, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

Successfully merging this pull request may close these issues.

Meltano ELT fails when running on native windows
5 participants