Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: add description for subcommands #1120

Merged
merged 4 commits into from
May 22, 2024
Merged

fix: add description for subcommands #1120

merged 4 commits into from
May 22, 2024

Conversation

marcosdotme
Copy link
Contributor

Description

Checklist

  • Add test cases to all the changes you introduce
  • Run ./scripts/format and ./scripts/test locally to ensure this change passes linter check and test
  • Test the changes on the local machine manually
  • Update the documentation for the changes (need remake a bunch of gifs/images @Lee-W)

Expected behavior

You must see the command description when running with --help option

Steps to Test This Pull Request

  1. cz commit --help

Screenshot

image

@marcosdotme
Copy link
Contributor Author

marcosdotme commented May 16, 2024

@Lee-W I delete the tests/commands/test_other_commands.py an move the tests for the correspondent file without asking you. Let me know if you need that I revert or squash this.

Copy link

codecov bot commented May 16, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 97.54%. Comparing base (120d514) to head (bbf19ad).
Report is 327 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #1120      +/-   ##
==========================================
+ Coverage   97.33%   97.54%   +0.20%     
==========================================
  Files          42       55      +13     
  Lines        2104     2486     +382     
==========================================
+ Hits         2048     2425     +377     
- Misses         56       61       +5     
Flag Coverage Δ
unittests 97.54% <100.00%> (+0.20%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

commitizen/cli.py Show resolved Hide resolved
cli.main()

out, _ = capsys.readouterr()
assert "bump semantic version based on the git log" in str(out).replace("\n", " ")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

none blocker, but we could try something like

with open(multiple_versions_increase_string, encoding="utf-8") as f:
file_regression.check(f.read(), extension=".txt")
as well

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know if I understand... I wrote the test_bump_command_shows_description_when_use_help_option to assert that the command description shows on running --help against the command. Can you explain your idea with other words?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pytest-regeression is a tool that you can verify whether the content (e.g., str, data frame) is exactly the same as the last time we generated it. It done so by creating a text file that records the last result. https://pytest-regressions.readthedocs.io/en/latest/overview.html

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will look to this later, today. Thanks!

Copy link
Contributor Author

@marcosdotme marcosdotme May 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Lee-W pytest-regressions is a game changer! Using the assert only approach isn't enough to check if the command descriptions matches. For example:

Using assert "create new commit" in str(out).replace("\n", " ") isn't reliable because if we add an dot to the command description "create new commit." it still matching.

But, using file_regression we can check for the entire output and adding the dot to the end of the command description will cause the test to fail.

I replaced all the assertion approach for the file_regression, except for two: changelog and bump.

Analyzing the failure I found the "problem" is in the file commitizen.version_schemes.py. Theres an constant called KNOWN_SCHEMES that is an set of all available scheme providers. The problem is: set is an unordered struct so the order changes for every call. Since the order changes for every call, we cannot assure using file_regression:

image

To fix this we can just replace the set comprehension for an list comprehension:

- KNOWN_SCHEMES = {ep.name for ep in metadata.entry_points(group=SCHEMES_ENTRYPOINT)}
+ KNOWN_SCHEMES = [ep.name for ep in metadata.entry_points(group=SCHEMES_ENTRYPOINT)]

I change the set for list and it passes on all tests, and now we can use file_regression. Theres an reason this to be an set? We can change to list?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Lee-W I will wait to #1126 be merged because this branch will also fail on CI due to trailing whitespaces on *.svg

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's merged 🎉

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Lee-W After run the ./scripts/test it fails because the missing ":" on commit message:

image

The commit is:

image

How to proceed? 🥲

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe a rebase to edit the commit message?

Copy link
Member

@Lee-W Lee-W May 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh just ignore it and push it. you can rebase from the master branch instead of merging. DO NOT edit the commit message though

…sion

Set is an unordered structure, so every call the order changes.
Because the unordered behavior we can't use `file_regression`
to test the command output.
@marcosdotme
Copy link
Contributor Author

@Lee-W Can you help me? Dont know whats happening here lol. All works fine on my local branch, all tests passes:

image

But it still failing on CI

@Lee-W
Copy link
Member

Lee-W commented May 20, 2024

@Lee-W Can you help me? Dont know whats happening here lol. All works fine on my local branch, all tests passes:

image

But it still failing on CI

Sure, let me take a look 🙂

@Lee-W
Copy link
Member

Lee-W commented May 20, 2024

I guess it's due to the latest main branch change. could you please run poetry run pytest --force-regen and see whether it fixes the CI? If not, I'll take a deeper look

@marcosdotme
Copy link
Contributor Author

I guess it's due to the latest main branch change. could you please run poetry run pytest --force-regen and see whether it fixes the CI? If not, I'll take a deeper look

Tried poetry run pytest --force-regen but it fails and I think is because the pytes-xdist plugin:

image

So I try to put --force-regen directly on .scripts/test:

image

But nothing changed:

image

@Lee-W
Copy link
Member

Lee-W commented May 20, 2024

got it. let me take a look

@Lee-W
Copy link
Member

Lee-W commented May 20, 2024

I know what's happening. The message of argparse is different in different Python versions.

@Lee-W
Copy link
Member

Lee-W commented May 20, 2024

Changed in Python 3.10. We could probably generate two sets of expected results and run that test based on different Python versions. We could try https://docs.pytest.org/en/latest/how-to/skipping.html#id1. Not sure whether there's something like parameterize we can use in this case. @marcosdotme do you want me to resolve it? or do you want to take a look?

@marcosdotme
Copy link
Contributor Author

marcosdotme commented May 20, 2024

Changed in Python 3.10. We could probably generate two sets of expected results and run that test based on different Python versions. We could try https://docs.pytest.org/en/latest/how-to/skipping.html#id1. Not sure whether there's something like parameterize we can use in this case. @marcosdotme do you want me to resolve it? or do you want to take a look?

Can you resolve this one? So I can learn from watching this time

@Lee-W
Copy link
Member

Lee-W commented May 20, 2024

Changed in Python 3.10. We could probably generate two sets of expected results and run that test based on different Python versions. We could try https://docs.pytest.org/en/latest/how-to/skipping.html#id1. Not sure whether there's something like parameterize we can use in this case. @marcosdotme do you want me to resolve it? or do you want to take a look?

Can you resolve this one? So I can learn from watching this time

Sure 🙂

…changes in python 3.10

Optional arguments -> options
@Lee-W
Copy link
Member

Lee-W commented May 20, 2024

@marcosdotme Done. I decide not to cover < 3.10 as this is not important enough to check

@Lee-W Lee-W added pr-status: ready-to-merge almost ready to merge. just keep it for a few days for others to check and removed pr-status: wait-for-review pr-status: wait-for-modification labels May 20, 2024
@marcosdotme
Copy link
Contributor Author

@marcosdotme Done. I decide not to cover < 3.10 as this is not important enough to check

Ok! Thanks 🤝🏻

@Lee-W
Copy link
Member

Lee-W commented May 20, 2024

@woile @noirbizarre I'm planning on merging it these days. Please let me know if you'd like to take a look. 🙂

@Lee-W Lee-W merged commit 8f4dbe9 into commitizen-tools:master May 22, 2024
18 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pr-status: ready-to-merge almost ready to merge. just keep it for a few days for others to check
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants