Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Verify test-backend doesn't produce extra output spam in CI #1587

Closed
timabbott opened this issue Aug 10, 2016 · 12 comments
Closed

Verify test-backend doesn't produce extra output spam in CI #1587

timabbott opened this issue Aug 10, 2016 · 12 comments

Comments

@timabbott
Copy link
Sponsor Member

timabbott commented Aug 10, 2016

We occasionally introduce bugs where extra stdout output shows up in the test-backend output, which makes it harder to read our test output. It'd be great to just add a quick check in tools/ci/backend for this type of issue (e.g. just redirect the output to a find via tee and then verify that output), so we can avoid merging new bugs of this form.

@sahildua2305
Copy link
Contributor

@timabbott can you please explain this issue a bit?

@timabbott
Copy link
Sponsor Member Author

Sure, if you run test-backend, there are a few tests that end up printing some unnecessary spammy output to the console. Often that can be a sign that something is wrong with the test, but in a few cases

Below are the main examples:

Running zerver.tests.test_upload.FileUploadTest.test_delete_old_unclaimed_attachments
2016-09-18 14:08:40,211 WARNING  dummy_2.txt does not exist. Its entry in the database will be removed.
Running zerver.tests.test_upload.FileUploadTest.test_download_non_existent_file
...
Running zerver.tests.test_email_mirror.TestDigestEmailMessages.test_receive_digest_email_messages
2016-09-18 14:08:27,938 INFO     Sending digest email for othello@zulip.com
Running zerver.tests.test_email_mirror.TestMissedHuddleMessageEmailMessages.test_receive_missed_huddle_message_email_messages
2016-09-18 14:08:28,002 INFO     Successfully processed email from cordelia@zulip.com to cordelia@zulip.com,iago@zulip.com,othello@zulip.com
Running zerver.tests.test_email_mirror.TestMissedMessageAddressWithEmptyGateway.test_address_with_empty_gateway
2016-09-18 14:08:28,052 WARNING  EMAIL_GATEWAY_PATTERN is an empty string, using NOREPLY_EMAIL_ADDRESS in the 'from' field.
Running zerver.tests.test_email_mirror.TestMissedPersonalMessageEmailMessages.test_receive_missed_personal_message_email_messages
2016-09-18 14:08:28,106 INFO     Successfully processed email from othello@zulip.com to hamlet@zulip.com
Running zerver.tests.test_email_mirror.TestReplyExtraction.test_reply_is_extracted_from_html
2016-09-18 14:08:28,283 INFO     Successfully processed email to Denmark (zulip.com)
Running zerver.tests.test_email_mirror.TestReplyExtraction.test_reply_is_extracted_from_plain
2016-09-18 14:08:28,326 INFO     Successfully processed email to Denmark (zulip.com)
Running zerver.tests.test_email_mirror.TestStreamEmailMessagesEmptyBody.test_receive_stream_email_messages_empty_body
Running zerver.tests.test_email_mirror.TestStreamEmailMessagesSuccess.test_receive_stream_email_messages_success
2016-09-18 14:08:28,386 INFO     Successfully processed email to Denmark (zulip.com)
Running zerver.tests.test_events.EventQueueTest.test_collapse_event
...
Running zerver.tests.test_management_commands.TestSendWebhookFixtureMessage.test_check_if_command_exits_when_fixture_path_does_not_exist
Fixture some/fake/path.json does not exist
Running zerver.tests.test_management_commands.TestSendWebhookFixtureMessage.test_check_if_command_exits_when_url_param_is_empty

The problem is that unexpected logging output appearing in the test output is often a sign that a bug has been introduced, and we'd like to catch those bugs automatically.

(There are a few types of output, like that generated by @skip or the "TOO SLOW" warnings, that we probably want to let be).

Ideally, we'd adjust those tests to not end up printing output, usually by changing the test code to capture the logging output and verify it (e.g. using mock.patch("logging.info") to mock the logging module). And then we'd make some sort of change to the test runner to make sure new regressions of this form are not introduced.

aayushagrawal9 added a commit to aayushagrawal9/zulip that referenced this issue Mar 11, 2018
…p#1587

Here we create a new tool tools/test-backend-wrapper which calls test-backend
and saves the output to var/test-backend-output.txt. A python file
tools/test-backend-output-check.py prints all lines of output that do
not match any of the regular expressions defined inside corresponding to
all valid output formats
aayushagrawal9 added a commit to aayushagrawal9/zulip that referenced this issue Mar 11, 2018
…p#1587.

Create new wrapper tools/test-backend-wrapper to save output of
test-backend to var/test-backend-output.txt
Create new tool tools/test-backend-output-check to parse output of
test-backend and print any invalid output lines
aayushagrawal9 added a commit to aayushagrawal9/zulip that referenced this issue Mar 11, 2018
…p#1587.

Create new wrapper tools/test-backend-wrapper to save output of
test-backend to var/test-backend-output.txt
Create new tool tools/test-backend-output-check to parse output of
test-backend and print any invalid output lines
aayushagrawal9 added a commit to aayushagrawal9/zulip that referenced this issue Mar 11, 2018
…p#1587.

Create new wrapper tools/test-backend-wrapper to save output of
test-backend to var/test-backend-output.txt
Create new tool tools/test-backend-output-check to parse output of
test-backend and print any invalid output lines
@timabbott timabbott changed the title Verify test-backend doesn't produce extra output spam in Travis CI Verify test-backend doesn't produce extra output spam in CI Jun 19, 2020
@thedeveloperr
Copy link
Collaborator

@timabbott what should we do if we a spammy output while verification ? Should we raise some kind of error ?

@timabbott
Copy link
Sponsor Member Author

Yes, that's the plan, though of course 95% of the work here is going to be clearing the existing spammy output.

@thedeveloperr
Copy link
Collaborator

@zulipbot claim

@zulipbot
Copy link
Member

zulipbot commented Jul 9, 2020

Hello @thedeveloperr, you have been unassigned from this issue because you have not updated this issue or any referenced pull requests for over 14 days.

You can reclaim this issue or claim any other issue by commenting @zulipbot claim on that issue.

Thanks for your contributions, and hope to see you again soon!

@thedeveloperr
Copy link
Collaborator

Working

@thedeveloperr
Copy link
Collaborator

thedeveloperr commented Jul 11, 2020

@timabbott
Before fixing I want to confirm if any of below line pattern is spam according to you ? And we don't want them in test output

Taken this from PR #8653

valid_line_patterns = [
    "^Running zerver",  # Example: Running zerver.tests.test_attachments.AttachmentsTests.test_delete_unauthenticated
    "^Running analytics",  # Example: Running analytics.tests.test_counts.TestActiveUsersAudit.test_empty_realm_or_user_with_no_relevant_activity
    "^\*\* Test is TOO slow: ",  # Example: ** Test is TOO slow: analytics.tests.test_counts.TestRealmActiveHumans.test_end_to_end (0.581 s)
    "^----------------------------------------------------------------------",  # Example: ----------------------------------------------------------------------
    "^INFO: URL coverage report is in",  # Example: INFO: URL coverage report is in var/url_coverage.txt
    "^INFO: Try running:",  # Example: INFO: Try running: ./tools/create-test-api-docs
    "^-- Running tests in",  # Example: -- Running tests in parallel mode with 4 processes
    "^OK",  # Example: OK
    "^Ran [0-9]+ tests in",  # Example: Ran 2139 tests in 115.659s
    "^Testing management command:",  # Testing management command: sync_api_key
    "^Running Slack Message Conversion test:",  # Running Slack Message Conversion test: bold_and_strike_conversion
    "^Destroying test database for alias ",  # Destroying test database for alias 'default'...
    "^Running corporate"
]

@thedeveloperr
Copy link
Collaborator

https://gist.github.com/thedeveloperr/cdc55a319d1782d12e5442c2b6c0125d/revisions

Here is the current potential spam I found. Please take a look at it once and flag any lines which should not be considered spam.

@timabbott
Copy link
Sponsor Member Author

The Running lines come from test-backend itself and I don't consider to be spam. Nor the OK, Ran, etc that test-backend itself outputs. The spam is other output printed while running individual tests.

@timabbott
Copy link
Sponsor Member Author

Using existing clone for alias 'default'... is from the test system as is "Skipping" but the rest on that looks like spam to me.

@thedeveloperr
Copy link
Collaborator

@zulipbot claim Still working on it

thedeveloperr added a commit to thedeveloperr/zulip that referenced this issue Aug 22, 2020
This commit adds automatic spam detection in output printed to
stderr and stdout by test-backend when it is run with flag
--ban-extra-output.
It also prints the test file and method that produced the
spam even in parallel running mode.
Fixes: zulip#1587.
thedeveloperr added a commit to thedeveloperr/zulip that referenced this issue Aug 23, 2020
This commit adds automatic spam detection in output printed to
stderr and stdout by test-backend when it is run with flag
--ban-extra-output.
It also prints the test file and method that produced the
spam even in parallel running mode.
Fixes: zulip#1587.
thedeveloperr added a commit to thedeveloperr/zulip that referenced this issue Aug 24, 2020
This commit adds automatic spam detection in output printed to
stderr and stdout by test-backend when it is run with flag
--ban-extra-output.
It also prints the test file and method that produced the
spam even in parallel running mode.
Fixes: zulip#1587.
thedeveloperr added a commit to thedeveloperr/zulip that referenced this issue Aug 26, 2020
This commit adds automatic detection of extra output (other than
printed by testing library or tools) in stderr and stdout by code under
test test-backend when it is run with flag --ban-console-output.
It also prints the test that produced the extra console output.

Fixes: zulip#1587.
thedeveloperr added a commit to thedeveloperr/zulip that referenced this issue Aug 26, 2020
This commit adds automatic detection of extra output (other than
printed by testing library or tools) in stderr and stdout by code under
test test-backend when it is run with flag --ban-console-output.
It also prints the test that produced the extra console output.

Fixes: zulip#1587.
timabbott pushed a commit to thedeveloperr/zulip that referenced this issue Aug 26, 2020
This commit adds automatic detection of extra output (other than
printed by testing library or tools) in stderr and stdout by code under
test test-backend when it is run with flag --ban-console-output.
It also prints the test that produced the extra console output.

Fixes: zulip#1587.
chdinesh1089 pushed a commit to chdinesh1089/zulip that referenced this issue Aug 30, 2020
This commit adds automatic detection of extra output (other than
printed by testing library or tools) in stderr and stdout by code under
test test-backend when it is run with flag --ban-console-output.
It also prints the test that produced the extra console output.

Fixes: zulip#1587.
Amitsinghyadav pushed a commit to Amitsinghyadav/zulip that referenced this issue Sep 20, 2020
This commit adds automatic detection of extra output (other than
printed by testing library or tools) in stderr and stdout by code under
test test-backend when it is run with flag --ban-console-output.
It also prints the test that produced the extra console output.

Fixes: zulip#1587.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants