Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Linux web_long_running_tests_2_5 is 2.04% flaky #143834

Closed
fluttergithubbot opened this issue Feb 21, 2024 · 16 comments
Closed

Linux web_long_running_tests_2_5 is 2.04% flaky #143834

fluttergithubbot opened this issue Feb 21, 2024 · 16 comments
Assignees
Labels
c: flake Tests that sometimes, but not always, incorrectly pass P0 Critical issues such as a build break or regression team-web Owned by Web platform team

Comments

@fluttergithubbot
Copy link
Contributor

The post-submit test builder Linux web_long_running_tests_2_5 had a flaky ratio 2.04% for the past (up to) 100 commits, which is above our 2.00% threshold.

One recent flaky example for a same commit: https://ci.chromium.org/ui/p/flutter/builders/prod/Linux%20web_long_running_tests_2_5/14957
Commit: fe4ab13

Flaky builds:
https://ci.chromium.org/ui/p/flutter/builders/prod/Linux%20web_long_running_tests_2_5/14957
https://ci.chromium.org/ui/p/flutter/builders/prod/Linux%20web_long_running_tests_2_5/14953

Recent test runs:
https://flutter-dashboard.appspot.com/#/build?taskFilter=Linux%20web_long_running_tests_2_5

Please follow https://github.com/flutter/flutter/wiki/Reducing-Test-Flakiness#fixing-flaky-tests to fix the flakiness and enable the test back after validating the fix (internal dashboard to validate: go/flutter_test_flakiness).

@fluttergithubbot fluttergithubbot added c: flake Tests that sometimes, but not always, incorrectly pass P0 Critical issues such as a build break or regression team-web Owned by Web platform team labels Feb 21, 2024
@yjbanov
Copy link
Contributor

yjbanov commented Feb 27, 2024

No update yet. Just started tracking this yesterday.

@yjbanov
Copy link
Contributor

yjbanov commented Mar 6, 2024

Still no update.

@fluttergithubbot
Copy link
Contributor Author

[prod pool] flaky ratio for the past (up to) 100 commits between 2024-02-28 and 2024-03-05 is 0.00%. Flaky number: 0; total number: 100.

@yjbanov
Copy link
Contributor

yjbanov commented Mar 12, 2024

I haven't made any progress on this.

@fluttergithubbot
Copy link
Contributor Author

@harryterkelsen
Copy link
Contributor

Possibly due to #109474 ?

The test just times out while trying to connect

@fluttergithubbot
Copy link
Contributor Author

[prod pool] flaky ratio for the past (up to) 100 commits between 2024-03-13 and 2024-03-19 is 1.01%. Flaky number: 1; total number: 99.
One recent flaky example for a same commit: https://ci.chromium.org/ui/p/flutter/builders/prod/Linux%20web_long_running_tests_2_5/15456
Commit: ca864c0
Flaky builds:
https://ci.chromium.org/ui/p/flutter/builders/prod/Linux%20web_long_running_tests_2_5/15456

Recent test runs:
https://flutter-dashboard.appspot.com/#/build?taskFilter=Linux%20web_long_running_tests_2_5

@eyebrowsoffire
Copy link
Contributor

This has gotten even more flaky lately. It seems to always flake on text_editing_integration.dart, so I'm just going to disable it for now. I spent some time reproducing locally and with the browser open, clearly a lot of wacky stuff is going on but I didn't figure out the exact root cause. We should dig deeper on this and get this re-enabled.

@harryterkelsen
Copy link
Contributor

Continuing to investigate this issue. The cause is still unknown.

@eyebrowsoffire
Copy link
Contributor

eyebrowsoffire commented Apr 8, 2024

I seem to be reproducing this now locally, so I have a bit more info to share. For one thing, I believe these flakes seem to only happen when running flutter drive in --debug mode. Doing this in --release, which uses dart2js, doesn't seem to have this same type of flaky hanging.

My repro steps are as follows:

flutter drive --driver=test_driver/integration_test.dart --target=integration_test/example_test.dart --browser-name=chrome -d web-server --debug --web-renderer=canvaskit --no-headless

This doesn't reproduce 100% of the time, but it doesn't usually take more than a couple of tries and the browser will stall. You can look in the JS console when this happens, and there is usually some sort of error related to module loading. Here is one such example:

dart_sdk.js:56 Uncaught TypeError: Cannot read properties of undefined (reading 'Symbol(_privateNames)')
    at dart.privateName (dart_sdk.js:56:24)
    at load__packages__flutter_test__src__test_text_input_key_handler_dart (test_text_input_key_handler.dart.js:1057:23)
    at Object.execCb (require.js:1696:33)
    at Module.check (require.js:883:51)
    at Module.<anonymous> (require.js:1139:34)
    at require.js:134:23
    at require.js:1189:21
    at each (require.js:59:31)
    at Module.emit (require.js:1188:17)
    at Module.check (require.js:938:30)
    at Module.<anonymous> (require.js:1139:34)
    at require.js:134:23
    at require.js:1189:21
    at each (require.js:59:31)
    at Module.emit (require.js:1188:17)
    at Module.check (require.js:938:30)
    at Module.<anonymous> (require.js:1139:34)
    at require.js:134:23
    at require.js:1189:21
    at each (require.js:59:31)
    at Module.emit (require.js:1188:17)
    at Module.check (require.js:938:30)
    at Module.<anonymous> (require.js:1139:34)
    at require.js:134:23
    at require.js:1189:21
    at each (require.js:59:31)
    at Module.emit (require.js:1188:17)
    at Module.check (require.js:938:30)
    at Module.<anonymous> (require.js:1139:34)
    at require.js:134:23
    at require.js:1189:21
    at each (require.js:59:31)
    at Module.emit (require.js:1188:17)
    at Module.check (require.js:938:30)
    at Module.<anonymous> (require.js:1139:34)
    at require.js:134:23
    at require.js:1189:21
    at each (require.js:59:31)
    at Module.emit (require.js:1188:17)
    at Module.check (require.js:938:30)
    at Module.<anonymous> (require.js:1139:34)
    at require.js:134:23
    at require.js:1189:21
    at each (require.js:59:31)
    at Module.emit (require.js:1188:17)
    at Module.check (require.js:938:30)
    at Module.<anonymous> (require.js:1139:34)
    at require.js:134:23
    at require.js:1189:21
    at each (require.js:59:31)
    at Module.emit (require.js:1188:17)
    at Module.check (require.js:938:30)
    at Module.enable (require.js:1176:22)
    at Module.init (require.js:788:26)
    at callGetModule (require.js:1203:63)
    at Object.completeLoad (require.js:1590:21)
    at HTMLScriptElement.onScriptLoad (require.js:1717:29)

The stacks seem to vary across different repros, but it always appears to be some sort of issue with module loading. It usually appears that the JavaScript assumes that some other script has run and has set some global state.

@fluttergithubbot
Copy link
Contributor Author

[prod pool] flaky ratio for the past (up to) 100 commits between 2024-04-03 and 2024-04-09 is 0.00%. Flaky number: 0; total number: 94.

@fluttergithubbot
Copy link
Contributor Author

[prod pool] flaky ratio for the past (up to) 100 commits between 2024-04-08 and 2024-04-16 is 0.00%. Flaky number: 0; total number: 98.

@yjbanov
Copy link
Contributor

yjbanov commented Apr 22, 2024

It seems these flakes migrated into #146877. Deduping.

@yjbanov yjbanov closed this as completed Apr 22, 2024
Copy link

github-actions bot commented May 6, 2024

This thread has been automatically locked since there has not been any recent activity after it was closed. If you are still experiencing a similar issue, please open a new bug, including the output of flutter doctor -v and a minimal reproduction of the issue.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 6, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
c: flake Tests that sometimes, but not always, incorrectly pass P0 Critical issues such as a build break or regression team-web Owned by Web platform team
Projects
None yet
Development

No branches or pull requests

4 participants