Skip to content

tests: Wait for all metrics to show up in ITCompositeTracer#12880

Merged
blakeli0 merged 4 commits intomainfrom
fix-otel-tests
Apr 22, 2026
Merged

tests: Wait for all metrics to show up in ITCompositeTracer#12880
blakeli0 merged 4 commits intomainfrom
fix-otel-tests

Conversation

@blakeli0
Copy link
Copy Markdown
Contributor

@blakeli0 blakeli0 commented Apr 21, 2026

Add a 100ms sleep to wait for all metrics to show up in ITCompositeTracer.
This is because metrics are captured in a separate thread rather than the main thread. See #12657

Skip formatter check in java-dataplex since it is flaky.

@blakeli0 blakeli0 requested a review from a team as a code owner April 21, 2026 19:25
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a 100ms sleep in the ITCompositeTracer test to allow for metrics collection. Feedback indicates that using Thread.sleep() is brittle and can lead to flaky tests, suggesting the use of a polling mechanism like Awaitility instead to ensure more robust and efficient test execution.

.get(AttributeKey.stringKey(ObservabilityAttributes.SERVER_ADDRESS_ATTRIBUTE)))
.isEqualTo(SHOWCASE_SERVER_ADDRESS);

Thread.sleep(100);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Using Thread.sleep() to handle asynchronous metrics collection is brittle and can lead to flaky tests. In a slow or resource-constrained CI environment, 100ms may not be sufficient, while in faster environments, it introduces unnecessary delays. A more robust approach is to use a polling mechanism, such as Awaitility, to wait for the metrics to be available with a reasonable timeout. This ensures the test proceeds as soon as the condition is met and is less sensitive to environment speed. Additionally, when testing for metric generation, it is acceptable to assert the exact number of metrics to ensure no unexpected metrics are produced.

References
  1. When testing for metric generation, it is acceptable to assert the exact number of metrics to ensure no unexpected metrics are produced.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree with this in general, but we don't have Awaitility introduced yet. This PR is to unblock the current CI and it should be a separate effort to introduce Awaitility.

Copy link
Copy Markdown
Contributor

@diegomarquezp diegomarquezp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good as a temporary solution. Also +1 to Awaitability.

@sonarqubecloud
Copy link
Copy Markdown

@sonarqubecloud
Copy link
Copy Markdown

@blakeli0 blakeli0 enabled auto-merge (squash) April 22, 2026 22:36
@blakeli0 blakeli0 merged commit 27feef5 into main Apr 22, 2026
157 checks passed
@blakeli0 blakeli0 deleted the fix-otel-tests branch April 22, 2026 22:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants