Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"enableSoftAssert": true does not allow cy.vrtTrack() function to run in async and allow tests to run independently #170

Open
alexniculae opened this issue Mar 29, 2022 · 5 comments
Labels
bug Something isn't working question Further information is requested

Comments

@alexniculae
Copy link

alexniculae commented Mar 29, 2022

Tested the below use case with the Cypress-Agent:

  • in the vrt.json file in the root of the project add "enableSoftAssert": true
  • for a simple test that runs in 1 sec create 10 vrtTracks at the end of the test
  • the test now takes 70-80 seconds to finish

Is there any way to allow the tests to run async and independent of the engine of VRT, at least when the enableSoftAssert is set to true?

If we don't want to fail the tests because of the results of the visual test, then at least we could leverage the speed of the tests.

Let me know if this issue should actually be moved to the js-sdk - was not 100% sure, but that might be best 🙂

@pashidlos
Copy link
Member

@alexniculae thanks for your report!

there is no restriction on async test execution using this option
my bet that this is happening because of re-try mechanism that is currently does not respect enableSoftAssert option

could you try changing default retry number like here and check the timing

cy.vrtTrack("Whole page with additional options", {
  retryLimit: 0,
});

@pashidlos pashidlos added bug Something isn't working question Further information is requested labels Mar 30, 2022
@alexniculae
Copy link
Author

alexniculae commented Mar 30, 2022

@pashidlos, thank you 🙂 the retries are already set to 0 in this case

what I see though is that with enableSoftAssert set to true or false the time it takes for the tests to run is the same

due to this matter, I assumed that with the option set to either true or false, the tests are waiting for the VRT response/result before being allowed to finish/proceed

i tried looking into the VRT code to see if there is anything that keeps the tests ‘blocked’ until VRT finishes processing, but am not perfectly sure if this logic could be coming from here:

(result) => checkResult(result),

so more rather then a bug, I’d see this ticket as a feature, to allow the (Cypress) tests to finish/proceed before VRT finishes its processing and replies with its results

so the VRT engine does its work in the background, independent from (in this case) Cypress, as the Cypress test is not dependent on the VRT result/response

looking forward for your thoughts

later edit: probably the Cypress agent should not wait for the VRT response though, only in the case where both of the following conditions are met:

enableSoftAssert: true && retryLimit: 0

@pashidlos
Copy link
Member

@alexniculae you are right, the test is waiting the response from VRT in order to know the result
not sure how this will work if image comparison result would be sent async
so far I see one change for sure:

  • ignore retry count if enableSoftAssert is true

another thing is to make image comparison async - not sure if this is currently possible
do you have any examples where it's done like this in other plugins or visual testing tool?

I would first try to review if it's possible to speed up response from VRT
what do you think?

@alexniculae
Copy link
Author

not sure how this will work if image comparison result would be sent async

@pashidlos, I see your point here; but in the following scenario:

enableSoftAssert: true
&&
retryLimit: 0

there is no longer any reason to expect the answer from the image comparison; no matter what the result of the visual test would be, it would be irrelevant for the e2e test; or is there something that i didn’t see? 🙂

for the rest of the configuration options (e.g. enableSoftAssert: false OR retryLimit > 0), this would no longer be a valid option though, as (in this case) Cypress needs to know the result in order to handle the behavior.

possibly this is a simplistic approach, but a ternary to avoid waiting for the visual test result in the first configuration example above might save a few ms per print-screen

what do you think?

so far I see one change for sure:

  • ignore retry count if enableSoftAssert is true

I would avoid doing this, for the following scenario(s):

  • I treat e2e tests and visual regression tests differently and separately - even though they live together (e2e tests test functionality, visual…visual 🙂)
  • but i do want to leverage the power and the (already existing) scripts of the e2e tests for the visual tests (no better way)
  • so i never want my e2e tests to fail because of the visual tests result, but I do want to have a report for my visual tests to see what happens there (this is where the VRT dashboard has all this power behind it)
  • given this matter, let’s say I want enableSoftAssert: true && retryLimit: 2 - because I want Cypress to take care of the functional tests (e2e) but also want VRT to try again in case the visual test has failed (and so have a stable and reliable visual tests report)
  • just to mention this though, I would very rarely ever set the retry on VRT to anything but 0; to me the e2e test should make all the necessary assertions required and be prepared in such a way that the state of the app when doing the visual test should always be the same - so no real reason for the visual test to fail (and if the visual test fails, it probably means there’s a flake somewhere in the test); of course this is not 100% always true, so this is why I would recommend against removing the reties logic when enableSoftAssert: true

another thing is to make image comparison async - not sure if this is currently possible

from my point of view, this is only doable in the case where e2e tests don’t need to wait for the response from VRT (as per first point in the current comment)

I assume that the comparison engine only needs to receive the screenshot (+details/options) from the e2e test and then (if the e2e test does not need to wait for the visual test response) the engine does the comparison by itself and is in no way interdependent on the e2e test

so, again the (probably over-simplistic) solution of adding a decision of waiting or not for the response in some cases (enableSoftAssert: true && retryLimit: 0)

I would first try to review if it's possible to speed up response from VRT

that is also already great as well, but I see my proposal above as going great lengths to optimizing performance in some cases as well (such as the scenario of monitoring the visual tests independently of the e2e ones) 🙂

long message, but i wanted to share the details, as I am pretty sure we have different ways of using and leveraging the power of VRT, which is great, because good things come when mixing opinions and approaches 🙂

in case you’d like that we discuss this further, I’d even be glad to have a call on the topic

@alexniculae
Copy link
Author

another way this could then be made easier for the users (but these are different issues that we need to consider and log in their own time) is that the visual tests would then no longer serve a purpose in blocking the e2e tests when running the image comparison, and could then run the comparison in async, and:

  • have the results be integrated in the CI pipeline, so we’d see the end results of the comparison in near real time there
  • have a Slack/Teams/others integration to receive a results report summary on a channel there
  • continue enhancing the dashboard for it to become a structured reporting dashboard for the comparison (along with the management features for the comparisons)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants