Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Separate implementation of jest-snapshot #2497

lithin opened this issue Jan 4, 2017 · 5 comments


None yet
6 participants
Copy link

commented Jan 4, 2017

Do you want to request a feature or report a bug?

Having implemented jest-snapshots in Ava, it seems that a bit more work could be done on separating snapshots into their own package. Three main issues I've ran into are:

  • When a snapshot comparison fails, jest returns a whole message which makes it difficult to tie in with reporting in other test runners
  • When a test has been removed (or even a whole file worth of snapshot tests), jest-snapshot doesn't delete them. It leaves this to the implementing framework although it really is a feature of snapshot testing
  • Watching isn't interactive by default - this is definitely something that needs to happen in the implementation but it would be nice to have a hint on how to get there.

Solution for Reporting
Reporting of failure from jest-snapshot would be better as a data structure (object?) so that the implementing framework decides how to display the information.

Solution for Deleting?
I'm not sure about this one to be honest. These are my thoughts:

When it comes to tests being deleted from a file (so index.test.js changes but the snapshot of it doesn't), it might be sufficient if the state checks at the end of the process that the tests it has run are exactly equal to the snapshots that exist, deleting redundant snapshots. That would mean however, that at some point the implementation needs to nudge the state telling it we have finished running the relevant test file.

Deleting redundant files pose a similar issue, only on higher level. When all tests have finished, we need to check that there are no deleted snapshot test file lying around, throughout the whole project codebase.

Another solution would be to "scan" the test suite before running it and resolving these issues before the tests start running.

I'm sure jest has sorted it out somehow already. Could you briefly explain to me how/where the deletion happens? Might be only a case of moving things around a bit.

Solution for Interactive Watch
Could you point me to where it's written in jest? Might be worth investigating if it is a part of the same problem as the other two issues.


This comment has been minimized.

Copy link

commented Jan 4, 2017

For deleting and keeping the state that's what we do in Jest, and it's not as straightforward because there are exceptions, I remember @dmitriiabramov added some code to catch when one runs a focused test (eg a real case scenario might be while debugging an issue or writing a brand new test) and in that case you don't want to delete all the snapshots.
Also if a test fail you don't want to delete the associated snapshot.
I think everything we know about a test run should now be handled by the State object here:

Edit: I think this is the commit I was thinking about 26478b5


This comment has been minimized.

Copy link

commented Jan 5, 2017

We spent a ton of time at Facebook to get the UX for snapshot testing just right. I think we did well, even though I think we are still at the beginning – there is so much more we can do. I believe it is critical to make the snapshot feature more generic so that bringing snapshots to other frameworks will also result in a great experience. My main worry of adopting jest-snapshot in other projects has been that the integration may not be as great as in Jest and it would steer people away from using this assertion for the wrong reasons. Thanks @lithin for bringing this up, here are my thoughts:

  • Reporting: I believe we now already return the expected and received values so you can do your own pretty-printing. If not, that should be easy to add.
  • Deleting: Jest does a ton of work to keep individual snapshots and snapshot files consistent. We designed the feature so that you never end up having either a test file without a snapshot file, a snapshot file without a test file or with obsolete snapshots within a file. I believe that keeping a project consistent is really imperative for the snapshot feature to be attractive. One thing we haven't gotten to right now is that when an obsolete snapshot (file) is detected, we fail the test run but we don't fail the individual test (if available) that should be failing – this would be good to troubleshoot which snapshot Jest has a problem with. Cleanup and consistency checks can be supported if we make the cleanup interface generic: As Cristian points out, the state management for individual snapshots associated with a test should already be exposed and it is up to the test library to decide how to handle failures and skipped tests.
  • Watch: the watch mode is specific to Jest and can potentially be extracted to be re-used by ava. We are rewriting it in #2362 and there is nothing that prevents it from being extracted into a separate jest-watch package (cc @rogeliog). This of course is only going to work if you are ok with taking on jest-haste-map as a dependency (this is what does all the static analysis for Jest).
  • Summary: While extracting the watch mode would be cool, I think it is much more important to also extract the code from the summary reporter that shows the status of snapshots in the project. See – can we find a way to make this generic so it can be consumed by other frameworks? I think there is value in sharing the same status report.

There are also a ton more things we'd like to add to the snapshot feature, as well as an interactive update mode but that should be discussed in a separate issue.

@thymikee thymikee added the Discussion label Jan 13, 2017


This comment has been minimized.

Copy link

commented Feb 3, 2017

When a snapshot comparison fails, jest returns a whole message which makes it difficult to tie in with reporting in other test runners

As @cpojer mentioned, the actual and expected strings are returned now, we needed it in wallaby.js as well.


This comment has been minimized.

Copy link

commented May 26, 2018

This is one of our oldest issues without an update and we've made a number of improvements to snapshots in the last year. To make some progress I think it would be helpful to understand which parts of this issue are still relevant?


This comment has been minimized.

Copy link

commented May 26, 2018

I think some points here are still valid but there has been little incentive to go after them. We can close this and revive it when we find motivation (ie. another test runner wanting to double down on Jest).

@cpojer cpojer closed this May 26, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.