New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run Jest in production mode #11616

Merged
merged 35 commits into from Nov 22, 2017

Conversation

Projects
None yet
3 participants
@gaearon
Member

gaearon commented Nov 21, 2017

Supersedes #10273.

See individual commits as there’s quite a bit here.

This is almost ready. The rest are actual small differences (e.g. less detailed messages) that I’ll need to fix in the tests themselves. After those are fixed, I’ll add yarn test-prod to the CI runs.

Test Suites: 17 failed, 104 passed, 121 total
Tests:       41 failed, 26 skipped, 2489 passed, 2556 total

gaearon added some commits Nov 21, 2017

Fix the equivalence test
It fails because the config is now passed to Jest explicitly.
But the test doesn't know about the config.

To fix this, we just run it via `yarn test` (which includes the config).
We already depend on Yarn for development anyway.
Actually flip the production tests to run in prod environment
This produces a bunch of errors:

Test Suites: 64 failed, 58 passed, 122 total
Tests:       740 failed, 26 skipped, 1809 passed, 2575 total
Snapshots:   16 failed, 4 passed, 20 total
Ignore expectDev() calls in production
Down from 740 to 175 failed.

Test Suites: 44 failed, 78 passed, 122 total
Tests:       175 failed, 26 skipped, 2374 passed, 2575 total
Snapshots:   16 failed, 4 passed, 20 total
Decode errors so tests can assert on their messages
Down from 175 to 129.

Test Suites: 33 failed, 89 passed, 122 total
Tests:       129 failed, 1029 skipped, 1417 passed, 2575 total
Snapshots:   16 failed, 4 passed, 20 total
Remove ReactDOMProduction-test
There is no need for it now. The only test that was special is moved into ReactDOM-test.
Remove production switches from ReactErrorUtils
The tests now run in production in a separate pass.
Add and use spyOnDev() for warnings
This ensures that by default we expect no warnings in production bundles.
If the warning *is* expected, use the regular spyOn() method.

This currently breaks all expectDev() assertions without __DEV__ blocks so we go back to:

Test Suites: 56 failed, 65 passed, 121 total
Tests:       379 failed, 1029 skipped, 1148 passed, 2556 total
Snapshots:   16 failed, 4 passed, 20 total
Replace expectDev() with expect() in __DEV__ blocks
We started using spyOnDev() for console warnings to ensure we don't *expect* them to occur in production. As a consequence, expectDev() assertions on console.error.calls fail because console.error.calls doesn't exist. This is actually good because it would help catch accidental warnings in production.

To solve this, we are getting rid of expectDev() altogether, and instead introduce explicit expectation branches. We'd need them anyway for testing intentional behavior differences.

This commit replaces all expectDev() calls with expect() calls in __DEV__ blocks. It also removes a few unnecessary expect() checks that no warnings were produced (by also removing the corresponding spyOnDev() calls).

Some DEV-only assertions used plain expect(). Those were also moved into __DEV__ blocks.

ReactFiberErrorLogger was special because it console.error()'s in production too. So in that case I intentionally used spyOn() instead of spyOnDev(), and added extra assertions.

This gets us down to:

Test Suites: 21 failed, 100 passed, 121 total
Tests:       72 failed, 26 skipped, 2458 passed, 2556 total
Snapshots:   16 failed, 4 passed, 20 total
Enable User Timing API for production testing
We could've disabled it, but seems like a good idea to test since we use it at FB.
Test for explicit Object.freeze() differences between PROD and DEV
This is one of the few places where DEV and PROD behavior differs for performance reasons.
Now we explicitly test both branches.
@bvaughn

This comment has been minimized.

Show comment
Hide comment
@bvaughn

bvaughn Nov 21, 2017

Contributor

I see a couple of failures from yarn test-prod that I wouldn't expect to be related to this change, eg: "should throw with non-object in the initial state property" (from the various class-equivalency tests)

    Expected value to be (using ===):
      "span"
    Received:
      "SPAN"

That's a bit odd though b'c you didn't even change this line with the diff.

Edit FWIW, it looks like the checkClassInstance method that throws this error is only run in dev, so these tests should probably be DEV-only. Maybe we should change them too? eg

diff --git a/packages/react/src/__tests__/ReactES6Class-test.js b/packages/react/src/__tests__/ReactES6Class-test.js
index 0df8eaa9e..4f4a93e4e 100644
--- a/packages/react/src/__tests__/ReactES6Class-test.js
+++ b/packages/react/src/__tests__/ReactES6Class-test.js
@@ -163,7 +163,7 @@ describe('ReactES6Class', () => {
     expect(renderCount).toBe(1);
   });
 
-  it('should throw with non-object in the initial state property', () => {
+  it('should throw (in dev) with non-object in the initial state property', () => {
     [['an array'], 'a string', 1234].forEach(function(state) {
       class Foo extends React.Component {
         constructor() {
@@ -174,9 +174,13 @@ describe('ReactES6Class', () => {
           return <span />;
         }
       }
-      expect(() => test(<Foo />, 'span', '')).toThrowError(
-        'Foo.state: must be set to an object or null',
-      );
+      if (__DEV__) {
+        expect(() => test(<Foo />, 'SPAN', '')).toThrowError(
+          'Foo.state: must be set to an object or null',
+        );
+      } else {
+        () => test(<Foo />, 'SPAN', '');
+      }
     });
   }); 
Contributor

bvaughn commented Nov 21, 2017

I see a couple of failures from yarn test-prod that I wouldn't expect to be related to this change, eg: "should throw with non-object in the initial state property" (from the various class-equivalency tests)

    Expected value to be (using ===):
      "span"
    Received:
      "SPAN"

That's a bit odd though b'c you didn't even change this line with the diff.

Edit FWIW, it looks like the checkClassInstance method that throws this error is only run in dev, so these tests should probably be DEV-only. Maybe we should change them too? eg

diff --git a/packages/react/src/__tests__/ReactES6Class-test.js b/packages/react/src/__tests__/ReactES6Class-test.js
index 0df8eaa9e..4f4a93e4e 100644
--- a/packages/react/src/__tests__/ReactES6Class-test.js
+++ b/packages/react/src/__tests__/ReactES6Class-test.js
@@ -163,7 +163,7 @@ describe('ReactES6Class', () => {
     expect(renderCount).toBe(1);
   });
 
-  it('should throw with non-object in the initial state property', () => {
+  it('should throw (in dev) with non-object in the initial state property', () => {
     [['an array'], 'a string', 1234].forEach(function(state) {
       class Foo extends React.Component {
         constructor() {
@@ -174,9 +174,13 @@ describe('ReactES6Class', () => {
           return <span />;
         }
       }
-      expect(() => test(<Foo />, 'span', '')).toThrowError(
-        'Foo.state: must be set to an object or null',
-      );
+      if (__DEV__) {
+        expect(() => test(<Foo />, 'SPAN', '')).toThrowError(
+          'Foo.state: must be set to an object or null',
+        );
+      } else {
+        () => test(<Foo />, 'SPAN', '');
+      }
     });
   }); 
@bvaughn

This comment has been minimized.

Show comment
Hide comment
@bvaughn

bvaughn Nov 21, 2017

Contributor

There's almost no differences between scripts/jest/dev and scripts/jest/prod except for process.env.NODE_ENV in processor.js and some lines in test-framework-setup.js.

I wonder if you considered using a shared folder for config.js, environment.js, preprocessor.js, setup.js to make maintenance easier later?

I know our team generally seems to be in favor of forking things but that still sits weirdly with me.

Contributor

bvaughn commented Nov 21, 2017

There's almost no differences between scripts/jest/dev and scripts/jest/prod except for process.env.NODE_ENV in processor.js and some lines in test-framework-setup.js.

I wonder if you considered using a shared folder for config.js, environment.js, preprocessor.js, setup.js to make maintenance easier later?

I know our team generally seems to be in favor of forking things but that still sits weirdly with me.

@bvaughn

This comment has been minimized.

Show comment
Hide comment
@bvaughn

bvaughn Nov 21, 2017

Contributor

I think CI might be failing because you added require('shared/ReactFeatureFlags').enableUserTimingAPI to ReactIncrementalPerf-test.js and packages/shared/ReactFeatureFlags.js uses ES module imports?

Contributor

bvaughn commented Nov 21, 2017

I think CI might be failing because you added require('shared/ReactFeatureFlags').enableUserTimingAPI to ReactIncrementalPerf-test.js and packages/shared/ReactFeatureFlags.js uses ES module imports?

@bvaughn

In general, this looks great.

I'm on the fence about some of the duplication. (Left comments.) But I wouldn't block the PR for it. Also left a note about why I think CI is failing, at a glance.

Happy to help clean things up further- particularly the component stack assertions- after this PR is merged if you'd like a hand.

Going to give it the 👍 now even though CI is failing because I trust you.

@gaearon

This comment has been minimized.

Show comment
Hide comment
@gaearon

gaearon Nov 21, 2017

Member

I wonder if you considered using a shared folder for config.js, environment.js, preprocessor.js, setup.js to make maintenance easier later?

That was the plan per our earlier conversation in the chat—to unify shared parts after we know everything works (it doesn't yet). Thanks for reviewing!

Member

gaearon commented Nov 21, 2017

I wonder if you considered using a shared folder for config.js, environment.js, preprocessor.js, setup.js to make maintenance easier later?

That was the plan per our earlier conversation in the chat—to unify shared parts after we know everything works (it doesn't yet). Thanks for reviewing!

@gaearon

This comment has been minimized.

Show comment
Hide comment
@gaearon

gaearon Nov 21, 2017

Member

To clarify, the outstanding TODOs as I see them in the scope of this PR:

  • Fix CI
  • Fix remaining tests in PROD mode
  • Remove duplication between configurations
Member

gaearon commented Nov 21, 2017

To clarify, the outstanding TODOs as I see them in the scope of this PR:

  • Fix CI
  • Fix remaining tests in PROD mode
  • Remove duplication between configurations
@gaearon

This comment has been minimized.

Show comment
Hide comment
@gaearon

gaearon Nov 21, 2017

Member

I think CI might be failing because you added require('shared/ReactFeatureFlags').enableUserTimingAPI to ReactIncrementalPerf-test.js and packages/shared/ReactFeatureFlags.js uses ES module imports?

Can you expand on this? We override feature flags this way in a few other tests and it seems to work. npm test also passes locally.

Member

gaearon commented Nov 21, 2017

I think CI might be failing because you added require('shared/ReactFeatureFlags').enableUserTimingAPI to ReactIncrementalPerf-test.js and packages/shared/ReactFeatureFlags.js uses ES module imports?

Can you expand on this? We override feature flags this way in a few other tests and it seems to work. npm test also passes locally.

@bvaughn

This comment has been minimized.

Show comment
Hide comment
@bvaughn

bvaughn Nov 21, 2017

Contributor

To clarify, the outstanding TODOs as I see them in the scope of this PR...

Those TODOs look good.

Can you expand on this? We override feature flags this way in a few other tests and it seems to work.

Shoot. You're right. I was mistaken about that. ☹️ Sorry!

Contributor

bvaughn commented Nov 21, 2017

To clarify, the outstanding TODOs as I see them in the scope of this PR...

Those TODOs look good.

Can you expand on this? We override feature flags this way in a few other tests and it seems to work.

Shoot. You're right. I was mistaken about that. ☹️ Sorry!

gaearon added some commits Nov 22, 2017

Fix error handling tests
This logic is really complicated because of the global ReactFiberErrorLogger mock.
I understand it now, so I added TODOs for later.

It can be much simpler if we change the rest of the tests that assert uncaught errors to also assert they are logged as warnings.
Which mirrors what happens in practice anyway.

@gaearon gaearon merged commit 6041f48 into facebook:master Nov 22, 2017

1 check passed

ci/circleci Your tests passed on CircleCI!
Details

@gaearon gaearon deleted the gaearon:prod-tests-2 branch Nov 22, 2017

@gaearon

This comment has been minimized.

Show comment
Hide comment
@gaearon

gaearon Nov 22, 2017

Member

Filed #11626 and #11618 as followups.

Member

gaearon commented Nov 22, 2017

Filed #11626 and #11618 as followups.

raphamorim added a commit to raphamorim/react that referenced this pull request Nov 27, 2017

Run Jest in production mode (facebook#11616)
* Move Jest setup files to /dev/ subdirectory

* Clone Jest /dev/ files into /prod/

* Move shared code into scripts/jest

* Move Jest config into the scripts folder

* Fix the equivalence test

It fails because the config is now passed to Jest explicitly.
But the test doesn't know about the config.

To fix this, we just run it via `yarn test` (which includes the config).
We already depend on Yarn for development anyway.

* Add yarn test-prod to run Jest with production environment

* Actually flip the production tests to run in prod environment

This produces a bunch of errors:

Test Suites: 64 failed, 58 passed, 122 total
Tests:       740 failed, 26 skipped, 1809 passed, 2575 total
Snapshots:   16 failed, 4 passed, 20 total

* Ignore expectDev() calls in production

Down from 740 to 175 failed.

Test Suites: 44 failed, 78 passed, 122 total
Tests:       175 failed, 26 skipped, 2374 passed, 2575 total
Snapshots:   16 failed, 4 passed, 20 total

* Decode errors so tests can assert on their messages

Down from 175 to 129.

Test Suites: 33 failed, 89 passed, 122 total
Tests:       129 failed, 1029 skipped, 1417 passed, 2575 total
Snapshots:   16 failed, 4 passed, 20 total

* Remove ReactDOMProduction-test

There is no need for it now. The only test that was special is moved into ReactDOM-test.

* Remove production switches from ReactErrorUtils

The tests now run in production in a separate pass.

* Add and use spyOnDev() for warnings

This ensures that by default we expect no warnings in production bundles.
If the warning *is* expected, use the regular spyOn() method.

This currently breaks all expectDev() assertions without __DEV__ blocks so we go back to:

Test Suites: 56 failed, 65 passed, 121 total
Tests:       379 failed, 1029 skipped, 1148 passed, 2556 total
Snapshots:   16 failed, 4 passed, 20 total

* Replace expectDev() with expect() in __DEV__ blocks

We started using spyOnDev() for console warnings to ensure we don't *expect* them to occur in production. As a consequence, expectDev() assertions on console.error.calls fail because console.error.calls doesn't exist. This is actually good because it would help catch accidental warnings in production.

To solve this, we are getting rid of expectDev() altogether, and instead introduce explicit expectation branches. We'd need them anyway for testing intentional behavior differences.

This commit replaces all expectDev() calls with expect() calls in __DEV__ blocks. It also removes a few unnecessary expect() checks that no warnings were produced (by also removing the corresponding spyOnDev() calls).

Some DEV-only assertions used plain expect(). Those were also moved into __DEV__ blocks.

ReactFiberErrorLogger was special because it console.error()'s in production too. So in that case I intentionally used spyOn() instead of spyOnDev(), and added extra assertions.

This gets us down to:

Test Suites: 21 failed, 100 passed, 121 total
Tests:       72 failed, 26 skipped, 2458 passed, 2556 total
Snapshots:   16 failed, 4 passed, 20 total

* Enable User Timing API for production testing

We could've disabled it, but seems like a good idea to test since we use it at FB.

* Test for explicit Object.freeze() differences between PROD and DEV

This is one of the few places where DEV and PROD behavior differs for performance reasons.
Now we explicitly test both branches.

* Run Jest via "yarn test" on CI

* Remove unused variable

* Assert different error messages

* Fix error handling tests

This logic is really complicated because of the global ReactFiberErrorLogger mock.
I understand it now, so I added TODOs for later.

It can be much simpler if we change the rest of the tests that assert uncaught errors to also assert they are logged as warnings.
Which mirrors what happens in practice anyway.

* Fix more assertions

* Change tests to document the DEV/PROD difference for state invariant

It is very likely unintentional but I don't want to change behavior in this PR.
Filed a follow up as facebook#11618.

* Remove unnecessary split between DEV/PROD ref tests

* Fix more test message assertions

* Make validateDOMNesting tests DEV-only

* Fix error message assertions

* Document existing DEV/PROD message difference (possible bug)

* Change mocking assertions to be DEV-only

* Fix the error code test

* Fix more error message assertions

* Fix the last failing test due to known issue

* Run production tests on CI

* Unify configuration

* Fix coverage script

* Remove expectDev from eslintrc

* Run everything in band

We used to before, too. I just forgot to add the arguments after deleting the script.

Ethan-Arrowood added a commit to Ethan-Arrowood/react that referenced this pull request Dec 8, 2017

Run Jest in production mode (facebook#11616)
* Move Jest setup files to /dev/ subdirectory

* Clone Jest /dev/ files into /prod/

* Move shared code into scripts/jest

* Move Jest config into the scripts folder

* Fix the equivalence test

It fails because the config is now passed to Jest explicitly.
But the test doesn't know about the config.

To fix this, we just run it via `yarn test` (which includes the config).
We already depend on Yarn for development anyway.

* Add yarn test-prod to run Jest with production environment

* Actually flip the production tests to run in prod environment

This produces a bunch of errors:

Test Suites: 64 failed, 58 passed, 122 total
Tests:       740 failed, 26 skipped, 1809 passed, 2575 total
Snapshots:   16 failed, 4 passed, 20 total

* Ignore expectDev() calls in production

Down from 740 to 175 failed.

Test Suites: 44 failed, 78 passed, 122 total
Tests:       175 failed, 26 skipped, 2374 passed, 2575 total
Snapshots:   16 failed, 4 passed, 20 total

* Decode errors so tests can assert on their messages

Down from 175 to 129.

Test Suites: 33 failed, 89 passed, 122 total
Tests:       129 failed, 1029 skipped, 1417 passed, 2575 total
Snapshots:   16 failed, 4 passed, 20 total

* Remove ReactDOMProduction-test

There is no need for it now. The only test that was special is moved into ReactDOM-test.

* Remove production switches from ReactErrorUtils

The tests now run in production in a separate pass.

* Add and use spyOnDev() for warnings

This ensures that by default we expect no warnings in production bundles.
If the warning *is* expected, use the regular spyOn() method.

This currently breaks all expectDev() assertions without __DEV__ blocks so we go back to:

Test Suites: 56 failed, 65 passed, 121 total
Tests:       379 failed, 1029 skipped, 1148 passed, 2556 total
Snapshots:   16 failed, 4 passed, 20 total

* Replace expectDev() with expect() in __DEV__ blocks

We started using spyOnDev() for console warnings to ensure we don't *expect* them to occur in production. As a consequence, expectDev() assertions on console.error.calls fail because console.error.calls doesn't exist. This is actually good because it would help catch accidental warnings in production.

To solve this, we are getting rid of expectDev() altogether, and instead introduce explicit expectation branches. We'd need them anyway for testing intentional behavior differences.

This commit replaces all expectDev() calls with expect() calls in __DEV__ blocks. It also removes a few unnecessary expect() checks that no warnings were produced (by also removing the corresponding spyOnDev() calls).

Some DEV-only assertions used plain expect(). Those were also moved into __DEV__ blocks.

ReactFiberErrorLogger was special because it console.error()'s in production too. So in that case I intentionally used spyOn() instead of spyOnDev(), and added extra assertions.

This gets us down to:

Test Suites: 21 failed, 100 passed, 121 total
Tests:       72 failed, 26 skipped, 2458 passed, 2556 total
Snapshots:   16 failed, 4 passed, 20 total

* Enable User Timing API for production testing

We could've disabled it, but seems like a good idea to test since we use it at FB.

* Test for explicit Object.freeze() differences between PROD and DEV

This is one of the few places where DEV and PROD behavior differs for performance reasons.
Now we explicitly test both branches.

* Run Jest via "yarn test" on CI

* Remove unused variable

* Assert different error messages

* Fix error handling tests

This logic is really complicated because of the global ReactFiberErrorLogger mock.
I understand it now, so I added TODOs for later.

It can be much simpler if we change the rest of the tests that assert uncaught errors to also assert they are logged as warnings.
Which mirrors what happens in practice anyway.

* Fix more assertions

* Change tests to document the DEV/PROD difference for state invariant

It is very likely unintentional but I don't want to change behavior in this PR.
Filed a follow up as facebook#11618.

* Remove unnecessary split between DEV/PROD ref tests

* Fix more test message assertions

* Make validateDOMNesting tests DEV-only

* Fix error message assertions

* Document existing DEV/PROD message difference (possible bug)

* Change mocking assertions to be DEV-only

* Fix the error code test

* Fix more error message assertions

* Fix the last failing test due to known issue

* Run production tests on CI

* Unify configuration

* Fix coverage script

* Remove expectDev from eslintrc

* Run everything in band

We used to before, too. I just forgot to add the arguments after deleting the script.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment