Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mephisto review testing suite #422

Merged
merged 4 commits into from
Apr 13, 2021
Merged

Mephisto review testing suite #422

merged 4 commits into from
Apr 13, 2021

Conversation

sghmk12
Copy link
Contributor

@sghmk12 sghmk12 commented Mar 26, 2021

Overview

Mephisto review testing suite encorporates end to end testing for the mephisto review service. Testing suite tests for major UI elements on both the AllItemView and ItemView routes as well as the ability for the server to record reviews correctly.

Usage

  1. User must find the path to their preferred python binary when running mephisto review and replace the PYTHON_PATH property on line 34 in tests/test_constants.js to match that path
  2. User must create a new directory to store the app build for tests: mkdir test/integration/review/test_app
  3. User must create a review app named test app by running the following command from /Mephisto:
    npx create-react-app test-review --template file:packages/cra-template-mephisto-review
  4. User must change directory to the new app: cd test-review
  5. User must build app: npm run build
  6. User must copy the build folder into /Mephisto/test/integration/review/test_app/build:
    cp -r build ../test/integration/review/test_app/build
  7. User moves to test directory: cd ../test/integration/review
  8. User runs tests: npm test
  9. Make note of the time constants on line 20 and 22 in the tests/test_constants.js file. This is the time to wait for the server to start and stop. Tests may fail by default on certain devices because these values are too small and the server does not start within the time the tests expect it to. Change as necessary to make sure server has enough time to start and stop.

Key files

  • test_types.js: a file of strings representing different types of tests performed by each test suite. Used to maintain a link between the data in the test_constants file and the data used in tests of similar types.
  • test_constants.js: object of constant values that are shared between different tests of the same test type. New constants can be added or test types by naming an attribute the name of a test type and providing the attribute with a value of another object containing its constants.

Behaviour

  • npm test invokes the following command: jest --runInBand --verbose
    • --runInBand option is used because each test requires interacting with the server on an individual level, this flag allows each test to run independent of each other.
  • jest runs all available .test.js files
  • utilities:
    • server.js:
      • file description: Javascript class for spawning child processes running mephisto review servers in for the purpose of these tests.
      • available methods:
        • start(server_start_args, data_collection_args, delay_in_ms, expected_server_output, stdoutCallback):
          starts child process with given arguments, waits delay_time milliseconds afterwards for server to setup, returns true if server started successfully, false if not. server_start_args is a mandatory argument, must be an array of arguments meant to be used in a node.js spawn function which can start a mephisto review server. data_collection_args is an optional argument, must be an array of arguments for a Node.js spawn which creates standard output that can be piped into the standard input of mephisto review server process to provide the server with data. delay_in_ms is a mandatory argument, must be an integer that represents how long in milliseconds the function waits after creating the review server to test a successful start. expected_server_output is a mandatory argument, must be a string that represents the expected standard output from the mephisto review server upon a successful start. stdoutCallback is an optional argument for a callback that can be attached to the review server's standard output stream. stdCallout must be a function and will receive a data buffer as a parameter.
        • stop(delay_in_ms):
          kills child process review server, returns true if successfully killed, false if not. delay_in_ms represents how long function waits before checking if server was successfully killed. Recommended that delay_in_ms is at least 500ms.
    • test_tools.js:
      • file description: collection of methods to assist test operations.
      • available methods:
        • delay(time): delays program by the specified time in milliseconds.
    • file_management.js:
      • file description: collection of methods to manage file system during tests.
      • available methods:
        • fileAsArray(path, type): parses input files for mephisto review in a manner that mimicks how data is retreived in the mephisto review python server. The type parameter can be "CSV" or null to indicate special file parsing behaviour. File lines are loaded into the array with line number corresponding to the array index with 0-numbered lines. METHOD SHOULD ONLY BE USED FOR SMALL TEXT FILES.
  • tests:
    • all_item_view_ui.test.js:
      uses server.js to start a review server and puppeteer to navigate to the root route of the review server. Checks if all major UI elements are present on AllItemView. Checks include search bar functionality and the presence of all given data points with a unique ID identifier. Checks are performed for single page and multiple pages of both CSV and JSON data. Then proceeds to stop the child process.
    • item_view_ui.test.js:
      uses server.js to start a review server and then uses puppeteer to navigate to the root route of the review server. For each data card present on the review server, navigates to the ItemView route of the data card. Checks if all major UI elements are present on the ItemView. Checks include presence of correct data point as well as unique ID identifier as well as the availability of the two default review buttons. Checks are performed for single page and multiple pages of both CSV and JSON data. Then proceeds to stop the child process.
    • review__results.test.js:
      uses server.js to start a review server. Uses puppeteer to navigate to root route of review app. Navigates to each ItemView for each data point. Approves or Rejects each data point depending on the test. Proceeds to stop review server. Review data ia logged in standard output from the review server. Matches recorded review output to the performed review on each data point. Checks are performed for single page and multiple pages of both CSV and JSON data.

TODO

  • Add argument safety checks in functions.
  • Automate test_app build to reduce usage related errors.
  • Current tests are all for mephisto review in version 2 or "ALL" mode usage. Add tests for legacy or "OBO/one-by-one" mode usage.
  • Add tests for usage with mephistoDB. Might be tricky to make sure all users have the same mephistoDB tasks for the test.

Issues

  • If app does not have HTML element present as expected in test, puppeteer will time out the test and jest will run test clean up (shut down review server). However this process runs past the time jest expects the test to finish in and jest will display an error regarding potentially not shutting down correctly and jest will remain running.
    • Solution: unknown, more research is necessary for how to handle this edge case safely.

…tes as well as review functionality.

Review template and server have been adjusted to accomodate the testing suite.
@sghmk12 sghmk12 requested a review from pringshia March 26, 2021 21:55
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Mar 26, 2021
@codecov-io
Copy link

codecov-io commented Mar 28, 2021

Codecov Report

Merging #422 (2be2d88) into master (533d33d) will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #422   +/-   ##
=======================================
  Coverage   65.59%   65.59%           
=======================================
  Files          76       76           
  Lines        6970     6970           
=======================================
  Hits         4572     4572           
  Misses       2398     2398           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 533d33d...2be2d88. Read the comment docs.

@pringshia
Copy link
Contributor

If app does not have HTML element present as expected in test, puppeteer will time out the test and jest will run test clean up (shut down review server). However this process runs past the time jest expects the test to finish in and jest will display an error regarding potentially not shutting down correctly and jest will remain running.

What does the error message look like for this? Can you paste in the description?

This is great text, row4
This is bad text, row4
This is bad text, row5
This is excellent text, row5
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

excellent 😎

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😎

@sghmk12
Copy link
Contributor Author

sghmk12 commented Mar 31, 2021

If app does not have HTML element present as expected in test, puppeteer will time out the test and jest will run test clean up (shut down review server). However this process runs past the time jest expects the test to finish in and jest will display an error regarding potentially not shutting down correctly and jest will remain running.

What does the error message look like for this? Can you paste in the description?

This is a portion of the output near the summary at the end. They key line of interest is the final one, as when it is printed jest does not exit by itself and the user must use ctrl-C to do so.


    TimeoutError: waiting for selector `#mephisto-search-button` failed: timeout 30000ms exceeded

      147 |       expect(searchBar).toBeDefined();
      148 |       await page.type("#mephisto-search", searchTestOptions.QUERY);
    > 149 |       var searchButton = await page.waitForSelector("#mephisto-search-button");
          |                                     ^
      150 |       expect(searchButton).toBeDefined();
      151 |       await page.$eval("#mephisto-search-button", (button) => button.click());
      152 |       //wait for server to generate results

      at new WaitTask (node_modules/puppeteer/src/common/DOMWorld.ts:776:28)
      at DOMWorld.waitForSelectorInPage (node_modules/puppeteer/src/common/DOMWorld.ts:628:22)
      at Object.internalHandler.waitFor (node_modules/puppeteer/src/common/QueryHandler.ts:78:19)
      at DOMWorld.waitForSelector (node_modules/puppeteer/src/common/DOMWorld.ts:486:25)
      at Frame.waitForSelector (node_modules/puppeteer/src/common/FrameManager.ts:1159:47)
      at Page.waitForSelector (node_modules/puppeteer/src/common/Page.ts:1971:29)
      at Object.<anonymous> (tests/all_item_view_ui.test.js:149:37)
          at runMicrotasks (<anonymous>)

   Testing if AllItemView has all major UI elements › when searching data with: A single page of JSON data

    : Timeout - Async callback was not invoked within the 15000 ms timeout specified by jest.setTimeout.Timeout - Async callback was not invoked within the 15000 ms timeout specified by jest.setTimeout.Error:

      126 |
      127 |   //tests the ability to use the search bar to filter data based on search tests specified for each test type
    > 128 |   test(`when searching data with: ${CONSTANT_KEY}`, async () => {
          |   ^
      129 |     //get search test for specific test type
      130 |     const searchTestOptions = TEST_CONSTANTS[CONSTANT_KEY].SEARCH_TEST_OPTIONS;
      131 |

      at new Spec (node_modules/jest-jasmine2/build/jasmine/Spec.js:116:22)
      at tests/all_item_view_ui.test.js:128:3
          at Array.forEach (<anonymous>)
      at Object.<anonymous> (tests/all_item_view_ui.test.js:17:1)

 PASS  tests/item_view_ui.test.js (29.769 s)
  Testing if ItemView has all major UI elements
    ✓ with: Multiple pages of CSV data (6390 ms)
    ✓ with: A single page of CSV data (4386 ms)
    ✓ with: Multiple pages of JSON data (6394 ms)
    ✓ with: A single page of JSON data (4394 ms)

Test Suites: 1 failed, 2 passed, 3 total
Tests:       4 failed, 16 passed, 20 total
Snapshots:   0 total
Time:        170.884 s
Ran all test suites.
Jest did not exit one second after the test run has completed.

This usually means that there are asynchronous operations that weren't stopped in your tests. Consider running Jest with `--detectOpenHandles` to troubleshoot this issue.

@pringshia
Copy link
Contributor

I wonder if any of the solutions here could be useful: jestjs/jest#7287

Either calling a done() callback from the async, or using the why-is-node-running package to debug.

@sghmk12
Copy link
Contributor Author

sghmk12 commented Apr 1, 2021

I wonder if any of the solutions here could be useful: facebook/jest#7287

Either calling a done() callback from the async, or using the why-is-node-running package to debug.

I can try it out but I think once the timeout is surpassed the program never makes it to that part of the function. I was considering today reducing the timeout time for puppeteer and increasing the timeout on jest such that if every iteration on a test timedout they would remain within the jest timeout and then we could use expect statements to check whether puppeteer timed out or threw an error.

@pringshia
Copy link
Contributor

FYI - Directions above should probably read:

cp -r build ../test/integration/review/test_app/

@pringshia
Copy link
Contributor

This looks great! Thanks for adding this testing suite into Mephisto. Excited to get it working as part of the GitHub build process in the future. This should also be a good foundational reference for those building upon these tests.

You may have to resolve a few conflicts before merging this in, though I don't think they should be too bad.

@sghmk12 sghmk12 merged commit 5a2eb97 into master Apr 13, 2021
@sghmk12 sghmk12 deleted the review-template-testing branch April 13, 2021 16:17
@pringshia pringshia changed the title [WIP] Mephisto review testing suite Mephisto review testing suite Apr 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants