Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add tests for happy paths in critical workflows #1110

Merged
merged 63 commits into from
Apr 26, 2021

Conversation

zwliew
Copy link
Contributor

@zwliew zwliew commented Apr 1, 2021

Problem

The following critical workflows are not tested. If one of them breaks, it would be difficult to realise unless we go through the tedium of testing each one manually. Hence, let's add automated tests for them.

Workflows:

  1. Creating and sending a new email campaign
  2. Creating and sending a new SMS campaign
  3. Creating and sending a new Telegram campaign
  4. Creating and sending a new protected email campaign
  5. Viewing a protected email message
  6. Unsubscribing from a mailing list

Partially resolves issue #1079

Solution

Features:

  • Added integration tests for happy paths in critical workflows
  • Added unit tests

Before & After Screenshots

There are no visual changes.

Tests

These tests have been verified on Travis and Amplify.

  1. Deploy the change on Travis and Amplify.
  2. All the tests should be run and should pass:
  • Write integration test for creating and sending a new email campaign
  • Write integration test for creating and sending a new SMS campaign
  • Write integration test for creating and sending a new Telegram campaign
  • Write integration test for creating and sending a new protected email campaign
  • Write unit test for viewing a protected email
  • Write unit test for successfully unsubscribing from a campaign
  • Write unit test for user choosing to not unsubscribe

Deploy Notes

The aforementioned tests have been added and should be run during every deployment on Travis and Amplify. If any test fails, the deployment should not go through.

Specifically, Amplify should not publish the new build if any test fails.

@zwliew zwliew changed the title feat(dashboard): test happy paths for entire campaign creation workflows feat: add integration tests for successful campaign creation workflows Apr 6, 2021
@zwliew zwliew force-pushed the frontend-tests-dashboard branch 2 times, most recently from dd6503b to 2b438a0 Compare April 8, 2021 07:40
@zwliew zwliew changed the title feat: add integration tests for successful campaign creation workflows feat: add tests for campaign creation workflows Apr 8, 2021
@zwliew zwliew changed the title feat: add tests for campaign creation workflows feat: add frontend tests for campaign creation workflows Apr 8, 2021
@zwliew zwliew force-pushed the frontend-tests-dashboard branch 2 times, most recently from cff80eb to 44269ba Compare April 9, 2021 03:42
@zwliew zwliew marked this pull request as ready for review April 9, 2021 03:42
@zwliew zwliew changed the title feat: add frontend tests for campaign creation workflows feat: add integration tests for successful campaign creation workflows Apr 9, 2021
@zwliew zwliew changed the base branch from develop to frontend-tests April 9, 2021 03:54
@zwliew zwliew assigned zwliew and unassigned zwliew Apr 9, 2021
@zwliew zwliew added the frontend Frontend label Apr 9, 2021
@zwliew zwliew force-pushed the frontend-tests-dashboard branch 2 times, most recently from cd9502b to ad81c87 Compare April 12, 2021 08:45
@zwliew zwliew changed the title feat: add integration tests for successful campaign creation workflows feat: add tests for successful campaign creation workflows Apr 12, 2021
The following dependencies were installed:
- msw to mock backend API responses
- @testing-library/react for idiomatic react testing APIs
- @testing-library/jest-dom for idiomatic DOM assertion APIs
- @testing-library/user-event for idiomatic DOM user event APIs
- jest-canvas-mock to mock Canvas for Lottie as jsdom doesn't implement it
- @peculiar/webcrypto as jsdom doesn't implement SubtleCrypto

Also install the necessary TypeScript definitions.
The purpose is to pass in the testMode option during tests.

Note: I've also shifted the declaration of initialChecks() into the useEffect()
call where it is used. This is mainly to silence the React Hooks ESLint
warnings on missing dependencies in the useEffect() dependency list.
This allows us to specify the initial modal content when setting up test
environments.
We don't care about these environment variables in tests.

Also, setting the axios baseURL for tests will result in flakiness in
tests that involve API responses.
This is mainly an issue in test environments. React treats 'muted' as a
property and sets it after the test environment has been torn down,
resulting in console warnings [1].

Fix this by mocking the setter property to do nothing. This doesn't pose
any issue in test environments.

[1]: testing-library/react-testing-library#470
Make sure to set CI=true explicitly as Amplify doesn't set it by
default. CI=true will run all the tests exactly once and exit.

Also compile translations before running the tests.
We want to render App.tsx in a test environment with separate providers.
Hence, move the providers to index.tsx.

Also, move the `import 'locales'` statement to the top of index.tsx as
well.
@miazima
Copy link
Contributor

miazima commented Apr 15, 2021

Since the Dashboard component is actually a collection of routes, I was thinking the campaign creation tests could fall under the respective channel Create components, e.g. EmailCreate or any other component specific to the channel.

Base automatically changed from frontend-tests to develop April 15, 2021 07:14
@zwliew
Copy link
Contributor Author

zwliew commented Apr 15, 2021

Since the Dashboard component is actually a collection of routes, I was thinking the campaign creation tests could fall under the respective channel Create components, e.g. EmailCreate or any other component specific to the channel.

Yup, we could do that. However, one issue is that there is logic in the common Create component (rendered from Dashboard) which loads the campaign from the API and other stuff.

Specifically for these tests, I thought that it would be better if they were complete integration tests, which included all the components being used in the campaign creation workflow. Hence, I thought it would be better if we also included the Dashboard and Create components in the tests. What do you think?

Going along the same line of thought, I also considered placing these tests in the App component to really test the entire component pipeline. However, since App does very little (it only acts as a router switch), I figured it would be fine to leave it out of the integration test.

@zwliew zwliew changed the title feat: add tests for successful campaign creation workflows feat: add tests for happy paths in critical workflows Apr 19, 2021
@miazima
Copy link
Contributor

miazima commented Apr 19, 2021

Yup I think these tests are integration tests rather than individual component unit tests. Perhaps you can create a new folder to house integration tests similar to this?

@zwliew
Copy link
Contributor Author

zwliew commented Apr 19, 2021

Yup I think these tests are integration tests rather than individual component unit tests. Perhaps you can create a new folder to house integration tests similar to this?

Sounds good. I guess the main goal is to make the intent of the tests clearer.

I'm not very sure what would be a good way to do this though. My initial thought would be to split the tests up into smaller files like this:

frontend/
  - src/
    - components/
      - dashboard/
        - Dashboard.tsx
        - tests/
          - integration/
            - utils.ts
            - email.test.tsx
            - sms.test.tsx
            - telegram.test.tsx

I feel like it is quite an inadequate solution though. How would you suggest we do this?

1. Rename `templates` to `{email|sms|telegram}_templates` to match the
   actual API response
2. Convert `params` to an array to match the actual API
3. Add `has_credential` to match the actual API
4. Add a LOGGED job to the `job_queue` once a campaign is sent to better
   mimic the API
@miazima
Copy link
Contributor

miazima commented Apr 19, 2021

I think its fine to place it as you suggested for now, I can't think of a strong reason to place it elsewhere. We can always move around our tests as needed in the future!

@zwliew
Copy link
Contributor Author

zwliew commented Apr 26, 2021

I've split the Dashboard integration tests to separate files

@miazima miazima self-requested a review April 26, 2021 04:38
@miazima miazima merged commit 52f4a73 into develop Apr 26, 2021
@miazima miazima deleted the frontend-tests-dashboard branch April 26, 2021 04:39
@zwliew
Copy link
Contributor Author

zwliew commented Apr 26, 2021

Thanks for the review!

lamkeewei added a commit that referenced this pull request May 5, 2021
* develop:
  refactor: use shared function to initialize models (#1172)
  chore: setup scaffolding for backend tests (#940)
  1.23.0
  fix(frontend): fix frontend test flakiness (#1162)
  fix: update successful delivery status only if error does not exist (#1150)
  chore: upgrade dependencies (#1153)
  feat: add unit tests for error states in critical workflows (#1118)
  feat: support whitelisting domains through `agencies` table (#1141)
  feat: add tests for happy paths in critical workflows (#1110)
  fix: prevent campaign names from causing dashboard rows to overflow (#1147)
  fix(email): Fix SendGrid fallback integration (#1026)
lamkeewei added a commit that referenced this pull request May 5, 2021
* develop:
  feat: refactor msg template components; add telegram character limit (#1148)
  refactor: use shared function to initialize models (#1172)
  chore: setup scaffolding for backend tests (#940)
  1.23.0
  fix(frontend): fix frontend test flakiness (#1162)
  fix: update successful delivery status only if error does not exist (#1150)
  chore: upgrade dependencies (#1153)
  feat: add unit tests for error states in critical workflows (#1118)
  feat: support whitelisting domains through `agencies` table (#1141)
  feat: add tests for happy paths in critical workflows (#1110)
  fix: prevent campaign names from causing dashboard rows to overflow (#1147)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
frontend Frontend
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants