This repo contains a test automation suite with a variety of tests. In this readme, you'll learn about the types of tests and how to run them.
- MongoDB Node Driver Test Automation
All of our test automation is powered by the Mocha test framework.
Some of the tests require a particular topology (e.g., standalone server, replica set, or sharded cluster). These tests check the topology of the MongoDB server that is being used. If the topology does not match, the tests will be skipped.
Below is a summary of the types of test automation in this repo.
Type of Test | Test Location | About the Tests | How to Run Tests |
---|---|---|---|
Unit | /test/unit |
The unit tests test individual pieces of code, typically functions. These tests do not interact with a real database, so mocks are used instead. The unit test directory mirrors the /src directory structure with test file names matching the source file names of the code they test. |
npm run check:unit |
Integration | /test/integration |
The integration tests test that a given feature or piece of a feature is working as expected. These tests do not use mocks; instead, they interact with a real database. The integration test directory follows the test/spec directory structure representing the different functional areas of the driver. Note: The .gitkeep files are intentionally left to ensure that this directory structure is preserved even as the actual test files are moved around. |
npm run check:test |
Benchmark | /test/benchmarks |
The benchmark tests report how long a designated set of tests take to run. They are used to measure performance. | npm run check:bench |
Specialized Environment | /test/manual |
The specalized environment tests are functional tests that require specialized environment setups in Evergreen. Note: "manual" in the directory path does not refer to tests that should be run manually. These tests are automated. These tests have a special Evergreen configuration and run in isolation from the other tests. |
There is no single script for running all of the specialized environment tests. Instead, you can run the appropriate script based on the specialized environment you want to use: - npm run check:atlas to test Atlas - npm run check:adl to test Atlas Data Lake - npm run check:kerberos to test Kerberos - npm run check:tls to test TLS - npm run check:ldap to test LDAP authorization |
TypeScript Definition | /test/types |
The TypeScript definition tests verify the type definitions are correct. | npm run check:tsd |
GitHub Actions | /test/action |
Tests that run as GitHub Actions such as dependency checking. | Currently, only npm run check:dependencies but could be expanded to more in the future. |
Code Examples | /test/integration/node-specific/examples |
Code examples that are also paired with tests that show they are working examples. | Currently, npm run check:lambda to test the AWS Lambda example with default auth and npm run check:lambda:aws to test the AWS Lambda example with AWS auth. |
Explicit Resource Management | /test/explicit-resource-management |
Tests that use explicit resource management with the driver's disposable resources. | bash .evergreen/run-resource-management-feature-integration.sh |
All of the MongoDB drivers follow the same specifications (specs). Each spec has tests associated with it. Some of the tests are prose (written, descriptive) tests, which must be implemented on a case-by-case basis by the developers on the driver teams. Other tests are written in a standardized form as YAML and converted to JSON, which can be read by the specialized spec test runners that are implemented in each driver.
The input test specifications are stored in test/spec
.
The actual implementations of the spec tests can be unit tests or integration tests depending on the requirements, and they can be found in the corresponding test directory according to their type. Regardless of whether they are located in the /unit
or /integration
test directory, test files named spec_name.spec.test
contain spec test implementations that use a standardized runner and spec_name.prose.test
files contain prose test implementations.
The easiest way to get started running the tests locally is to start a standalone server and run all of the tests.
Start a mongod
standalone with our cluster_setup.sh script:
./test/tools/cluster_setup.sh server
Then run the tests:
npm test
Note: the command above will run a subset of the tests that work with the standalone server topology since the tests are being run against a standalone server.
The output will show how many tests passed, failed, and are pending. Tests that we have indicated should be skipped using .skip()
will appear as pending in the test results. See Mocha's documentation for more information.
In the following subsections, we'll dig into the details of running the tests.
By default, the integration tests run with auth-enabled and the cluster_setup.sh
script defaults to starting servers with auth-enabled. Tests can be run locally without auth by setting the environment
variable AUTH
to the value of noauth
. This must be a two-step process of starting a server without auth-enabled and then running the tests without auth-enabled.
AUTH='noauth' ./test/tools/cluster_setup.sh <server>
AUTH='noauth' npm run check:test
As we mentioned earlier, the tests check the topology of the MongoDB server being used and run the tests associated with that topology. Tests that don't have a matching topology will be skipped.
In the steps above, we started a standalone server:
./test/tools/cluster_setup.sh server
You can use the same cluster_setup.sh script to start a replica set or sharded cluster by passing the appropriate option:
./test/tools/cluster_setup.sh replica_set
or
./test/tools/cluster_setup.sh sharded_cluster
If you are running more than a standalone server, make sure your ulimit
settings are in accordance with MongoDB's recommendations. Changing the settings on the latest versions of macOS can be tricky. See this article for tips. (You likely don't need to do the complicated maxproc
steps.)
The cluster_setup.sh script automatically stores the files associated with the MongoDB server in the data
directory, which is stored at the top-level of this repository.
You can delete this directory if you want to ensure you're running a clean configuration. If you delete the directory, the associated database server will be stopped, and you will need to run cluster_setup.sh again.
You can prefix npm test
with a MONGODB_URI
environment variable to point the tests to a specific deployment. For example, for a standalone server, you might use:
MONGODB_URI=mongodb://localhost:27017 npm test
For a replica set, you might use:
MONGODB_URI=mongodb://localhost:31000,localhost:31001,localhost:31002/?replicaSet=rs npm test
The easiest way to run a single test is by appending .only()
to the test context you want to run. For example, you could update a test function to be:
it.only('cool test', function() {})
Then, run the test using npm run check:test
for a functional or integration test or
npm run check:unit
for a unit test. See Mocha's documentation for more detailed information on .only()
.
Another way to run a single test is to use Mocha's grep
flag. For functional or integration tests, run:
npm run check:test -- -g <test name>
For unit tests, run:
npm run check:unit -- -g <test name>
See the Mocha documentation for information on the grep
flag.
Evergreen is the continuous integration (CI) system we use. Evergreen builds are automatically run whenever a pull request is created or when commits are pushed to particular branches (e.g., main
, 4.0
, and 3.6
).
Each Evergreen build runs the test suite against a variety of build variants that include a combination of topologies, special environments, and operating systems. By default, commits in pull requests only run a subset of the build variants in order to save time and resources. To configure a build, update .evergreen/config.yml.in
and then generate a new Evergreen config via:
node .evergreen/generate_evergreen_tasks.js
Occasionally, you will want to manually kick off an Evergreen build in order to debug a test failure or to run tests against uncommitted changes.
You can use the Evergreen UI to choose to rerun a task (an entire set of test automation for a given topology and environment). Evergreen does not allow you to rerun an individual test.
You can also choose to run a build against code on your local machine that you have not yet committed by running a pre-commit patch build.
Begin by setting up the Evergreen CLI.
- Download and install the Evergreen CLI according to the instructions in the Evergreen Documentation.
- Be sure to create
evergreen.yml
as described in the documentation. - Add the Evergreen binary to your path.
Once you have the Evergreen CLI setup, you are ready to run a build. Keep in mind that if you want to run only a few tests, you can append .only()
as described in the section above on running individual tests.
-
In a terminal, navigate to your node driver directory:
cd node-mongodb-native
-
Use the Evergreen
patch
command.-y
skips the confirmation dialog.-u
includes uncommitted changes.-p [project name]
specifies the Evergreen project.--browse
opens the patch URL in your browser.evergreen patch -y -u -p mongo-node-driver-next --browse
-
In your browser, select the build variants and tasks to run.
You may want to test the driver with a pre-release version of a dependent library (e.g., bson). Follow the steps below to do so.
- Open package.json
- Identify the line that specifies the dependency
- Replace the version number with the commit hash of the dependent library. For example, you could use a particular commit for the js-bson project on GitHub:
"bson": "mongodb/js-bson#e29156f7438fa77c1672fd70789d7ade9ca65061"
- Run
npm install
to install the dependency
Now you can run the automated tests, run manual tests, or kick off an Evergreen build from your local repository.
You may want to manually test changes you have made to the driver. The steps below will walk you through how to create a new Node project that uses your local copy of the Node driver. You can modify the steps to work with existing Node projects.
- Navigate to a new directory and create a new Node project by running
npm init
in a terminal and working through the interactive prompts. A new file namedpackage.json
will be created for you. - In
package.json
, create a new dependency formongodb
that points to your local copy of the driver. For example:"dependencies": { "mongodb": "/path-to-your-copy-of-the-driver-repo/node-mongodb-native" }
- Run
npm install
to install the dependency. - Create a new file that uses the driver to test your changes. See the MongoDB Node.js Quick Start Repo for example scripts you can use.
Note: When making driver changes, you will need to run
npm run build:ts
with each change in order for it to take effect.
TODO: flesh this section out more
We use mocha
to construct our test suites and chai
to assert expectations.
Some special notes on how mocha works with our testing setup:
before
hooks will run even if a test is skipped by the environment it runs on.- So, for example, if your
before
hook does logic that can only run on a certain server version you can't depend on your test block metadata to filter for that.
- So, for example, if your
after
hooks cannot be used to clean up clients because the session leak checker currently runs in anafterEach
hook, which would be executed before anyafter
hook has a chance to run
Not all tests are able to run in all environments and some are unable to run at all due to known bugs.
When marking a test to be skipped, be sure to include a skipReason
, so that it can be added to the test run printout.
// skipping an individual test
it.skip('should not run', () => { /* test */ }).skipReason = 'TODO: NODE-1234';
// skipping a set of tests via beforeEach
beforeEach(() => {
if (/* some condition */) {
this.currentTest.skipReason = 'requires <run condition> to run';
this.skip();
}
});
npm run check:bench
Refer to the run-spec-benchmark-tests-node-server
task for Node.js version, MongoDB server version, and platform that we run benchmarks against in CI.
The server is run in standalone mode and the server versions are aliased by this script: https://github.com/mongodb-labs/drivers-evergreen-tools/blob/5048cca80e9ca62642409de2d401058bbd7057fa/.evergreen/mongodl.py#L58 check the latest version to see what alias the driver is running against.
The host used is described in detail here: https://spruce.mongodb.com/distro/rhel90-dbx-perf-large/settings/general (Auth required to view)
Here is a rough list of the key configurations:
- cpu: Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz
- cores: 16
- arch: x64
- os: RHEL 9.0 linux (5.14.0-70.75.1.el9_0.x86_64)
- ram: 64 GB
It is best to try reproductions against as similar a deployment as possible to isolate regressions.
The benchmarks can be directed to test different settings and driver versions.
The following are environment variables and how the benchmark runner uses them:
MONGODB_DRIVER_PATH
- (default: current working driver) if set MUST be set to the directory a driver version is in, usually another clone of the driver checked out to a different revision.MONGODB_CLIENT_OPTIONS
- (default: empty object) if set MUST be a JSON string that will be parsed and passed as the second argument to the MongoClient constructor.MONGODB_URI
- (default:mongodb://127.0.0.1:27017
) if set MUST be a valid MongoDB connection string and it will be used as the host the benchmarks will run against.
It may be desirable to test how changes to BSON
impact the driver's performance.
To do this:
- clone the changed version of BSON
- run the build script for that repo (usually done by
npm install
for you)
- run the build script for that repo (usually done by
- run
npm link
- over in the driver repo run
npm link bson
When you run the benchmarks verify that the BSON version has been picked by the version references that are printed out:
- cpu: Apple M1 Max
- cores: 10
- arch: arm64
- os: darwin (23.6.0)
- ram: 32GB
- node: v22.6.0
- driver: 6.11.0 (df3ea32a9): .../mongodb
- options {}
- bson: 6.10.1 (installed from npm): (.../mongodb/node_modules/bson)
Secrets needed for testing in special environments are managed in a drivers-wide AWS secrets manager vault.
drivers-evergreen-tools contains scripts that can fetch secrets from secrets manager for local use and use in CI in the .evergreen/secrets_handling folder.
Local use of secrets manager requires:
- the AWS SDK installed
- an AWS profile with access to the AWS vault has been configured
(see instructions in the secrets handling readme).
Here's an example usage of the tooling in drivers-evergreen-tools that configures credentials for CSFLE:
bash ${DRIVERS_TOOLS}/.evergreen/secrets_handling/setup-secrets.sh drivers/csfle
source secrets-export.sh
- The
setup-secrets
script authenticates with AWS, fetches credentials and writes them to a bash file calledsecrets-export.sh
. - The setup-secrets accepts a space separated list of all the vaults from which to fetch credentials. in this case, we fetch credentials from the
drivers/csfle
vault. - Source
secrets-export.sh
to load the credentials into the environment.
Important
Make sure secrets-export.sh
is in the .gitignore of any Github repo you might be using these tools in to avoid leaking credentials. This is already done for this repo.
In order to test some features, you will need to generate and set a specialized group of environment variables. The subsections below will walk you through how to generate and set the environment variables for these features.
We recommend using a different terminal for each specialized environment to avoid the environment variables from one specialized environment impacting the test runs for another specialized environment.
Before you begin any of the subsections below, clone the drivers-evergreen-tools repo.
We recommend creating an environment variable named DRIVERS_TOOLS
that stores the path to your local copy of the driver-evergreen-tools
repo (code examples in this section will assume this has been done):
export DRIVERS_TOOLS="/path/to/your/copy/of/drivers-evergreen-tools"
The following steps will walk you through how to create and test a MongoDB Serverless instance.
Important
If you set up an Atlas cluster for local use, you MUST delete it when you are finished with it using the delete-instance script.
This script uses aws secrets manager to fetch credentials. Make sure you are logged into AWS and have your profile set correctly.
- Run the setup-serverless script:
bash .evergreen/setup-serverless.sh
- Source the expansions and secrets:
source secrets-export.sh
source serverless.env
-
Comment out the line in
.evergreen/run-serverless-tests.sh
that sourcesinstall-dependencies.sh
(this downloads node and npm and is only used in CI). -
Run the
.evergreen/run-serverless-tests.sh
script directly to test serverless instances from your local machine.
The following steps will walk you through how to start and test a load balancer.
-
Start a sharded cluster with two
mongos
, so you have a URI similar toMONGODB_URI=mongodb://host1,host2/
. The server must be version 5.2.0 or higher. Create the config server:mongod --configsvr --replSet test --dbpath config1 --bind_ip localhost --port 27217
Initiate the config server in the shell:
mongosh "mongodb://localhost:27217" --eval "rs.initiate( { _id: 'test', configsvr: true, members: [ { _id: 0, host: 'localhost:27217' } ] })"
Create shard replica sets:
mongod --shardsvr --replSet testing --dbpath repl1 --bind_ip localhost --port 27218 --setParameter enableTestCommands=true mongod --shardsvr --replSet testing --dbpath repl2 --bind_ip localhost --port 27219 --setParameter enableTestCommands=true mongod --shardsvr --replSet testing --dbpath repl3 --bind_ip localhost --port 27220 --setParameter enableTestCommands=true
Initiate replica set in the shell:
mongosh "mongodb://localhost:27218" --eval "rs.initiate( { _id: 'testing', members: [ { _id: 0, host: 'localhost:27218' }, { _id: 1, host: 'localhost:27219' }, { _id: 2, host: 'localhost:27220' }] })"
Create two
mongos
running on ports27017
and27018
:mongos --configdb test/localhost:27217 --bind_ip localhost --setParameter enableTestCommands=1 --setParameter loadBalancerPort=27050 mongos --configdb test/localhost:27217 --port 27018 --bind_ip localhost --setParameter enableTestCommands=1 --setParameter loadBalancerPort=27051
Initiate cluster on
mongos
in shell:mongosh "mongodb://localhost:27017" --eval "sh.addShard('testing/localhost:27218,localhost:27219,localhost:27220')" mongosh "mongodb://localhost:27017" --eval "sh.enableSharding('test')"
-
An alternative way to the fully manual cluster setup is to use
mlaunch
: Initialize the sharded cluster viamlaunch
in a new empty directory:mlaunch init --dir data --ipv6 --replicaset --nodes 2 --port 51000 --name testing --setParameter enableTestCommands=1 --sharded 1 --mongos 2
mlaunch
will then start up the sharded cluster. Once it finishes, stop the cluster:mlaunch stop
When
mlaunch
has stopped the cluster, navigate to thedata
directory and edit the.mlaunch_startup
file:- Add
--setParameter loadBalancerPort=27050
to the firstmongos
configuration at the bottom of the file. - Add
--setParameter loadBalancerPort=27051
to the secondmongos
configuration at the bottom of the file.
Navigate back up to the root directory where
mlaunch
was initialized and restart:mlaunch start
- Add
-
Create an environment variable named
MONGODB_URI
that stores the URI of the sharded cluster you just created. For example:export MONGODB_URI="mongodb://host1,host2/"
-
Install the HAProxy load balancer. For those on macOS, you can install HAProxy with:
brew install haproxy
-
Start the load balancer by using the run-load-balancer script provided in
drivers-evergreen-tools
.$DRIVERS_TOOLS/.evergreen/run-load-balancer.sh start
A new file name
lb-expansion.yml
will be automatically created. The contents of the file will be similar in structure to the code below.SINGLE_MONGOS_LB_URI: 'mongodb://127.0.0.1:8000/?loadBalanced=true' MULTI_MONGOS_LB_URI: 'mongodb://127.0.0.1:8001/?loadBalanced=true'
-
Generate a sourceable environment file from
lb-expansion.yml
by running the following command:cat lb-expansion.yml | sed 's/: /=/g' > lb.env
A new file name
lb.env
is automatically created. -
Source the environment variables using a command like
source lb.env
. -
Export each of the environment variables that were created in
lb.env
. For example:export SINGLE_MONGOS_LB_URI
-
Export the
LOAD_BALANCER
environment variable totrue
:export LOAD_BALANCER='true'
-
Disable auth for tests:
export AUTH='noauth'
-
Run the test suite as you normally would:
npm run check:test
Verify that the output from Mocha includes
[ topology type: load-balanced ]
. This indicates the tests successfully accessed the specialized environment variables for load balancer testing. -
When you are done testing, shutdown the HAProxy load balancer:
$DRIVERS_TOOLS/.evergreen/run-load-balancer.sh stop
The following steps will walk you through how to run the tests for CSFLE.
- Install MongoDB Client Encryption if you haven't already:
npm install mongodb-client-encryption
Note
If developing changes in mongodb-client-encryption
, you can link it locally using etc/tooling/fle.sh
.
- Load FLE credentials and download crypt_shared
This must be run inside a bash or zsh shell.
source .evergreen/setup-fle.sh
Note
By default, setup-fle.sh
installs crypt_shared. If you want to test with mongocryptd instead, set the RUN_WITH_MONGOCRYPTD environment variable before
sourcing setup-fle.sh
.
- Run the functional tests:
export TEST_CSFLE=true
npm run check:test
The output of the tests will include sections like "Client-Side Encryption Corpus", "Client-Side Encryption Functional", "Client-Side Encryption Prose Tests", and "Client-Side Encryption".
CSFLE supports automatic KMS credential fetching for Azure, GCP and AWS. In order to e2e test GCP and Azure, we must run the tests on an actual GCP or Azure host (our ). This is supported by drivers-evergreen-tools.
The basic idea is to
- Provision an Azure or GCP server.
- Set up a cluster on the server.
- Copy the driver and tests to the server and run the tests on the server.
- Copy the results back.
All of this is handled in the csfle/azurekms and csfle/gcpkms folders in drivers-evergreen-tools.
Important
Azure VMs and GCP VMs must be destroyed with their corresponding teardown.sh
scripts.
- Provision an Azure server. You must set the
AZUREKMS_VMNAME_PREFIX
variable:
export AZUREKMS_VMNAME_PREFIX: "NODE_DRIVER"
bash ${DRIVERS_TOOLS}/.evergreen/csfle/azurekms/setup.sh
- Comment out the following line in
run-deployed-azure-kms-tests.sh
:
source $DRIVERS_TOOLS/.evergreen/init-node-and-npm-env.sh
- Run the tests:
bash .evergreen/run-deployed-azure-kms-tests.sh
- Provision an GCP server.
bash ${DRIVERS_TOOLS}/.evergreen/csfle/gcpkms/setup.sh
- Comment out the following line in
run-deployed-gcp-kms-tests.sh
:
source $DRIVERS_TOOLS/.evergreen/init-node-and-npm-env.sh
- Run the tests:
bash .evergreen/run-deployed-gcp-kms-tests.sh
Using drivers evergreen tools, run the setup-atlas-cluster
script. You must also set the CLUSTER_PREFIX environment variable.
CLUSTER_PREFIX=dbx-node-lambda bash ${DRIVERS_TOOLS}/.evergreen/atlas/setup-atlas-cluster.sh
The URI of the cluster is available in the atlas-expansions.yml
file.
- Set up an Atlas cluster, as outlined in the "Launching an Atlas Cluster" section.
- Add the URI of the cluster to the environment as the MONGODB_URI environment variable.
- Run the tests with
npm run check:search-indexes
.
TODO(NODE-6698): Update deployed lambda test section.
You must be in an office or connected to the VPN to run these tests.
Run .evergreen/run-kerberos-tests.sh
.
Note
AWS ECS tests have a different set up process. Don't even bother running these locally, just pray to the CI gods that things work and you never have to touch these tests.
AWS tests require a cluster configured with MONGODB_AWS auth enabled. This is easy to set up using drivers-evergreen-tools
by specifying the aws-auth.json
orchestration file (this is what CI does).
- Set up your cluster and export the URI of your cluster as MONGODB_URI.
- Choose your configuration and set the relevant environment variables.
Do you want the AWS SDK to be installed while running auth? If not, set MONGODB_AWS_SDK to false.
Choose your AWS authentication credential type and export the AWS_CREDENTIAL_TYPE
type with the chosen value:
AWS Credential Type | Explanation |
---|---|
regular | The AWS credentials are present in the URI as username:password |
env-creds | AWS credentials are loaded into the environment as AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY |
assume-role | The machine assumes a particular authentication role, associated with the machine |
ec2 | The driver authenticates against a local endpoint (on an AWS ec2 instance) |
web-identity | Credentials are sourced from an AssumeRoleWithWebIdentity |
session-creds | Similar to env-creds, but the credentials are temporary and include a session token |
- Run the
bash .evergreen/run-mongodb-aws-tests.sh
.
- TLS
- Atlas Data Lake
- LDAP
- Snappy (maybe in general, how to test optional dependencies)
- Atlas connectivity
These steps require mongosh
to be available locally. Clone it from GitHub.
mongosh
uses a lerna
monorepo. As a result, mongosh
contains multiple references to the mongodb
package
in their package.json
s.
Set up mongosh
by following the steps in the mongosh
readme.
mongosh contains a script that does this. To use the script, create an environment
variable REPLACE_PACKAGE
that contains a string in the form
mongodb:<path to your local instance of the driver>
. The package replacement script will replace
all occurrences of mongodb
with the local path of your driver.
An alternative, which can be useful for
testing a release, is to first run npm pack
on the driver. This generates a tarball containing all the code
that would be uploaded to npm
if it were released. Then, set the environment variable REPLACE_PACKAGE
with the full path to the file.
Once the environment variable is set, run replace package in mongosh
with:
npm run replace:package
mongosh
's readme documents how to run its tests. Most likely, it isn't necessary to run all of mongosh's
tests. The mongosh
readme also documents how to run tests for a particular scope. The scopes are
listed in the generate_mongosh_tasks.js
evergreen generation script.
For example, to run the service-provider-server
package, run the following command in mongosh
:
lerna run test --scope @mongosh/service-provider-server