Skip to content

Writing Performance Tests

Harsha Nalluru edited this page Mar 4, 2021 · 15 revisions

To add perf tests for the sdk/<service>/<service-sdk> package, follow the steps below.

  1. Create a new folder for the perf tests.

    Path- sdk/<service>/perf-tests/<service-sdk>

    (Create the perf-tests folder if that doesn't exist)

  2. This new project will part of rush, with the package name @azure-tests/<service-sdk>. Since, it is part of rush, add the following entry in rush.json

        {
          "packageName": "@azure-tests/<service-sdk>",
          "projectFolder": "sdk/<service>/perf-tests/<service-sdk>",
          "versionPolicyName": "test"
        }
    
    
  3. Tests will live under sdk/<service>/perf-tests/<service-sdk>/test

  4. Add a package.json such as example-perf-package.json at sdk/<service>/perf-tests/<service-sdk> folder.

    Make sure to import your <service-sdk> and the test-utils-perfstress project.

      "devDependencies": {
         "@azure/<service-sdk>": "^<version-in-master-branch>",
         "@azure/test-utils-perfstress": "^1.0.0"
       }

    Note: "@azure/test-utils-perfstress" is not a published npm package.

    Set the name of the package and mark it as private.

     "name": "@azure-tests/<service-sdk>",
     "private": true,
  5. Run rush update and commit the changes to the pnpm-lock file.

  6. Copy the tsconfig.json, sample.env(and .env) files that are present at the sdk/<service>/<service-sdk> to sdk/<service>/perf-tests/<service-sdk>.

    Set the compilerOptions.module to commonjs in the tsconfig to allow running the tests with ts-node.

      "module": "commonjs"

(Skip this section if your service does not have or does not care about a track-1 version.)

  1. If there is an old major version of your package that needs to be compared, create the folder as sdk/<service>/perf-tests/<service-sdk>-track-1

  2. It is expected that the track-1 perf tests are counterparts of track-2 tests, so they need to have the same names as specified in the track-2 tests for convenience.

  3. Add a package.json such as example-track-1-perf-package.json at sdk/<service>/perf-tests/<service-sdk> folder.

    Make sure to import your <service-sdk> and the test-utils-perfstress project.

      "devDependencies": {
         "@azure/<service-sdk>": "^<latest-track-1-version>",
         "@azure/test-utils-perfstress": "file:../../../test-utils/perfstress/azure-test-utils-perfstress-1.0.0.tgz",
       }

    Set the name of the package and mark it as private.

     "name": "@azure-tests/<service-sdk>-track-1",
     "private": true,

    Note: Track-1 packages will not be managed by rush, instead npm will be used to manage/run the track-1 tests, you can copy the readme such as the storage-blob-perf-tests-readme for instructions.

    Make sure to add the "setup" step in package.json.

        "setup": "node ../../../../common/tools/perf-tests-track-1-setup.js",
  4. Run rush update and run npm run setup to be able to use the perf framework for track-1 perf tests.

    npm run setup installs the dependencies specified in package.json

  5. Repeat the step 6 from the previous section for the track-1 too.

Add an index.spec.ts at sdk/<service>/perf-tests/<service-sdk>/test/.

import { PerfStressProgram, selectPerfStressTest } from "@azure/test-utils-perfstress";
import { `ServiceNameAPIName`Test } from "./api-name.spec";
import { `ServiceNameAPIName2`Test } from "./api-name2.spec";

console.log("=== Starting the perfStress test ===");

const perfStressProgram = new PerfStressProgram(selectPerfStressTest([`ServiceNameAPIName`Test, `ServiceNameAPIName2`Test]));

perfStressProgram.run();

Base class would have all the common code that would be repeated for each of the tests - common code such as creating the client, creating a base resource, etc.

Create a new file such as serviceName.spec.ts at sdk/<service>/perf-tests/<service-sdk>/test/.

import { PerfStressTest, getEnvVar } from "@azure/test-utils-perfstress";
import {
  ServiceNameClient
} from "@azure/<service-sdk>";

// Expects the .env file at the same level
import * as dotenv from "dotenv";
dotenv.config();

export abstract class `ServiceName`Test<TOptions> extends PerfStressTest<TOptions> {
  serviceNameClient: ServiceNameClient;

  constructor() {
    super();
    // Setting up the serviceNameClient
  }

  public async globalSetup() {
    // .createResources() using serviceNameClient
  }

  public async globalCleanup() {
    // .deleteResources() using serviceNameClient
  }
}

Following code shows how the individual perf test files would look like.

import { ServiceNameClient } from "@azure/<service-sdk>";
import { PerfStressOptionDictionary, drainStream } from "@azure/test-utils-perfstress";
import { `ServiceName`Test } from "./serviceNameTest.spec";

interface `ServiceNameAPIName`TestOptions {
  newOption: number;
}

export class `ServiceNameAPIName`Test extends ServiceNameTest<`ServiceNameAPIName`TestOptions> {
  public options: PerfStressOptionDictionary<`ServiceNameAPIName`TestOptions> = {
    newOption: {
      required: true,
      description: "A new option",
      shortName: "sz",
      longName: "newOption",
      defaultValue: 10240
    }
  };

  serviceNameClient: `ServiceName`Client;

  constructor() {
    super();
    // Setting up the client
  }

  public async globalSetup() {
    await super.globalSetup(); // Calling base class' setup
    // Add any additional setup
  }

  async runAsync(): Promise<void> {
    // call the method on `serviceNameClient` that you're interested in testing
  }
}

It is not mandatory to have separate base class and test classes. If there is nothing common among the testing scenarios of your service, feel free to merge base class with the test class to only have a single test class instead.

As seen in the previous section, you can specify custom options along with the default options from the performance framework. You can access the options in the class using this.parsedOptions.

Parsed options include the default options such as duration, iterations, parallel, etc offered by the perf framework as well as the custom options provided in the TestClass.

interface `ServiceNameAPIName`TestOptions {
  newOption: number;
}

export class `ServiceNameAPIName`Test extends ServiceNameTest<`ServiceNameAPIName`TestOptions> {
  public options: PerfStressOptionDictionary<`ServiceNameAPIName`TestOptions> = {
    newOption: {
      required: true,
      description: "A new option",
      shortName: "sz",
      longName: "newOption",
      defaultValue: 10240
    }
  };
}

To run a particular test, use npm run perf-test:node - takes the test class name as the argument along with the command line arguments you may provide.

  • Run npm run perf-test:node -- TestClassName --warmup 2 --duration 7 --iterations 2 --parallel 2

Refer to storage-blob-perf-tests-readme and storage-blob-perf-tests-readme-track-1 and have similar set of instructions for your perf project.

  • Example: Currently @azure/<service-sdk> is at 12.4.0 on master and you want to test version 12.2.0
    • In the track 2 perf tests project, update dependency @azure/<service-sdk> version in package.json to 12.2.0
    • Add a new exception in common\config\rush\common-versions.json under allowedAlternativeVersions
      • "@azure/<service-sdk>": [..., "12.2.0"]
  • rush update (generates a new pnpm-lock file)
  • Navigate to sdk\storage\perf-tests\<service-sdk>
  • rush build -t perf-test-<service-sdk>
  • Run the tests as suggested before, example npm run perf-test:node -- TestClassName --warmup 2 --duration 7 --iterations 2 --parallel 2