Skip to content

Commit

Permalink
feat(integ-runner): test update path when running tests (#19915)
Browse files Browse the repository at this point in the history
This PR adds an "update path" workflow to the integration test
workflow. When a change is being made to an existing integration test we
want to be able to ensure that the change not only works for _new_
stacks, but also with _existing_ stacks. In order to accomplish this,
the runner will first deploy the existing snapshot (i.e. `cdk deploy
--app /path/to/snapshot`) and then perform a stack update by deploying
the integration test.

The update path can be disabled by passing the `--disable-update-path`
command line option.

This PR adds a couple of pieces of functionality in order to enable
testing the "update path".

1. The runner will attempt to checkout the snapshot from the origin
   before running the test. This is to try and make sure that you are
   actually testing an update of the existing snapshot and not an
   intermediate snapshot (e.g. if you have been iterating locally).
2. When the runner performs the snapshot diff, it will check for any
   destructive changes and return that information as a warning to the
   user.
3. If any destructive changes are found, the runner will record that
   information as trace metadata in the `manifest.json` file. This will
   enable us to create automation that checks for this as part of the PR
   workflow

Added logic to include the `allowDestroy` option in items 2 & 3 above.
If a resource type is included in the `allowDestroy` list, then
it will not be reported.


----

### All Submissions:

* [x] Have you followed the guidelines in our [Contributing guide?](https://github.com/aws/aws-cdk/blob/master/CONTRIBUTING.md)

### Adding new Unconventional Dependencies:

* [ ] This PR adds new unconventional dependencies following the process described [here](https://github.com/aws/aws-cdk/blob/master/CONTRIBUTING.md/#adding-new-unconventional-dependencies)

### New Features

* [ ] Have you added the new feature to an [integration test](https://github.com/aws/aws-cdk/blob/master/INTEGRATION_TESTS.md)?
	* [ ] Did you use `yarn integ` to deploy the infrastructure and generate the snapshot (i.e. `yarn integ` without `--dry-run`)?

*By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
  • Loading branch information
corymhall committed Apr 14, 2022
1 parent 423d72f commit d0ace8f
Show file tree
Hide file tree
Showing 18 changed files with 1,019 additions and 258 deletions.
33 changes: 33 additions & 0 deletions packages/@aws-cdk/integ-runner/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,8 @@ to be a self contained CDK app. The runner will execute the following for each f
If this is set to `true` then the list of tests provided will be excluded
- `--from-file`
Read the list of tests from this file
- `--disable-update-workflow` (default=`false`)
If this is set to `true` then the [update workflow](#update-workflow) will be disabled

Example:

Expand Down Expand Up @@ -150,6 +152,37 @@ Test Results:
Tests: 1 passed, 1 total
```

#### Update Workflow

By default, integration tests are run with the "update workflow" enabled. This can be disabled by using the `--disable-update-workflow` command line option.

If an existing snapshot is being updated, the integration test runner will first deploy the existing snapshot and then perform a stack update
with the new changes. This is to test for cases where an update would cause a breaking change only on a stack update.

The `integ-runner` will also attempt to warn you if you are making any destructive changes with a message like:

```bash
!!! This test contains destructive changes !!!
Stack: aws-cdk-lambda-1 - Resource: MyLambdaServiceRole4539ECB6 - Impact: WILL_DESTROY
Stack: aws-cdk-lambda-1 - Resource: AliasAliasPermissionAF30F9E8 - Impact: WILL_REPLACE
Stack: aws-cdk-lambda-1 - Resource: AliasFunctionUrlDC6EC566 - Impact: WILL_REPLACE
Stack: aws-cdk-lambda-1 - Resource: Aliasinvokefunctionurl4CA9917B - Impact: WILL_REPLACE
!!! If these destructive changes are necessary, please indicate this on the PR !!!
```

If the destructive changes are expected (and required) then please indicate this on your PR.

##### New tests

If you are adding a new test which creates a new snapshot then you should run that specific test with `--disable-update-workflow`.
For example, if you are working on a new test `integ.new-test.js` then you would run:

```bash
yarn integ --update-on-failed --disable-update-workflow integ.new-test.js
```

This is because for a new test we do not need to test the update workflow (there is nothing to update).

### integ.json schema

See [@aws-cdk/cloud-assembly-schema/lib/integ-tests/schema.ts](../cloud-assembly-schema/lib/integ-tests/schema.ts)
Expand Down
52 changes: 44 additions & 8 deletions packages/@aws-cdk/integ-runner/lib/cli.ts
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@
// Exercise all integ stacks and if they deploy, update the expected synth files
import * as path from 'path';
import * as chalk from 'chalk';
import * as workerpool from 'workerpool';
import * as logger from './logger';
import { IntegrationTests, IntegTestConfig } from './runner/integ-tests';
import { runSnapshotTests, runIntegrationTests, IntegRunnerMetrics } from './workers';
import { runSnapshotTests, runIntegrationTests, IntegRunnerMetrics, IntegTestWorkerConfig, DestructiveChange } from './workers';

// https://github.com/yargs/yargs/issues/1929
// https://github.com/evanw/esbuild/issues/1492
Expand All @@ -27,14 +28,16 @@ async function main() {
.options('max-workers', { type: 'number', desc: 'The max number of workerpool workers to use when running integration tests in parallel', default: 16 })
.options('exclude', { type: 'boolean', desc: 'All tests should be run, except for the list of tests provided', default: false })
.options('from-file', { type: 'string', desc: 'Import tests to include or exclude from a file' })
.option('disable-update-workflow', { type: 'boolean', default: 'false', desc: 'If this is "true" then the stack update workflow will be disabled' })
.argv;

const pool = workerpool.pool(path.join(__dirname, '../lib/workers/extract/index.js'), {
maxWorkers: argv['max-workers'],
});

// list of integration tests that will be executed
const testsToRun: IntegTestConfig[] = [];
const testsToRun: IntegTestWorkerConfig[] = [];
const destructiveChanges: DestructiveChange[] = [];
const testsFromArgs: IntegTestConfig[] = [];
const parallelRegions = arrayFromYargs(argv['parallel-regions']);
const testRegions: string[] = parallelRegions ?? ['us-east-1', 'us-east-2', 'us-west-2'];
Expand All @@ -43,7 +46,7 @@ async function main() {
const fromFile: string | undefined = argv['from-file'];
const exclude: boolean = argv.exclude;

let failedSnapshots: IntegTestConfig[] = [];
let failedSnapshots: IntegTestWorkerConfig[] = [];
if (argv['max-workers'] < testRegions.length * (profiles ?? [1]).length) {
logger.warning('You are attempting to run %s tests in parallel, but only have %s workers. Not all of your profiles+regions will be utilized', argv.profiles*argv['parallel-regions'], argv['max-workers']);
}
Expand All @@ -65,14 +68,19 @@ async function main() {
testsFromArgs.push(...(await new IntegrationTests(argv.directory).fromCliArgs(argv._.map((x: any) => x.toString()), exclude)));
}

// If `--force` is not used then first validate the snapshots and gather
// the failed snapshot tests. If `--force` is used then we will skip snapshot
// tests and run integration tests for all tests
// always run snapshot tests, but if '--force' is passed then
// run integration tests on all failed tests, not just those that
// failed snapshot tests
failedSnapshots = await runSnapshotTests(pool, testsFromArgs);
for (const failure of failedSnapshots) {
destructiveChanges.push(...failure.destructiveChanges ?? []);
}
if (!argv.force) {
failedSnapshots = await runSnapshotTests(pool, testsFromArgs);
testsToRun.push(...failedSnapshots);
} else {
testsToRun.push(...testsFromArgs);
// if any of the test failed snapshot tests, keep those results
// and merge with the rest of the tests from args
testsToRun.push(...mergeTests(testsFromArgs, failedSnapshots));
}

// run integration tests if `--update-on-failed` OR `--force` is used
Expand All @@ -85,6 +93,7 @@ async function main() {
clean: argv.clean,
dryRun: argv['dry-run'],
verbose: argv.verbose,
updateWorkflow: !argv['disable-update-workflow'],
});
if (!success) {
throw new Error('Some integration tests failed!');
Expand All @@ -101,13 +110,28 @@ async function main() {
void pool.terminate();
}

if (destructiveChanges.length > 0) {
printDestructiveChanges(destructiveChanges);
throw new Error('Some changes were destructive!');
}
if (failedSnapshots.length > 0) {
let message = '';
if (!runUpdateOnFailed) {
message = 'To re-run failed tests run: yarn integ-runner --update-on-failed';
}
throw new Error(`Some snapshot tests failed!\n${message}`);
}

}

function printDestructiveChanges(changes: DestructiveChange[]): void {
if (changes.length > 0) {
logger.warning('!!! This test contains %s !!!', chalk.bold('destructive changes'));
changes.forEach(change => {
logger.warning(' Stack: %s - Resource: %s - Impact: %s', change.stackName, change.logicalId, change.impact);
});
logger.warning('!!! If these destructive changes are necessary, please indicate this on the PR !!!');
}
}

function printMetrics(metrics: IntegRunnerMetrics[]): void {
Expand All @@ -134,6 +158,18 @@ function arrayFromYargs(xs: string[]): string[] | undefined {
return xs.filter(x => x !== '');
}

/**
* Merge the tests we received from command line arguments with
* tests that failed snapshot tests. The failed snapshot tests have additional
* information that we want to keep so this should override any test from args
*/
function mergeTests(testFromArgs: IntegTestConfig[], failedSnapshotTests: IntegTestWorkerConfig[]): IntegTestWorkerConfig[] {
const failedTestNames = new Set(failedSnapshotTests.map(test => test.fileName));
const final: IntegTestWorkerConfig[] = failedSnapshotTests;
final.push(...testFromArgs.filter(test => !failedTestNames.has(test.fileName)));
return final;
}

export function cli() {
main().then().catch(err => {
logger.error(err);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,19 @@ import * as path from 'path';
import { AssemblyManifest, Manifest, ArtifactType, AwsCloudFormationStackProperties, ArtifactManifest, MetadataEntry } from '@aws-cdk/cloud-assembly-schema';
import * as fs from 'fs-extra';

/**
* Trace information for stack
* map of resource logicalId to trace message
*/
export type StackTrace = Map<string, string>;

/**
* Trace information for a assembly
*
* map of stackId to StackTrace
*/
export type ManifestTrace = Map<string, StackTrace>;

/**
* Reads a Cloud Assembly manifest
*/
Expand Down Expand Up @@ -64,6 +77,17 @@ export class AssemblyManifestReader {
return stacks;
}

/**
* Write trace data to the assembly manifest metadata
*/
public recordTrace(trace: ManifestTrace): void {
const newManifest = {
...this.manifest,
artifacts: this.renderArtifacts(trace),
};
Manifest.saveAssemblyManifest(newManifest, this.manifestFileName);
}

/**
* Clean the manifest of any unneccesary data. Currently that includes
* the metadata trace information since this includes trace information like
Expand All @@ -77,26 +101,51 @@ export class AssemblyManifestReader {
Manifest.saveAssemblyManifest(newManifest, this.manifestFileName);
}

private renderArtifactMetadata(metadata: { [id: string]: MetadataEntry[] } | undefined): { [id: string]: MetadataEntry[] } {
private renderArtifactMetadata(artifact: ArtifactManifest, trace?: StackTrace): { [id: string]: MetadataEntry[] } | undefined {
const newMetadata: { [id: string]: MetadataEntry[] } = {};
for (const [metadataId, metadataEntry] of Object.entries(metadata ?? {})) {
if (!artifact.metadata) return artifact.metadata;
for (const [metadataId, metadataEntry] of Object.entries(artifact.metadata ?? {})) {
newMetadata[metadataId] = metadataEntry.map((meta: MetadataEntry) => {
if (meta.type === 'aws:cdk:logicalId' && trace && meta.data) {
const traceData = trace.get(meta.data.toString());
if (traceData) {
trace.delete(meta.data.toString());
return {
type: meta.type,
data: meta.data,
trace: [traceData],
};
}
}
// return metadata without the trace data
return {
type: meta.type,
data: meta.data,
};
});
}
if (trace && trace.size > 0) {
for (const [id, data] of trace.entries()) {
newMetadata[id] = [{
type: 'aws:cdk:logicalId',
data: id,
trace: [data],
}];
}
}
return newMetadata;
}

private renderArtifacts(): { [id: string]: ArtifactManifest } | undefined {
private renderArtifacts(trace?: ManifestTrace): { [id: string]: ArtifactManifest } | undefined {
const newArtifacts: { [id: string]: ArtifactManifest } = {};
for (const [artifactId, artifact] of Object.entries(this.manifest.artifacts ?? {})) {
let stackTrace: StackTrace | undefined = undefined;
if (artifact.type === ArtifactType.AWS_CLOUDFORMATION_STACK && trace) {
stackTrace = trace.get(artifactId);
}
newArtifacts[artifactId] = {
...artifact,
metadata: this.renderArtifactMetadata(artifact.metadata),
metadata: this.renderArtifactMetadata(artifact, stackTrace),
};
}
return newArtifacts;
Expand Down
28 changes: 28 additions & 0 deletions packages/@aws-cdk/integ-runner/lib/runner/private/utils.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
// Helper functions for CDK Exec
import { spawnSync } from 'child_process';

/**
* Our own execute function which doesn't use shells and strings.
*/
export function exec(commandLine: string[], options: { cwd?: string, verbose?: boolean, env?: any } = { }): any {
const proc = spawnSync(commandLine[0], commandLine.slice(1), {
stdio: ['ignore', 'pipe', options.verbose ? 'inherit' : 'pipe'], // inherit STDERR in verbose mode
env: {
...process.env,
...options.env,
},
cwd: options.cwd,
});

if (proc.error) { throw proc.error; }
if (proc.status !== 0) {
if (process.stderr) { // will be 'null' in verbose mode
process.stderr.write(proc.stderr);
}
throw new Error(`Command exited with ${proc.status ? `status ${proc.status}` : `signal ${proc.signal}`}`);
}

const output = proc.stdout.toString('utf-8').trim();

return output;
}
Loading

0 comments on commit d0ace8f

Please sign in to comment.