Skip to content

Commit

Permalink
Merge branch 'master' into apigw-sfn-method
Browse files Browse the repository at this point in the history
  • Loading branch information
mergify[bot] committed Mar 21, 2022
2 parents 37e694d + 01b538e commit 05c2b67
Show file tree
Hide file tree
Showing 11 changed files with 402 additions and 17 deletions.
9 changes: 7 additions & 2 deletions .github/PULL_REQUEST_TEMPLATE.md
Expand Up @@ -3,10 +3,15 @@

### All Submissions:

* [ ] Have you followed the guidelines in our [Contributing guide?](../CONTRIBUTING.md)
* [ ] Have you followed the guidelines in our [Contributing guide?](https://github.com/aws/aws-cdk/blob/master/CONTRIBUTING.md)

### Adding new Unconventional Dependencies:

* [ ] This PR adds new unconventional dependencies following the process described [here](../CONTRIBUTING.md/#adding-new-unconventional-dependencies)
* [ ] This PR adds new unconventional dependencies following the process described [here](https://github.com/aws/aws-cdk/blob/master/CONTRIBUTING.md/#adding-new-unconventional-dependencies)

### New Features

* [ ] Have you added the new feature to an [integration test](https://github.com/aws/aws-cdk/blob/master/INTEGRATION_TESTS.md)?
* [ ] Did you use `cdk-integ` to deploy the infrastructure and generate the snapshot (i.e. `cdk-integ` without `--dry-run`)?

*By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
2 changes: 1 addition & 1 deletion .github/workflows/yarn-upgrade.yml
Expand Up @@ -27,7 +27,7 @@ jobs:
run: echo "::set-output name=dir::$(yarn cache dir)"

- name: Restore Yarn cache
uses: actions/cache@v2.1.7
uses: actions/cache@v3
with:
path: ${{ steps.yarn-cache.outputs.dir }}
key: ${{ runner.os }}-yarn-${{ hashFiles('**/yarn.lock') }}
Expand Down
24 changes: 21 additions & 3 deletions CONTRIBUTING.md
Expand Up @@ -234,9 +234,17 @@ Integration tests perform a few functions in the CDK code base -
3. (Optionally) Acts as a way to validate that constructs set up the CloudFormation resources as expected. A successful
CloudFormation deployment does not mean that the resources are set up correctly.

If you are working on a new feature that is using previously unused CloudFormation resource types, or involves
configuring resource types across services, you need to write integration tests that use these resource types or
features.
**When are integration tests required?**

The following list contains common scenarios where we _know_ that integration tests are required.
This is not an exhaustive list and we will, by default, require integration tests for all
new features unless there is a good reason why one is not needed.

1. Adding a new feature that is using previously unused CloudFormation resource types
2. Adding a new feature that is using previously unused (or untested) CloudFormation properties
3. Involves configuring resource types across services (i.e. integrations)
4. Adding a new supported version (e.g. a new [AuroraMysqlEngineVersion](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_rds.AuroraMysqlEngineVersion.html))
5. Adding any functionality via a [Custom Resource](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.custom_resources-readme.html)

To the extent possible, include a section (like below) in the integration test file that specifies how the successfully
deployed stack can be verified for correctness. Correctness here implies that the resources have been set up correctly.
Expand All @@ -254,6 +262,16 @@ Examples:
* [integ.destinations.ts](https://github.com/aws/aws-cdk/blob/master/packages/%40aws-cdk/aws-lambda-destinations/test/integ.destinations.ts#L7)
* [integ.token-authorizer.lit.ts](https://github.com/aws/aws-cdk/blob/master/packages/%40aws-cdk/aws-apigateway/test/authorizers/integ.token-authorizer.lit.ts#L7-L12)

**What do do if you cannot run integration tests**

If you are working on a PR that requires an update to an integration test and you are unable
to run the `cdk-integ` tool to perform a real deployment, please call this out on the pull request
so a maintainer can run the tests for you. Please **do not** run the `cdk-integ` tool with `--dry-run`
or manually update the snapshot.

See the [integration test guide](./INTEGRATION_TESTS.md) for a more complete guide on running
CDK integration tests.

#### yarn watch (Optional)

We've added a watch feature to the CDK that builds your code as you type it. Start this by running `yarn watch` for
Expand Down
225 changes: 225 additions & 0 deletions INTEGRATION_TESTS.md
@@ -0,0 +1,225 @@
# Integration Tests

This document describes the purpose of integration tests as well as acting as a guide
on what type of changes require integrations tests and how you should write integration tests.

- [What are CDK Integration Tests](#what-are-cdk-integration-tests)
- [When are integration tests required](#when-are-integration-tests-required)
- [How to write Integration Tests](#how-to-write-integration-tests)
- [Creating a test](#creating-a-test)
- [New L2 Constructs](#new-l2-constructs)
- [Existing L2 Constructs](#existing-l2-constructs)
- [Assertions](#assertions)

## What are CDK Integration Tests

All Construct libraries in the CDK code base have integration tests that serve to -

1. Acts as a regression detector. It does this by running `cdk synth` on the integration test and comparing it against
the `*.expected.json` file. This highlights how a change affects the synthesized stacks.
2. Allows for a way to verify if the stacks are still valid CloudFormation templates, as part of an intrusive change.
This is done by running `yarn integ` which will run `cdk deploy` across all of the integration tests in that package.
If you are developing a new integration test or for some other reason want to work on a single integration test
over and over again without running through all the integration tests you can do so using
`yarn integ integ.test-name.js` .Remember to set up AWS credentials before doing this.
3. (Optionally) Acts as a way to validate that constructs set up the CloudFormation resources as expected.
A successful CloudFormation deployment does not mean that the resources are set up correctly.


## When are Integration Tests Required

The following list contains common scenarios where we _know_ that integration tests are required.
This is not an exhaustive list and we will, by default, require integration tests for all
new features unless there is a good reason why one is not needed.

**1. Adding a new feature that is using previously unused CloudFormation resource types**
For example, adding a new L2 construct for an L1 resource. There should be a new integration test
to test that the new L2 successfully creates the resources in AWS.

**2. Adding a new feature that is using previously unused (or untested) CloudFormation properties**
For example, there is an existing L2 construct for a CloudFormation resource and you are adding
support for a new property. This could be either a new property that has been added to CloudFormation
or an existing property that the CDK did not have coverage for. You should either update and existing
integration test to cover this new property or create a new test.

Sometimes the CloudFormation documentation is incorrect or unclear on the correct way to configure
a property. This can lead to introducing new features that don't actually work. Creating
an integration test for the new feature can ensure that it works and avoid unnecessary bugs.

**3. Involves configuring resource types across services (i.e. integrations)**
For example, you are adding functionality that allows for service x to integrate with service y.
A good example of this is the [aws-stepfunctions-tasks](./packages/@aws-cdk/aws-stepfunctions-tasks) or
[aws-apigatewayv2-integrations](./packages/@aws-cdk/aws-apigatewayv2-integrations) modules. Both of these
have L2 constructs that provide functionality to integrate services.

Sometimes these integrations involve configuring/formatting json/vtl or some other type of data.
For these types of features it is important to create an integration test that not only validates
that the infrastructure deploys successfully, but that the intended functionality works. This could
mean deploying the integration test and then manually making an HTTP request or invoking a Lambda function.

**4. Adding a new supported version (e.g. a new [AuroraMysqlEngineVersion](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_rds.AuroraMysqlEngineVersion.html))**
Sometimes new versions introduce new CloudFormation properties or new required configuration.
For example Aurora MySQL version 8 introduced a new parameter and was not compatible with the
existing parameter (see [#19145](https://github.com/aws/aws-cdk/pull/19145)).

**5. Adding any functionality via a [Custom Resource](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.custom_resources-readme.html)**
Custom resources involve non-standard functionality and are at a higher risk of introducing bugs.

## How to write Integration Tests

This section will detail how to write integration tests, how they are executed and how to ensure
you have good test coverage.

### Creating a Test

An integration tests is any file located in the `test/` directory that has a name that starts with `integ.`
(e.g. `integ.*.ts`).

To create a new integration test, first create a new file, for example `integ.my-new-construct.ts`.
The contents of this file should be a CDK app. For example, a very simple integration test for a
Lambda Function would look like this:

_integ.lambda.ts_
```ts
import * as iam from '@aws-cdk/aws-iam';
import * as cdk from '@aws-cdk/core';
import * as lambda from '../lib';

const app = new cdk.App();

const stack = new cdk.Stack(app, 'aws-cdk-lambda-1');

const fn = new lambda.Function(stack, 'MyLambda', {
code: new lambda.InlineCode('foo'),
handler: 'index.handler',
runtime: lambda.Runtime.NODEJS_10_X,
});

app.synth();
```

To run the test you would run:

*Note - filename must be `*.js`*
```
npm run cdk-integ integ.lambda.js
```

This will:
1. Synthesize the CDK app
2. `cdk deploy` to your AWS account
3. `cdk destroy` to delete the stack
4. Save a snapshot of the synthed CloudFormation template to `integ.lambda.expected.json`

Now when you run `npm test` it will synth the integ app and compare the result with the snapshot.
If the snapshot has changed the same process must be followed to update the snapshot.

### New L2 Constructs

When creating a new L2 construct (or new construct library) it is important to ensure you have a good
coverage base from which future contributions can build on.

Some general rules to follow are:

- **1 test with all default values**
One test for each L2 that only populates the required properties. For a Lambda Function this would look like:

```ts
new lambda.Function(this, 'Handler', {
code,
handler,
runtime,
});
```

- **1 test with all values provided**
One test for each L2 that populates non-default properties. Some of this will come down to judgement, but this should
be based on major functionality. For example, when testing a Lambda Function there are 37 (*at the time of this writing) different
input parameters. Some of these can be tested together and don't represent large pieces of functionality,
while others do.

For example, the test for a Lambda Function might look like this. For most of these properties we are probably fine
testing them together and just testing one of their values. For example we don't gain much by testing a bunch of
different `memorySize` settings, as long as we test that we can `set` the memorySize then we should be good.

```ts
new lambda.Function(this, 'Handler', {
code,
handler,
runtime,
architecture,
description,
environment,
environmentEncryption,
functionName,
initialPolicy,
insightsVersion,
layers,
maxEventAge,
memorySize,
reservedConcurrentExecutions,
retryAttempts,
role,
timeout,
tracing,
});
```

Other parameters might represent larger pieces of functionality and might create other resources for us or configure
integrations with other services. For these it might make sense to split them out into separate tests so it is easier
to reason about them.

A couple of examples would be
(you could also mix in different configurations of the above parameters with each of these):

_testing filesystems_
```ts
new lambda.Function(this, 'Handler', {
filesystem,
});
```

_testing event sources_
```ts
new lambda.Function(this, 'Handler', {
events,
});
```

_testing VPCs_
```ts
new lambda.Function(this, 'Handler', {
securityGroups,
vpc,
vpcSubnets,
});
```

### Existing L2 Constructs

Updating an existing L2 Construct could consist of:

1. **Adding coverage for a new (or previously uncovered) CloudFormation property.**
In this case you would want to either add this new property to an existing integration test or create a new
integration test. A new integration test is preferred for larger update (e.g. adding VPC connectivity, etc).

2. **Updating functionality for an existing property.**
In this case you should first check if you are already covered by an existing integration test. If not, then you would follow the
same process as adding new coverage.

3. **Changing functionality that affects asset bundling**
Some constructs deal with asset bundling (i.e. `aws-lambda-nodejs`, `aws-lambda-python`, etc). There are some updates that may not
touch any CloudFormation property, but instead change the way that code is bundled. While these types of changes may not require
a change to an integration test, you need to make sure that the integration tests and assertions are rerun.

An example of this would be making a change to the way `aws-lambda-nodejs` bundles Lambda code. A couple of things could go wrong that would
only be caught by rerunning the integration tests.

1. The bundling commands are only running when performing a real synth (not part of unit tests). Running the integration test confirms
that the actual bundling was not broken.
2. When deploying Lambda Functions, CloudFormation will only update the Function configuration with the new code,
but it will not validate that the Lambda function can be invoked. Because of this, it is important to rerun the integration test
to deploy the Lambda Function _and_ then rerun the assertions to ensure that the function can still be invoked.

### Assertions
...Coming soon...
1 change: 1 addition & 0 deletions packages/@aws-cdk/aws-iot/README.md
Expand Up @@ -36,6 +36,7 @@ Import it into your code:

```ts nofixture
import * as iot from '@aws-cdk/aws-iot';
import * as actions from '@aws-cdk/aws-iot-actions-alpha';
```

## `TopicRule`
Expand Down
18 changes: 14 additions & 4 deletions packages/@aws-cdk/custom-resources/README.md
Expand Up @@ -354,7 +354,7 @@ This sample demonstrates the following concepts:

### Customizing Provider Function name

In multi-account environments or when the custom resource may be re-utilized across several
In multi-account environments or when the custom resource may be re-utilized across several
stacks it may be useful to manually set a name for the Provider Function Lambda and therefore
have a predefined service token ARN.

Expand Down Expand Up @@ -401,9 +401,19 @@ the `installLatestAwsSdk` prop to `false`.
You must provide the `policy` property defining the IAM Policy that will be applied to the API calls.
The library provides two factory methods to quickly configure this:

* **`AwsCustomResourcePolicy.fromSdkCalls`** - Use this to auto-generate IAM Policy statements based on the configured SDK calls.
Note that you will have to either provide specific ARN's, or explicitly use `AwsCustomResourcePolicy.ANY_RESOURCE` to allow access to any resource.
* **`AwsCustomResourcePolicy.fromStatements`** - Use this to specify your own custom statements.
* **`AwsCustomResourcePolicy.fromSdkCalls`** - Use this to auto-generate IAM
Policy statements based on the configured SDK calls. Keep two things in mind
when using this policy:
* This policy variant assumes the IAM policy name has the same name as the API
call. This is true in 99% of cases, but there are exceptions (for example,
S3's `PutBucketLifecycleConfiguration` requires
`s3:PutLifecycleConfiguration` permissions, Lambda's `Invoke` requires
`lambda:InvokeFunction` permissions). Use `fromStatements` if you want to
do a call that requires different IAM action names.
* You will have to either provide specific ARNs, or explicitly use
`AwsCustomResourcePolicy.ANY_RESOURCE` to allow access to any resource.
* **`AwsCustomResourcePolicy.fromStatements`** - Use this to specify your own
custom statements.

The custom resource also implements `iam.IGrantable`, making it possible to use the `grantXxx()` methods.

Expand Down
Expand Up @@ -199,6 +199,13 @@ export class AwsCustomResourcePolicy {
*
* Each SDK call with be translated to an IAM Policy Statement in the form of: `call.service:call.action` (e.g `s3:PutObject`).
*
* This policy generator assumes the IAM policy name has the same name as the API
* call. This is true in 99% of cases, but there are exceptions (for example,
* S3's `PutBucketLifecycleConfiguration` requires
* `s3:PutLifecycleConfiguration` permissions, Lambda's `Invoke` requires
* `lambda:InvokeFunction` permissions). Use `fromStatements` if you want to
* do a call that requires different IAM action names.
*
* @param options options for the policy generation
*/
public static fromSdkCalls(options: SdkCallsPolicyOptions) {
Expand Down
@@ -1 +1 @@
awscli==1.22.73
awscli==1.22.77
14 changes: 9 additions & 5 deletions packages/aws-cdk/test/integ/helpers/test-helpers.ts
Expand Up @@ -15,14 +15,18 @@ export function integTest(
timeoutMillis?: number,
) {

// Integ tests can run concurrently, and are responsible for blocking themselves if they cannot.
// Because `test.concurrent` executes the test code immediately (to obtain a promise), we allow
// setting the `JEST_TEST_CONCURRENT` environment variable to 'false' in order to use `test`
// instead of `test.concurrent` (this is necessary when specifying a test pattern to verify).
const testKind = process.env.JEST_TEST_CONCURRENT === 'false' ? test : test.concurrent;
// Integ tests can run concurrently, and are responsible for blocking
// themselves if they cannot. Because `test.concurrent` executes the test
// code immediately, regardles of any `--testNamePattern`, this cannot be the
// default: test filtering simply does not work with `test.concurrent`.
// Instead, we make it opt-in only for the pipeline where we don't do any
// selection, but execute all tests unconditionally.
const testKind = process.env.JEST_TEST_CONCURRENT === 'true' ? test.concurrent : test;
const runner = shouldSkip(name) ? testKind.skip : testKind;

runner(name, async () => {
// eslint-disable-next-line no-console
console.log(`running test ${name} using ${runner.name}`);
const output = new MemoryStream();

output.write('================================================================\n');
Expand Down

0 comments on commit 05c2b67

Please sign in to comment.