Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add docs for writing test cases #791

Merged
merged 25 commits into from
Feb 16, 2024
Merged
Show file tree
Hide file tree
Changes from 19 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
180 changes: 180 additions & 0 deletions docs/authoring_test_cases.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,180 @@
# Authoring test cases

Test cases for the conformance runner are configured in YAML files and are located [here][testsuites]. Each file
represents a single suite of tests.

Authoring new test cases involves either a new file, if a new suite is warranted, or an additional test case in an
existing file/suite.

For the Protobuf definitions for the conformance runner, see the [connectrpc repository][connectrpc-repo] in the BSR.

## Test suites

A suite represents a series of tests that all test the same general functionality (cancellation, timeouts, etc.). When
defining a test suite in a YAML file, the only values that are required are the suite name (which should be unique across all suites)
and at least one test case.
smaye81 marked this conversation as resolved.
Show resolved Hide resolved

Each suite can be configured with various directives which will apply to all tests within that suite. These directives
can be used to constrain the tests run by the conformance runner or to signal various configurations to the runner that
may be needed to execute the tests. The runner will use these directives to expand the tests in the suite into multiple
permutations. This means that a single test defined in a suite will be run several times across various permutations.

The below directives are used to constrain tests within a suite:

* `mode` can be used to specify a suite as applying only to a specific mode (i.e. client or server). For example,
if you are writing a suite to target only clients under test, you would specify `mode: TEST_MODE_CLIENT` and the
tests in the suite will be run only when the `mode` specified to the runner on the command line is `client`. If not
specified in a suite, tests are run regardless of the command line `mode`.

* `relevantProtocols` is used to limit tests only to a specific protocol, such as **Connect**, **gRPC**, or **gRPC-web**. If not
specified, tests are run for all protocols.

* `relevantHttpVersions` is used to limit tests to certain HTTP versions, such as **HTTP 1.1**, **HTTP/2**, or **HTTP/3**. If not
specified, tests are run for all HTTP versions.

* `relevantCodecs` is used to limit tests to certain codec formats, such as **JSON** or **binary**. If not specified, tests are
run for all codecs.

* `relevantCompressions` is used to limit tests to certain compression algorithms, such as **gzip**, **brotli**, or **snappy**. If not
specified, tests are run for all compressions.

* `connectVersionMode` allows you to either require or ignore validation of the Connect version header or query param. This
should be left unspecified if the suite is agnostic to this validation behavior.

The below `reliesOn` directives are used to signal to the test runner how the reference client or server should be
configured when running tests:

* `reliesOnTls` specifies that a suite relies on TLS. Defaults to `false`.
smaye81 marked this conversation as resolved.
Show resolved Hide resolved

* `reliesOnTlsClientCerts` specifies that the suite relies on the _client_ using TLS certificates to authenticate with
the server. Note that if this is set to `true`, `reliesOnTls` must also be `true`. Defaults to `false`.

* `reliesOnConnectGet` specifies that the suite relies on the Connect GET protocol. Defaults to `false`.

* `reliesOnMessageReceiveLimit` specifies that the suite relies on support for limiting the size of received messages.
When `true`, the `mode` property must be set to indicate whether the client or server should support the limit. Defaults
to `false`.

## Test cases

Test cases are specified in the `testCases` property of the suite. Each test case starts with the `request` property
which defines the request which will be sent to a client during the test run. Each `request` must specify the following
fields:

* `testName` - For naming conventions, see [below](#naming-conventions).
* `service` - The fully-qualified path of the service this test will interact with, typically the `ConformanceService`.
smaye81 marked this conversation as resolved.
Show resolved Hide resolved
* `method` - This is a string specifying the method on `service` that will be called.
* `streamType` - One of `STREAM_TYPE_UNARY`, `STREAM_TYPE_CLIENT_STREAM`, `STREAM_TYPE_SERVER_STREAM`,
`STREAM_TYPE_HALF_DUPLEX_BIDI_STREAM`, or `STREAM_TYPE_FULL_DUPLEX_BIDI_STREAM`.

Once the above are specified, you can then define your request. For a full list of fields to specify in the request,
see the [`ClientCompatRequest`][client-compat-proto] message in the Conformance Protobuf definitions.
smaye81 marked this conversation as resolved.
Show resolved Hide resolved

> [!IMPORTANT]
> The `ClientCompatRequest` message contains some fields that should _not_ be specified in test cases.
> These fields include:
> * Fields 1 through 8 in the message definition. These fields are automatically populated by the test runner.
> If a test is specific to one of these values, it should instead be indicated in the directives for the test suite
> itself.
> * Field 20 (`raw_request`). This field is only used by the reference client for sending anomalous requests.
smaye81 marked this conversation as resolved.
Show resolved Hide resolved

## Naming conventions

Test suites and their tests within follow a loose naming convention.

### Test files (Suites)

Test files should be named according to the suite inside and the general functionality being tested. In addition:

* If a suite applies only to a certain protocol, the file name should be prefixed with that protocol. If the suite applies
to all protocols, this can be omitted.
* If a suite only contains client or server tests, the file name prefix should include `client` or `server`. If the suite is
for both client and server, this can be omitted.

For example:
* `connect_with_get.yaml` contains tests that test GET requests for the Connect protocol only.
* `grpc_web_client.yaml` contains client tests for the gRPC-web protocol.
* `client_message_size.yaml` file contains a suite of client tests only across all protocols.

### Tests

Test names should be hyphen-delimited. If a suite contains tests of multiple stream types, the test name should be
prefixed with the stream type and a backslash (`/`).

> [!NOTE]
> In the case of Bidi tests with separate cases for half-duplex operation vs. full-duplex
> operation, you should also add `full-duplex` or `half-duplex` to the test name.

For example:

* `unary/error`
* `server-stream/error-with-responses`
* `bidi-stream/full-duplex/stream-error-returns-success-http-code`

If a suite contains tests for just a single stream type, the stream type can be omitted from the test name.

These conventions allow for more granular control over running tests via the conformance runner, such as only running tests
for a specific protocol or only running the unary tests within a suite.

## Expected responses

The expected response for a test is auto-generated based on the request details. The conformance runner will determine
what the response should be according to the values specified in the test suite and individual test cases.

You also have the ability to explicitly specify your own expected response directly in the test definition itself. However,
this is typically only needed for exception test cases. If the expected response is mostly re-stating the response definition
that appears in the requests, you should rely on the auto-generation if possible. Otherwise, specifying an expected response
can make the test YAML overly verbose and harder to read, write, and maintain.

If the test induces behavior that prevents the server from sending or client from receiving the full response definition, it
will be necessary to define the expected response explicitly. Timeouts, cancellations, and exceeding message size limits are
good examples of this.

If you do need to specify an explicit response, simply define an `expectedResponse` block for your test case and this will
override the auto-generated expected response in the test runner.

To see tests denoting an explicit response, search the [test suites](testsuites) directory for the word `expectedResponse`.

## Example

Taking all of the above into account, here is an example test suite that verifies a server returns the specified headers
and trailers. The below test will only run when `mode` is `server` and will be limited to the Connect protocol using the
JSON format and identity compression. In addition, it will require the server verify the Connect version header.

```yaml
name: Example Test Suite
# Constrain to only servers under test.
mode: TEST_MODE_SERVER
# Constrain these tests to only run over the Connect protocol.
relevantProtocols:
- PROTOCOL_CONNECT
# Constrain these tests to only use 'identity' compression.
relevantCompressions:
- COMPRESSION_IDENTITY
# Constrain these tests to only use the JSON codec.
relevantCodecs:
- CODEC_JSON
# Require Connect version header verification.
connectVersionMode: CONNECT_VERSION_MODE_REQUIRE
testCases:
- request:
testName: returns-headers-and-trailers
service: connectrpc.conformance.v1.ConformanceService
method: Unary
streamType: STREAM_TYPE_UNARY
requestMessages:
- "@type": type.googleapis.com/connectrpc.conformance.v1.UnaryRequest
responseDefinition:
responseHeaders:
- name: x-custom-header
value: ["foo"]
responseData: "dGVzdCByZXNwb25zZQ=="
responseTrailers:
- name: x-custom-trailer
value: ["bing"]
```

[testsuites]: https://github.com/connectrpc/conformance/tree/main/internal/app/connectconformance/testsuites/data
[suite-proto]: https://buf.build/connectrpc/conformance/file/main:connectrpc/conformance/v1/suite.proto
[client-compat-proto]: https://buf.build/connectrpc/conformance/file/main:connectrpc/conformance/v1/client_compat.proto
[connectrpc-repo]: https://buf.build/connectrpc/conformance
15 changes: 14 additions & 1 deletion internal/app/connectconformance/test_case_library.go
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,7 @@ func (lib *testCaseLibrary) expandSuite(suite *conformancev1.TestSuite, configCa
if _, ok := configCases[cfgCase]; ok {
namePrefix := generateTestCasePrefix(suite, cfgCase)
if err := lib.expandCases(cfgCase, namePrefix, suite.TestCases); err != nil {
return err
return fmt.Errorf("failed to expand test cases for suite %s: %w", suite.Name, err)
}
}
}
Expand All @@ -174,6 +174,18 @@ func (lib *testCaseLibrary) expandSuite(suite *conformancev1.TestSuite, configCa

func (lib *testCaseLibrary) expandCases(cfgCase configCase, namePrefix []string, testCases []*conformancev1.TestCase) error {
for _, testCase := range testCases {
if testCase.Request.TestName == "" {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a bit more validation so that we could explicitly state these fields were required. Notably, there was no validation at all that a test name be specified. If it was omitted, the tests ran fine. If the tests happened to fail, the output just didn't show a test name, which probably isn't good. For example:

FAILED: TLS Client Certs/HTTPVersion:3/Protocol:PROTOCOL_GRPC_WEB/Codec:CODEC_TEXT/Compression:COMPRESSION_ZSTD/TLS:true:
        expecting 0 response messages but instead got 1

Also for the other validation I added, the tests would run the whole suite but then fail with a cryptic error like:

FAILED: TLS Client Certs/HTTPVersion:3/Protocol:PROTOCOL_GRPC_WEB/Codec:CODEC_TEXT/Compression:COMPRESSION_ZSTD:
        service name  is not a valid service

I figured this validation was better because it catches it before any tests are actually run.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could provide a little bit more context for these errors if you change the loop vars from _, testCase to i, testCase:

return fmt.Errorf("test case #%d: test case has no name", i+1)

return errors.New("suite contains a test with no name")
}
if testCase.Request.Service == "" {
return fmt.Errorf("test %s has no service specified", testCase.Request.TestName)
}
if testCase.Request.Method == "" {
return fmt.Errorf("test %s has no method specified", testCase.Request.TestName)
}
if testCase.Request.StreamType == conformancev1.StreamType_STREAM_TYPE_UNSPECIFIED {
return fmt.Errorf("test %s has no stream type specified", testCase.Request.TestName)
}
if testCase.Request.StreamType != cfgCase.StreamType {
continue
}
Expand Down Expand Up @@ -441,6 +453,7 @@ func populateExpectedResponse(testCase *conformancev1.TestCase) error {
// message in this situation, where the response data value is some fixed string (such as "no response definition")
// and whose request info will still be present, but we expect it to indicate zero request messages.
if len(testCase.Request.RequestMessages) == 0 {
testCase.ExpectedResponse = &conformancev1.ClientResponseResult{}
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was causing a panic if you omitted requestMessages from a test case. Since ExpectedResponse was nil here, once we returned, the assert function in results.go was blowing up on line ~170.

return nil
}

Expand Down
30 changes: 30 additions & 0 deletions internal/app/connectconformance/test_case_library_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -39,31 +39,47 @@ func TestNewTestCaseLibrary(t *testing.T) {
testCases:
- request:
testName: basic-unary
service: connectrpc.conformance.v1.ConformanceService
method: Unary
streamType: STREAM_TYPE_UNARY
- request:
testName: basic-client-stream
service: connectrpc.conformance.v1.ConformanceService
method: ClientStream
streamType: STREAM_TYPE_CLIENT_STREAM
- request:
testName: basic-server-stream
service: connectrpc.conformance.v1.ConformanceService
method: ServerStream
streamType: STREAM_TYPE_SERVER_STREAM
- request:
testName: basic-bidi-stream
service: connectrpc.conformance.v1.ConformanceService
method: BidStream
streamType: STREAM_TYPE_FULL_DUPLEX_BIDI_STREAM`,
"tls.yaml": `
name: TLS
reliesOnTls: true
testCases:
- request:
testName: tls-unary
service: connectrpc.conformance.v1.ConformanceService
method: Unary
streamType: STREAM_TYPE_UNARY
- request:
testName: tls-client-stream
service: connectrpc.conformance.v1.ConformanceService
method: ClientStream
streamType: STREAM_TYPE_CLIENT_STREAM
- request:
testName: tls-server-stream
service: connectrpc.conformance.v1.ConformanceService
method: ServerStream
streamType: STREAM_TYPE_SERVER_STREAM
- request:
testName: tls-bidi-stream
service: connectrpc.conformance.v1.ConformanceService
method: BidiStream
streamType: STREAM_TYPE_FULL_DUPLEX_BIDI_STREAM`,
"tls-client-certs.yaml": `
name: TLS Client Certs
Expand All @@ -72,6 +88,8 @@ func TestNewTestCaseLibrary(t *testing.T) {
testCases:
- request:
testName: tls-client-cert-unary
service: connectrpc.conformance.v1.ConformanceService
method: Unary
streamType: STREAM_TYPE_UNARY`,
"connect-get.yaml": `
name: Connect GET
Expand All @@ -80,6 +98,8 @@ func TestNewTestCaseLibrary(t *testing.T) {
testCases:
- request:
testName: connect-get-unary
service: connectrpc.conformance.v1.ConformanceService
method: Unary
streamType: STREAM_TYPE_UNARY`,
"connect-version-client-required.yaml": `
name: Connect Version Required (client)
Expand All @@ -89,6 +109,8 @@ func TestNewTestCaseLibrary(t *testing.T) {
testCases:
- request:
testName: unary-without-connect-version-header
service: connectrpc.conformance.v1.ConformanceService
method: Unary
streamType: STREAM_TYPE_UNARY`,
"connect-version-server-required.yaml": `
name: Connect Version Required (server)
Expand All @@ -98,6 +120,8 @@ func TestNewTestCaseLibrary(t *testing.T) {
testCases:
- request:
testName: unary-without-connect-version-header
service: connectrpc.conformance.v1.ConformanceService
method: Unary
streamType: STREAM_TYPE_UNARY`,
"connect-version-client-not-required.yaml": `
name: Connect Version Optional (client)
Expand All @@ -107,6 +131,8 @@ func TestNewTestCaseLibrary(t *testing.T) {
testCases:
- request:
testName: unary-without-connect-version-header
service: connectrpc.conformance.v1.ConformanceService
method: Unary
streamType: STREAM_TYPE_UNARY`,
"connect-version-server-not-required.yaml": `
name: Connect Version Optional (server)
Expand All @@ -116,6 +142,8 @@ func TestNewTestCaseLibrary(t *testing.T) {
testCases:
- request:
testName: unary-without-connect-version-header
service: connectrpc.conformance.v1.ConformanceService
method: Unary
streamType: STREAM_TYPE_UNARY`,
"max-receive-limit": `
name: Max Receive Size (server)
Expand All @@ -124,6 +152,8 @@ func TestNewTestCaseLibrary(t *testing.T) {
testCases:
- request:
testName: unary-exceeds-limit
service: connectrpc.conformance.v1.ConformanceService
method: Unary
streamType: STREAM_TYPE_UNARY`,
}
testSuiteData := make(map[string][]byte, len(testData))
Expand Down
Loading