Skip to content

Commit

Permalink
Merge branch 'main' into exception-xcontent-depth
Browse files Browse the repository at this point in the history
  • Loading branch information
tvernum committed Jan 10, 2024
2 parents c02af09 + 5457fc2 commit f9a67d1
Show file tree
Hide file tree
Showing 1,676 changed files with 28,311 additions and 14,884 deletions.
41 changes: 41 additions & 0 deletions .buildkite/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# Elasticsearch CI Pipelines

This directory contains pipeline definitions and scripts for running Elasticsearch CI on Buildkite.

## Directory Structure

- [pipelines](pipelines/) - pipeline definitions/yml
- [scripts](scripts/) - scripts used by pipelines, inside steps
- [hooks](hooks/) - [Buildkite hooks](https://buildkite.com/docs/agent/v3/hooks), where global env vars and secrets are set

## Pipeline Definitions

Pipelines are defined using YAML files residing in [pipelines](pipelines/). These are mostly static definitions that are used as-is, but there are a few dynamically-generated exceptions (see below).

### Dynamically Generated Pipelines

Pull request pipelines are generated dynamically based on labels, files changed, and other properties of pull requests.

Non-pull request pipelines that include BWC version matrices must also be generated whenever the [list of BWC versions](../.ci/bwcVersions) is updated.

#### Pull Request Pipelines

Pull request pipelines are generated dynamically at CI time based on numerous properties of the pull request. See [scripts/pull-request](scripts/pull-request) for details.

#### BWC Version Matrices

For pipelines that include BWC version matrices, you will see one or more template files (e.g. [periodic.template.yml](pipelines/periodic.template.yml)) and a corresponding generated file (e.g. [periodic.yml](pipelines/periodic.yml)). The generated file is the one that is actually used by Buildkite.

These files are updated by running:

```bash
./gradlew updateCIBwcVersions
```

This also runs automatically during release procedures.

You should always make changes to the template files, and run the above command to update the generated files.

## Node / TypeScript

Node (technically `bun`), TypeScript, and related files are currently used to generate pipelines for pull request CI. See [scripts/pull-request](scripts/pull-request) for details.
1 change: 0 additions & 1 deletion .buildkite/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
{
"name": "buildkite-pipelines",
"module": "index.ts",
"type": "module",
"devDependencies": {
"@types/node": "^20.6.0",
Expand Down
10 changes: 10 additions & 0 deletions .buildkite/pipelines/dra-workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,13 @@ steps:
image: family/elasticsearch-ubuntu-2204
machineType: custom-32-98304
buildDirectory: /dev/shm/bk
- wait
# The hadoop build depends on the ES artifact
# So let's trigger the hadoop build any time we build a new staging artifact
- trigger: elasticsearch-hadoop-dra-workflow
async: true
build:
branch: "${BUILDKITE_BRANCH}"
env:
DRA_WORKFLOW: staging
if: build.env('DRA_WORKFLOW') == 'staging'
7 changes: 4 additions & 3 deletions .buildkite/pipelines/periodic.template.yml
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,7 @@ steps:
- openjdk19
- openjdk20
- openjdk21
- openjdk22
GRADLE_TASK:
- checkPart1
- checkPart2
Expand Down Expand Up @@ -180,14 +181,14 @@ steps:
image: family/elasticsearch-ubuntu-2004
machineType: n2-standard-8
buildDirectory: /dev/shm/bk
if: build.branch == "main" || build.branch =~ /^[0-9]+\.[0-9]+\$/
- label: Check branch consistency
if: build.branch == "main" || build.branch == "7.17"
- label: check-branch-consistency
command: .ci/scripts/run-gradle.sh branchConsistency
timeout_in_minutes: 15
agents:
provider: gcp
image: family/elasticsearch-ubuntu-2004
machineType: n2-standard-2
- label: Check branch protection rules
- label: check-branch-protection-rules
command: .buildkite/scripts/branch-protection.sh
timeout_in_minutes: 5
7 changes: 4 additions & 3 deletions .buildkite/pipelines/periodic.yml
Original file line number Diff line number Diff line change
Expand Up @@ -1194,6 +1194,7 @@ steps:
- openjdk19
- openjdk20
- openjdk21
- openjdk22
GRADLE_TASK:
- checkPart1
- checkPart2
Expand Down Expand Up @@ -1301,14 +1302,14 @@ steps:
image: family/elasticsearch-ubuntu-2004
machineType: n2-standard-8
buildDirectory: /dev/shm/bk
if: build.branch == "main" || build.branch =~ /^[0-9]+\.[0-9]+\$/
- label: Check branch consistency
if: build.branch == "main" || build.branch == "7.17"
- label: check-branch-consistency
command: .ci/scripts/run-gradle.sh branchConsistency
timeout_in_minutes: 15
agents:
provider: gcp
image: family/elasticsearch-ubuntu-2004
machineType: n2-standard-2
- label: Check branch protection rules
- label: check-branch-protection-rules
command: .buildkite/scripts/branch-protection.sh
timeout_in_minutes: 5
68 changes: 60 additions & 8 deletions .buildkite/scripts/pull-request/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,7 @@ Each time a pull request build is triggered, such as via commit or comment, we u

The generator handles the following:

- `allow-labels` - only trigger a step if the PR has one of these labels
- `skip-labels` - don't trigger the step if the PR has one of these labels
- `excluded-regions` - don't trigger the step if **all** of the changes in the PR match these paths/regexes
- `included-regions` - trigger the step if **all** of the changes in the PR match these paths/regexes
- `trigger-phrase` - trigger this step, and ignore all other steps, if the build was triggered by a comment and that comment matches this regex
- Note that each step has an automatic phrase of `.*run\\W+elasticsearch-ci/<step-name>.*`
- Various configurations for filtering/activating steps based on labels, changed files, etc. See below.
- Replacing `$SNAPSHOT_BWC_VERSIONS` in pipelines with an array of versions from `.ci/snapshotBwcVersions`
- Duplicating any step with `bwc_template: true` for each BWC version in `.ci/bwcVersions`

Expand All @@ -21,18 +16,75 @@ The generator handles the following:

Pipelines are in [`.buildkite/pipelines`](../../pipelines/pull-request). They are automatically picked up and given a name based on their filename.


## Setup

- [Install bun](https://bun.sh/)
- `npm install -g bun` will work if you already have `npm`
- `cd .buildkite; bun install` to install dependencies

## Run tests
## Testing

Testing the pipeline generator is done mostly using snapshot tests, which generate pipeline objects using the pipeline configurations in `mocks/pipelines` and then compare them to previously-generated snapshots in `__snapshots__` to confirm that they are correct.

The mock pipeline configurations should, therefore, try to cover all of the various features of the generator (allow-labels, skip-labels, etc).

Snapshots are generated/managed automatically whenever you create a new test that has a snapshot test condition. These are very similar to Jest snapshots.

### Run tests

```bash
cd .buildkite
bun test
```

If you need to regenerate the snapshots, run `bun test --update-snapshots`.

## Pipeline Configuration

The `config:` property at the top of pipelines inside `.buildkite/pipelines/pull-request` is a custom property used by our pipeline generator. It is not used by Buildkite.

All of the pipelines in this directory are evaluated whenever CI for a pull request is started, and the steps are filtered and combined into one pipeline based on the properties in `config:` and the state of the pull request.

The various configurations available mirror what we were using in our Jenkins pipelines.

### Config Properties

#### `allow-labels`

- Type: `string|string[]`
- Example: `["test-full-bwc"]`

Only trigger a step if the PR has one of these labels.

#### `skip-labels`

- Type: `string|string[]`
- Example: `>test-mute`

Don't trigger the step if the PR has one of these labels.

#### `excluded-regions`

- Type: `string|string[]` - must be JavaScript regexes
- Example: `["^docs/.*", "^x-pack/docs/.*"]`

Exclude the pipeline if all of the changed files in the PR match at least one regex. E.g. for the example above, don't run the step if all of the changed files are docs changes.

#### `included-regions`

- Type: `string|string[]` - must be JavaScript regexes
- Example: `["^docs/.*", "^x-pack/docs/.*"]`

Only include the pipeline if all of the changed files in the PR match at least one regex. E.g. for the example above, only run the step if all of the changed files are docs changes.

This is particularly useful for having a step that only runs, for example, when all of the other steps get filtered out because of the `excluded-regions` property.

#### `trigger-phrase`

- Type: `string` - must be a JavaScript regex
- Example: `"^run\\W+elasticsearch-ci/test-full-bwc.*"`
- Default: `.*run\\W+elasticsearch-ci/<step-name>.*` (`<step-name>` is generated from the filename of the yml file).

Trigger this step, and ignore all other steps, if the build was triggered by a comment and that comment matches this regex.

Note that the entire build itself is triggered via [`.buildkite/pull-requests.json`](../pull-requests.json). So, a comment has to first match the trigger configured there.
Original file line number Diff line number Diff line change
Expand Up @@ -64,10 +64,9 @@ public class AggregatorBenchmark {
private static final int OP_COUNT = 1024;
private static final int GROUPS = 5;

private static final BigArrays BIG_ARRAYS = BigArrays.NON_RECYCLING_INSTANCE; // TODO real big arrays?
private static final BlockFactory blockFactory = BlockFactory.getInstance(
new NoopCircuitBreaker("noop"),
BigArrays.NON_RECYCLING_INSTANCE
BigArrays.NON_RECYCLING_INSTANCE // TODO real big arrays?
);

private static final String LONGS = "longs";
Expand Down Expand Up @@ -155,25 +154,25 @@ private static Operator operator(DriverContext driverContext, String grouping, S

private static AggregatorFunctionSupplier supplier(String op, String dataType, int dataChannel) {
return switch (op) {
case COUNT -> CountAggregatorFunction.supplier(BIG_ARRAYS, List.of(dataChannel));
case COUNT -> CountAggregatorFunction.supplier(List.of(dataChannel));
case COUNT_DISTINCT -> switch (dataType) {
case LONGS -> new CountDistinctLongAggregatorFunctionSupplier(BIG_ARRAYS, List.of(dataChannel), 3000);
case DOUBLES -> new CountDistinctDoubleAggregatorFunctionSupplier(BIG_ARRAYS, List.of(dataChannel), 3000);
case LONGS -> new CountDistinctLongAggregatorFunctionSupplier(List.of(dataChannel), 3000);
case DOUBLES -> new CountDistinctDoubleAggregatorFunctionSupplier(List.of(dataChannel), 3000);
default -> throw new IllegalArgumentException("unsupported data type [" + dataType + "]");
};
case MAX -> switch (dataType) {
case LONGS -> new MaxLongAggregatorFunctionSupplier(BIG_ARRAYS, List.of(dataChannel));
case DOUBLES -> new MaxDoubleAggregatorFunctionSupplier(BIG_ARRAYS, List.of(dataChannel));
case LONGS -> new MaxLongAggregatorFunctionSupplier(List.of(dataChannel));
case DOUBLES -> new MaxDoubleAggregatorFunctionSupplier(List.of(dataChannel));
default -> throw new IllegalArgumentException("unsupported data type [" + dataType + "]");
};
case MIN -> switch (dataType) {
case LONGS -> new MinLongAggregatorFunctionSupplier(BIG_ARRAYS, List.of(dataChannel));
case DOUBLES -> new MinDoubleAggregatorFunctionSupplier(BIG_ARRAYS, List.of(dataChannel));
case LONGS -> new MinLongAggregatorFunctionSupplier(List.of(dataChannel));
case DOUBLES -> new MinDoubleAggregatorFunctionSupplier(List.of(dataChannel));
default -> throw new IllegalArgumentException("unsupported data type [" + dataType + "]");
};
case SUM -> switch (dataType) {
case LONGS -> new SumLongAggregatorFunctionSupplier(BIG_ARRAYS, List.of(dataChannel));
case DOUBLES -> new SumDoubleAggregatorFunctionSupplier(BIG_ARRAYS, List.of(dataChannel));
case LONGS -> new SumLongAggregatorFunctionSupplier(List.of(dataChannel));
case DOUBLES -> new SumDoubleAggregatorFunctionSupplier(List.of(dataChannel));
default -> throw new IllegalArgumentException("unsupported data type [" + dataType + "]");
};
default -> throw new IllegalArgumentException("unsupported op [" + op + "]");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,6 @@
@State(Scope.Thread)
@Fork(1)
public class MultivalueDedupeBenchmark {
private static final BigArrays BIG_ARRAYS = BigArrays.NON_RECYCLING_INSTANCE; // TODO real big arrays?
private static final BlockFactory blockFactory = BlockFactory.getInstance(
new NoopCircuitBreaker("noop"),
BigArrays.NON_RECYCLING_INSTANCE
Expand Down
Loading

0 comments on commit f9a67d1

Please sign in to comment.