Skip to content

Commit

Permalink
Fix incremental compilation of runtime/test (#8975)
Browse files Browse the repository at this point in the history
  • Loading branch information
Akirathan committed Feb 13, 2024
1 parent cf19115 commit 5919eda
Show file tree
Hide file tree
Showing 354 changed files with 1,126 additions and 1,175 deletions.
538 changes: 292 additions & 246 deletions build.sbt

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions build/build/paths.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -91,6 +91,7 @@
Factorial.enso:
runtime/:
target/:
runtime-benchmarks/:
bench-report.xml:
lib/:
rust/:
Expand Down
2 changes: 1 addition & 1 deletion build/build/src/engine.rs
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ impl Benchmarks {
pub fn sbt_task(self) -> Option<&'static str> {
match self {
Benchmarks::All => Some("bench"),
Benchmarks::Runtime => Some("runtime/bench"),
Benchmarks::Runtime => Some("runtime-benchmarks/bench"),
Benchmarks::Enso => None,
Benchmarks::EnsoJMH => Some("std-benchmarks/bench"),
}
Expand Down
10 changes: 7 additions & 3 deletions build/build/src/engine/context.rs
Original file line number Diff line number Diff line change
Expand Up @@ -448,7 +448,7 @@ impl RunContext {
} else {
if self.config.build_benchmarks {
// Check Runtime Benchmark Compilation
sbt.call_arg("runtime/Benchmark/compile").await?;
sbt.call_arg("runtime-benchmarks/compile").await?;

// Check Language Server Benchmark Compilation
sbt.call_arg("language-server/Benchmark/compile").await?;
Expand Down Expand Up @@ -478,8 +478,12 @@ impl RunContext {
for bench in &self.config.execute_benchmarks {
match bench {
Benchmarks::Runtime => {
let runtime_bench_report =
&self.paths.repo_root.engine.runtime.bench_report_xml;
let runtime_bench_report = &self
.paths
.repo_root
.engine
.join("runtime-benchmarks")
.join("bench-report.xml");
if runtime_bench_report.exists() {
ide_ci::actions::artifacts::upload_single_file(
runtime_bench_report,
Expand Down
48 changes: 30 additions & 18 deletions docs/infrastructure/benchmarks.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,26 +13,38 @@ To track the performance of the engine, we use
benchmarks:

- [micro benchmarks](#engine-jmh-microbenchmarks) located directly in the
`runtime` SBT project. These benchmarks are written in Java, and are used to
measure the performance of specific parts of the engine.
`runtime-benchmarks` SBT project. These benchmarks are written in Java, and
are used to measure the performance of specific parts of the engine.
- [standard library benchmarks](#standard-library-benchmarks) located in the
`test/Benchmarks` Enso project. These benchmarks are entirelly written in
Enso, along with the harness code.
`test/Benchmarks` Enso project. These benchmarks are entirely written in Enso,
along with the harness code.

## Engine JMH microbenchmarks

These benchmarks are written in Java and are used to measure the performance of
specific parts of the engine. The sources are located in the `runtime` SBT
project, under `src/bench` source directory.
specific parts of the engine. The sources are located in the
`runtime-benchmarks` SBT project, under `engine/runtime-benchmarks` directory.

### Running the benchmarks

To run the benchmarks, use `bench` or `benchOnly` command - `bench` runs all the
benchmarks and `benchOnly` runs only one benchmark specified with the fully
qualified name. The parameters for these benchmarks are hard-coded inside the
JMH annotations in the source files. In order to change, e.g., the number of
measurement iterations, you need to modify the parameter to the `@Measurement`
annotation.
To run the benchmarks, use `bench` or `benchOnly` command in the
`runtime-benchmarks` project - `bench` runs all the benchmarks and `benchOnly`
runs only one benchmark specified with the fully qualified name. The
aforementioned commands are mere shortcuts to the
[standard JMH launcher](https://github.com/openjdk/jmh/blob/master/jmh-core/src/main/java/org/openjdk/jmh/Main.java).
To get the full power of the JMH launcher, invoke simply `run` with cmdline
options passed to the launcher. For the full options summary, see the
[JMH source code](https://github.com/openjdk/jmh/blob/master/jmh-core/src/main/java/org/openjdk/jmh/runner/options/CommandLineOptions.java),
or invoke `run -h`.

You can change the parameters to the benchmarks either by modifying the
annotations directly in the source code, or by passing the parameters to the JMH
runner. For example, to run the benchmarks with 3 warmup iterations and 2
measurement iterations, use:

```
sbt:runtime-benchmarks> run -w 3 -i 2 <bench-name>
```

### Debugging the benchmarks

Expand Down Expand Up @@ -94,12 +106,12 @@ the full options summary, either see the
or run the launcher with `-h` option.

The `std-benchmarks` SBT project supports `bench` and `benchOnly` commands, that
work the same as in the `runtime` project, with the exception that the benchmark
name does not have to be specified as a fully qualified name, but as a regular
expression. To access the full flexibility of the JMH launcher, run it via
`Bench/run` - for example, to see the help message: `Bench/run -h`. For example,
you can run all the benchmarks that have "New_Vector" in their name with just 3
seconds for warmup iterations and 2 measurement iterations with
work the same as in the `runtime-benchmarks` project, with the exception that
the benchmark name does not have to be specified as a fully qualified name, but
as a regular expression. To access the full flexibility of the JMH launcher, run
it via `Bench/run` - for example, to see the help message: `Bench/run -h`. For
example, you can run all the benchmarks that have "New_Vector" in their name
with just 3 seconds for warmup iterations and 2 measurement iterations with
`Bench/run -w 3 -i 2 New_Vector`.

Whenever you add or delete any benchmarks from `test/Benchmarks` project, the
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
package org.enso.interpreter.bench.benchmarks;

import org.enso.interpreter.bench.BenchmarksRunner;
import org.openjdk.jmh.runner.RunnerException;

public class RuntimeBenchmarksRunner {
public static void main(String[] args) throws RunnerException {
BenchmarksRunner.run(args);
}
}

0 comments on commit 5919eda

Please sign in to comment.