Skip to content

Codebase Framework Files

Nate edited this page Jul 14, 2023 · 8 revisions

There are a few files that are necessary to implement a framework into the Framework Benchmarks. All of these files will vary because they're all specific and unique to each framework.

File Summary
Benchmark Config File Defines test instructions and metadata for the framework benchmarks program.
Test Docker Files Creates a docker container that installs the test and runs the framework's server.

Benchmark Config File

The benchmark_config.json file is used by our scripts to identify available tests - it should exist at the root of the framework directory.

Here is an example benchmark_config.json from the Compojure framework. There are two different tests listed for the Compojure framework.

{
  "framework": "compojure",
  "maintainers": ["somegithubuser"],
  "tests": [{
    "default": {
      "json_url": "/compojure/json",
      "db_url": "/compojure/db",
      "query_url": "/compojure/queries/",
      "update_url": "/compojure/updates/",
      "fortune_url": "/compojure/fortunes",
      "plaintext_url": "/compojure/plaintext",
      "port": 8080,
      "debug_port": 8081,
      "approach": "Realistic",
      "classification": "Micro",
      "database": "MySQL",
      "framework": "compojure",
      "language": "Clojure",
      "flavor": "None",
      "orm": "Micro",
      "platform": "Servlet",
      "webserver": "Resin",
      "os": "Linux",
      "database_os": "Linux",
      "display_name": "compojure",
      "notes": "",
      "versus": "servlet"
    },
    "raw": {
      "db_url": "/compojure/raw/db",
      "query_url": "/compojure/raw/queries/",
      "update_url": "/compojure/raw/updates/",
      "fortune_url": "/compojure/raw/fortunes",
      "port": 8080,
      "approach": "Realistic",
      "classification": "Micro",
      "database": "MySQL",
      "framework": "compojure",
      "language": "Clojure",
      "flavor": "None",
      "orm": "Raw",
      "platform": "Servlet",
      "webserver": "Resin",
      "os": "Linux",
      "database_os": "Linux",
      "display_name": "compojure-raw",
      "notes": "",
      "versus": "servlet"
    }
  }]
}
  • framework: Specifies the framework name, which is used when the tests are run and within TFB. This allows you to call the default test with tfb --mode verify --test compojure, or call the other test with tfb --mode verify --test compojure-raw.

  • maintainers: A list of GitHub users that should be pinged when PR's are opened that edit any of the files in that framework directory.

  • tests: A list of tests that can be run for this framework. In many cases, this contains a single element for the "default" test, but additional tests can be specified. Each test name must be unique when concatenated with the framework name. Each test will be run separately in our Rounds, so it is to your benefit to provide multiple variations in case one works better in some cases.

    • dockerfile (optional): Specify the name of the dockerfile this test will use. The default is test-name.dockerfile
    • docker_cmd (optional): Allows you to override the CMD provided in the dockerfile for this test.
    • json_url (optional): The URI to the JSON test, typically /json
    • db_url (optional): The URI to the database test, typically /db
    • query_url (optional): The URI to the variable query test. The URI must be set up so that an integer can be applied to the end of the URI to specify the number of queries to run. For example, /query?queries=(to yield /query?queries=20) or /query/ (to yield /query/20)
    • fortune_url (optional): the URI to the fortunes test, typically /fortune
    • update_url (optional): the URI to the updates test, setup in a manner similar to query_url described above.
    • plaintext_url (optional): the URI of the plaintext test, typically /plaintext
    • port: The port the server will be listening on, typically 8080
    • debug_port (optional): The port the server will be listening on if started with the --debug option. Will be opened in addition to port.
    • approach (metadata): Realistic or Stripped
    • classification (metadata): Fullstack, Micro, or Platform
    • database (metadata): MySQL, Postgres, MongoDB, Cassandra, Elasticsearch, Redis, SQLite, SQLServer, or None
    • framework (metadata): name of the framework (only used to display information on the results site)
    • language (metadata): name of the language
    • flavor (metadata): used to denote a language version or interpreter different from the standard (eg. pypy)
    • orm (metadata): Full, Micro, or Raw
    • platform (metadata): name of the platform
    • webserver (metadata): name of the web-server (also referred to as the "front-end server")
    • os (metadata): The application server's operating system, Linux or Windows
    • database_os (metadata): The database server's operating system, Linux or Windows
    • display_name (metadata): How to render this test permutation's name on the results web site. Some permutation names can be really long, so the display_name is provided in order to provide something more succinct.
    • versus (optional): The name of another test (elsewhere in this project) that is a subset of this framework. This allows for the generation of the framework efficiency chart in the results web site. For example, Compojure is compared to "servlet" since Compojure is built on the Servlets platform.

The requirements section explains the expected response for each URL as well all metadata options available.

Test Docker File

In order to install the necessary components for each framework test, a dockerfile named after that test is required. Looking at the benchmark_config.json for compojure above, a dockerfile for the default test would be called compojure.dockerfile and exist at the root level (in this case frameworks/Clojure/compojure/compojure.dockerfile.) All frameworks must have a default test with a corresponding dockerfile. The next entry requires a dockerfile named compojure-raw.dockerfile.

Each tests' dockerfile should be considered independently. The idea is that a single dockerfile should be a complete install of the framework and run the server in the foreground. If the installation process is the same between two tests, it is encouraged that the code between the two dockerfiles looks the same and be in the same order. Docker will make use of its internal caching system such that the second test will build much faster. If they're exactly the same with only a different command, consider the following:

We allow you to override the default dockerfile name with the dockerfile benchmark_config option. This would allow you to use the same dockerfile for two tests. You can override the CMD in the dockerfile using the docker_cmd option.

Provided Environment Variables

We provide the following ARGs to be consumed by your dockerfile if needed.

  • BENCHMARK_ENV: Its value is the current benchmarking environment, like Citrine or Azure. Useful for tuning your application.
  • TFB_TEST_NAME: Its value is the name of the current test. Useful if you'd like to use one dockerfile and alter things at runtime based on your test name. See the nodejs implementation for an example.
  • TFB_TEST_DATABASE: Its value is the name of the database that's being used for the current test. Useful to set drivers or database connection strings at runtime.

If you are unfamiliar with writing dockerfiles check out Docker's documentation. Also check out how other frameworks build their dockerfiles in our repo.