From 04afbc5a7da8634f28abbad6a011516dd0b2058b Mon Sep 17 00:00:00 2001 From: Vasileios Karakasis Date: Tue, 31 Jan 2023 22:27:05 +0100 Subject: [PATCH 1/3] Update @performance_function description in tutorial --- docs/deferrable_functions_reference.rst | 2 +- docs/tutorial_basics.rst | 64 +++++++++++++------------ 2 files changed, 35 insertions(+), 31 deletions(-) diff --git a/docs/deferrable_functions_reference.rst b/docs/deferrable_functions_reference.rst index 19bbea88a0..1a173f04fd 100644 --- a/docs/deferrable_functions_reference.rst +++ b/docs/deferrable_functions_reference.rst @@ -58,7 +58,7 @@ Currently ReFrame provides three broad categories of deferrable functions: They include, but are not limited to, functions to iterate over regex matches in a file, extracting and converting values from regex matches, computing statistical information on series of data etc. - .. _deferrable-performance-functions: +.. _deferrable-performance-functions: -------------------------------- diff --git a/docs/tutorial_basics.rst b/docs/tutorial_basics.rst index 516442033a..8eee093e34 100644 --- a/docs/tutorial_basics.rst +++ b/docs/tutorial_basics.rst @@ -429,12 +429,13 @@ For running the benchmark, we need to set the OpenMP number of threads and pin t You can set environment variables in a ReFrame test through the :attr:`~reframe.core.pipeline.RegressionTest.env_vars` dictionary. What makes a ReFrame test a performance test is the definition of at least one :ref:`performance function`. -Similarly to a test's :func:`@sanity_function`, a performance function is a member function decorated with the :attr:`@performance_function` decorator, which binds the decorated function to a given unit. -These functions can be used by the regression test to extract, measure or compute a given quantity of interest; where in this context, the values returned by a performance function are referred to as performance variables. -Alternatively, performance functions can also be thought as `tools` available to the regression test for extracting performance variables. -By default, ReFrame will attempt to execute all the available performance functions during the test's ``performance`` stage, producing a single performance variable out of each of the available performance functions. -These default-generated performance variables are defined in the regression test's attribute :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` during class instantiation, and their default name matches the name of their associated performance function. -However, one could customize the default-generated performance variable's name by passing the ``perf-key`` argument to the :attr:`@performance_function` decorator of the associated performance function. +Similarly to a test's :func:`@sanity_function`, a performance function is a member function decorated with the :func:`@performance_function` decorator that merely extracts or computes a performance metric from the test's output and associates it with a unit. +By default, every performance function defined in the test is assigned to a *performance variable* with the function's name. +A performance variable is a named quantity representing a performance metric that ReFrame will report on it, log it and can also check it against a reference value. +The performance variables of a test are stored in the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary. +The keys are the names of the metrics, whereas the values are :ref:`performance functions `. +The :func:`@performance_function` decorator apart from turning an ordinary method into a "performance function", it also creates an entry in the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary. +The optional ``perf_key`` argument can be used to assign a different name to the newly created performance variable. In this example, we extract four performance variables, namely the memory bandwidth values for each of the "Copy", "Scale", "Add" and "Triad" sub-benchmarks of STREAM, where each of the performance functions use the :func:`~reframe.utility.sanity.extractsingle` utility function. For each of the sub-benchmarks we extract the "Best Rate MB/s" column of the output (see below) and we convert that to a float. @@ -468,9 +469,11 @@ This is especially useful if you run long suites of performance exploration test Setting explicitly the test's performance variables --------------------------------------------------- -In the above STREAM example, all four performance functions were almost identical except for a small part of the regex pattern, which led to some code repetition. -Even though the performance functions were rather simple and the code repetition was not much in that case, this is still not a good practice and it is certainly an approach that would not scale when using more complex performance functions. -Hence, in this example, we show how to collapse all these four performance functions into a single function and how to reuse this single performance function to create multiple performance variables. +Users are allowed to manipulate the test's :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary directly. +This is useful to avoid code repetition or in cases that relying on decorated methods to populate the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` is impractical, e.g., creating multiple performance variables in a loop. + +You might have noticed that in our STREAM example above, all four performance functions are almost identical except for a small part of the regex pattern. +In the following example, we define a single performance function, :func:`extract_bw`, that can extract any of the requested bandwidth metrics, and we populate the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` ourselves in a pre-performance hook: .. code-block:: console @@ -480,28 +483,29 @@ Hence, in this example, we show how to collapse all these four performance funct :start-at: import reframe :emphasize-lines: 28- -As shown in the highlighted lines, this example collapses the four performance functions from the previous example into the :func:`extract_bw` function, which is also decorated with the :attr:`@performance_function` decorator with the units set to ``'MB/s'``. -However, the :func:`extract_bw` function now takes the optional argument ``kind`` which selects the STREAM benchmark to extract. -By default, this argument is set to ``'Copy'`` because functions decorated with :attr:`@performance_function` are only allowed to have ``self`` as a non-default argument. -Thus, from this performance function definition, ReFrame will default-generate a single performance variable during the test instantiation under the name ``extract_bw``, where this variable will report the performance results from the ``Copy`` benchmark. -With no further action from our side, ReFrame would just report the performance of the test based on this default-generated performance variable, but that is not what we are after here. -Therefore, we must modify these default performance variables so that this version of the STREAM test produces the same results as in the previous example. -As mentioned before, the performance variables (also the default-generated ones) are stored in the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary, so all we need to do is to redefine this mapping with our desired performance variables as done in the pre-performance pipeline hook :func:`set_perf_variables`. +As mentioned in the previous section the :func:`@performance_function ` decorator performs two tasks: -.. tip:: - Performance functions may also be generated inline using the :func:`~reframe.utility.sanity.make_performance_function` utility as shown below. - - .. code-block:: python - - @run_before('performance') - def set_perf_vars(self): - self.perf_variables = { - 'Copy': sn.make_performance_function( - sn.extractsingle(r'Copy:\s+(\S+)\s+.*', - self.stdout, 1, float), - 'MB/s' - ) - } +1. It converts a test method to *performance function*, i.e., a function that is suitable for extracting a performance metric. +2. It updates the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary with the newly created performance function. + +In this example, we are only interested in the first functionality and that's why we redefine completely the test's :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` using the :func:`extract_bw` performance function. +If you are inheriting from a base test and you don't want to override completely its performance variables, you could call instead :py:func:`update` on :attr:`~reframe.core.pipeline.RegressionTest.perf_variables`. + +Finally, you can convert any arbitrary function or :doc:`deferred expression ` into a performance function by calling the :func:`~reframe.utility.sanity.make_performance_function` utility as shown below: + +.. code-block:: python + + @run_before('performance') + def set_perf_vars(self): + self.perf_variables = { + 'Copy': sn.make_performance_function( + sn.extractsingle(r'Copy:\s+(\S+)\s+.*', + self.stdout, 1, float), + 'MB/s' + ) + } + +Note that in this case, the newly created performance function is not assigned to a test's performance variable and you will have to do this independently. ----------------------- Adding reference values From 44243cb37f61ba1c0e090edfbdd457dfd87866db Mon Sep 17 00:00:00 2001 From: Vasileios Karakasis Date: Tue, 31 Jan 2023 22:51:38 +0100 Subject: [PATCH 2/3] Register performance functions even if `perf_variables` are defined in the class body --- reframe/core/pipeline.py | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/reframe/core/pipeline.py b/reframe/core/pipeline.py index 0deb82e6ff..c23c131894 100644 --- a/reframe/core/pipeline.py +++ b/reframe/core/pipeline.py @@ -984,9 +984,8 @@ def __pre_init__(self): self.__deferred_rfm_init.evaluate() # Build the default performance dict - if not self.perf_variables: - for fn in self._rfm_perf_fns.values(): - self.perf_variables[fn._rfm_perf_key] = fn(self) + for fn in self._rfm_perf_fns.values(): + self.perf_variables[fn._rfm_perf_key] = fn(self) def __init__(self): pass From f4a88b7160231986cd60a2ffbee5585c5515e2a4 Mon Sep 17 00:00:00 2001 From: Vasileios Karakasis Date: Wed, 15 Feb 2023 13:17:50 +0100 Subject: [PATCH 3/3] Address PR comments Co-authored-by: Eirini Koutsaniti --- docs/tutorial_basics.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/tutorial_basics.rst b/docs/tutorial_basics.rst index 8eee093e34..00643a6025 100644 --- a/docs/tutorial_basics.rst +++ b/docs/tutorial_basics.rst @@ -431,7 +431,7 @@ You can set environment variables in a ReFrame test through the :attr:`~reframe. What makes a ReFrame test a performance test is the definition of at least one :ref:`performance function`. Similarly to a test's :func:`@sanity_function`, a performance function is a member function decorated with the :func:`@performance_function` decorator that merely extracts or computes a performance metric from the test's output and associates it with a unit. By default, every performance function defined in the test is assigned to a *performance variable* with the function's name. -A performance variable is a named quantity representing a performance metric that ReFrame will report on it, log it and can also check it against a reference value. +A performance variable is a named quantity representing a performance metric that ReFrame will report on, log and can also check against a reference value. The performance variables of a test are stored in the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary. The keys are the names of the metrics, whereas the values are :ref:`performance functions `. The :func:`@performance_function` decorator apart from turning an ordinary method into a "performance function", it also creates an entry in the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary.