Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions docs/config_reference.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1111,6 +1111,16 @@ General Configuration
Purge any loaded environment modules before running any tests.


.. js:attribute:: .general[].report_file

:required: No
:default: ``"${HOME}/.reframe/reports/run-report-{sessionid}.json"``

The file where ReFrame will store its report.

.. versionadded:: 3.1


.. js:attribute:: .general[].save_log_files

:required: No
Expand Down
27 changes: 27 additions & 0 deletions docs/manpage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -225,6 +225,8 @@ Options controlling ReFrame output
Normally, if the stage directory of a test exists, ReFrame will remove it and recreate it.
This option disables this behavior.

This option can also be set using the :envvar:`RFM_CLEAN_STAGEDIR` environment variable or the :js:attr:`clean_stagedir` general configuration parameter.

.. versionadded:: 3.1

.. option:: --save-log-files
Expand All @@ -236,6 +238,16 @@ Options controlling ReFrame output
This option can also be set using the :envvar:`RFM_SAVE_LOG_FILES` environment variable or the :js:attr:`save_log_files` general configuration parameter.


.. option:: --report-file=FILE

The file where ReFrame will store its report.
The ``FILE`` argument may contain the special placeholder ``{sessionid}``, in which case ReFrame will generate a new report each time it is run by appending a counter to the report file.

This option can also be set using the :envvar:`RFM_REPORT_FILE` environment variable or the :js:attr:`report_file` general configuration parameter.

.. versionadded:: 3.1


-------------------------------------
Options controlling ReFrame execution
-------------------------------------
Expand Down Expand Up @@ -790,6 +802,21 @@ Here is an alphabetical list of the environment variables recognized by ReFrame:
================================== ==================


.. envvar:: RFM_REPORT_FILE

The file where ReFrame will store its report.

.. versionadded:: 3.1

.. table::
:align: left

================================== ==================
Associated command line option :option:`--report-file`
Associated configuration parameter :js:attr:`report_file` general configuration parameter
================================== ==================


.. envvar:: RFM_SAVE_LOG_FILES

Save ReFrame log files in the output directory before exiting.
Expand Down
91 changes: 80 additions & 11 deletions docs/tutorial_basics.rst
Original file line number Diff line number Diff line change
Expand Up @@ -90,28 +90,28 @@ Now it's time to run our first test:
.. code-block:: none

[ReFrame Setup]
version: 3.1-dev0 (rev: 986c3505)
command: './bin/reframe -c tutorials/basics/hello/hello1.py -r'
launched by: user@tresa.local
working directory: '/Users/user/reframe'
version: 3.1-dev2 (rev: 272e1aae)
command: ./bin/reframe -c tutorials/basics/hello/hello1.py -r
launched by: user@dhcp-133-44.cscs.ch
working directory: '/Users/user/Repositories/reframe'
settings file: '<builtin>'
check search path: '/Users/user/reframe/tutorials/basics/hello/hello1.py'
stage directory: '/Users/user/reframe/stage'
output directory: '/Users/user/reframe/output'
check search path: '/Users/user/Repositories/reframe/tutorials/basics/hello/hello1.py'
stage directory: '/Users/user/Repositories/reframe/stage'
output directory: '/Users/user/Repositories/reframe/output'

[==========] Running 1 check(s)
[==========] Started on Sat Jun 20 09:44:52 2020
[==========] Started on Fri Jul 24 11:05:46 2020

[----------] started processing HelloTest (HelloTest)
[ RUN ] HelloTest on generic:default using builtin
[----------] finished processing HelloTest (HelloTest)

[----------] waiting for spawned checks to finish
[ OK ] (1/1) HelloTest on generic:default using builtin [compile: 0.735s run: 0.505s total: 1.272s]
[ OK ] (1/1) HelloTest on generic:default using builtin [compile: 0.378s run: 0.299s total: 0.712s]
[----------] all spawned checks have finished

[ PASSED ] Ran 1 test case(s) from 1 check(s) (0 failure(s))
[==========] Finished on Sat Jun 20 09:44:53 2020
[==========] Finished on Fri Jul 24 11:05:47 2020


Perfect! We have verified that we have a functioning C compiler in our system.
Expand All @@ -121,7 +121,7 @@ On successful outcome of the test, the stage directory is removed by default, bu
The prefixes of these directories are printed in the first section of the output.
Let's inspect what files ReFrame produced for this test:

.. code-block:: bash
.. code-block:: console

ls output/generic/default/builtin/HelloTest/

Expand All @@ -133,6 +133,75 @@ Let's inspect what files ReFrame produced for this test:
ReFrame stores in the output directory of the test the build and run scripts it generated for building and running the code along with their standard output and error.
All these files are prefixed with ``rfm_``.

ReFrame also generates a detailed JSON report for the whole regression testing session.
By default, this is stored inside the ``${HOME}/.reframe/reports`` directory and a new report file is generated every time ReFrame is run, but you can control this through the :option:`--report-file` command-line option.

Here are the contents of the report file for our first ReFrame run:


.. code-block:: console

cat ~/.reframe/reports/run-report-0.json

.. code-block:: javascript

{
"session_info": {
"cmdline": "./bin/reframe -c tutorials/basics/hello/hello1.py -r",
"config_file": "<builtin>",
"data_version": "1.0",
"hostname": "dhcp-133-44.cscs.ch",
"prefix_output": "/Users/user/Repositories/reframe/output",
"prefix_stage": "/Users/user/Repositories/reframe/stage",
"user": "user",
"version": "3.1-dev2 (rev: 272e1aae)",
"workdir": "/Users/user/Repositories/reframe",
"time_start": "2020-07-24T11:05:46+0200",
"time_end": "2020-07-24T11:05:47+0200",
"time_elapsed": 0.7293069362640381,
"num_cases": 1,
"num_failures": 0
},
"runs": [
{
"num_cases": 1,
"num_failures": 0,
"runid": 0,
"testcases": [
{
"build_stderr": "rfm_HelloTest_build.err",
"build_stdout": "rfm_HelloTest_build.out",
"description": "HelloTest",
"environment": "builtin",
"fail_reason": null,
"fail_phase": null,
"jobid": 85063,
"job_stderr": "rfm_HelloTest_job.err",
"job_stdout": "rfm_HelloTest_job.out",
"name": "HelloTest",
"maintainers": [],
"nodelist": [
"dhcp-133-44.cscs.ch"
],
"outputdir": "/Users/user/Repositories/reframe/output/generic/default/builtin/HelloTest",
"perfvars": null,
"result": "success",
"stagedir": null,
"scheduler": "local",
"system": "generic:default",
"tags": [],
"time_compile": 0.3776402473449707,
"time_performance": 4.506111145019531e-05,
"time_run": 0.2992382049560547,
"time_sanity": 0.0005609989166259766,
"time_setup": 0.0031709671020507812,
"time_total": 0.7213571071624756
}
]
}
]
}


More of "Hello, World!"
-----------------------
Expand Down
4 changes: 4 additions & 0 deletions reframe/core/pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -925,6 +925,10 @@ def stderr(self):
'''
return self._job.stderr

@property
def build_job(self):
return self._build_job

@property
@sn.sanity_function
def build_stdout(self):
Expand Down
4 changes: 2 additions & 2 deletions reframe/core/systems.py
Original file line number Diff line number Diff line change
Expand Up @@ -189,12 +189,12 @@ def json(self):
'container_platforms': [
{
'type': ctype,
'modules': [m.name for m in cpenv.modules],
'modules': [m for m in cpenv.modules],
'variables': [[n, v] for n, v in cpenv.variables.items()]
}
for ctype, cpenv in self._container_environs.items()
],
'modules': [m.name for m in self._local_env.modules],
'modules': [m for m in self._local_env.modules],
'variables': [[n, v]
for n, v in self._local_env.variables.items()],
'environs': [e.name for e in self._environs],
Expand Down
91 changes: 83 additions & 8 deletions reframe/frontend/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
import re
import socket
import sys
import time
import traceback

import reframe
Expand Down Expand Up @@ -69,6 +70,23 @@ def list_checks(checks, printer, detailed=False):
printer.info('\nFound %d check(s).' % len(checks))


def generate_report_filename(filepatt):
if '{sessionid}' not in filepatt:
return filepatt

search_patt = os.path.basename(filepatt).replace('{sessionid}', r'(\d+)')
new_id = -1
basedir = os.path.dirname(filepatt) or '.'
for filename in os.listdir(basedir):
match = re.match(search_patt, filename)
if match:
found_id = int(match.group(1))
new_id = max(found_id, new_id)

new_id += 1
return filepatt.format(sessionid=new_id)


def main():
# Setup command line options
argparser = argparse.ArgumentParser()
Expand Down Expand Up @@ -136,6 +154,12 @@ def main():
help='Save ReFrame log files to the output directory',
envvar='RFM_SAVE_LOG_FILES', configvar='general/save_log_files'
)
output_options.add_argument(
'--report-file', action='store', metavar='FILE',
help="Store JSON run report in FILE",
envvar='RFM_REPORT_FILE',
configvar='general/report_file'
)

# Check discovery options
locate_options.add_argument(
Expand Down Expand Up @@ -509,19 +533,33 @@ def print_infoline(param, value):
param = param + ':'
printer.info(f" {param.ljust(18)} {value}")

session_info = {
'cmdline': ' '.join(sys.argv),
'config_file': rt.site_config.filename,
'data_version': '1.0',
'hostname': socket.gethostname(),
'prefix_output': rt.output_prefix,
'prefix_stage': rt.stage_prefix,
'user': os_ext.osuser(),
'version': os_ext.reframe_version(),
'workdir': os.getcwd(),
}

# Print command line
printer.info(f"[ReFrame Setup]")
print_infoline('version', os_ext.reframe_version())
print_infoline('command', repr(' '.join(sys.argv)))
print_infoline('launched by',
f"{os_ext.osuser() or '<unknown>'}@{socket.gethostname()}")
print_infoline('working directory', repr(os.getcwd()))
print_infoline('settings file', f'{site_config.filename!r}')
print_infoline('version', session_info['version'])
print_infoline('command', repr(session_info['cmdline']))
print_infoline(
f"launched by",
f"{session_info['user'] or '<unknown>'}@{session_info['hostname']}"
)
print_infoline('working directory', repr(session_info['workdir']))
print_infoline('settings file', f"{session_info['config_file']!r}")
print_infoline('check search path',
f"{'(R) ' if loader.recurse else ''}"
f"{':'.join(loader.load_path)!r}")
print_infoline('stage directory', repr(rt.stage_prefix))
print_infoline('output directory', repr(rt.output_prefix))
print_infoline('stage directory', repr(session_info['prefix_stage']))
print_infoline('output directory', repr(session_info['prefix_output']))
printer.info('')
try:
# Locate and load checks
Expand Down Expand Up @@ -696,8 +734,18 @@ def print_infoline(param, value):
max_retries) from None
runner = Runner(exec_policy, printer, max_retries)
try:
time_start = time.time()
session_info['time_start'] = time.strftime(
'%FT%T%z', time.localtime(time_start),
)
runner.runall(testcases)
finally:
time_end = time.time()
session_info['time_end'] = time.strftime(
'%FT%T%z', time.localtime(time_end)
)
session_info['time_elapsed'] = time_end - time_start

# Print a retry report if we did any retries
if runner.stats.failures(run=0):
printer.info(runner.stats.retry_report())
Expand All @@ -712,6 +760,33 @@ def print_infoline(param, value):
if options.performance_report:
printer.info(runner.stats.performance_report())

# Generate the report for this session
report_file = os.path.normpath(
os_ext.expandvars(rt.get_option('general/0/report_file'))
)
basedir = os.path.dirname(report_file)
if basedir:
os.makedirs(basedir, exist_ok=True)

# Build final JSON report
run_stats = runner.stats.json()
session_info.update({
'num_cases': run_stats[0]['num_cases'],
'num_failures': run_stats[-1]['num_failures']
})
json_report = {
'session_info': session_info,
'runs': run_stats
}
report_file = generate_report_filename(report_file)
try:
with open(report_file, 'w') as fp:
json.dump(json_report, fp, indent=2)
except OSError as e:
printer.warning(
f'failed to generate report in {report_file!r}: {e}'
)

else:
printer.error("No action specified. Please specify `-l'/`-L' for "
"listing or `-r' for running. "
Expand Down
Loading