Skip to content

Conversation

@rsarm
Copy link
Contributor

@rsarm rsarm commented Jul 3, 2019

Fixes #644.

This is a simple test to try this feature.

import reframe as rfm
import reframe.utility.sanity as sn


@rfm.simple_test
class UbuntuContainerCheck(rfm.RunOnlyRegressionTest):
    def __init__(self, **kwargs):
        super().__init__()
        self.descr = ('Run the some commands inside a container')
        self.valid_systems = ['*']
        self.valid_prog_environs = ['*']

        self.container_platform = 'Docker'
        self.container_platform.image = 'ubuntu:18.04'
        self.container_platform.workdir = '/workdir'
        self.container_platform.commands = [
            'pwd', 'ls', 'cat /etc/os-release']

        self.sanity_patterns = sn.all([
            sn.assert_found(r'^' + self.container_platform.workdir,
                self.stdout),
            sn.assert_found(r'^dummy_src.txt', self.stdout),
            sn.assert_found(r'18.04.2 LTS \(Bionic Beaver\)', self.stdout),
            ])

I also added a --rm to the docker run command after discussing with @teojgo. Without this the container stays on status stopped after the test ends. --rm takes cares of completely clean the container after the command exits.

@pep8speaks
Copy link

pep8speaks commented Jul 3, 2019

Hello @rsarm, Thank you for updating!

Cheers! There are no PEP8 issues in this Pull Request!Do see the ReFrame Coding Style Guide

Comment last updated at 2019-10-14 10:57:58 UTC

@rsarm rsarm changed the title [feat] Enable running containers inside the test pipeline WIP: [feat] Enable running containers inside the test pipeline Jul 3, 2019
@rsarm rsarm changed the title WIP: [feat] Enable running containers inside the test pipeline [feat] Enable running containers inside the test pipeline Jul 4, 2019
@vkarak vkarak changed the title [feat] Enable running containers inside the test pipeline [wip] [feat] Enable running containers inside the test pipeline Jul 4, 2019
Copy link
Contributor

@vkarak vkarak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well done @rsarm! Works fine. I have some minor comments regarding this PR. I will also mark it as WIP, because ideally I would like to have the configuration part and the documentation also part of this before merging. Also, we should find a way to better test the container-specific code inside the pipeline. Perhaps, install Docker in the Travis and have a unit test that will use the example test file you mention in this PR?

@vkarak
Copy link
Contributor

vkarak commented Jul 5, 2019

@rsarm Can you also merge with master to get the latest changes and resolve the conflicts?

@codecov-io
Copy link

codecov-io commented Jul 5, 2019

Codecov Report

Merging #853 into master will decrease coverage by 0.05%.
The diff coverage is 85.71%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #853      +/-   ##
==========================================
- Coverage   91.88%   91.82%   -0.06%     
==========================================
  Files          80       80              
  Lines       10481    10674     +193     
==========================================
+ Hits         9630     9801     +171     
- Misses        851      873      +22
Impacted Files Coverage Δ
reframe/core/containers.py 100% <100%> (+17.02%) ⬆️
reframe/core/pipeline.py 92.05% <57.89%> (-1.43%) ⬇️
reframe/core/config.py 82.9% <66.66%> (-2.81%) ⬇️
unittests/test_pipeline.py 94.25% <76.4%> (-2.47%) ⬇️
reframe/core/systems.py 87.96% <83.33%> (-0.22%) ⬇️
unittests/test_containers.py 96.33% <98.59%> (+3.14%) ⬆️
unittests/fixtures.py 83.33% <0%> (+2.77%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2c4cffd...b099fb4. Read the comment docs.

@vkarak vkarak removed this from the ReFrame sprint 2019w23 milestone Jul 13, 2019
@victorusu victorusu added this to the ReFrame sprint 2019w29 milestone Jul 16, 2019
Copy link
Contributor

@vkarak vkarak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The unit tests need to extended:

  1. You need to a test for the emit_prepare_cmds().
  2. You need to test the part of pipeline.py that handles the containers.

Copy link
Contributor

@vkarak vkarak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are still a couple of things to fix:

  1. Unit tests, some comments are not resolved yet.
  2. I don't like that the emit_*_cmds() are inconsistent. One returns a string, while all the others return a list.
  3. Other minor stuff.

I will fix these issues.

Copy link
Contributor

@vkarak vkarak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rsarm This PR is now almost ready to go. Only remaining issues: I get a strange error with Singularity unit tests on Daint + the missing Sarus module in Dom config, which I have removed.

@vkarak
Copy link
Contributor

vkarak commented Oct 14, 2019

This PR is now ready. Thanks @rsarm.

I am disabling the Singularity unit test if running on Cray system with CLE6.

@vkarak vkarak merged commit 6285793 into reframe-hpc:master Oct 14, 2019
@rsarm rsarm deleted the containers-pipeline branch November 26, 2019 13:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add support for containers

6 participants