Skip to content

Conversation

@teojgo
Copy link
Contributor

@teojgo teojgo commented May 22, 2018

  • Use the analytics module provided by Cray.

  • Run a simple pyspark script to calculate pi.

Closes #107

* Use the `analytics` module provided by Cray.

* Run a simple `pyspark` script to calculate pi.
@teojgo teojgo added this to the ReFrame sprint 2018w20 milestone May 22, 2018
@teojgo teojgo self-assigned this May 22, 2018
@teojgo teojgo requested a review from vkarak May 22, 2018 11:28
from reframe.core.launchers.registry import getlauncher


class SparkAnalyticsCheck(RunOnlyRegressionTest):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest using the new-style checks.

self.valid_prog_environs = ['PrgEnv-cray']

self.modules = ['analytics']

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove the blank lines here.

pi_diff = sn.abs(pi_value - pi_reference)

self.sanity_patterns = sn.assert_lt(pi_diff, 0.01)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove blank lines.

pi_reference = math.pi
pi_diff = sn.abs(pi_value - pi_reference)

self.sanity_patterns = sn.assert_lt(pi_diff, 0.01)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it fits in a single line, you may replace pi_diff with the sn.abs() expression directly.

self.num_tasks = 72
self.num_tasks_per_node = 18
super().setup(partition, environ, **job_opts)
self.job.launcher = getlauncher('local')()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Put a small comment here, why you need to do that.

self.num_tasks_per_node = 12
else:
self.num_tasks = 72
self.num_tasks_per_node = 18
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Leave a blank line here.

@vkarak vkarak merged commit f40bb48 into master May 24, 2018
@vkarak vkarak deleted the regression-test/spark_check branch May 24, 2018 07:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants