Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make metrics deactivable for tests #905

Merged
merged 5 commits into from Jun 7, 2017

Conversation

l-vincent-l
Copy link
Contributor

We can observe a gain of 10% on the runtime of tests

@davidbgk
Copy link
Member

Nice 👍

capture d ecran 2017-05-15 a 16 26 49

I hoped it would be more than that but it's better than nothing :)

@@ -14,6 +14,7 @@


from udata.models import db # noqa: need metrics refactoring
from flask import current_app
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Import with third-parties dependencies above.

@@ -111,10 +112,12 @@ def iso(self, value):
'Unsupported format: {0} ({1})'.format(value, type(value)))

def trigger_update(self):
self.need_update.send(self)
if not current_app.config['TESTING'] or current_app.config.get('FORCE_METRICS'):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Worth an entry in settings' documentation?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, it should be documented, but I prefer a USE_METRICS or ENABLE_METRICS instead of force metrics (see next comment)


def notify_update(self):
self.updated.send(self)
if not current_app.config['TESTING'] or current_app.config.get('FORCE_METRICS'):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure we want to restrict that to TESTING, I'm thinking about local load of backups for instance but that might be another issue. @noirbizarre thoughts?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of testing both TESTING and FORCE_METRICS, I think it would be cleaner to have a single USE_METRICS defined to True in udata.settings.Defaults and overriden to False in udata.settings.Testing so:

  • we avoid runtime code to perform to have a view on testing
  • we have all testing parameters in the same place
  • activating metrics in test is simply setting config.USE_METRICS to True
    @davidbgk I think local load of backups is not performed inside the application so metrics are not triggered.

@@ -37,6 +37,7 @@ def create_app(self):
return app

def test_new_discussion(self):
self.app.config['FORCE_METRICS'] = True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not in the setUp of that class if you have to set it everywhere?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed 👍 : you should factorize.
Even better, you should write a MetricsSettings class for this case. See https://github.com/opendatateam/udata/blob/master/udata/tests/frontend/test_markdown.py#L16-L21 for an example (it's the pattern we use when we need to activate some settings)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was done on purpose, I wanted to activate it as less as possible.

@@ -37,6 +37,7 @@ def create_app(self):
return app

def test_new_issue(self):
self.app.config['FORCE_METRICS'] = True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.


def notify_update(self):
self.updated.send(self)
if not current_app.config['TESTING'] or current_app.config.get('FORCE_METRICS'):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of testing both TESTING and FORCE_METRICS, I think it would be cleaner to have a single USE_METRICS defined to True in udata.settings.Defaults and overriden to False in udata.settings.Testing so:

  • we avoid runtime code to perform to have a view on testing
  • we have all testing parameters in the same place
  • activating metrics in test is simply setting config.USE_METRICS to True
    @davidbgk I think local load of backups is not performed inside the application so metrics are not triggered.

@@ -111,10 +112,12 @@ def iso(self, value):
'Unsupported format: {0} ({1})'.format(value, type(value)))

def trigger_update(self):
self.need_update.send(self)
if not current_app.config['TESTING'] or current_app.config.get('FORCE_METRICS'):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, it should be documented, but I prefer a USE_METRICS or ENABLE_METRICS instead of force metrics (see next comment)

@@ -37,6 +37,7 @@ def create_app(self):
return app

def test_new_discussion(self):
self.app.config['FORCE_METRICS'] = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed 👍 : you should factorize.
Even better, you should write a MetricsSettings class for this case. See https://github.com/opendatateam/udata/blob/master/udata/tests/frontend/test_markdown.py#L16-L21 for an example (it's the pattern we use when we need to activate some settings)

@@ -102,6 +102,9 @@ def test_request_transfer_to_same_organization(self):


class TransferAcceptTest(TestCase, DBTestMixin):
def setUp(self):
self.app.config['USE_METRICS'] = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't forget the super().setUp()

@noirbizarre noirbizarre modified the milestone: 1.1 May 26, 2017
@noirbizarre noirbizarre merged commit 762701a into opendatateam:dev Jun 7, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants