Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Delayed metrics reroll config #760

Merged

Conversation

ChandraAddala
Copy link
Contributor

Introducing a new configuration, DELAYED_METRICS_REROLL_GRANULARITY, using which we control the granularity upto which we re-roll only the delayed metrics instead of entire slot.

@coveralls
Copy link

Coverage Status

Coverage increased (+0.01%) to 76.046% when pulling 7effc50 on ChandraAddala:delayed-metris-reroll-config into 1faece3 on rackerlabs:master.

Copy link
Contributor

@shintasmith shintasmith left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the rest looks good to me

@@ -2,6 +2,7 @@

## IN PROGRESS
* Upgraded datastax driver to 3.1.2
* Added new configuration option DELAYED_METRICS_REROLL_GRANULARITY
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

YAY!!!!!

We probably need to document this (and the previous Rollup configs) more, but it doesn't have to be in the Release Notes. Maybe add something in the wiki later?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's add it to the wiki as a part of this story. I think mentioning it in the release notes is good, so people know what versions its valid for.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Once everything is merged to master, I plan on writing a wiki page just for delayed metrics explaining various configs and how to tweak them.

*
* For example, a slot key of granularity
* {@link Granularity#MIN_1440 MIN_1440}, for destination granularity of
* {@link Granularity#MIN_60 MIN_60}, will have 6*4=24 children.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Curious, Is this a normal example? I mean, do we normally ask from 1 day (1440m) granularity to 1 hr (60m) granularity, skipping the 240m granularity?

Also, the term "children" is not as clear, but I realized it's probably been used in other places. At first I thought 1440m is a child of the 240m. I don't know if it's possible to change to "Finer" and "Coarser"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is getChildrenKeys above which would list children of all sublevels. This method uses that and only returns the level we are interested in. I just followed the terminology used there.

public Collection<SlotKey> getChildrenKeys(Granularity destGranularity) {

if (!getGranularity().isCoarser(destGranularity)) {
throw new IllegalArgumentException("Current granularity must be coarser than the destination granularity");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you print the 2 granularities involved in the Exception msg?

@coveralls
Copy link

Coverage Status

Coverage increased (+0.04%) to 76.074% when pulling 03f3fb3 on ChandraAddala:delayed-metris-reroll-config into 1faece3 on rackerlabs:master.

Copy link
Contributor

@shintasmith shintasmith left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@ChandraAddala ChandraAddala merged commit 98e273b into rax-maas:master Dec 5, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants