-
Notifications
You must be signed in to change notification settings - Fork 332
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(metrics): add downloadIntelligenceModelsReqResp to latency metrics #3549
feat(metrics): add downloadIntelligenceModelsReqResp to latency metrics #3549
Conversation
This pull request is automatically being deployed by Amplify Hosting (learn more). |
public accumulateLatency(callback: () => Promise<unknown>, key: PreComputedLatencies) { | ||
const start = performance.now(); | ||
|
||
return callback().finally(() => { | ||
const currentLatency = this.precomputedLatencies.get(key) ?? 0; | ||
this.saveLatency(key, currentLatency + (performance.now() - start), true); | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we already support this, see saveLatency overwrite param
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This a special case of the overwrite when we want to add the current value to the next one before overwrite it.
It is doable with measureLatency
as well, but we need to access to the current value and I really do not want to use precomputedLatencies
inside cantina. Actually precomputedLatencies
should be private.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
never mind, I just checked again and the saveLatency does what you described.
The overwrite
name is a bit misleading, because it not just simple overwrite, but also accumulate values
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
measureLatency exposes the overwrite property too, so you don't have to access precomputedLatencies directly in cantina
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
one issue
@@ -93,31 +94,31 @@ export default class CallDiagnosticLatencies extends WebexPlugin { | |||
* Store precomputed latency value | |||
* @param key - key | |||
* @param value - value | |||
* @param overwrite - overwrite existing value or add it | |||
* @param accumulate - when it true, it overwrite existing value with sum of the current value and the new measurement otherwise just store the new measurement |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
its
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
overwrites
* @throws | ||
* @returns | ||
*/ | ||
public saveLatency(key: PreComputedLatencies, value: number, overwrite = true) { | ||
const existingValue = overwrite ? 0 : this.precomputedLatencies.get(key) || 0; | ||
public saveLatency(key: PreComputedLatencies, value: number, accumulate = true) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've just realised, we've changed the meaning here too.
Should I accumulate?
- accumulate = true it overrides.
- accumulate = false, it accumulates.
Also the default for this method should be to overwrite.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So maybe change the default value and swap the tertiary statement outcomes on line 102.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good point, I fixed all of them
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
one small issue
e5f1347
to
41a1c80
Compare
it('should save latency correctly when accumulate is true', () => { | ||
assert.deepEqual(cdl.precomputedLatencies.size, 0); | ||
cdl.saveLatency('internal.client.pageJMT', 10, false); | ||
cdl.saveLatency('internal.client.pageJMT', 10, true); | ||
assert.deepEqual(cdl.precomputedLatencies.size, 1); | ||
assert.deepEqual(cdl.precomputedLatencies.get('internal.client.pageJMT'), 10); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this doesn't seem right?
i mean it doesn't make sense now, but i think it didn't make sense before either
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks good to me. It tests the case when there is no current value. Here we can end up with NaN
when the missing value not handled correctly.
}); | ||
|
||
it('checks measureLatency when callBack rejects', async () => { | ||
const key = 'internal.client.pageJMT'; | ||
const overwrite = true; | ||
const accumulate = true; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in order to keep the test the same as before, we should say accumulate = false, as before it was overwrite = true
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
some more comments, but non really blocking.
also, there's a lot of linting changes, not sure if we should add those
@torgeadelin I think the linting is coming from a recently merged PR, there is not too many and all related to metrics plugin so I think it is OK |
@@ -149,7 +150,8 @@ describe('internal-plugin-metrics', () => { | |||
}); | |||
|
|||
it('fails if preLoinId is not set', async () => { | |||
webex.internal.newMetrics.callDiagnosticMetrics.preLoginMetricsBatcher.preLoginId = undefined; | |||
webex.internal.newMetrics.callDiagnosticMetrics.preLoginMetricsBatcher.preLoginId = | |||
undefined; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why this change?
// avoid setting .sent timestamp | ||
webex.internal.newMetrics.callDiagnosticMetrics.preLoginMetricsBatcher.prepareRequest = ( | ||
q | ||
) => Promise.resolve(q); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Again why has this changed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are the changes in indentation intentional? Seems like some are but some are not?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please check latest comment
key: PreComputedLatencies, | ||
overwrite = false | ||
accumulate = false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you planning to change definition in Cantina too, I think we should
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I already have the PR for that
https://sqbu-github.cisco.com/WebExSquared/webex-web-client/pull/5653
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't have accumulateLatency anymore right?
@shnaaz I just updated my branch to the latest beta and run formatting, probably these lines where not correctly formatted. I can roll it back if you want to. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just roll back couple of linter changes we discussed, otherwise looks good
2425295
to
c730d70
Compare
COMPLETES SPARK-503933
This pull request addresses
Implement new downloadIntelligenceModelsReqResp latency for CA metrics
by making the following changes
Updated the types and added new utility function called
accumulateLatency
Change Type
The following scenarios where tested
< ENUMERATE TESTS PERFORMED, WHETHER MANUAL OR AUTOMATED >
I certified that
I have read and followed contributing guidelines
I discussed changes with code owners prior to submitting this pull request
I have not skipped any automated checks
All existing and new tests passed
I have updated the documentation accordingly
Make sure to have followed the contributing guidelines before submitting.