changes benchmarks in preparation of value caching #38
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What type of PR is this?
/kind cleanup
What this PR does / why we need it:
The pod being scheduled is a parameter to all Scheduler framework callbacks. Passing this for each callback has overhead, because the main communication channel in wasm is its memory. The host must marshal the pod to proto and copy to to the guest's memory. Much more expensive is having the guest de-marshal it.
For example, a simple filter that unmarshals the pod takes roughly 65us. Every callback that uses pod metadata would cost something similar, possibly 4 or more times.
This change makes overhead more visible by having benchmarks access all parameters.
Which issue(s) this PR fixes:
NONE
Special notes for your reviewer:
This change is missing tests and RATIONALE. I'm raising it for early review in case the approach isn't accepted.
Does this PR introduce a user-facing change?
NONE
What are the benchmark results of this change?
Below you can see higher overhead than if we only use our examples, because more data is in use.
This makes any future optimizations visible, as we know what the base case is when all parameters are evaluated.