Skip to content

add check_opt_vs_noopt_jit option to check that results between baseline and optimized jitted versions match#249

Merged
Krovatkin merged 2 commits intomasterfrom
krovatkin/check_results2
Mar 2, 2021
Merged

add check_opt_vs_noopt_jit option to check that results between baseline and optimized jitted versions match#249
Krovatkin merged 2 commits intomasterfrom
krovatkin/check_results2

Conversation

@Krovatkin
Copy link
Contributor

@Krovatkin Krovatkin commented Feb 9, 2021

add check_opt_vs_noopt_jit option to check that results between baseline and optimized jitted versions match

model.train(train)


def check_results(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i'm wondering how this is intended to be used. If it's used mainly locally by jit developers, it won't affect all users of the benchmark... Should we make it slightly less intrusive by not needing to implement a 'check_results' override for models that don't jit? is it sufficient to just try/except NotImplementedError inside this func to avoid the ones that aren't supported?

@Krovatkin Krovatkin force-pushed the krovatkin/check_results2 branch from 220a1c9 to 4343589 Compare February 12, 2021 23:43
]

if model_name in model_blacklist:
warnings.warn(UserWarning(f"{model_name}.get_module() doesn't support `check_results` yet!"))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this just WIP (that you have both blacklist and try/except)? or you need both for some reason?

Copy link
Contributor Author

@Krovatkin Krovatkin Feb 17, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wconstab unfortunately, for some benchmarks results don't match, while the others are set up in a way I can't get at their jitted module to be able to reset the optimization version and recompile it at noopt.
I added some comments.
Eventually, I'm hoping to enable results checking for as many models as I can

@Krovatkin Krovatkin force-pushed the krovatkin/check_results2 branch from 4343589 to fbcca56 Compare February 17, 2021 22:04
@Krovatkin Krovatkin requested a review from wconstab February 18, 2021 17:02
Copy link
Contributor

@wconstab wconstab left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm- other than one nit, which is that you could add a comment and/or change check_results function name, explaining the idea of verifying that the JIT optimization behaves same as baseline JIT.

(Disambiguate from also important but non-addressed case of making sure the model is producing correct results..)

@Krovatkin Krovatkin changed the title add check_results option for inference runs add check_opt_vs_noopt_jit option to check that results between baseline and optimized jitted versions match Mar 2, 2021
@Krovatkin Krovatkin merged commit ff0b662 into master Mar 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants