Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unreliable benchmarks. #1

Open
KaiserKarel opened this issue May 10, 2020 · 4 comments
Open

Unreliable benchmarks. #1

KaiserKarel opened this issue May 10, 2020 · 4 comments

Comments

@KaiserKarel
Copy link

These benchmarks do not offer much reliability; the compiler may be optimizing way large pieces of code under observation; and the way the benchmarks are performed leaves room for measurement noise.

Using criterion should provide us with better statistics.

@KaiserKarel
Copy link
Author

If you're accepting PR's, I can do some work on this after the 25th. I've got some experience writing benchmarks.

@sunli829
Copy link
Collaborator

Great, welcome to PR. 😁

@sunli829
Copy link
Collaborator

You gave me enough time to optimize. 😂

@KaiserKarel
Copy link
Author

Hehe; we'll see about that then ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants