You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Inside the results folder should be a README.md that explains quite explicitly what results we are looking for and what circumstances.
In BunnyMark, this should require:
The maximum # of bunnies you can achieve while maintaining 60 FPS.
Close ALL OTHER applications and windows while running the benchmark.
Disable all but one monitor/display.
Set your monitor's display resolution to 1920x1080p if possible, otherwise note the digression in your report
Also all the bunnymarks should display in the same sized window, if possible, and all details like this should be standardized across benchmarks if we're going to have meaningful results.
Each test should have a results folder
Frameworks_test\Bunnmark\results
and the same for any future benchmarks.
Each results folder should have a file for each framework:
Etc.
Then, people should run benchmarks and add their results to the relevant file in a standard format.
I'm thinking a simple thing like this for bunnymark (assumption is greatest number of bunnies while maintaining 60 FPS), perhaps in Json format?
My crashdumper library has system scanning utilities, btw:
https://github.com/larsiusprime/crashdumper
The text was updated successfully, but these errors were encountered: