-
-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
First draft of codespeedization of test/perf #3813
Conversation
Great stuff! |
Certainly not necessary to backport to 0.1. The only thing that we need to do is to have a version of the perf tests that codespeed should run. It doesn't make sense to run all the sort, cat, and blas tests right now. How about only enable the |
Nice! |
I've updated this with the |
* Change webhook notification URLs to julia.mit.edu (Linux) and criid.ee.washington.edu (OSX) * Add `codespeed` target to test/perf/Makefile, which will run perfsuite and upload result to hardcoded codespeed installation location * Note that `make codespeed` will attempt to `Pkg.add()` the packages `JSON` and `Curl` if not already installed. * The sort subtests run a limited subset of 28 tests instead of almost 800 * Annotate all @time macro calls with an extra "Description" field, used to populate the codespeed database for the first time * Until extra build-time information is provided, expect environment variables such as JULIA_FLAVOR, JULIA_BRANCH, and JULIA_COMMIT_DATE on the command line when `make codespeed` is invoked.
👍 Fantastic stuff! |
I'm going to pull the trigger tonight unless anyone has any comments. |
As far as I'm concerned, you can merge whenever. This is great stuff and only touches test/perf, so it's safe. |
First draft of codespeedization of test/perf
How do we see the codespeed dashboard? I can imagine that this will start generating interesting data in a few days. |
For now, you go to http://128.52.160.154/, although once we have enough data for it not to be embarrasing, we might want to change that to something like http://speed.julialang.org. :P |
Love the speedy-Julia logo! |
Yup love the logo! |
@StefanKarpinski Could you set up speed.julialang.org for this? |
We certainly need fewer benchmarks here, and carefully chosen. @staticfloat Can we restrict to |
Sure thing. I also have the tests roughly grouped, (The default grouping is to group by the folder they're in, e.g. I'm also going to hack in a method of disabling a benchmark so it doesn't show up on the outside, yet still can be tracked so once we DO decide to show it on the outside, we can. This will take a little time, however, so for now I'll just disable the tests we're not using, and remove the already-submitted results that we don't want. |
See JuliaLang/julia#39239 (comment) `Core.TypeofVararg` introduced in JuliaLang/julia#3813 `VERSION >= v"1.7.0-DEV.77"`
Summary:
codespeed
target to test/perf/Makefile, which will run perfsuite and upload result to hardcoded codespeed installation locationmake codespeed
will attempt toPkg.add()
the packagesJSON
andCurl
if not already installed.Excludesort
subtests for now, see 453b4efAgain, exclude thesort
tests for now, as this integration with codespeed will have to wait. For now, empty descriptionsmake codespeed
is invoked.This might be my biggest pull request yet, but I seem to have finally gotten things to the point where they're uploading to codespeed regularly. I know it doesn't look like it due to the lack of history, but that's because I recently purged the memory of the codespeed installation.
Comments and critiques are welcome! Once this gets merged,
master
will start to automagically get tracked; it fails right now due to the lack of thecodespeed
target intest/perf/Makefile
. I'm not sure if we ever want to backport this torelease-0.1
, perhaps this should be a 0.2+ thing. :)Next on my TODO list: