Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

First draft of codespeedization of test/perf #3813

Merged
merged 1 commit into from
Jul 25, 2013
Merged

Conversation

staticfloat
Copy link
Sponsor Member

Summary:

  • Change webhook notification URLs to julia.mit.edu (Linux) and criid.ee.washington.edu (OSX)
  • Add codespeed target to test/perf/Makefile, which will run perfsuite and upload result to hardcoded codespeed installation location
    • Note that make codespeed will attempt to Pkg.add() the packages JSON and Curl if not already installed.
    • Exclude sort subtests for now, see 453b4ef
  • Annotate all @time macro calls with an extra "Description" field, used to populate the codespeed database for the first time
    • Again, exclude the sort tests for now, as this integration with codespeed will have to wait. For now, empty descriptions
  • Until extra build-time information is provided, expect environment variables such as JULIA_FLAVOR, JULIA_BRANCH, and JULIA_COMMIT_DATE on the command line when make codespeed is invoked.

This might be my biggest pull request yet, but I seem to have finally gotten things to the point where they're uploading to codespeed regularly. I know it doesn't look like it due to the lack of history, but that's because I recently purged the memory of the codespeed installation.

Comments and critiques are welcome! Once this gets merged, master will start to automagically get tracked; it fails right now due to the lack of the codespeed target in test/perf/Makefile. I'm not sure if we ever want to backport this to release-0.1, perhaps this should be a 0.2+ thing. :)

Next on my TODO list:

  • Build scaffolding around C code to create a "baseline" in codespeed to compare against
  • Enjoy the pretty graphs, and make sure the build process is all toughened up (Only time will tell, now).
  • Something I'm surely forgetting, but can't seem to put my finger on right now.....

@timholy
Copy link
Sponsor Member

timholy commented Jul 24, 2013

Great stuff!

@ViralBShah
Copy link
Member

Certainly not necessary to backport to 0.1. The only thing that we need to do is to have a version of the perf tests that codespeed should run. It doesn't make sense to run all the sort, cat, and blas tests right now.

How about only enable the kernel, micro, and shootout for now and start collecting data?

@mlubin
Copy link
Member

mlubin commented Jul 24, 2013

Nice!

@staticfloat
Copy link
Sponsor Member Author

I've updated this with the sort tests special-cased for codespeed as discussed here.

* Change webhook notification URLs to julia.mit.edu (Linux) and criid.ee.washington.edu (OSX)
* Add `codespeed` target to test/perf/Makefile, which will run perfsuite and upload result to hardcoded codespeed installation location
  * Note that `make codespeed` will attempt to `Pkg.add()` the packages `JSON` and `Curl` if not already installed.
  * The sort subtests run a limited subset of 28 tests instead of almost 800
* Annotate all @time macro calls with an extra "Description" field, used to populate the codespeed database for the first time
* Until extra build-time information is provided, expect environment variables such as JULIA_FLAVOR, JULIA_BRANCH, and JULIA_COMMIT_DATE on the command line when `make codespeed` is invoked.
@IainNZ
Copy link
Member

IainNZ commented Jul 24, 2013

👍 Fantastic stuff!

@staticfloat
Copy link
Sponsor Member Author

I'm going to pull the trigger tonight unless anyone has any comments.

@StefanKarpinski
Copy link
Sponsor Member

As far as I'm concerned, you can merge whenever. This is great stuff and only touches test/perf, so it's safe.

staticfloat added a commit that referenced this pull request Jul 25, 2013
First draft of codespeedization of test/perf
@staticfloat staticfloat merged commit 522dca8 into master Jul 25, 2013
@ViralBShah
Copy link
Member

How do we see the codespeed dashboard? I can imagine that this will start generating interesting data in a few days.

@staticfloat
Copy link
Sponsor Member Author

For now, you go to http://128.52.160.154/, although once we have enough data for it not to be embarrasing, we might want to change that to something like http://speed.julialang.org. :P

@IainNZ
Copy link
Member

IainNZ commented Jul 25, 2013

Love the speedy-Julia logo!

@ViralBShah
Copy link
Member

Yup love the logo!

@ViralBShah
Copy link
Member

@StefanKarpinski Could you set up speed.julialang.org for this?

@ViralBShah
Copy link
Member

We certainly need fewer benchmarks here, and carefully chosen. @staticfloat Can we restrict to kernel, micro, and shootout for now to minimize the clutter?

@staticfloat
Copy link
Sponsor Member Author

Sure thing. I also have the tests roughly grouped, (The default grouping is to group by the folder they're in, e.g. kernel, micro, etc.... although we can change that up if we like) but that grouping doesn't show up in the "Timeline" page, only in the "Comparison" page. Something I may have to fix on the Codespeed side of things.

I'm also going to hack in a method of disabling a benchmark so it doesn't show up on the outside, yet still can be tracked so once we DO decide to show it on the outside, we can. This will take a little time, however, so for now I'll just disable the tests we're not using, and remove the already-submitted results that we don't want.

mkitti added a commit to mkitti/SymbolServer.jl that referenced this pull request Jan 15, 2021
See JuliaLang/julia#39239 (comment)

`Core.TypeofVararg` introduced in JuliaLang/julia#3813
`VERSION >= v"1.7.0-DEV.77"`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants