New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add benchmark tools #777

Merged
merged 3 commits into from Sep 21, 2018

Conversation

2 participants
@kt3k
Copy link
Contributor

kt3k commented Sep 21, 2018

This PR tries to address #373.

  • This uses hyperfine for benchmarking tool.
  • This solution uses gh-pages for static hosting of web site, and stores benchmark data in a json file //data.json in gh-pages branch.
  • Added //tools/benchmark.py, which performs benchmarks, downloads gh-pages branch and append data to //gh-pages/data.json.
  • Added //website/ directory which contains the frontend resources to show the chart of benchmarked data. (The page uses d3 and c3 for showing the chart.)
  • Added provider: pages in deployment of travis. This deploys //gh-pages/ directory to gh-pages branch.
    • This requires $GITHUB_TOKEN env var set appropriately in travis.

The demo is available here, which I tested in my fork.

2018-09-21 10 49 57


notes:

  • This uses a single json file as a database. I think this scales up to a few thousand data, but probably it needs to be replaced with more serious db tools like sqlite or online db solutions after thousands of commits.
@ry

This comment has been minimized.

Copy link
Collaborator

ry commented Sep 21, 2018

Wow!! Thank you! Review soon

@ry
Copy link
Collaborator

ry left a comment

Huge contribution! Simply done. Thank you!

A few comments...

github-token: $GITHUB_TOKEN
keep-history: true
on:
branch: master

This comment has been minimized.

@ry

ry Sep 21, 2018

Collaborator

Awesome

from util import run, run_output, root_path, build_path

benchmark_types = ["hello", "relative_import"]
benchmark_files = ["tests/002_hello.ts", "tests/003_relative_import.ts"]

This comment has been minimized.

@ry

ry Sep 21, 2018

Collaborator

Maybe use a map here:

benchmarks = {
  “hello”: [“tests/002.ts”, ”—reload”],
  # ...
}
benchmark_files = ["tests/002_hello.ts", "tests/003_relative_import.ts"]

data_file = "gh-pages/data.json"
benchmark_file = "benchmark.json"

This comment has been minimized.

@ry

ry Sep 21, 2018

Collaborator

Store in the out/ dir? AKA util.build_path()

])
except:
os.mkdir("gh-pages")
with open("gh-pages/data.json", "w") as f:

This comment has been minimized.

@ry

ry Sep 21, 2018

Collaborator

Use data_file.

categories: sha1List
}
}
});

This comment has been minimized.

@ry

ry Sep 21, 2018

Collaborator

🙌

run(["hyperfine", "--export-json", benchmark_file, "--warmup", "3"] + [
os.path.join(build_dir, "deno") + " " + file
for file in benchmark_files
])

This comment has been minimized.

@ry

ry Sep 21, 2018

Collaborator

It would be nice to be able to specify arguments. Like I want to run the same script with —reload and without.

@ry

This comment has been minimized.

Copy link
Collaborator

ry commented Sep 21, 2018

Ideally add hyperfile to 3rd party and build using gn - but I’m also ok if you add a TODO for now

.gitignore Outdated
# RLS generated files
/target/
# benchmark temp files
/benchmark.json

This comment has been minimized.

@ry

ry Sep 21, 2018

Collaborator

It's no longer written to root.

("relative_import", ["tests/003_relative_import.ts",
"--reload"])]

data_file = "gh-pages/data.json"

This comment has been minimized.

@ry

ry Sep 21, 2018

Collaborator

Maybe it's easier if you write this to website/data.json, so that it's easy to view locally...

Add a comment to the top

# To view the results locally run ./tools/http_server.py and visit 
# http://localhost:4545/website

This comment has been minimized.

@kt3k

kt3k Sep 21, 2018

Author Contributor

Changed to store data at //website/data.json and then export to //gh-pages/.

Now the local benchmark result should be visible at http://localhost:4545/website with ./tools/http_server.py command!

@ry

ry approved these changes Sep 21, 2018

Copy link
Collaborator

ry left a comment

LGTM!

@ry ry merged commit 3ad48bd into denoland:master Sep 21, 2018

3 checks passed

continuous-integration/appveyor/pr AppVeyor build succeeded
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
license/cla Contributor License Agreement is signed.
Details

piscisaureus added a commit that referenced this pull request Sep 21, 2018

Revert "Add benchmark tools (#777)"
This reverts commit 3ad48bd.
@ry

This comment has been minimized.

Copy link
Collaborator

ry commented Sep 21, 2018

@kt3k This commit broke CI, we had to revert: 516e1da
See https://travis-ci.com/denoland/deno/builds/85580239

@kt3k

This comment has been minimized.

Copy link
Contributor Author

kt3k commented Sep 22, 2018

@ry
It seems that GITHUB_TOKEN is not set in travis and the deployment failed. Please create an access token and set it in travis' settings page.

The personal access token need to have repo access (probably public_repo access is enough) which can be generated from github's ui. ([settings] -> [developer settings] -> [personal access tokens] -> [generate new token])

Then copy the token and paste it in denoland/deno's settings page in travis.

If GITHUB_TOKEN is set correctly, travis logs the lines about it at the start of build.

(This is an example build of my fork, which successfully deployed to gh-pages. commit logs).

piscisaureus added a commit that referenced this pull request Sep 22, 2018

piscisaureus added a commit that referenced this pull request Sep 22, 2018

piscisaureus added a commit to piscisaureus/deno that referenced this pull request Sep 22, 2018

@kt3k kt3k referenced this pull request Sep 24, 2018

Closed

Website scripts need tests #812

@ry ry referenced this pull request Sep 26, 2018

Closed

Continuous performance testing #373

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment