Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reproducibility of benchmark results #64

Closed
Tracked by #649
yecol opened this issue Jan 5, 2021 · 1 comment
Closed
Tracked by #649

Reproducibility of benchmark results #64

yecol opened this issue Jan 5, 2021 · 1 comment
Assignees

Comments

@yecol
Copy link
Collaborator

yecol commented Jan 5, 2021

We provided the details about how we conduct the experiments of benchmarking on:

In addition, to help users reproduce the results easier, we are going to provide a snapshot on aliyun, with GraphScope installed and a script to run the benchmark suite.

@yecol
Copy link
Collaborator Author

yecol commented Aug 6, 2021

@lidongze0629
add a folder analytical_engine and place the artifacts for the second one here.
https://github.com/graphscope/benchmark

@yecol yecol mentioned this issue Aug 6, 2021
40 tasks
@yecol yecol added this to To do in v0.7 release Aug 11, 2021
@yecol yecol moved this from To do to In progress in v0.7 release Aug 18, 2021
@acezen acezen closed this as completed Mar 31, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
No open projects
v0.7 release
In progress
Development

No branches or pull requests

3 participants