Skip to content
This repository has been archived by the owner on Jun 30, 2022. It is now read-only.

Upload the weight's diff if it fails to push for some reason #54

Open
joao-paulo-parity opened this issue Sep 9, 2021 · 10 comments
Open
Assignees

Comments

@joao-paulo-parity
Copy link
Contributor

joao-paulo-parity commented Sep 9, 2021

At the moment, in case some error occurs while the bot is trying to push the generated weights (e.g. paritytech/substrate#9686 (comment)), the bot will simply error. It would be better to upload the weight's diff that the effort isn't lost (some benchmarks take a long time to run).

Github limits the amount of characters per-comment so that's something to also take into account before posting the diff. In case the maximum length is exceeded, the diff could be uploaded as a Gist.

@athei
Copy link
Member

athei commented Jan 24, 2022

You could use git format-patch to create a patch file and compress this and use a file upload instead of posting it as text. This would circumvent the limitation and also makes it easier to apply.

@shawntabrizi
Copy link
Contributor

CC @ggwpez

We were already thinking of generating raw JSON output files from all the benchmarks so that we could generate visual representations of the benchmarks in case people want to look closer at the results.

From this, we should also be able to easily generate the weight files if they could not be committed.

@ggwpez
Copy link
Member

ggwpez commented Jan 24, 2022

So where do we upload it to? I don't think creating Gists is optimal.
Maybe some S3 compatible server to throw the JSON files in?

@shawntabrizi
Copy link
Contributor

Can we not just leave it on the benchmarking machine?

I guess we would be concerned about serving files while it was benchmarking?

If we need another server that works and can be allocated, otherwise, gist is probably not the worst idea.

@ggwpez
Copy link
Member

ggwpez commented Jan 24, 2022

Can we not just leave it on the benchmarking machine?
I guess we would be concerned about serving files while it was benchmarking?

Yes, just running a webserver there could interfere with the results.
In the best case we would make that data publicly available, so that other people can also analyze Substrate+Polkadot weights.

If we need another server that works and can be allocated, otherwise, gist is probably not the worst idea.

If we just throw them in single Gist files we will not be able to index/order or find them ever again.
To not go overboard for the first version of this; some kind of indexable file storage e.g. S3-compatible Bucket storage would be fine I think.

@joao-paulo-parity
Copy link
Contributor Author

I'm assigning myself to this task since I plan to work on it sometime soon

@athei
Copy link
Member

athei commented Jan 24, 2022

So where do we upload it to? I don't think creating Gists is optimal. Maybe some S3 compatible server to throw the JSON files in?

I attachments to postings I make in github by dragging them in here and it will just upload them "somewhere". Can't the bot do the same thing?

@ggwpez
Copy link
Member

ggwpez commented Jan 24, 2022

I attachments to postings I make in github by dragging them in here and it will just upload them "somewhere". Can't the bot do the same thing?

Yes for the diff that is enough. Sorry for mixing up two issues, my concern was regarding the raw output.

@koenw
Copy link

koenw commented Jan 24, 2022

If we just throw them in single Gist files we will not be able to index/order or find them ever again.

We would need to keep an index of Gists, perhaps in a file (or files) that we can commit/push.

Is there any other downside to Gists? It seems simpler to me than using an additional service.

We would also easily be able to generate (or it could just be) a JSON of all benchmarks like @shawntabrizi mentioned.

@Vovke Vovke added duplicate This issue or pull request already exists and removed duplicate This issue or pull request already exists labels Apr 12, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants