Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

services page got much slower to load between cockpit 217 and 218 #14840

Closed
AdamWill opened this issue Oct 30, 2020 · 14 comments
Closed

services page got much slower to load between cockpit 217 and 218 #14840

AdamWill opened this issue Oct 30, 2020 · 14 comments

Comments

@AdamWill
Copy link
Contributor

Cockpit version: 220
OS: Fedora
Page: Services

Sorry this report is late, but I was looking into Fedora openQA test failures on aarch64 and realized that the Cockpit test is quite often failing because the Services page takes too long to load. Looking at the test history, this seems to have started back in April when cockpit 218 landed.

When the test enters the page, a Loading spinner shows for a long time. Even on x86_64 this sometimes takes 20 seconds or more, but on aarch64 it often keeps showing for around two minutes, at which point the test times out. I just tested on my own x86_64 desktop and saw the spinner for several seconds when loading the page, looking at the JS console doesn't seem to show anything useful. System logs don't show anything relevant during the time the spinner is showing.

@AdamWill
Copy link
Contributor Author

I'm guessing this is related to the several changes @KKoukiou made to how services are parsed between 217 and 218.

@AdamWill
Copy link
Contributor Author

@paulwhalen

@pcdubs
Copy link

pcdubs commented Oct 30, 2020

FWIW, on hardware (Raspberry Pi 4-4GB), the delay is about 4 seconds.

@AdamWill
Copy link
Contributor Author

Huh, odd that it takes so long in the VM. Did you test from a Server DVD install? That's how the openQA test runs.

@pcdubs
Copy link

pcdubs commented Oct 30, 2020

This was a network installation.

@AdamWill
Copy link
Contributor Author

What package set? I'm thinking the behaviour may depend on how many / which services are present...

@pcdubs
Copy link

pcdubs commented Nov 2, 2020

Just the default "@^server-product-environment"

@KKoukiou
Copy link
Contributor

KKoukiou commented Nov 3, 2020

@AdamWill thanks for reposting this. Previously we were displaying all units as they were fetched from the API, now we wait until all are fetched and show the spinner in the meantime, then we display the list.
This justifies your comment that after 217 this page became slower, in terms of we show the spinner instead the list being updated.
However I really never encountered such long loading times as you mention. I will keep this issue as a reference, and try to find a way to make this page faster again.

@AdamWill
Copy link
Contributor Author

AdamWill commented Nov 3, 2020

some logging might help debug it - for instance, it could log each service as it parses it, with a timestamp, so we could see if there is one or a few that are taking much longer. I could not find anything along these lines when I looked. It'd be ideal if it could be logged to the system journal (that makes it much easier to get out of an openQA test) but I could probably work with it being logged to the javascript console too.

@martinpitt
Copy link
Member

I guess this is still relevant (Services page has become slower), so keeping this open for now.

@AdamWill
Copy link
Contributor Author

AdamWill commented Sep 22, 2021

we're seeing something similar with the logs page lately, in fact, again on aarch64. the openQA test goes to the logs page and changes 'priority:err' to 'priority:info'. at this point, the page refreshes to load the info-level log entries. on aarch64, this takes a very long time - on most runs of the test atm, it fails after waiting 45 seconds and the refresh is still not done.

I'll tweak openQA to wait even longer (with a soft failure), but it's obviously not ideal for it to take that long.

@KKoukiou
Copy link
Contributor

KKoukiou commented May 2, 2023

This was addressed by a series of PRs in release 288 like #18502

@KKoukiou KKoukiou closed this as completed May 2, 2023
@AdamWill
Copy link
Contributor Author

AdamWill commented May 2, 2023

Awesome! I'll try and check in on the aarch64 results and see if it's passing more regularly now.

@martinpitt
Copy link
Member

@AdamWill : Note that we run our upstream integration tests on aarch64 as well now (through Packit), and they seem happy enough with our normal timeout. So, crossing fingers! 🤞

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants