Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide a more meaningful error message in case a tested site is too slow instead of "metric unavailable" #28

Closed
rpkoller opened this issue Jan 12, 2017 · 7 comments
Labels

Comments

@rpkoller
Copy link

It is probably a naming issue. I've installed version 1.2.4 and ran into something i haven't seen yet afterwards. I occassionally get now that kind of errors on slower sites I test against:

bildschirmfoto 2017-01-12 um 14 31 49

bildschirmfoto 2017-01-12 um 14 31 52

But If i observe the Chrome window alongside it is actually painting something but i guess i am going past a certain time scope threshold benchmarking?! so pwmetric terminates the benchmark after a certain time? I wouldn't call it unavailable then but something like, took too long and aborted or provide some override to go until the bitter end? ;) cheers r.

@denar90
Copy link
Collaborator

denar90 commented Jan 12, 2017

@rpkoller thanks, I faced the same issue measuring https://www.cnet.com/special-reports/jony-ive-talks-about-putting-the-apple-touch-on-the-macbook-pro/ too.

@paulirish should we handle this case in scope of lighthouse?

@paulirish
Copy link
Owner

yeah. for both it would be helpful to do lighthouse --save-assets --save-artifacts http://website --output=json > results.json and zip up everything and file a ticket. we can solve it, but it sounds like something is failing.

also @denar90, we probably should catch rejections coming from lighthouse and handle those. it's possible both cases woulda come in like that.

@rpkoller
Copy link
Author

@paulirish problem is those metric not available outputs happening not every run. So i am uncertain if i do the suggested line in lighthouse that the sampled data would contain and outline the case. Is there a way to visualize the json output data with pwmetrics beforehand? or is there a way to see if the metric not available applies to the sampled json case? cuz one of the pages having that issue yesterday and which i sampled with lighthouse right now and pwmetrics afterwards shows no metric not available right now.

@denar90
Copy link
Collaborator

denar90 commented Jan 13, 2017

I've created ticket for Lighthouse. Will keep in touch with guys there.


@rpkoller I reproduced this issue with cnet.com I mentioned above every time I tried. Hope we can figure smth out about it.


@paulirish I found out where it's crashed - https://github.com/GoogleChrome/lighthouse/blob/a7648e7eeef7dce5641147f2bf75725a084e8937/lighthouse-core/audits/first-meaningful-paint.js#L148.
Unfortunately we can't catch rejections because Lighthouse doesn't reject these kind of things. It just return debugString value - https://github.com/GoogleChrome/lighthouse/blob/a7648e7eeef7dce5641147f2bf75725a084e8937/lighthouse-core/audits/first-meaningful-paint.js#L79
So we should check this one and throw our own error with debugString value.
Thoughts?

@denar90 denar90 added the bug label Jan 20, 2017
@pedro93
Copy link
Collaborator

pedro93 commented Jan 30, 2017

Any update on this issue? I've come across it a couple of times aswell.

@denar90
Copy link
Collaborator

denar90 commented Jan 30, 2017

@pedro93 we are waiting for resolving lighthouse issue. I think guys are busy with more priority stuff right now...

This was referenced Feb 11, 2017
@pedro93
Copy link
Collaborator

pedro93 commented Mar 8, 2017

Seeing as this is merged I will close it. Feel free to re-open it something is missing.

@pedro93 pedro93 closed this as completed Mar 8, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants