Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GTMetrix is no longer usable without account #1366

Closed
garritfra opened this issue Dec 14, 2023 · 22 comments
Closed

GTMetrix is no longer usable without account #1366

garritfra opened this issue Dec 14, 2023 · 22 comments
Labels
bug Something isn't working

Comments

@garritfra
Copy link
Collaborator

Just stumbled across this by reviewing a site:

grafik

Looks like we have to switch. Pingdom maybe?

@garritfra garritfra added the bug Something isn't working label Dec 14, 2023
@garritfra
Copy link
Collaborator Author

Pingdom unfortunately has different results for page sizes. Examples:

fanrongbin.com

GTMetrix: 184 kB

Bildschirmfoto 2023-12-14 um 13 13 46

Pingdom: 97.5 kB

Bildschirmfoto 2023-12-14 um 13 13 25
www.blitzw.in

GTMetrix: 21 kB

Bildschirmfoto 2023-12-14 um 13 13 36

Pingdom: 23.5 kB

Bildschirmfoto 2023-12-14 um 13 12 36

@kevquirk
Copy link
Owner

Oh damn. Might need to do some research as to what we can use.

@garritfra
Copy link
Collaborator Author

FWIW, I checked if https://pagespeed.web.dev/ might be an alternative, but they don't seem to expose page weight at all. 🤔

@kevquirk
Copy link
Owner

kevquirk commented Dec 14, 2023

I don't think that will work as it doesn't mention page size.

1mb.club is using DebugBear doing a test and comparing it to GT Metrix, it comes our with different scores though. 🤷‍♂️

GTMetrix (90kb) - https://gtmetrix.com/reports/kevquirk.com/oMqg7SG2/
DebugBear (50.2kb) - https://www.debugbear.com/test/website-speed/5PV7nucN/overview?metric=pageWeight

@garritfra
Copy link
Collaborator Author

DebugBear seems to be quite close to the compressed size reported by GTMetrix. If all else fails, that might be our only option. 🤷‍♂️

@kevquirk
Copy link
Owner

Yeah, but it means we will need to re-do over 1000 sites to make sure it's accurate. 😳

@kevquirk
Copy link
Owner

I'm wondering if we take this as serendipity and close the project down? I don't have the time to go through all these sites and re-test them.

What do you think, @garritfra ?

@garritfra
Copy link
Collaborator Author

I wouldn't want to close the project down per se. I'll contact you to discuss some details.

@garritfra garritfra mentioned this issue Dec 16, 2023
11 tasks
@meduzen
Copy link
Contributor

meduzen commented Dec 16, 2023

I'm wondering if we take this as serendipity and close the project down?

I was coming to update a size and stumbled upon the same issue. Closing the project could be a nice testimony from the (recent) past, but at the same time it’s also kinda useful to discover new things.

Without automation, maintenance efficiency is kinda limited.

@michaelnordmeyer
Copy link

Some services include the reply headers in the reported weight, some don't. You can use curl to find out, but you only get the data for the HTML page.

With compression:

curl --compressed -so /dev/null -w "Header: %{size_header} bytes, Download: %{size_download} bytes\n" https://kevquirk.com/
Header: 259 bytes, Download: 3814 bytes

Without compression:

curl -so /dev/null -w "Header: %{size_header} bytes, Download: %{size_download} bytes\n" https://kevquirk.com/ 
Header: 236 bytes, Download: 31883 bytes

wget, as someone mentioned in #749, would be an easy solution, but it doesn't work for CSS files importing other files.

@derspyy
Copy link
Contributor

derspyy commented Dec 30, 2023

wow this whole stuff sucks : /
i wholeheartedly think the project should continue at least with compressed sizes.
it was always awesome to see what could be done!

@JLO64
Copy link
Contributor

JLO64 commented Jan 2, 2024

I know this isn't productive towards the current conversation, but it gets worse with GTmetrix. Even if you sign up for a free account there is a limit for how many reports you can generate. After you hit that limit (I'm not sure what it is) you are forced to pay for a plan in order to generate more.

image

Regarding shutting down the project, I'd argue against it. It's genuinely nice having a list of websites like these in a format like this. (I'm totally not in it for the badge on my site) Maybe it can be restarted instead with a new list starting from scratch based on whatever new metric is agreed upon? As nice as debugbear is I think it would be wise to avoid third party services to avoid another situation like what we're going through now.

@garritfra
Copy link
Collaborator Author

I think there's no way around a third party service like debug bear to get consistent results.

We won't shut down the project. Once we find a solution we're happy with, updating all sites shouldn't be an issue using scripts. Ideally we should try to collaborate with the third party service owner (whoever that might be) to lift any possible scan restrictions or speed up the process.

@bradleytaunt
Copy link
Collaborator

Thought I would throw my two cents in:

For 1mb.club I have been toying with the idea of using a custom script that will:

  1. Download all the files of a given website (CSS, images, JS, main HTML page)
  2. Place all these files into a temporary folder
  3. Get the size of all those files together as the total

I finally pushed out what I working on way back and called it "sizegrab": https://git.sr.ht/~bt/sizegrab (it's written in ruby)

It's far from perfect and most likely doesn't cover every use case. But the idea is to try and avoid depending on third party companies / services. If anyone wants to help improve that ugly script of mine, please do so! (or even re-write everything in a language others prefer!)

/ end rant

@JLO64
Copy link
Contributor

JLO64 commented Jan 3, 2024

It's far from perfect and most likely doesn't cover every use case. But the idea is to try and avoid depending on third party companies / services. If anyone wants to help improve that ugly script of mine, please do so! (or even re-write everything in a language others prefer!)

I am probably the least qualified person in this thread to be working on something like this, but I have a bunch of free time on my hands I'm willing to put towards this. For right now, I'm thinking of rewriting this as a Python script (I'm sorry I'm not more comfortable with Ruby!) and deploying it as an AWS Lambda function. After I get that set up, I can make a widget and/or website to query and display results. Since it'll be a Lambda function, maybe it can be used to better automate the project in the future?

@evtn
Copy link

evtn commented Jan 4, 2024

FWIW, I checked if https://pagespeed.web.dev/ might be an alternative, but they don't seem to expose page weight at all. 🤔

Actually they do, under Performance -> Passed Audits -> Avoids enormous network payloads

Although I'm not sure if it's a size after compression

@michaelnordmeyer
Copy link

Although I'm not sure if it's a size after compression

It's the size after compression. You can test with your browser's developer tools on the network tab. They display both sizes.

@JLO64
Copy link
Contributor

JLO64 commented Jan 5, 2024

Lol. While working on a solution of my own I stumbled across Cloudflare's URL Scanner. It's pretty robust, free to use with no restrictions, and has an API (you have to sign up and generate a token to use the API tho)
image

They even show both compressed and uncompressed network transfers!
image

@garritfra
Copy link
Collaborator Author

@JLO64 huh, nice! I'm not sure if our audience will be kosher using cloudflare, but I'd personally be fine with this. @kevquirk what do you think?

@kevquirk
Copy link
Owner

kevquirk commented Jan 5, 2024

I just checked on my site and the results are very close. GTMetrix is reporting 89.4kb and Cloudflare is reporting 91.5kb.

While 2kb is quite a lot when it comes to the smaller sites, it's about the closest we've come out of everything we've tried. Being Cloudflare, it's unlikely to go paid anytime soon too. Also, having the API, we could potentially go back and re-test all sites with CF (although I'd have no idea how to do that haha).

TL;DR I think we have a winner. If you're happy @garritfra we can update the instructions on the site.

@garritfra
Copy link
Collaborator Author

@kevquirk I'll take care of it.

Thanks @JLO64 for the suggestion!

@garritfra
Copy link
Collaborator Author

#1383 is merged, so I'll close this issue. Thanks for all the input!

Any contributions regarding the automatic size checker or other topics discussed here are welcome. I don't think I'll have the time to rewrite that script any time soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

8 participants