-
Notifications
You must be signed in to change notification settings - Fork 571
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GTMetrix is no longer usable without account #1366
Comments
Oh damn. Might need to do some research as to what we can use. |
FWIW, I checked if https://pagespeed.web.dev/ might be an alternative, but they don't seem to expose page weight at all. 🤔 |
I don't think that will work as it doesn't mention page size. 1mb.club is using DebugBear doing a test and comparing it to GT Metrix, it comes our with different scores though. 🤷♂️ GTMetrix (90kb) - https://gtmetrix.com/reports/kevquirk.com/oMqg7SG2/ |
DebugBear seems to be quite close to the compressed size reported by GTMetrix. If all else fails, that might be our only option. 🤷♂️ |
Yeah, but it means we will need to re-do over 1000 sites to make sure it's accurate. 😳 |
I'm wondering if we take this as serendipity and close the project down? I don't have the time to go through all these sites and re-test them. What do you think, @garritfra ? |
I wouldn't want to close the project down per se. I'll contact you to discuss some details. |
I was coming to update a size and stumbled upon the same issue. Closing the project could be a nice testimony from the (recent) past, but at the same time it’s also kinda useful to discover new things. Without automation, maintenance efficiency is kinda limited. |
Some services include the reply headers in the reported weight, some don't. You can use With compression: curl --compressed -so /dev/null -w "Header: %{size_header} bytes, Download: %{size_download} bytes\n" https://kevquirk.com/
Without compression: curl -so /dev/null -w "Header: %{size_header} bytes, Download: %{size_download} bytes\n" https://kevquirk.com/
|
wow this whole stuff sucks : / |
I know this isn't productive towards the current conversation, but it gets worse with GTmetrix. Even if you sign up for a free account there is a limit for how many reports you can generate. After you hit that limit (I'm not sure what it is) you are forced to pay for a plan in order to generate more. Regarding shutting down the project, I'd argue against it. It's genuinely nice having a list of websites like these in a format like this. (I'm totally not in it for the badge on my site) Maybe it can be restarted instead with a new list starting from scratch based on whatever new metric is agreed upon? As nice as debugbear is I think it would be wise to avoid third party services to avoid another situation like what we're going through now. |
I think there's no way around a third party service like debug bear to get consistent results. We won't shut down the project. Once we find a solution we're happy with, updating all sites shouldn't be an issue using scripts. Ideally we should try to collaborate with the third party service owner (whoever that might be) to lift any possible scan restrictions or speed up the process. |
Thought I would throw my two cents in: For 1mb.club I have been toying with the idea of using a custom script that will:
I finally pushed out what I working on way back and called it "sizegrab": https://git.sr.ht/~bt/sizegrab (it's written in ruby) It's far from perfect and most likely doesn't cover every use case. But the idea is to try and avoid depending on third party companies / services. If anyone wants to help improve that ugly script of mine, please do so! (or even re-write everything in a language others prefer!) / end rant |
I am probably the least qualified person in this thread to be working on something like this, but I have a bunch of free time on my hands I'm willing to put towards this. For right now, I'm thinking of rewriting this as a Python script (I'm sorry I'm not more comfortable with Ruby!) and deploying it as an AWS Lambda function. After I get that set up, I can make a widget and/or website to query and display results. Since it'll be a Lambda function, maybe it can be used to better automate the project in the future? |
Actually they do, under Although I'm not sure if it's a size after compression |
It's the size after compression. You can test with your browser's developer tools on the network tab. They display both sizes. |
Lol. While working on a solution of my own I stumbled across Cloudflare's URL Scanner. It's pretty robust, free to use with no restrictions, and has an API (you have to sign up and generate a token to use the API tho) They even show both compressed and uncompressed network transfers! |
I just checked on my site and the results are very close. GTMetrix is reporting 89.4kb and Cloudflare is reporting 91.5kb. While 2kb is quite a lot when it comes to the smaller sites, it's about the closest we've come out of everything we've tried. Being Cloudflare, it's unlikely to go paid anytime soon too. Also, having the API, we could potentially go back and re-test all sites with CF (although I'd have no idea how to do that haha). TL;DR I think we have a winner. If you're happy @garritfra we can update the instructions on the site. |
#1383 is merged, so I'll close this issue. Thanks for all the input! Any contributions regarding the automatic size checker or other topics discussed here are welcome. I don't think I'll have the time to rewrite that script any time soon. |
Just stumbled across this by reviewing a site:
Looks like we have to switch. Pingdom maybe?
The text was updated successfully, but these errors were encountered: