Learn how to optimize your website for the popular page speed tests!
- Do you want the fastest site possible?
- Do you want the best website functionality and SEO?
- Are you overwhelmed and don’t know what you’re doing?
Read on as I go over the most common speed test recommendations and tell you which ones to optimize and how to do it!
Want to skip ahead?
The truth about speed scores
In all honesty…
There’s nothing wrong with using speed tests. They work fine and are helpful…to professionals. They’re made for professionals and use terminology that professionals understand. It’s when you have naive users/clients trying to make sense of them that all of the sudden they become a problem. The tools measure so many different things and yet don’t fully explain every little thing and why this or why that. It just gives a little warning message and over-simplified suggestion to correct.
But generally, I disagree with many of their recommendations and the way the results are laid out. They’re helpful if you know what you’re doing but completely misleading and confusing if you’re a noob. There’s a reason why TONS of experts tell you to ignore page scores!
I wrote my own guide as well:
And don’t take my word for it…read from other respected experts:
- Chasing the Perfect Rainbow – WordPress Speed Up Facebook group
- Speed Performance Grades don’t matter – Freelancer tools
- 5 Speed Optimization Myths – WP Rocket
- Fuck Pagespeed – Glueckpress
- Do Not Use Google Pagespeed Insights – WPFixIt
Are you educated yet? If so, you may continue.
Common Google Pagespeed Insights optimizations (and how to read them)
By far, my least favorite of all the page speed test tools out there. I list it first because it’s the most common and most annoying one that naive clients reference. They think this tool matters the most because it’s made by Google, which also runs the #1 search engine in the world.
But the reality is, there’s very little correlation between scoring well on this tool and ranking high on Google search engine. Don’t believe me? Try it for yourself…search Google for any random keyword and check then run the Google Pagespeed Insights test for the top 10 results. I randomly searched “speed addiction” just now and got the score for the top result….hahaha, it was 21 out of 100! SEE?! The test is absolute bullsh*t!
Just so I don’t mince my words: I NEVER use GPI! It’s crap. Doesn’t give me any helpful details or recommendations to look at. And even the information it does give me is either completely unhelpful, or wildly off the mark. You might also notice that GPI doesn’t even let you pick a server location to test from, which means it actually isn’t even measuring concrete times at all. It’s just kinda blind-guessing at what how your website loads and how long it takes. Let me say it again…GPI is at totally useless at best, or confusing/misleading at its worst. Just ignore it!!!
First contentful paint:
- Inaccurate! Just ignore and use the eye test.
- In theory: first contentful paint around 500ms is great, around 1 second is ok, and longer should be optimized.
- The problem with GPI is that it often way overstates your FCP times as being 5 seconds when browsing it yourself takes more like 1 to 2 seconds.
Speed Index, Time to Interfactive, First Meaningful Paint, First CPU Idle, Max Potential First Input Delay:
- Ignore all this crap. They often overstates your actual times.
Properly size images:
- Some of the recommendations here are legit and you actually have images that should be optimized better via proper sizing or compression.
- But some of the images may be intentionally left at higher quality (for better clarify), or larger size (for retina-compatibility).
- And some of the images don’t even have much to optimize. Should they really be hassling you about a 3KB optimization on a 1MB image? C’mon!
Defer offscreen images:
- NO! And I hate that the tool is so opinionated about this. As mentioned before, I HATE LAZY LOADING. Read my guide before you argue with me.
- I do not recommend lazyloading images for many sites. Why? Because it hurts user experience, delaying content load just so you can get “faster times”. And I will forever hate tools that keep telling me to lazyload.
Serve images in next-gen formats:
- Can be a valid point in regards to improve your image compression. But can also be ignored if you know why you’re using specific image formats.
- I also hate that they’re too eagerly pushing the WebP format which really isn’t that widely adopted yet in all devices, browsers, and image software.
Eliminate render-blocking resources:
- I do appreciate the tool trying to point out items that delay rendering, but do you even know WHY some resources SHOULD be render-blocking? It’s so you don’t get FOUT/FOUC issues, which is where your content loads before the CSS stylesheet and so things look ugly before the CSS loads, which then re-renders the page again. Some CSS absolutely should be render-blocking.
- Render-blocking JS also exists for a reason as well. Some JS is absolutely needed to render visual parts of your site and without loading that JS first, the content spills in an unwanted manner. I’ll put it this way…imagine trying to bring your friend a glass of wine by carrying over the wine first (in your hands) and then the glass. You see, the glass is absolutely needed to control the way the content is delivered. We cannot have content spilling out randomly without its intended rendering effect.
Efficiently encode images:
- More image-related optimization suggestions…annoying! Don’t worry about this if your images are properly-sized and compressed!
Remove unused CSS:
- HAHAHAHA! Sorry, guys. This isn’t possible for most of you. This is because you use many plugins that have overlapping CSS styles. Perhaps your theme styled buttons a certain way, then you added a shopping plugin which styled buttons differently, and then later added a custom CSS plugin with its own styling that overwrote the previous two. In this case, the button style from the theme and shopping plugin would be considered “unused CSS”.
- Only 2 ways to get rid of unused CSS. The easier way for most folks is to somehow dequeue unnecessary CSS from themes/plugins. The best (but most technical) way is to custom-code your site so that you have only the exact code needed and no unused stuff. This is why I love having everything hardcoded. Our sites are always super clean and lean with no fluff.
Reduce server response times (TTFB):
- Generally, anything around 200ms (0.2sec) is good. I’ve seen GPI complaining about 0.15s TTFB before I’m just gonna ignore it as being stupidly inaccurate.
Ensure text remains visible during webfont load:
- No, that’s a stupid idea. That’s exactly how FOUT issues happen. And your text looks ugly for a split second before the font loads and then the page quick re-renders and gives a jarring user experience. NO NO NO!
Avoid enormous network payloads:
- Somewhat valid warning suggesting your site should be more lightweight. I halfway agree with GPI’s suggestion here but overall don’t take them seriously whatsoever.
- It doesn’t really matter how big your pagesize is. I’ve seen some sites with 8MB of data load faster than other sites with only 1MB of data. So while yes, I do agree that sites should always be as light as possible, I don’t agree that their total size correlates well with load times. Why? Because it has to do with processing time and render-weight.
- Code takes longer to process and render than simple static assets. For this reason, a 2MB site (with 500KB html, 500kb CSS, 500kb JS, and 500kb images) will load slower than a 5MB site (with only 100KB html, 100kb CSS, 100kb JS, and 4.7mb images). But does GPI account for this? Of course not. It simply throws out an automated warning when your site goes above a certain pagesize.
- The bottom line? Yes, you should try to load only the items you really need but overall, you don’t even need to worry about this if your site is loading fast enough!
Minimize main-thread work:
- Ooooh, seemingly helpful metrics about JS rendering times. All these are theoretically important and yes, you should try to load as little JS as possible. Only two ways to do this…either remove some of your bloated plugins or pick ones with lighter code, or recode your JS better (probably not the issue for legit developers).
- My gripe about GPI’s metric here is that it’s inaccurate. It’s saying one of my client sites takes 14 seconds when in reality, the entire page loads in a few seconds. Sure, there might be some background JS still lingering but they don’t affect page load or function! ARGH!
Serve static assets with an efficient cache policy:
- Another one of those canned recommendations that I absolutely hate. YES, static assets (which don’t change often) should generally be cached for a long time.
- HOWEVER, not all static assets should be cached. Some are related to your site design and functions and shouldn’t be cached so that your users can see the latest version of your site when you make changes!!!
- Also, some static assets are loading from a 3rd-party server (like Google Analytics, or API scripts, or webfonts) and because they aren’t loading from your site, you have no control over how they are loaded! Do you really think 3rd-party services wouldn’t have cached these assets and reduce their server loads if they didn’t have a reason to leave them uncached???
Reduce javascript execution time:
- Really handy JS-diagnostic tool to tell you how long each JS takes to load. The issue is that it overstates the execution times IMO, and also doesn’t tell newbies how to resolve the issue.
- If you didn’t code these JS yourself, then you only have one choice to optimize this: get rid of them. Yes, this might mean you lose whatever function that it serves. If you still want to have that same function, you either load another theme/plugin that is coded more efficiently or custom-code it yourself.
Avoid excessive DOM size:
- Valid metric here.
- You can either have fewer things on the page or have better coded theme/plugins, or recode the page to be lighter on the DOM.
Minimize Critical Requests Depth:
- This basically means your theme and/or plugins are too bloated. And that you have too many things loading and too many things loading other extra things.
- Pick leaner themes and plugins, and/or custom-code some things yourself. Or just get rid of non-essential visual elements and functions.
Keep request counts low and transfer sizes small:
- Honestly, this doesn’t matter as long as you made your site as lightweight as possible. It really doesn’t matter if you have 100-200 requests and some of the file sizes are large.
- What matters is that your site loads quickly. And if your site isn’t loading quickly, then you can look at this list to see which types of resources you have loading.
Common GTmetrix recommendations (and how to read them)
GTmetrix is by far my performance testing tool. It measures a ton of little things, actually gives concrete numbers and nice visual charts, also many tabs and sub-tabs chock full of helpful details for developers to optimize their sites.
Signing up for a free account allows you to choose from many test locations, and also save 30 days of your past tests for easy comparison over time. If you only have time for one speed tool, use THIS one!
Serve scaled images:
- Valid metric. Make sure your images are properly cropped and sized to the dimensions of which they’re displayed. With that said, having an image slightly bigger than the space it’s given is not so bad!
- However, there are some exceptions where you want over-sized images for retina-compatibility purposes.
- You can fix this by resizing the images better or having a theme/plugin that correctly resizes the displayed images for you.
- My issue with this metric is that it too often shows an alarming “F” score when it’s not that big of a deal at all. It many cases, it doesn’t noticeably affect your page load time.
Serve resources from a consistent URL:
- Valid metric. Your site should definitely show all resources from the same domain and also in the context of HTTP or HTTPS, with-WWW or without-WWW.
- EXCEPT this recommendation doesn’t apply when you’re using CDN. Obviously since CDN needs to load from its own URL.
Leverage browser caching:
- Another one of those canned recommendations that I absolutely hate. YES, static assets (which don’t change often) should generally be cached for a long time.
- HOWEVER, not all static assets should be cached. Some are related to your site design and functions and shouldn’t be cached so that your users can see the latest version of your site when you make changes!!!
- Also, some static assets are loading from a 3rd-party server (like Google Analytics, or API scripts, or webfonts) and because they aren’t loading from your site, you have no control over how they are loaded! Do you really think 3rd-party services wouldn’t have cached these assets and reduce their server loads if they didn’t have a reason to leave them uncached???
Defer parsing of Javascript:
- Render-blocking JS exists for a reason. Some JS is absolutely needed to render visual parts of your site and without loading that JS first (aka “critical JS”), the content spills in an unwanted manner. I’ll put it this way…imagine trying to bring your friend a glass of wine by carrying over the wine first (in your hands) and then the glass. You see, the glass is absolutely needed to control the way the content is delivered. We cannot have content spilling out randomly without its intended rendering effect.
- You can and should optimize this but it’s up to you to know which JS should be deferred and which should not. You should also be careful to know how to defer it so that it doesn’t your page design or function.
- Most likely, the easiest optimization for most of you is to avoid as many nonessential design effects, features, and plugins as possible.
- I also hate that this tool doesn’t know which JS is critical and should be render-blocking and so it naturally recommends to defer all of it. HELL NO! Do you want the JS mobile menu to show up last for mobile-visitors? Do you want your ATF slider to render last? I think not!
Combine images using CSS sprites:
- This is a silly cumbersome tactic popular back in the days like 10-20 years ago.
- While it can be sometimes useful now, it doesn’t have much effect in this current time of HTTP/2 protocol. It’s also really tedious to build CSS sprites.
Minimize redirects:
- This metric is sometimes valid but other times point to things you have no control over.
- Any redirect chains caused by your site should absolutely be fixed. For example: all URLS in your site should use a consistent domain (HTTP or HTTPS, with-WWW or without-WWW). Sometimes, the issue is simply because you type the HTTP version of your domain into the test instead of HTTPS.
- Any redirect chains caused by 3rd-party assets (loaded from 3rd-party servers) cannot be fixed or controlled by you. You have to ignore them, or stop using whatever plugin/service that’s making those redirected requests.
Specify a cache validator:
- Technical metric that isn’t meant to be read by non-developers. Most of the time, they’re referencing 3rd-party assets you have no control over. Just ignore!
Avoid CSS @import:
- Valid metric pointing out something that does affect your page render times. Unfortunately, CSS @import is often used by themes by some themes and plugins for various reason. I’d like to say it’s out of laziness. At this point, I recommend you either avoid using those themes/plugins or find other ways to include those CSS yourself.
- Want to manually optimize and include the CSS attachment yourself? Try this guide (Gift of Speed) or this guide (Varvy).
Optimize Images:
- Useful metric to let you know which images can be optimized more.
- Here’s my thing. Smaller or less important images should be optimized as much as possible.
- But for important images that need pristine quality (products, photography, etc), you may want to use higher quality than what this test recommends. This is why it’s important to know what fits best for your site instead of listening to some automated tool.
Specify Image Dimensions:
- I don’t think this is really necessary and I’m too lazy to explain why.
- Ok fine, I’ll try a short version. Basically, it might be more storage-efficient to reuse certain images in different places of your site even when they don’t match the dimension perfectly. I also think not setting the dimension only marginally affects initial paint rendering but not the end result.
Minify Javascript:
- I hate how much of a big deal test scores make out of this. A lot of times, saving 50% of some JS file is only saving 1KB. Most of the time, minifying Javascript doesn’t even make that big of a deal…and if did, then your issue is that you have too much JS in the first place!
- Btw, if you really want to minify your Javascript, you can just do it from Cloudflare at the DNS level instead of wasting processing power on some PHP plugin.
Inline Small Javascript:
- Yes, it’s true. Inlining some small external JS is probably more efficient than making a separate HTTP request for it.
- Thing is it’s usually called as an external request for a reason! And if you’re not a coder, you won’t know whether or not it should be inlined or not. Why go through this hassle for such a tiny gain?
Optimize the order of styles and scripts:
- Somewhat pointless metric. It’s not that your script load order doesn’t matter, but that the only way they were loaded in this un-optimal manner was due to your use of bloated themes and plugins.
- So again…either get rid of some features and functions, or manually hard-code yourself.
- And even then, the impact is not always as big of a deal as this warning makes it out to be. In some cases, the seemingly un-optimal load order was intended!
Minify HTML:
- Not really that big of a deal, IMO. Sure, it makes your site a tiny bit more lightweight but at what cost?
- Minifying your site for free at the DNS level with Cloudflare is my favorite option. Do it at the server level on-the-fly upon each initial page load request?…I think that’s absolutely silly.
- This is one of those recommendations that impact more when your site is bloated, and IF your site is bloated, then we know what you should really be focusing on is reducing the bloat and not wasting more server processing to minify stuff!
Minify CSS:
- Same explanation as above.
Specify a character set early:
- Not that big of a deal and not much impact. I will say that if you’re seeing this metric, your theme probably dropped the ball on that!
- You can manually specify a character set following this guide.
Enable gzip compression:
- Totally valid metric but somewhat annoying in its lack of explaining alternate scenarios.
- FIRST OFF: you should totally be using GZIP compression as it greatly reduces the size of your site assets, making them quicker to transfer and load. HOWEVER, you can totally ignore GZIP and also this recommendation if you’re already using BROTLI compression on your server which is even better than GZIP!
- SECONDLY: you should also ignore this recommendation if it’s referencing assets being loaded off 3rd-party servers (since you have no control over them!)
Specify a Vary: Accept-Encoding header:
- Not a big deal in my opinion. But again, here’s my repeated pet peeve…it often references 3rd-party assets which you have no control over.
- More info on fixing this issue in this guide.
Avoid bad requests:
- Totally legit metric. You shouldn’t make any calls to assets that don’t exist. Maybe you’re referencing non-existent things, or maybe you’ve mis-spelled some links and image names. Fix them!
Avoid landing page redirects:
- Legitimate recommendation!
- But also could be maybe you typed the wrong version of your domain? Or maybe you’re intending a redirect?
Enable Keep-Alive:
- This recommendation is generally a good thing in the page load performance world.
- But in the server world, there are varying opinions. Some say to enable it, but set a proper time limit. Others say you might be better off disabling it. You can decide which is best by factoring in your website, traffic, and server capacity.
- This is one of those things where you have to be a server person to know if you should have it on or off, or to even enable it yourself.
Inline small CSS:
- Sure, you can do it if you know how and if you know whether or not it’s more efficient-loaded this way. How small is small? And does it really have to be inlined if those static assets are already cached? Does it really have to be inlined if it’s not even used for critical rendering? Ehhh, probably too much manual coding skill required and lots of little nuances to consider for something that probably won’t impact you much anyway!
Minimize request size:
- Highly ideal! Try to keep it small so your page load feels snappier!
Put CSS in the document head:
- Yes, it’s a valid general recommendation. But doesn’t have to be followed to a tee if you know what you’re doing.
Prefer asynchronous resources:
- Yes, this is generally recommended.
Avoid a character set in the meta-tag:
- Not a big deal either way. Here’s how to do it.
Avoid empty src or href:
- I think these issues are relatively negligible but can have some effect on your server if you have tons of them and/or tons of traffic. I hate that the tool won’t even tell you where the empty src or href are.
Put JavaScript at bottom:
- The conventional logic behind this suggestion makes total sense. HTML is the content, CSS is the visual styling, and JS is very often related to functions. So generally we think HTML and CSS should load first and JS should be loaded last. The only problem with this logic is that nowadays, JS is very often used for design purposes. It’s often used to load sliders, or theme elements, or many other visual elements. Delay that JS, and you would be delaying your page load.
- So my point is…the suggestion isn’t always relevant and definitely not for every JS. So it’s up to you to know which JS can be safely deferred and which JS should be left to load as quickly as possible. And once you know that, you can just ignore this suggestion completely.
Common Pingdom Speed Test suggestions (and how to read them)
Pingdom used to be one of my favorite speed tools (mostly because of their cute/friendly UI), but has since become a bit annoying to use. It’s STILL somewhat useful and relevant, as it does give helpful info and is also used by many clients and developers alike. I also think part of its popularity is because it records the fastest times compared to GPI and GTmetrix. This is because Pingdom doesn’t record things like favicon load time which often drags out the load time for GTmetrix.
The reason why I don’t like it now like I did before is because of all the limitations. Pingdom improved their UI but got stingy with their free service. The test doesn’t let you run multiple tests as once; it seems like you can only test a domain once every 5 minutes or so. If you try to run it again too soon, you either get a warning message or a “cached” repeat score that you already saw before. There are also times when it won’t run the test from the server location you choose…VERY ANNOYING! I do like that the Pingdom test score URL’s seem to save for a lot longer. I feel like you could open the test score URL’s several months later and still see the results, whereas GTmetrix only saves for 30 days. GPI doesn’t allow you to save them at all, I believe.
Reduce DNS lookups:
- This suggestion makes little sense to me. It’s obvious that you shouldn’t have many different domain lookups for the same domain. But when you have resources loading from several different domains, as is the case with many sites nowadays, this suggestion no longer fits the bill. Even an “average” website nowadays will load from the origin domain, then webfont, then marketing tracker script, then chat script, and also some others.
- My point is, there isn’t much you can do about this suggestion. As long as you’re not calling resources from different versions of the same domain, you’re fine. It would be NICE if this tool would at least let you know all the hostnames (and their variants) being requested.
- Some more info on this suggestion if you want to “optimize” it.
Make fewer HTTP requests:
- I’ve got mixed feelings here. The obvious response is DUH!!! Fewer requests is better than more requests.
- But the question lies in HOW you reduce your requests. If you’re doing it by actually removing requests, that’s fantastic. But if you’re doing it only by combining CSS and JS, that’s not exactly helping it. Sure, we can debate all day about whether or not
Compress components with gzip:
- Totally valid metric but somewhat annoying in its lack of explaining alternate scenarios.
- FIRST OFF: you should totally be using GZIP compression as it greatly reduces the size of your site assets, making them quicker to transfer and load. HOWEVER, you can totally ignore GZIP and also this recommendation if you’re already using BROTLI compression on your server which is even better than GZIP!
- SECONDLY: you should also ignore this recommendation if it’s referencing assets being loaded off 3rd-party servers (since you have no control over them!)
Use cookie-free domains:
- A terribly-unexplained “junk” suggestion to me. The suggestion basically complains if you’re using cookies but doesn’t tell which cookies you’re using or even explain why your site might be using cookies.
- In case you don’t know, there are many reasons for using cookies. They’re used for managing user sessions (logged-in users), remembering previous user choices (GDPR, newsletter pop-ups), or tracking purposes.
- Are you absolutely sure you don’t need or shouldn’t have cookies? Try following this guide or this one.
Add Expires headers:
- Useful suggestion of telling users’ browsers to cache static assets that don’t change often. The only problem is that this suggestion/score is often complaining about 3rd-party assets loading off external servers that you have no control over.
Avoid URL redirects:
- This one is a no-brainer. Ideally, you should not be redirecting your domain to another. The reason why some people are seeing this is because they’re entering the wrong domain, or maybe the reference is to some 3rd party links redirecting themselves.
Configure entity tags (ETags):
- ETags help browsers save time by letting them know whether resources can be loaded from local cache rather than re-downloading from the origin server. With that said, they aren’t always the best option depending on your scenario. Either way, this metric isn’t a huge deal IMO since most visits are first-time visits anyway.
- Learn more about ETags here.
Raphael Bolius
Thank you for that article! I recommended it just now to a client that complained about the google-tool. 😉 Let me say somne more words about pagespeed:
1) The main problem about pagespeed is that clients want everything. And they do not only everything, they want everything at the same time. That is because there is a myth in the web that says: Everything in the web is possible and it is for free. But not everything is possible and REALLY not many things are free. Google and fecebook e. g. you pay with your data. And you pay more then you can ever imagine.
2) Fast loading time helps the planet by reducing the amount of energy that the server needs to deliver the website. As the web is responsible for 2-4% of global energy-consumption, it might be a good idea to build websites that use less energy and set less CO2 free. I offer an overview about 10 “green” websites – such as the website of the green party in germany and the website of Greenpeace-Germany in my blog: https://gruenkraft.design/webdesignblog/energieverbrauch-und-co2-emissionen-bekannter-webseiten/ I wonder why nobody is interested in that.
Ah, maybe it is because the big theme-producers and plugin-producers do not want people to know too much. People who understand the web are bad clients. They want to have clients with the intelligence of an ape (no insult on chimansees is intended! 😉 ) that just needs to know how to press a button and then he or his WP-Theme loads up his 20MB image-file directly from the camera, and then in order to make his site not to crash completely, you he will have to buy a plugin that crops his image. (I had a photographer as client that used to have a loading-time of 1 minute ( 😉 ) I coded him a simple theme and added some caching-plugin. Now he has exactly the same with a loading-time of 1.5 sec. But no more gadgets in the backend. 😉
3) Oh yes, here we go: It is the ape. The ape who does not want to understand anything about the web. It just wants to have a wonderful website, with no coding-knpowledge in 5 seconds. It is the same guy who sells his data to facebook so he can talk with non-existing friends about crap that nobody interests so he can receive as miuch “likes” as possible.
PUA! Sorry, I get in rage, because that is my daily business:
1) The clients wants a website.
2) No, it cannot be handcrafted, that is too expensive.
3) Yes, we have to use a crappy premium-theme and 500 plugins.
—
4) Oh, my god! Yesterday I was in in the countryside and used my smartphone. The website needed soooo ong to show. What is the problem mister webdesigner?
Johnny
Hahaha, Raphael. You illustrate the problem with many naive clients perfectly. It’s ok. We all learn eventually!
rupam krishna Bharali
An all in one article.
I actually like it because you have covered all the important stuff nicely.
And the lazyload, what I saw that when it’s activated my page bounce rate was 71% which now drop to 56% after deactivating it. Thanks to the plugins because they allow us to uncheck it.
I have one more say regarding “cookie-free domain”, that my old site rankwp.in is having some issue so the better version of the article that you are linking is republished in wpblogging.in.
Link: https://wpblogging.in/site-optimization/4-ways-to-setup-cookie-free-domains/
You can update this link to the new domain so that people can read it without facing any problem.
….. have a great day.
Johnny
Ahhh, fixed it! Thanks for stopping by.
Siddhit
I have two words to explain but you are actually a lifesaver for us, i am a fan of your articles from now. Really explained each words into pixel and pixel form beginner point of view.
Johnny
Glad you like it, Siddhit!