The document discusses various techniques for improving website speed, organized into three main categories: transmission, rendering, and serving. Transmission-focused techniques include image compression, minification, HTTP compression, and expires headers. Rendering optimizations involve load order, lazy loading, and parallel downloads. Serving improvements involve using a CDN, disk caching, keep-alive headers, and pre-rendering. The document emphasizes testing techniques like Google PageSpeed Insights and HAR files to diagnose bottlenecks and measure the impact of changes.
This is going to be a super rapid-fire walk through site speed issues and how to deal with them.. It’ll be a bit terrifying if you’re not a serious nerd.
The main goal here is to walk through it all
and get you an idea of relative impact/difficulty for each technique. With that, you can decide what to do next and research if necessary.
Everything’s rated by difficulty and impact.
And, these aren’t every technique. They’re just the ones I’ve found to offer the most impact for the time and money spent.
Well, of course, faster is better.
You do need to test stuff. I’ll bring that up a few times.
If you put in the effort to create great content, don’t kill it w/ a slow site.
There is SOME correlation to organic search rankings. But it’s weak. Google says it’s a ‘tie breaker.’
The correlation to lead generation and sales, on the other hand, is VERY strong.
Getting under 2 seconds generates a very high return.
So the real question is: Why on earth would you not do this? Before we go through the techniques and tricks, though, I need to talk about bottlenecks and measurement.
There are three places that can slow sites down – three bottlenecks or chokepoints.
Serving the content – the web server generating web pages and content – is the first bottleneck. It’s hard to make a site faster if the server’s slow.
The bottleneck most people focus on is transmission: Basically, how your site uses bandwidth.
And the most-forgotten bottleneck is rendering. Your browser has to draw the page or content. If the page/content has problems, it can slow rendering, which creates a slower perceived load time.
There are tools that let you measure and diagnose issues at all three chokepoints.
If you want basic, easy-to-read site speed diagnosis, Google Page Speed Insights is great. If you’re just starting out with a site performance audit, start here.
Google PageSpeed Insights mostly focuses on transmission – on bandwidth.
Don’t worry – it’s in the link
Yslow has more oomph. If you want to really dig I, Yslow looks at more metrics.
Yslow is very, very powerful.
It’s less intuitive. But it’s worth using if you’ve exhausted Google PageSpeed Insights
WebPageTest.org – I’d love to love it. It’s just got some issues.
It captures such a nice set of metrics. But the diagnostics and the HTTP Archive are harder to get at.
When you’re ready to really nerd out – learn to read and use an HTTP Archive, or HAR.
You can generate it in Chrome w/ a few clicks.
Just open the console, refresh the page, and you’re good to go.
Now we get to the heavy stuff: How to actually diagnose and fix/optimize
Starting w/ Transmission
In this HAR diagram, the blue line is download time. It’s an indicator of bandwidth used.
This is the single easiest win in all of site speed.
Google PageSpeed will provide the analysis
And, of course, the HAR will do it.
A few quick examples. This image totally uncompressed was 4 megabytes. Doing even a tiny bit of compression reduced it to 400kb.
Look at what happens – can you see a difference? No? But it’s half the size.
Even smaller
And yes, even smaller
PageSpeed isn’t bad, either, but…
It only uses lossless – PNG - compression.
You can use Photoshop, of course, but here are two other tools. Be very careful about using web-based compression tools, Make sure they’re reputable.
Always always always do this.
Using the right image format is another one
JPG is for photos, PNG is really for line art. So convert my friend to PNG and it balloons back to 1 megabyte.
Put our logo in JPG and it’s 60k
In PNG, it’s 13kb. And this isn’t even the vector version
Take out some colors and it’s down to 8kb.
Always always always do this.
This is an unminified file.
This is the same file, minified. The difference? Tabs, blank lines are all removed. Invisible characters are still characters!!! Removing them from a large file makes a huge difference. Obviously, keep an editable version of the files. How can you do this?
In Google PageSpeed, you can download the minified files with a click.
But many libraries already come w/ minified versions – Jquery, for example.
Again, there’s no reason not to.
Seriously. Doesn’t get much easier.
I’ve never seen this break anything. I’ve heard rumors it can cause issues, which is why I advocate testing. But in 20 years I’ve never seen an issue.
Set expires headers for static files
When you set far-future expires headers, the server tells visiting browsers that a particular file isn’t going to change any time soon.
Google PageSpeed will show you issues. So will Yslow, etc., but PageSpeed is easy.
But you can’t usually do this w/ third-party scripts.
Really, it’s any site. But expires headers work well for ‘static’ files that don’t change that often. So that’s when/where to use it.
Test all of these things. Again – I haven’t seen them break anything, but…
Some resources can block page load.
At it’s most basic: Load CSS first. Defer javascript as long as possible. That means all CSS includes should come before any javascript includes. But it’s more subtle than that.
You can load javascript in parallel.
Google will show render-blocking javascript. That’s the stuff that’s in the wrong load order – the page is stuck.
Use HAR to really drill down and get a look at what loads when.
Do the same for javascript. This ensures visiting browsers only load the javascript/css once per session. They they cache it locally.
Yslow is my favorite tool here.
I don’t know why you’d write crappy code. It’s actually more work.
When you set far-future expires headers, the server tells visiting browsers that a particular file isn’t going to change any time soon.
This not only means faster load time because server is closer – it also means the main server doesn’t have to retrieve and deliver the files.
Retrieving from a database is slower than retrieving from disk (you can also cache in memory – that’s a separate topic, really)
I won’t go into a lot of detail here – code acceleration usually caches executed code so that the server doesn’t have to re-run the code.