The document discusses various techniques for improving website speed, organized into three main categories: transmission, rendering, and serving. Transmission-focused techniques include image compression, minification, HTTP compression, and expires headers. Rendering optimizations involve load order, lazy loading, and parallel downloads. Serving improvements involve using a CDN, disk caching, keep-alive headers, and pre-rendering. The document emphasizes testing techniques like Google PageSpeed Insights and HAR files to diagnose bottlenecks and measure the impact of changes.
This is going to be a super rapid-fire walk through site speed issues and how to deal with them.. Itāll be a bit terrifying if youāre not a serious nerd.
The main goal here is to walk through it all
and get you an idea of relative impact/difficulty for each technique. With that, you can decide what to do next and research if necessary.
Everythingās rated by difficulty and impact.
And, these arenāt every technique. Theyāre just the ones Iāve found to offer the most impact for the time and money spent.
Well, of course, faster is better.
You do need to test stuff. Iāll bring that up a few times.
If you put in the effort to create great content, donāt kill it w/ a slow site.
There is SOME correlation to organic search rankings. But itās weak. Google says itās a ātie breaker.ā
The correlation to lead generation and sales, on the other hand, is VERY strong.
Getting under 2 seconds generates a very high return.
So the real question is: Why on earth would you not do this? Before we go through the techniques and tricks, though, I need to talk about bottlenecks and measurement.
There are three places that can slow sites down ā three bottlenecks or chokepoints.
Serving the content ā the web server generating web pages and content ā is the first bottleneck. Itās hard to make a site faster if the serverās slow.
The bottleneck most people focus on is transmission: Basically, how your site uses bandwidth.
And the most-forgotten bottleneck is rendering. Your browser has to draw the page or content. If the page/content has problems, it can slow rendering, which creates a slower perceived load time.
There are tools that let you measure and diagnose issues at all three chokepoints.
If you want basic, easy-to-read site speed diagnosis, Google Page Speed Insights is great. If youāre just starting out with a site performance audit, start here.
Google PageSpeed Insights mostly focuses on transmission ā on bandwidth.
Donāt worry ā itās in the link
Yslow has more oomph. If you want to really dig I, Yslow looks at more metrics.
Yslow is very, very powerful.
Itās less intuitive. But itās worth using if youāve exhausted Google PageSpeed Insights
WebPageTest.org ā Iād love to love it. Itās just got some issues.
It captures such a nice set of metrics. But the diagnostics and the HTTP Archive are harder to get at.
When youāre ready to really nerd out ā learn to read and use an HTTP Archive, or HAR.
You can generate it in Chrome w/ a few clicks.
Just open the console, refresh the page, and youāre good to go.
Now we get to the heavy stuff: How to actually diagnose and fix/optimize
Starting w/ Transmission
In this HAR diagram, the blue line is download time. Itās an indicator of bandwidth used.
This is the single easiest win in all of site speed.
Google PageSpeed will provide the analysis
And, of course, the HAR will do it.
A few quick examples. This image totally uncompressed was 4 megabytes. Doing even a tiny bit of compression reduced it to 400kb.
Look at what happens ā can you see a difference? No? But itās half the size.
Even smaller
And yes, even smaller
PageSpeed isnāt bad, either, butā¦
It only uses lossless ā PNG - compression.
You can use Photoshop, of course, but here are two other tools. Be very careful about using web-based compression tools, Make sure theyāre reputable.
Always always always do this.
Using the right image format is another one
JPG is for photos, PNG is really for line art. So convert my friend to PNG and it balloons back to 1 megabyte.
Put our logo in JPG and itās 60k
In PNG, itās 13kb. And this isnāt even the vector version
Take out some colors and itās down to 8kb.
Always always always do this.
This is an unminified file.
This is the same file, minified. The difference? Tabs, blank lines are all removed. Invisible characters are still characters!!! Removing them from a large file makes a huge difference. Obviously, keep an editable version of the files. How can you do this?
In Google PageSpeed, you can download the minified files with a click.
But many libraries already come w/ minified versions ā Jquery, for example.
Again, thereās no reason not to.
Seriously. Doesnāt get much easier.
Iāve never seen this break anything. Iāve heard rumors it can cause issues, which is why I advocate testing. But in 20 years Iāve never seen an issue.
Set expires headers for static files
When you set far-future expires headers, the server tells visiting browsers that a particular file isnāt going to change any time soon.
Google PageSpeed will show you issues. So will Yslow, etc., but PageSpeed is easy.
But you canāt usually do this w/ third-party scripts.
Really, itās any site. But expires headers work well for āstaticā files that donāt change that often. So thatās when/where to use it.
Test all of these things. Again ā I havenāt seen them break anything, butā¦
Some resources can block page load.
At itās most basic: Load CSS first. Defer javascript as long as possible. That means all CSS includes should come before any javascript includes. But itās more subtle than that.
You can load javascript in parallel.
Google will show render-blocking javascript. Thatās the stuff thatās in the wrong load order ā the page is stuck.
Use HAR to really drill down and get a look at what loads when.
Do the same for javascript. This ensures visiting browsers only load the javascript/css once per session. They they cache it locally.
Yslow is my favorite tool here.
I donāt know why youād write crappy code. Itās actually more work.
When you set far-future expires headers, the server tells visiting browsers that a particular file isnāt going to change any time soon.
This not only means faster load time because server is closer ā it also means the main server doesnāt have to retrieve and deliver the files.
Retrieving from a database is slower than retrieving from disk (you can also cache in memory ā thatās a separate topic, really)
I wonāt go into a lot of detail here ā code acceleration usually caches executed code so that the server doesnāt have to re-run the code.