On-page SEO does really matter. We've been hearing it from others for ages and experiencing its benefits for ourselves. Without a solid on-page foundation, your site will be crippled and chances are, it's never going to reach its full potential.

But then,

Knowing this arises even more critical questions.

How are we supposed to know when we're done with the on-page, so we can finally move on to other aspects of SEO?

Are throwing a bunch of semantic keywords in body tags and URL, writing long articles, and siloing them going to be enough?

What about more technical factors such as image compression, browser caching and minifying resources and others?


Unfortunately, probably due to lack of access to big and accurate data, we never had a chance to read a data-driven research that gives us insight into what it takes to have a truly stunning on-page, and how much effect they might have on our rankings.

Well, until now.


Before jumping into further research and study, I'd like to first let you know more about;

Our data and research method

  • This study is backed by 1 million English SERPs and nearly 10 million URL's page speed data.
  • We tried to exclude brand dominated SERPs as much as possible from our dataset to keep outliers from distorting the result and messing with the statistical significance. Why? if Google relates a domain with a brand name, for the sake of showing the most relevant result, it gives that domain an unfair advantage over the competition and tends to ignore SEO factors we're aware of.

    For example:
    SERP we DID use; "iPhone 8 reviews"
    SERP we DIDN'T use; "iPhone 8"
  • Since Google has shifted its approach from mobile first, to AI first, things have gotten a bit more complicated. Most important of all, it's not that much black and white anymore. Thus, instead of just labeling a page as optimized or not, we also checked rule impact values that indicate the importance and priority of implementing that rule.
  • All data we used come from our competitive keyword research tool Cognitive Intelligence and free Page Speed Analysis app. You can also use Google's PageSpeed Insights, but half of the data you need will be hidden.

That's enough babbling. Now let's get started with the fun part.

Text size on page

Let's start our list off by analyzing a widely believed phenomenon.

Do longer articles really have an advantage over the competition?

Indeed, data we extracted from 10M URLs suggest so;

Pages with higher text size (in bytes) significantly out-performed their competitors in SERPs.

Server Response Time

Server response time simply indicates how long it takes for your server to respond back. Any number above 2 seconds is considered highly poor UX-wise.

We analyzed this rule in two different ways;

The first graph below shows the number of pages with instantly responding (below 200ms) servers.

Although correlation wasn't very strong, lower ranking pages' servers responded a bit slower;

Main Takeaway

Your site's server response time directly or indirectly contributes to your rankings. Probably by impacting bounce rate negatively.

GZIP Compression

GZIP (GNU Zip) is an algorithm for file compression and supported by all modern browsers.

As you can see in the graph below, almost 60% of the first 3 results fully leveraged GZIP compression, and the optimization number dropped to 50% as we analyzed lower ranking URLs.

Main Takeaway

Fully leveraging GZIP compression, in other words using GZIP algorithm to compress compressible resources on your page can help you reduce transferred file size up to 90%. And it's certainly going to make your job easier on SERPs.

Browser Caching

Benefits of browser caching are undeniable, static files on your page get saved on visitor's browser, next time s/he visits your site, or just bounces to another page, instead of fetching those cacheable files from the server, they get delivered from cache, resulting in a much faster user experience and load time, it also cuts the data cost down for the visitor.

All good, but how it affects your performance on Google?

In our research, we surprisingly found that only 6% of 10 million URLs fully and correctly implemented browser caching, while clearly occupying higher places in Google search results.

Main Takeaway

Leveraging browser caching features fully is probably one of the most important technical SEO factors you can ever implement.

Its ranking benefits on our big dataset were crystal clear and extremely drastic.

Minified Resources

Minifying is the process of removing indents, spaces, and comments from resource files on your page, including JavaScript, HTML and CSS.

Implementing these necessary changes reduces transferred data size, allows browsers and even crawlers to read and parse page's source code a bit faster.

But how Google algorithm responds to it?

Although, as much as I'd like to say that minifying your source code helps, unfortunately, data we have is quite neutral. In fact, the number of pages with fully minified JavaScript and CSS files increase as we go down further in the SERPs.

Does it mean you should avoid minifying resource files? Certainly not. It's a still important step towards perfection. Then why we can't see its effect on the SERPs?

My theory is that, since those resource files are 100% cacheable, need for minifying them kind of becomes irrelevant in Google's eyes. One way or another, minifying your resource files doesn't seem to have an affect on your rankings.

Number of hosts

This rule reveals the number of JavaScript and CSS resources, loaded from other hosts.

As you might have already guessed, these pages with lower ranking have slightly more external resources in average.

Main Takeaway

When you load your style and JS files externally, your site becomes dependent on that hosting's health status. Besides that, your visitor's browser makes extra HTTP requests, and waits for those servers to respond back. Instead of loading these libraries directly, you could host them on your server, or at least replace them with the CDN versions when applicable. Popular libraries used by millions of sites such as Jquery, official Share/Like buttons of social networks, Disqus comments etc are generally hosted on CDNs. Nevertheless, paying extra attention to that rule might pay off big time.

Average Image Response Size

Keeping your visitors interested in your content is important. Using many images and visual content is one of the best ways to accomplish that.

Fortunately, Google algorithm seems to agree with us;

As can be seen from the graph above, top 4 results had significantly more images than rest of the competition. Image size is calculated in bytes and it covers all kind of images on the page, including icons, image buttons and all other JPG, PNG and GIF visuals.

What's next?

Call your developer, or head over to Page Speed Analysis and PageSpeed Insights, and start identifying weaknesses in your website.

Remember, those details might seem small at first, however, they are likely why you can never break into the first page, or take your spot in the top three results.

If you have any questions, or need help implementing the changes listed, feel free to drop a comment below.