How To Use A Google Custom Search Engine For SEO Analysis [Case Study]

13 replies
  • SEO
  • |


How To Use A Google Custom Search Engine For OnPage SEO Analysis

There's an interesting way to use a Google Programmable Search Engine (Custom Search) to figure out which of the Top 10 Google results has the best onpage SEO, compared to those that are ranking higher primarily from offpage SEO or other ranking factors (like Domain Age, User Engagement, etc.)

This can be helpful when you're trying to analyze or compare your content, so you know if you need to make it more relevant, or focus on other SEO ranking factors.

Google Programmable Search seems to "wash out" the effect of backlinks in it's results, so that you get a better picture of onpage SEO relevance.

Here's how you can test this:

Step 1: Do a search at Google.com for your keyword. In this example, we'll do a search for "how to catch an armadillo".

Step 2: Open a new tab and login to Google Programmable Search.

Step 3: Create a new Custom Search Engine and add the Top 10 organic URL listings from Step 1. (Note: Do not add more than 10, or the results start acting weird.)

Step 4: Repeat the same search using the Google Custom Search.

Step 5: Compare the position differences.

Now, in the image below, on the left side, you will see the ranking position for the Custom Search Results.

On the right side, you can see the normal Google.com search results. I have also added the number of backlinks pointing to that URL (data via SEMRush).



How to interpret these results...

The first result in Custom Search is also the first result at Google.com, so no changes there.

But, next we see that the #6 organic result from Google.com, is the #2 result in the Custom Search, even though it has 0 links.

It seems that if backlinks and other ranking factors were not included in the final ranking relevance score, this page is more relevant than the others.

From here, you can keep looking down and comparing all the differences...

How to use the analysis...

Well, you can use this method to study the most relevant pages based on content relevance, so that when you write your page, you can include the same important words, phrases, details, etc.

Also, when you finish writing your content and you publish it and it gets indexed, you can then come back and add your new page to the Custom Search, and then do a search to see if it outranks the other pages. If it does, it means your page is more relevant, based on onpage factors. At the point, you can begin to focus on external ranking factors and user engagement signals, to pull the rank up even higher.

Since it takes a little time, you probably only want to use this on keywords that you're really determined to rank #1 for. But, in those cases, this can be a helpful method to help you determine what your next steps should be, so you can get to the top.

All the best,
Jack Duncan
#analysis #case #custom #engine #google #search #seo #study
Avatar of Unregistered
  • Profile picture of the author FableKeeper
    Looks like a method to check if your website ranks good based purely on keywords.

    Do you think it's better to focus on this first, and backlinks should be the next priority?

    Anyhow, thanks for sharing!
    {{ DiscussionBoard.errors[11805807].message }}
    • Profile picture of the author Jack Duncan
      @FableKeeper,

      Great question...

      The short answer is yes...I think you should focus on the keyword first.

      Now, here's some of the thinking behind this, if you'd like to go deeper.

      When the U.S. Department of Justice brought their antitrust lawsuit against Google, one of the things to come out of the discovery process in that trial was some internal Google presentation documents.

      These were not supposed to be seen by the public, but the DOJ released them.

      Even though they are from 2016, they explain quite a bit about how Google actually works, and how you might structure an SEO campaign to take best advantage of the algorithm.

      Take a look at page 5, 6, and 7 of this document: https://www.justice.gov/d9/2023-11/417516.pdf

      Pretty crazy...

      Ok, so this tells us that onpage SEO relevance is important to give you page a chance to rank on the first page, so that user signals can take over, and tell Google if the page should actually be in the Top 10.

      This has been true for some time, and going forward, I imagine that user signals will completely trump traditional onpage/offpage SEO, if they don't already for some queries.

      I think this is helpful to know, and actually makes a lot of sense, when you think about what people want from a search engine.

      If you use this custom search engine method to find the pages with the most onpage relevance, and see what is on these pages, you have a better chance of creating your own page that is relevant for the keyword. (You can use a Google Custom Search to check your own onpage relevance against the existing pages, to confirm your page is now more relevant than the current pages.)

      Once your page gets into the Top 10 results, user signals should take over. If people find your page useful, and stay on the page longer than the other pages in the Top 10, it will rise to the top.

      Another way to think of this is that in the beginning, onpage SEO signals and backlinks helped Google determine how to rank pages, but as more user signals come in for these pages, user signals trump backlinks and onpage SEO, because they are essential "real votes" by "real users" for the content that Google is presenting for a keyword.

      Pay close attention to your time on page and scroll stats for your pages. Bump those metrics up, and you should see a boost in rankings, as the algorithm factors in these positive signals over time.

      Hope this helps,
      Jack Duncan
      {{ DiscussionBoard.errors[11805853].message }}
      • Profile picture of the author FableKeeper
        Thanks for expanding.

        It would be a fun situation to be in, when you get your page in top 10 on Google Custom Search and realize that users don't spend much time on your website. Probably due to sh*tty content or how it looks))

        What is clear to me is that, all of it must be tight:
        * keywords must match what people search for
        * your website must look good and have good written content to hook and retain the users on your page
        * your site should have good quality backlinks
        * your site should have good engagement metrics (high scroll stats, high amount of time spent on your site, etc.)

        I was wondering how Google can get these "user signals" and looks like they don't have access to stats on your website (the ones you can see through Google Search Console). All they can monitor is how quick users click on the next link in search results or check when users scroll away on the search results page.
        {{ DiscussionBoard.errors[11806332].message }}
    • Profile picture of the author Jack Duncan
      @live4pk

      "It's also crucial to keep in mind that quality content goes beyond just keywords. It's about ensuring that readers find it enjoyable and valuable."

      Yes, you're absolutely correct.

      The user signals to watch in GA4 are time on page, scroll depth, and clicks. It makes a lot of sense, if you think about it. Imagine clicking a Google search result, and then seeing right away that "this page isn't any good" and quickly leaving. A) You stayed for just a couple of seconds, B) You didn't scroll down the the bottom, just skimmed the top part and knew this page wasn't what you were looking for, C) You didn't click any links on the page to learn more/go deeper

      So, Google takes these user signals and treats them like votes for the page.

      Ironically, you could (not suggesting it!) send a bunch of negative user signals to your competitor's pages, and at the same time, send positive signals to your pages, and watch them swap positions. (Rand Fishkin from MOZ actually tested this live from stage, about 9 years ago, and it did actually work.) I think there is such a long delay though in these metrics for a small scale test, that most people would simply give up and deduce that it didn't work, and move on to other things.

      I noticed your into A.I., from your sig link. It would be really great if someone would test a poorly written A.I. page, against a really well rewritten A.I. page. Use time on page and scroll depth for the voting criteria, and see if this is the primary means of determining which A.I. content will rank well.

      In other words, just because A.I. can write content that is readable and is factually correct, if people don't sense it has "life" in it, and that it sounds interesting to them, it's not going to rank long term.

      My wife was listening to a Radiolab podcast recently, about A.I. being used to replace humans on phone calls and in meetings.

      Every word that the A.I. clone said was correct, and it even sounded like the person it was intended to clone. At the same time, it was so obvious that it was a bot. Very subtle nuances gave it away, and it made you laugh, because you couldn't quite put your finger on it, but it was certainly a bot and not a real person.

      I suspect A.I. content suffers from the same problem, if it isn't done really well. Even when it is factually correct, and well written, it lacks "life" and people can sense it in a few seconds. This results in lower time on page, scroll depth, and clicks, and Google interprets that as "low quality for our users" and lowers the rank accordingly.
      {{ DiscussionBoard.errors[11805863].message }}
      • Profile picture of the author Gustaf
        Exactly. I completely agree. Many SEOs believe that Google can detect AI-generated content and penalize websites that use it.

        In reality, it's not about AI content itself - it's about how visitors react to it. If a visitor quickly leaves the page right after starting to read, Google will note that the content on that page isn't useful for that search query. And the key point is that Google doesn't care whether the content is AI-generated or written by a human - if it's so bad that the visitor chooses to leave, it's considered low-quality.

        In fact, when a visitor returns to search results after closing a page, it's the number one signal to Google that the page they left is not up to the mark.

        Originally Posted by Jack Duncan View Post

        @live4pk

        "I suspect A.I. content suffers from the same problem, if it isn't done really well. Even when it is factually correct, and well written, it lacks "life" and people can sense it in a few seconds. This results in lower time on page, scroll depth, and clicks, and Google interprets that as "low quality for our users" and lowers the rank accordingly.
        Signature
        Google penalty recovery service for those who need to restore traffic after updates.
        {{ DiscussionBoard.errors[11807871].message }}
    • Profile picture of the author conquest99999
      Content is king. Backlinks come second.
      {{ DiscussionBoard.errors[11810326].message }}
  • Profile picture of the author live4pk
    Ah!, good idea BTW

    I really like how you utilize Google's custom search tool to identify which websites are well-written on a particular topic. This approach can definitely enhance the quality of the content on your site.

    You pointed out that this method can be quite time-consuming, and I completely agree. It can be challenging to apply it to multiple websites simultaneously.

    It's also crucial to keep in mind that quality content goes beyond just keywords. It's about ensuring that readers find it enjoyable and valuable.

    I agree this is an excellent tip for boosting your website's search engine ranking.

    I'm interested to know if you've explored any other techniques for evaluating your website's content.
    Signature
    {{ DiscussionBoard.errors[11805847].message }}
  • Profile picture of the author Tony Jaa
    i have one doubt, instead of using a custom a search engine to identify the content and keywords gap, we can start doing this straightly with google search results.
    Why we need to compare, also if its for different location or different keywords we can make it compare, but why for same thing, we need to do ?
    {{ DiscussionBoard.errors[11812070].message }}
  • Profile picture of the author Motivation Gyan
    "Interesting case study! Using Google Custom Search Engine for SEO analysis is a smart and creative approach--thanks for sharing!"
    {{ DiscussionBoard.errors[11812893].message }}
  • Profile picture of the author SEO Tech Burner
    I like your case study. Thanks for sharing this informative post. keep it up
    {{ DiscussionBoard.errors[11814845].message }}
  • Profile picture of the author godsepallavi17
    Hey this is great! But yes we can use it only for specific target keywords for which we want to rank. Rest it is time consuming. Instead posting content on high authority sites may give quick results.
    {{ DiscussionBoard.errors[11815832].message }}
  • Profile picture of the author MarkBlackburn
    Thank you for this, I nearly got there - but my results don't show the rankings in the way that yours does. i.e. I don't get the blue #numbers on the left or the red #numbers down the right. Or is that something you added manually?
    {{ DiscussionBoard.errors[11817601].message }}
  • Profile picture of the author emiraa
    The text was quite informative, providing a good amount of detail and facts. It proved to be very beneficial in expanding my understanding of the subject matter because it was very informative.
    {{ DiscussionBoard.errors[11823319].message }}
  • Wow, this is a really clever use of Google Programmable Search! I hadn't thought about using it to isolate on-page SEO factors like that.

    The idea that the Custom Search results downplay backlinks and off-page signals makes a lot of sense--since it's essentially only pulling from the limited set of URLs you feed it, it seems like Google relies more heavily on content relevance and on-page factors to rank those pages.

    This could definitely be useful when auditing a page that ranks lower than you'd expect based on its content. If it ranks higher in the Custom Search than in Google.com, that might be a sign that you've nailed the on-page SEO and just need to build more authority. On the flip side, if it's still low in the Custom results, it's probably time to rework the content.

    Appreciate you sharing this method--definitely adding it to my SEO toolbox!
    {{ DiscussionBoard.errors[11825448].message }}
Avatar of Unregistered

Trending Topics