Google Natural Search Click Through Rates and Forecasting

Editors Note: This article is an adapted version of a recent article written and posted by the same author TIm Aldiss over on the Econsultancy blog.

Search Engine Optimisation (SEO) is an important element of any companies’ digital media plan, and as with an good media plan forecasting is a key element.

A recent publication of a new Google organic search Click Through Rate (CTR) study (by an agency called Slingshot) tries to help shed light on the process by demonstrating the levels of likely click through rate for any first page result.

It’s a great study born off a wealth of accurate data that takes the various other media that now appear in Google results such as Paid Search, Places results, images, video and shopping listings.

What it doesn’t do is help improve forecasting by shedding light on the methodology around anticipating where you are likely to rank in 3, 6 or 12 months time.

Ever since Search Engine Optimisation (SEO) was coined as a term brands employing the services of agencies have asked what they are likely to get as a result of deploying SEO. To answer this agencies in the most part used to focus on the first and most logical impact of SEO – rankings.

In fact in some cases a commercial model was built around it, whereby the agency only got remunerated from positions gained (typically first, second & third page – first to tenth, eleventh to 20th, etc… and in some cases top 3 results only).

In those days only a brave (and foolish) man would forecast. Back then it was the wild west of SEO! Search engine algorithms only had a couple of factors influencing them. Some even returned results alphabteically. One day you could rank top using white text on a white background, the next you be pipped by someone keyword stuffing a cloaked & meta refreshed landing page.

Obviously as a transactional site, or even just a brochureware site, ranking visibility was just the front line of measurement of success and it became logical to look beyond visibility to the visits that came from this.

However very few providers shared any relevant data to make forecasting possible. Two notable culprits were Wordtracker (still available today) and Overture – the latter returning paid search click data from it’s alliance with Yahoo.

It wasn’t until Google launched it’s keyword tool and traffic estimator as part of it’s Adwords offering that we got anywhere near accurate data to use. Now we finally had data the insight generated was suitably crude.

You could now pick keywords based on their ‘search volume’. However there was virtually no comparison to search volume and actual traffic as measured by the sites analytics. One word of warning, and a common misapprhension of Google’s keyword tool data: it’s collected from it’s entire search inventory. That means that the returned counts include clicks on it’s adverts from all round the web – not just search alone.

Then along came Google-powered AOL and in August 2006 they accidentally leaked millions of search records giving those who queried the data a unique insight into click data by rank (original data revealed hereTechcunch article from the time here).

Obviously back in 2006 search results looked very different than they do now. It’s also worth poiunting out the following: 1) users were much less internet savvy, 2) connections were much slower, and 3) AOL users were cretins!

Multiplying the search volume from Google’s keyword tool by the click through rate from the leaked AOL data suddenly gave us the ability to determine how much traffic you might get from any top ten position in Google.

The trouble is that no one had the ability to actually determine that these figures were still wildly out as the only way to get impression data to work out the percentage click through rtae from the traffic you were actually getting was through Pay Per Click, and that was like comparing apples and oranges and had numerous other factors as caveats. Google Webmaster Tools resolves this issue.

So what impact does this CTR study have on me?

Take a look at the data comparison table from the four different ‘studies’ below. The (conditional formatting) colour scheme allows you to easily see where you were the highest click through rates were (green) and the lowest (red):

Quite a change between 2006 and 2011. What’s most amazing is the difference between the potential click throughs in total. For the top 3 results, according to the Slingshot study, you can now only expect to receive 35.5% of the available traffic. This is as opposed to almost twice that – 62.5% – in 2006.

Equally impressive is the drop in total available searches for page one results (top ten). You can now only expect 52.4% of available searches on page one, as opposed to a whopping 89.6% in 2006!

Qualifying this Click Through Rate study data

Now we are able to see impressions by keyword from Google Webmaster Tools we can determine the actual click through rate of your keywords. I’ve always been wisely sceptical %) so to check this data out for ourselves I conducted 3 real case studies.

We took impression data from Google Webmaster Tools, Visit data from Google Analytics and ran our own ranking reports (using SEO Gadget’s Keyword Tool, Trackpal and Raven Tools). The data is intersting, but first here’s some relevant background info on the sample set. All figures are from August. Client A is in the travel sector, Client B is an arts & crafts retailer, and Client C is in breakdown recovery.

There are obviously a number of caveats to take into account when qualifying this data against your clients – least of all is the size of the data set and the stability of positional rankings. However for non-brand terms the click through rates were much higher than expected.


Now comes the interesting bit: forecasting. Knowing this data from our clients is it wise to anticipate such stonkingly good CTR from existing vivisbility? I’d suggest not if you are embarking on a relationship with a client afresh, but if you are already engaged and through your work have achieved fixed top 3 positions to gain these CTRs then hell yeah – why not.

Forecasting just got easier but it’s still acumen and experience that allows you to anticipate your own SEO ranking capabilities in 3, 6 and 12 months that nails accuracy enough to create that  successfully SEO media plan.

Footnote: SEOMoz also published an article on this subject here, and Search Engine Land also have a comprehensive review of all studies here.



2 thoughts on “Google Natural Search Click Through Rates and Forecasting

  1. Pingback: Google Campaign – Step 10 | Simply Text Marketing

  2. Pingback: Google Campaign – Step 10 | Simply Text Marketing

Leave a Reply

Your email address will not be published. Required fields are marked *