Nick is the Founder and Chief Strategy Officer of From The Future. When he's not elbow deep in data, he's spending time with his wife, his dogs, or his cars.
Ready for something painfully obvious; ranking for hard keywords is hard.
But there’s a strategy you can deploy to do a better job.
Today I want to show you how to systematically expand your website’s keyword footprint, one page at a time – and exponentially increase your website’s traffic from SEO.
This SEO strategy focuses on building out individual URL’s to rank for lots and lots of keywords (or Bigfoot pages as I like to call them).
This is often referred to as keyword spread and it’s a simple measurement of an individual page’s organic reach and visibility. The measurement is the volume of traffic-generating keywords.
The definition of a bigfoot page (or page with a large spread of rankings across many keywords) is going to vary for each niche, but for the purposes of this post I’m going to be looking at pages ranking for at least 1,000 different keywords.
To get started I want to share a few examples of bigfoot pages, across a few different niches.
For each of the above examples I want to look at both the root keyword, content type, word count, and then pick apart the topic use via LSI keywords they might be using by running the head keyword for each page through lsigraph.com.
Seems you can still stuff a crap load of target keywords, many times into the more competitive SEO verticals. Having a word appear on a page 68 times within 1,600 words (like in the case of our student loan example) is a bit much – but then you look at the credit cards page with the word credit appearing over THREE HUNDRED times within 4,700 words.. wow.
It’s so valuable to run an LSI keyword analysis for your page’s head term – it brings so many additional topics to the surface that you should expand on within the page.
The Halloween page is my favorite example of this, with LSI terms bubbling up like; last minute, diy, ideas, and homemade. All of which make perfect sense when you consider them in the context of being related to “easy.”
You will need to run this report for each domain, then once in the top pages report sort descending by keywords:
Then you can expand these individual results to get the full list of keywords their pages are ranking for:
You can also use this data to build your SEO content map and look for opportunities to extend existing page topics to cover more even more terms.
A big piece of building Bigfoot pages is folding in weaker pages on your site that are not performing on their own.
A simple rule of thumb for identifying these pages is to look at all of your pages individual organic traffic performance over the past 6 months and make a judgement call.
If you are going to combine your thinner/weaker pages make sure you not only move all of the source content to the destination page, but be sure to add a proper 301 redirect and then regenerate / submit your XML sitemap.
There are some specific details in terms of time periods and thresholds that I use to identify opportunities to refine/extend/combine pages – which I cover in much greater detail in Traffic Think Tank.
Bigfoot pages are money.
The more you can extend the relevancy of every individual URL, the more SERP real estate you can eat up and the more organic traffic you can acquire.
One critical mistake you need to beware of is keyword cannibalization, and ensuring you’re not creating too many pages targeting terms that are too similar.
This is another reason LSI research is so important, because even though you may think you need to create separate pages for keywords that look completely different (i.e. contain no duplicate terms) – Google may see these terms as synonyms (or very closely related).
In this case you’re actually hurting yourself by building multiple pages to target each version versus building just one bigfoot page.
I have a simple process for checking keyword overlap that I’ll be emailing to my list – sign up if you would like to see it.
Our data driven approach to keyword research.