Nick is the Founder and Chief Strategy Officer of From The Future. When he's not elbow deep in data, he's spending time with his wife, his dogs, or his cars.
Everyone loves a good case study, especially in SEO.
Funny thing is there are so many SEO case studies but there’s still a gap in available content, more specifically, there aren’t case studies that provide truly actionable strategies and takeaways.
I’m here to fill in that content gap with this post.
To prepare this post I’ve reached out to thousands of people (almost 22 thousand to be exact) and presented them with the site’s I planned to report on for this case study; asking them to reply with what they would want to see from a case study.
What I got back was a great list of what folks are most interested in seeing from a case study on organic search.
First, let me show you the results I’ll be diving into – and how I got to the number in the title.
For this case study I’ll be reviewing the SEO results from 3 different websites, in 3 totally different niches, that all have different conversion metrics for “success.”
Here are the screenshots of the organic traffic results from SEMRush:
Before I dive into the numbers, I wanted to share some of my favorite responses that I received via email. These will frame out most of what I go into in this post.
I compiled responses from over 1,000 subscribers on our email list into what would make this a truly valuable case study.
What that breaks down into is the following case study criteria:
So the above list is EXACTLY what I’ll be diving into for this post.
Before I dive into breaking down all the SEO components listed above, let’s first run through the numbers from the screenshots I cited at the beginning of the post.
For the purposes of this post I’m going to use the data from SEMRush and supplement with GA data as needed, the reason being is GA data can be manipulated with filters to paint the picture of whatever you might want it to look like – relying solely on core organic views from a trusted 3rd party (like SEMRush) keeps this clean.
Work began on the Ecommerce website in Spring of 2015.
It was a new-ish site (only a few months old) and as you can see from the screenshot, results were slow at first – and in my experience this is common for Ecommerce SEO *IF* your main focus is to build rankings and traffic for commercial keywords.
If you *just* want to “build traffic” all it takes is some creative blog posts and big niche specific guides. Don’t get me wrong – I’m not knocking these, it’s different strokes for different folks is all I’m saying.
So from March 2015 (when we got started) until February 2017 (as I’m writing this post) we took organic traffic from approximately 35,000 organic visits per month to ~225,000 organic visits per month, for a total net increase of 542%.
This was a different project all together.
This wasn’t a pure-play SEO project, this was a BIG design and development undertaking. More specifically, this project included the design and build of a new site for a core brand, but then also needed to support meticulously folding in 3 other brand websites (that had been acquired) into one seamless master site, and then re-launch.
These are my favorite projects in all honesty – they tend to be very complex in terms of content management, and brand alignment in terms of information architecture for products and solutions is one of my all time favorite tasks.
The project kicked off in September 2016, with a target launch date for the new site in December.
None of the acquired brands really had any organic rankings to speak of and were more so bought up to acquire the technologies and fold-in the customers under the core brand umbrella (very smart growth strategy!)
So the important pieces here were to scrape all current rankings (if any) for each site, review all the content, and create a master content map for what needs to live where on the new site – to then inform the redirect map.
The site re-launched in December (as planned) and you’ll notice the immediate increase of organic traffic, initially in December but then really in January of 2017; going from approximately 800 organic visits per month to around 3,600 organic visits per month.
It’s worth noting the slight dip in February as Google reshuffles all the new rankings and tests the waters to make sure these URL’s have the rankings they should – taking into consideration qualitative metrics such as CTR, time to long click, and dwell time.
Overall the gain from SEO traffic was approximately 350%.
This website is special. For starters, it’s a giant international brand and the cherry on top is the website is beautifully designed – so on top of everything else, international SEO was a major consideration out of the gate.
What was surprising is how much of a rats nest the front-end code was from purely an SEO perspective – the good news is for any SEO worth their salt, all you see in this case is opportunity.
For this site, we went to work in August of 2016 and the initial focus was 100% in cleaning up the client-side layer of the site – we specifically audited things like the crawl budget and efficiency, index rate, and internal links.
It’s a pretty big site, so it took a couple months to get all our ducks in a row and then wait to have everything implemented by their development team.
Most of the work went live in November and then you can see the first movement of organic traffic in December 2016. When we started the site was getting approximately 210,000 organic visits per month which grew as of February 2017 to 306,000 organic visits per month, an increase of 46%.
So for the purposes of coming up with a number for the post title (because having 3 numbers was really confusing, I asked :)) I took 542% + 350% + 46% = 938/3 = 313%.
So that’s where that comes from – ok onto the MEAT!
Coming back to the list of strategies and tactics that were asked about on how these results were achieved, I’m going to do a deep dive into each of the bullets I listed above, one at a time.
This is a great question and everyone wants to know how to make SEO easier, i.e. how to do less and get more.
While content is important, it’s still not the King, and there are ways to get more SEO juice out of your current site.
Per my post linked above, promotion is one of the easiest ways to crank more traffic out of existing content, and for specific details on promotion strategies I recommend checking out this smart post by Robbie Richards.
What I’m going to get into instead, is action you can take on your site to get more love from Google.
With that said, one of the fastest ways to increase your organic traffic without creating any new pages is to optimize your crawl budget.
More times than not this is considered technical SEO, and while doing this to the full extent does get technical – there are elements of it that any intermediate SEO can implement and see results from.
The best place to start is to run a full crawl of your website and analyze your results.
What you’re looking for is the following:
Thin pages are URL’s on your site that have a low text to html ratio, and the best way to identify these is to use a proper site crawler.
Here’s the thin pages report shown in DeepCrawl.
Generally speaking you don’t want pages with less than 500 words of original content being crawled, as with everything in life (and especially SEO, there are always exceptions to this, but this is a sound standard practice).
For thin pages you do not wish to “thicken up” with more content, consider blocking them via robots.txt.
Query parameters are those nasty looking URL’s you often see on Ecommerce sites, they usually start with ?= and can stack to the moon and back.
They tend to be generated from core shopping cart code on many platforms but also from site search driven (like Endeca) and database driven websites.
The big issue with these, is each one of these URL’s, by default, is crawled and considered a unique page. Fact of the matter is, more times than not, these pages do not have enough unique content / user value for them to be included in your crawl budget.
These can be attacked through a combination of blocking Googlebot from specific parameters within the url parameters report in search console coupled with hard disallow directives in your robots.txt file
Your index rate is simply the ratio of pages submitted into Google for indexing vs the pages actually included in Google’s index.
Here’s a screenshot of a healthy index rate, where 564 of the submitted 568 pages are indexed.
Another key component to consider when looking at your overall index efficiency is edge case variations of URL’s that shouldn’t be accessible, for example if your homepage at your root domain lives at yoursite.com, but then there are duplicate variations of the page that live at yoursite.com/main/ and yoursite.com/index.html these should be redirected using a 301.
Taking the above steps is often the very first thing we do as part of our process, think of these as cleaning up your site’s foundation before you embark on building new content on top of it.
I found it a bit funny this was a question, since I’ve written on this before, but I do think I can expand on this more specifically in the context of this case study.
The practical application of this is the ability to increase the semantic keywords and contextual relevancy of your pages for additional keywords, without having to screw up conversion rates by stuffing walls of text into your layout that no one reads.
To show you a real world example, here’s how we implemented this on TrafficSafetyStore.com:
And if you do a search in Google.com for the target category keyword here “roll up construction signs” you will be greeted with this beautiful SERP:
Now I’m not saying the results we’re seeing are because of putting this content behind a jQuery window shade, but I will share with you that this also improved our conversion rate by a considerable percentage, likely due to improved UX from a shopping perspective.
The other consideration based on these recent changes from Google is that content that is hidden on Desktop due to it being Mobile content, may still be crawled and scored, and in fact influence Desktop rankings.
This circles back to the point of cleaning up your foundation before starting to build on top of it.
The critical issues that were analyzed, identified and immediately addressed with each of the 3 sites in this study were:
To identify these issues you need to become familiar with running a crawl report and analysis and then being able to sketch out and understand a sites information architecture.
Short answer here; more.
But in all honesty, this is such a hard question to answer – which is why I’m sure it has not yet been answered to the level that satisfies most SEO readers.
So with that said, I will take you through roughly how many links we’ve built to each of these sites, and then in the next section I’ll go into details on how.
Velocity is key for retail sites – what this meant was building up a steady base of new inbound linking root domains coming in day after day and week after week.
This particular retailer is in an extremely competitive space, so we built up a steady base of acquiring 30-40 new linking root domains each month, and continue with this campaign running in the background as we build out new pages and add new products.
In the B2B space, velocity is not as important as securing higher authority placements with much higher trust metrics. When building links for businesses selling to other businesses credibility is key and works like a flywheel; the more high trust links you drive the more linked mentions you tend to pick up organically.
To put this bluntly, link building in the travel and hospitality space is weird. Working with name brands is great in some ways but very difficult in other capacities, and what I’ve learned is that the link turnover rate in the Hospitality niche is incredibly high.
What this means is links get placed and then re-placed frequently – I think a lot of this has to do with how incentivized many of the big linking sources are. To paint a better picture of the churn I’m speaking about look at this graph:
The mix of links that are being lost are from a mix of spam sites (woohoo!), affiliates, and images that are being replaced or updated by the site managers.
I have some specific ideas on how to build more specific assets to combat this in the future, but at this time we’re just getting to the content part of this campaign, so the links we’ve gained so far have only been from very specific strategies that I will go into in more detail in the next section.
I’m going to combine this section with another since it’s *so* closely related. The other question was “what are the sources of links for each niche and how were they identified?”
I would never call myself a link building person, so the strategies we use to source links for these sites are not going to be new information to many of you, but with that said – they still work.
I’m going to include how prospective link targets are identified within each section, and then show real world examples in the wild of companies putting them to work to acquire shiny white hat links.
The way most of these campaigns are run is good old fashioned outreach.
Below are the primary link building strategies we used for each of our 3 case study niches, one key consideration is that these aren’t directory links, social profile links, and other manufactured link acquisition techniques:
Link building in Ecommerce takes attention to deal and almost always requires incentivizing your link targets – they understand you’re a commercial entity and they expect to be rewarded for the value they’re providing to you with their link.
This is a very scalable strategy for getting links to an Ecommerce site if you have the budget and margin on your products to make this worthwhile. For our client in the fashion space, our process for identifying sites to approach for reviews is as following:
Same as with reviews, you’ll need a budget – but it doesn’t have to be cash. Many times, depending on your products you can find groups that need what you sell and are willing to show off your logo (with a link of course) in exchange for some free products.
A personal example I’ll share with you is how at TrafficSafetyStore.com we donate 12″ traffic cones to driving clubs around the country, in exchange they will put their logo on their club sites with a link.
The only difference between donations vs. sponsorships is this tactic does usually come in the form of cash.
Think beyond just sponsoring articles though, and instead look for opportunities to support either causes that are important to you or that exist ion your local area.
A good example of these could be youth sports leagues, local chapters of the ACLU, or pretty much any local non-profit or independent organization that has a website.
Contests work incredibly well when done right. The hardest part about pulling off a contest is giving away something compelling enough that people REALLY want to win. This usually means it’s a pretty big expense for you or your client, but I’m serious when I say when the idea is solid and the execution is done right – these things can crush link acquisition, like to the tune of 100+ new LRD’s.
Here are a few examples of some of my favorite contests from Ecommerce sites, all of which landed them over 100 new root domain links:
^Notice the size / amount of product each site is giving away to drive entries, engagement, and interest.
Scalability is a challenge in any vertical, but it seems to get exponentially more difficult in the B2B space. The tactics I’m going to review in this section have varying degrees of scalability, and are more so focused on outcomes: high quality links that can be stacked to move the needle on organic rankings.
Broken Link Building
Far from new information, this is a near timeless link building strategy that can still get results and be scaled quickly.
The biggest challenge when running a BLB campaign is finding a piece of content that is a critical reference point in terms of making the argument or legitimizing the pages that are linking to it.
From there the caveats are:
The last consideration I’ll leave you with for BLB is who the pitch is coming from; it better be 1) and email address on the client’s domain or 2) an “independent 3rd party” that is advocating for the content being pitched as it aligns with their core mission or values.
Evergreen Resource Development
Buying guides still work really well for this, but it’s becoming incredibly competitive with many affiliate players starting to invest serious resources (and dollars) into their content, and new affiliates popping up every day.
The trick here is to bridge the gap and provide something of true value, beyond a buying guide, but to do so in a way where your content offers something new and unique that does not yet exist within your vertical.
Here’s a great example from Buffer on how to manage daily social media updates for millions of followers:
That has earned them 310 links from 44 domains:
Problem Solving & Process Content
If we weren’t talking about link acquisition I would probably address the creation of problem solving content by speaking to the traffic potential of larger platforms with installed, high intent audiences such as LinkedIn or Quora, but – since this is about links building, let’s look at some native examples.
KISSmetrics is known for their content marketing prowess, but what’s not often reflected upon is their level of SEO in the B2B software space.
These guys crush lead generation from SEO, and a bit reason behind their rankings success is their ability to nail process and problem solving content to attract gobs of links.
Take for example this post on how to boost conversions, which earned them over 400 links from over 60 linking root domains.
Interviews & Influencer Marketing
Ego-baiting is still alive and well, especially in all of the verticals where it hasn’t been played to death (like in SEO for example).
As such, landing an interview with an authority in your space or leveraging the audience and trust of influencers in your space is a great way to build visibility for your content that can ultimately leads to a nice inflow of links.
The Harvard Business Review uses interviews, like this one with the Founder of Starbucks, to rake in links to the tune of ~180 from ~80 domains on just this single post.
One of the smartest ways this can be done is actually through using properly messaged case studies with a high level of production value.
Take for example Bitly did a case study on Omnichannel Ecommerce, that resulted in over 30 links from 19 LRD’s, and I’d be willing to bet these are all organic links earned without outreach.
This becomes even more believable when you look at their case study landing page which has racked up over 700 links from well over 200 linking domains:
Unlike the other 2 SEO niches, hospitality tends to be a bit easier in terms of scaling link building since you usually have more assets at your fingertips to leverage for results. All of the below tactics, which while useful in other verticals, have driven the most effective link acquisition results specifically for our Hotel clients.
Link reclamation can be incredibly scalable and effective if you get the pitch right, and put the energy into efficiency (i.e. scale).
The 2 most commons approaches we use to reclaim links to support Hotel SEO are:
Discounts & Affiliate
This one is pretty straight-forward, the nuance here is approaching this from a second tier – what that means is offering your link targets something to offer to their audience.
This is also commonly use in Ecommerce to help build relationships, but this makes the offer more attractive to more established blogger and niche publishers since not only do they get something for themselves, but they are also now able to offer something of value to their audience, reflecting additional value on them as a trusted advisor.
Where this becomes less straight-forward is when you’re out proactively building an affiliate network with travel bloggers. The way to leverage this from an SEO perspective, opposed to just using it to increase brand awareness and stand on the shoulders of other networks to drive more bookings, is by setting up your affiliate links in an SEO friendly way.
There are several ways to do this but the 2 that are the most common are:
This is one of those questions I get asked on a weekly basis.
And the answer is only going to upset you – but here it is: it depends.
Let me explain that this is based EXCLUSIVELY on my experience and may vary wildly from what others may experience, but here goes:
On top of all of this I’ve heard stories about link velocity being built up over a few month period, no movements at all in terms of rankings, and then all of a sudden a big pop (not related to an algorithm update), which makes it really hard to attribute to individual links.
Here’s an example of what this looks like:
The best advice/insight I can provide on this question as a whole is this:
Don’t try to attribute individual activities in SEO to any one strategy or tactic
SEO works best when it’s executed and managed holistically as a system, not a vacuum.
I took a stab at sort of answering this question in detail when I created my guide on how to implement a keyword strategy, but I’m going to try to dive back in for the purposes of this post.
This is a really difficult activity for a lot of SEO’s, and the reason (IMHO) is because I think most SEO’s a very analytical – which in most areas of SEO serves you very well.
However, this particular element of implementing and SEO strategy is as much an art as it is a science.
For parts of this process it is very straight-forward; the keyword list includes 12 keywords that are all variations around a single, identifiable topic. Create a new URL, select the root term for the URL, write the page title to focus on the root term + value prop + include 2-3 modifiers, create the content on the page to lend context to the additional term variations.
Boom, easy (relatively speaking).
Where this gets difficult is when the keyword lists get big, and the variations begin to blur, and more times than not, SEO’s overcomplicate their targeting.
More times than not, when we’re brought into a new project, mass cannibalization is afoot.
The previous SEO or marketing manager thought that the best way to build relevancy for their target keywords was to place their head term in the page title and URL of 3-4 blog posts per month and then crank out thin, non-value creating posts as fast as they could.
What a waste.
It’s true that topic relevance can be built in concert, and also that often it takes time… but amassing a troth of posts using the keyword “best yeezy boost 350” never worked for anyone. I promise.
Instead, categorize your keywords into themes based on topic overlap.
Find connections between terms that don’t share any of the same words, but speak to the purpose of the page, support a process where multiple terms are included, or can be woven into sections on the same URL to lend context to the focus topic.
Here’s how we do this for the 3 niches in this post:
Mapping keywords to content for Ecommerce is all about intent, and supporting intent at each level of the conversion funnel and site architecture.
Intent is of course important here to, but for Lead-Gen it’s not as straight forward since you don’t have cut and dry “category” and “sub-category” pages, instead I prefer to approach B2B sites in the following content groups:
If you’ve learned nothing so far in this post, I hope it’s at least become apparent how different SEO strategy is across different verticals – and that SEO for Hotels and Resorts is a different beast all together.
Hospitality sites tend to have an extreme focus on 2 core functions for content:
Because of this it makes grouping keywords in topical buckets to map them to pages relatively easy.
Geo-focused keywords with modifiers such as [country], [city], [state], [zip code], and [near] should all be attacked in sections where the architecture is centralized under the geo-targeting.
What does that mean?
It means Google uses geography targeting as a leading indicator for search results ranking, and hence it should be the parent of all the rest of your content in your document tree.
The practical application of this means if you’re targeting terms for hotels near Six Flags in New Jersey, New Jersey needs to be the parent directory with child pages for the activities and attractions, so /new-jersey/six-flags-great-adventure/ vs. /six-flags-great-adventure/new-jersey/.
For content that lives outside of geography (which in reality is nothing for location-based businesses) but if we’re going for big SEO wins here, i.e. the likes of “best places to [activity name]” then blog posts are a solid content vehicle for achieving this.
Here’s a great example:
Funny enough this post ranks for Geo terms (keywords including the “Caribbean” geo-modifer) in addition to a slew of “best diving spot” keywords, good for them.
But this is exactly the kind of content I would be focusing blog posts on for Hotels.
The beautiful part about refining an SEO process that works, is that the process itself is the same across all verticals – so for this section I only need to write it out once 🙂
Most SEO Companies start their process with an exhaustive audit, more times than not I see this as a bastardized derivation of Annie Cushing’s Audit Template (which is stellar if you’re looking for one).
We do this a bit differently.
We DO an audit, but we’re only looking for critical elements or super fast wins, we DON’T blow tens of hours doing a comprehensive audit that include 80% of changes that will result in a minimal increase for a maximum amount of effort.
The 80/20 rule is alive and well in SEO.
We run a full site crawl (with permission from the site owner) on DeepCrawl and on a ScreamingFrog instance we run on an AWS instance.
We then review:
I mentioned most of this in the earlier section. We start on-site (before moving onto traffic sources, rankings, etc.) first by gathering our findings from the crawl, but then also include a manual review of:
Herein lies our core differentiator; besides me being an owner/operator of an 8 figure Ecommerce website (which lends hands-on experience to growing revenue for transactional websites), FTF employs more developers than we do analysts.
This is because I never wanted to be in a position where we made specific, technical recommendations but the Client was unable to implement them due to lack of technical resources.
This is our proprietary take on keyword research. What makes it proprietary is simple our process, the extreme amount of data we pull in, and the takeaways this enables us to deliver in terms of keyword prioritization.
This has been written about in-depth before here, here, and here. What we do a bit differently is not the process (which is sound and again, pretty standard) but ours is based on the data we find and bake into our keyword matrix.
This is a breakdown of all the content that we recommend building, moving, enhancing, deprecating, or redirecting based on:
1) new keyword opportunities identified in the keyword matrix and
2) content gaps identified in organic search.
This includes all individual configuration details at the URL level which means meta attributes (title, description, slug, and header structure), content requirements (keyword use, content length, topics to cover), information architecture (where the page should live in the document tree, how it should be linked in navigational elements, and internal links to and from the page), and lastly which design patterns should be used to support the user experience (mobile first of course).
If new content is needed to support specific campaigns elements, whether it’s increasing the overall contextual relevancy of pages, the creation of linkable assets, or content required for partner sites, this is run contemporaneously with the next task set.
Content that doesn’t get seen earns no links and shares and is exponentially harder to rank. While the depth and breadth of your content does matter, it doesn’t matter nearly as much as your efforts to promote that content.
Please refer back to the previous section for how we built links for each of the 3 cast study sites.
Before I jump into the specific KPI’s we used for these campaigns, I thought it would be worthwhile to share some insights from a friend and fellow SEO, HubSpot’s Global Director of Growth, Matt Barby:
“Each SEO campaign is different and requires an element of customization when it comes to reporting on its success. That said, there are a few KPIs that come up time and time again that I rely on to get specific insight(s).
The first is the obvious one – overall organic traffic.
Depending on the scale of the project I’ll look at this daily/weekly/monthly. Moving further down the funnel I’d be looking for conversions from organic search.
This could be in the form of direct sales, leads generated or even data capture (all depends on the project).
To support these two, I’d keep an eye on branded search volume growth (tracking general awareness) and monthly backlink acquisition – these KPIs are way more skewed towards the top of the funnel but are really important to track in order to identify trends in your overall data.”
– Matt Barby
Global Head of Growth, HubSpot
For more SEO insights from Matt, check out MatthewBarby.com
The following set of data points are the key performance indicators we track during the life of an SEO engagement:
Number of Pages Indexed + Index Ratio – There are some other metrics that we look at while we’re looking at this like average crawl rate, but overall the indexed page count over time and the index ratio are like the pulse of a website. This let’s use know how valuable Google is finding the content / pages of the site and if we are chopping out loads of thin or duplicate pages (general culprit is parameterized URL’s) we want to watch and make sure Google is picking up that these URL’s have been deprecated.
Query Impression Volume – If you’re running an SEO campaign the literal goal is to increase the visibility of your website in organic search, the volume of impressions where your website URL’s are being shown to people in organic search results is the literal measure of this.
Number of Page 1 Rankings – This is a report we have set up to run automatically daily and weekly. For weekly reporting we use AWR Cloud and for daily tracking, especially of SERP beta (or the rate of flux among rankings for a particular keyword), we use SERPwoo. This report is grouped in number of rankings on page one (overall), top 5, top 3, and those sweet #1’s.
Organic Traffic (Brand and Non-Brand) – Another literal measure of pure SEO success, the total number of visitors from organic search. This is best looked at month over month, but there are certain campaign elements that when they’re launched will cause us to keep an eye on the realtime traffic report in GA. The important part about this KPI is the split between brand versus non-brand traffic. This is extremely difficult to dial in with a high level of accuracy, thanks to “not provided,” but there’s still value in trying to split this out the best you can. This shows the impact of brand awareness campaigns vs. the correlation against % of total traffic from new visitors which is driven mostly from non-brand keywords.
Number of Keywords Ranked – This is your measurement metric for your site’s overall keyword footprint. Step one here is to grow your overall footprint by building rankings for more keywords initially, which will will start out looking like this:
where most of your rankings are not on page 1 (i.e. positions 1 through 10), and then using the strategies I’ve outlined in this post to move from the above scenario, to one that ideally looks more like this:
Revenue from Organic Search – All the traffic in the world doesn’t matter if you’re not able to monetize it, so this is a critical KPI that we track and report on to make the ROI for SEO is always positive.
For starters it’s easiest to answer this question in reverse, when the rankings come – so do the results.
The time to rankings has everything to do with the size of the opportunity relative to how established your site is and how big of an impact fixes might have.
This question has a ton of overlap with the previous one on “time to see results from links” – and the answer is honestly the same – it depends.
It’s going to be different in every vertical and to a large extent, every keyword.
The best way to approach answering this is on a keyword by keyword basis, looking at individual difficulty scores.
Here’s my extremely generalized rundown using one of my favorite keyword difficulty scores still to this day, TermExplorer’s.
Based on TermExplorer’s Keyword Analyzer results and provided keyword difficulty scores:
Difficulty score: 1-3
What’s needed to rank: dedicated page targeting keyword at meta attribute level, minimum content equal to average word count of top 10 ranking pages. Trust signals from minimal external sources, i.e. links, social signals, click-through from SERP for target keyword.
Time to rank: if your site’s DA/TF is above the average of all sites currently on SERP1, likely 3-4x your sites average total crawl rate, but give it 2-3 weeks to be safe. If your site’s DA/TF is below the average of all sites currently on SERP1, double the time.
Difficulty score: 4-5
What’s needed to rank: dedicated page, not more than 1 directory off of root directory, targeting keyword at meta attribute level, minimum content equal to average word count of top 10 ranking pages. Minimum linking root domains equivalent to average of URL’s ranking in top 5 positions.
Time to rank: if your site’s DA/TF is above the average of all sites currently on SERP1, likely 4-6x your sites average total crawl rate, but give it 4-6 weeks to be safe. If your site’s DA/TF is below the average of all sites currently on SERP1, double the time.
Difficulty score: 6-7
What’s needed to rank: dedicated page in root directory, targeting keyword at meta attribute level, content equal to 150% average word count of top 10 ranking pages. Minimum 150% linking root domains equivalent to average of URL’s ranking in top 3 positions.
Time to rank: if your site’s DA/TF is above the average of all sites currently on SERP1, likely 4-6x your sites average total crawl rate, but give it 6-8 weeks to be safe. If your site’s DA/TF is below the average of all sites currently on SERP1, triple the time and signals needed.
Difficulty score: 8+
What’s needed to rank: DA65+ or likely 8+ months of established link velocity, earning 25+ new linking root domains each with DA25/TF15+ per month.
Time to rank: Extremely variable based on competitors, net link acquisition of top 10 ranked sites, and overall audience engineering capabilities of sites currently in top 10.
I love this question.
What we’re seeing as a result of this (as of January 2017) is content that is hidden for mobile users based on media queries is being used to score the relevancy of the pages for those keywords.
This is YUGE, and means you can have a big, beefy desktop experience where you select design patterns so as not to overwhelm users with too much text, e.g. accordions, tabs, jQuery window shades, etc. – but that Google will still crawl and score all of that juicy content towards your SEO efforts.
In terms of focusing on CRO for Mobile, the biggest mistake I see sites making on their responsive page versions is not being “finger friendly” enough.
I’m not saying make your buttons enormous, but you do really need to think in terms of how to best utilize this real estate.
One interesting trend is how meticulous we as mobile power users have become at inspecting what’s on the screen, even when it seems very small.
What I’ve been doing to keep tabs on this, and leverage CRO to the best of our ability to help drive conversions from the increasing amount of mobile traffic we’re seeing across all websites, it to stay abreast of shifts in design patterns.
I do this by following pattern libraries that track mobile design trends, like this one.
Simple answer here: follow the search volume.
In some mercantile verticals all the search volume is for the products (Amazon tends to kill it in these verticals) where as in others, there’s virtually no search volume for product specific keywords and instead all the volume is for 2-4 word terms.
In these verticals, it’s best to run with a silo’d category or sub-category architecture and build large stores of relevant child pages to bubble up their relevancy to a parent directory.
We’re in a time where this can be done using item-prop tags and blown out structured data mark-up, and it can even be done in many instances using internal links, but there are edge cases to everything in SEO.
For keyword niches where there is heavy competition for both the category terms as well as the product terms, i.e. if you’re a distributor for very established brands; think Nike, North Face, BBS, Thule, Burton, etc., you likely need to be approaching your SEO via a flat architecture.
In the above instances, I would be building large content hub pages targeting the brand terms, extending those hub pages incrementally by adding keywords on the sub-cat URL’s but not as child directories, and then also placing the product pages in the root.
So for example, if I was selling winter sportswear, and carried Burton snowboard bags, my architecture may look like this:
The impact here simply forces innovation and creativity, which is awesome!
It means you need to keep a tighter experience to drive conversions and pander to the intent of users who may be searching, reading, and experiencing your B2B content on their phone.
In my experience, the intent of someone reading B2B content on their phone is they are either at the very beginning of the customer journey – having just been told about your business from a friend or saw an ad, or – they are at the very bottom of the buyer journey and have just been told to FIND A SOLUTION NOW!
In either case it’s critical if you’re going to have a chance at the sale that you accommodate this lead by providing useful solution content to answer their most common questions, provide options to quickly email pages to colleagues, offer downloadable resources, and if possible, use a design pattern such as a sticky footer or gesture-based element to show the phone number.
These are more CRO-focused tips than SEO-focused, but I figured at this point in the post I’ve sort of maxed out all the SEO questions in this context 🙂
I hope you found something of value from this now over 7,000 word post that I’ve spent weeks writing.
If not, that’s cool too.
What I would ask in return, if you either found it useful, are going to send it along to someone you know, or if you’re seething in disappointment – please take a moment and drop me some feedback below in the comments.
Our data driven approach to keyword research.