How to Optimize for Keywords in 2015
In advertising you can get away with a passing reference to your product. It’s a stimulus. It reminds people that they need something. But mass market media has to be all things to all people. You can’t meet every individual’s needs with one 30-second spot, but don’t worry – that’s what your website is for.
In our previous efforts as an industry to optimize a website for every keyword, we’ve been doing it wrong. We’ve been going after synonyms. We find different ways to say the same big-money keyword and squeeze them into whatever content we were going to write. We don’t find keywords relevant to more than one stage of the purchase cycle…and we often don’t write content that’s relevant to any of it.
Google’s Freshness algorithm used to mean that you got big rewards for putting up “fresh content” every day. It didn’t matter what that content was – Google just wanted to see that your site was updated. Obviously this incentivized the SEO industry in completely the wrong way: we’ve been more than guilty of contributing to “content shock.”
The problem for search marketers who have been creating content for more than a few years is simply that Google thought we would find it pretty obvious that content had to be good, despite the fact that bad content worked just fine.
So last year Google clarified “thin and duplicate content” that the Panda algorithm originally targeted. It’s now “thin and duplicate content with little or no added value.”
For example, as an industry we knew that forex trading was complicated. We knew people wanted to know what it was and how to do it. So in true SEO fashion we set about writing that article with slight variations, hundreds of times, and spreading it across the Internet.
Obviously that doesn’t work anymore. Which is great, because we only have to write that article once. All that matters is that our page that deals with what forex trading is better than everyone else’s, including Wikipedia’s – and Google is giving us more and more opportunities to optimize our pages in order to demonstrate our expertise. Adding entries to Wikidata (and previously Freebase) can help us to appear at the top of search results as long as our content is expert enough. Broadly speaking there are a large number of ranking factors than can push your page up or down the rankings, but the Knowledge Graph is a huge opportunity to get there without reaching quite the same levels of investment required to compete in popular search results.
This is the fundamental truth of SEO in 2014: Googlebot comes in at the top of your funnel. If there’s a segment of your funnel missing you don’t just lose customers…you lose Google. You don’t get the rankings for the keywords at the bottom, just by answering people who are asking “where do I trade forex?”.
Brands need to understand that writing comprehensive content that answers all of the questions their audience is searching for could get them ranking first without building another link.
We do see links as part of the marketing mix, but if you have enough links to rank in the top five, you have enough links to rank first. People don’t do all of their research on one single website, so it’s important to get mentioned in other publications where your audience is – that’s not about links, or keywords – but about answering a question, or “keyword intent,” that perhaps you can’t answer on your own site. It’s so worthwhile getting your products reviewed by influencers, for example.
Optimizing for keyword intent is much more important than optimizing for keywords in 2015. As an example, if 100,000 people search for “car insurance” each month, and 1,000 people search for “car insurance for classic cars” – it’s a good bet that Google expects at least 1,000 more people who have actually searched for “car insurance” might be looking to insure a classic car. The proof is there that people want a result for that term, but it’s very difficult to guess what the 100,000 people who are just searching for the most lucrative keyword are actually looking for.
This is why Google’s search results will rarely include 10 service providers or retailers on the first page. Searchers might want some news – so The New York Times or The Guardian might be listed. People might want to see local providers because they don’t trust the big sites – hence the Venice algorithm. Some results might be reviews. Google wants to cater for as many intents as possible as it tries to satisfy searchers and provide a good experience. It makes complete sense for SEOs to try and satisfy as many different searchers as possible with their own websites, and in fact it’s completely necessary if they want to rank for big terms in 2015.
The Zero Moment of Truth
The Zero Moment of Truth (ZMOT) – the decision-making process – involved an average of 10 “resources” in 2011, according to Google. Google also says that the number of resources used the year before that was half that amount. Only five.
Smartphone ownership has increased significantly since 2011. The number of people signed up to a social network has more than doubled. People are researching more now than they were back then. I’d be surprised if the number of touch points before a purchase right now is less than 20. Maybe more.
So answering as many searcher intents as possible is necessary in order to win the ZMOT. And winning the ZMOT is necessary in order to rank in Google. TV advertising can provide that stimulus…but the brand that wins the ZMOT takes the money.