The search engines constantly update their algorithms but search engine optimisation in 2012 changed a lot with Google making big changes. In 2012, we even saw some reputable companies that have always applied white hat, best practise approaches to SEO get hit by Google penalties. So what has changed and why and where does this leave website owners?
Google always said that you are not allowed to create links in order to manipulate the search results and any links built purely for the purpose of improving rankings are considered to be web spam and black hat. Most people interpreted web spam as being automatically generated links and believed that if they built manual links, they were safe from any penalty. The release of Google Penguin proved just how wrong this belief was. In actual fact, Google and the other search engines can’t tell how the links were built as there is no difference between a manually created link or a link with automation tools like Winautomation or Ubot, they can only analyse link profiles and identify certain trends.
Pre 2012 and How Times have Changed
Before, Google would ignore poor links and would only take into account the good links. If you had some poor links going to your site, Google’s advice would be not to worry about it. In the past, the best practises approach when building links was to put your keywords in the anchor text as it made the links more relevant and therefore carried more link juice. However, you are only able to choose the anchor text if you have built the links yourself so sites with low anchor text diversities tended to be sites with a high number of manually created links. Remember, Google says you are not allowed to manually create links for the purpose of improving search rankings. If people have naturally linked to your site, the chances are they would have used different anchor texts giving you huge anchor text diversity.
The internet is becoming filled with poor quality content purely designed to boost web sites search rankings. In an attempt to reduce this, Google started to penalise sites with low anchor text diversity by introducing Google Penguin. Poor quality links now have a negative effect on sites. This changed the SEO world as we knew it and while the trend was moving towards effective content marketing and blogger outreach, this transition was sped up.
What are poor quality links?
These are links that aren’t naturally built. Google believes that links should be naturally created by people writing good quality content and others sharing it. Any links built purely for the purpose of improving your sites rankings are considered to be web spam. This doesn’t mean you can’t manually create links any more, just that they should be naturally created.
Google Penguin is an algorithmic penalty and runs once a month at random times penalising sites that have unnatural link profiles e.g. link profiles that do not appear to be naturally created having a low anchor text diversity. Google Penguin caused a huge shake up in the SEO world and marked the change of how links are built.
Protecting against Google penalties is simply a case of building natural links focusing on content marketing and blogger outreach and knowing your link profile. You should be running regular live link reports that detail your exact match anchors e.g. keyword, and broad match anchors e.g. keyword with extra text, and the anchor text that doesn’t include any keywords of all of your links. There are no exact figures that you should be looking for as every niche is different but if any of your keywords have anchor text diversities of 30% or higher, you should be looking to reduce them by building links with generic anchors. A natural link profile will have a high percentage of branded anchor text (your company’s name) and a high level of naked URLs (www.yourdomain.com).
The difficulty with Google Penguin is that people started trying to game the system and make competitors sites look like they had an unnatural link profile by building lots of spammy links, known as negative SEO. Some people were deliberately launching a negative SEO attack on sites in an attempt to lower their link diversity and trigger a penalty as explained by Matt Cutts in his negative SEO video.
Google launched a disavow link tool in early October 2012 to protect against negative SEO where users can submit links from their link profile that they wish to be ignored. The disavow tool is essentially a crowd sourcing tool where webmasters do a lot of the spam filtering on behalf of Google. The difficulty with this though is that a lot of SEOs don’t know what they are doing and will simply disavow all of their links and start again. For example; imagine if you put together a really good link building campaign and saw some good movement in the search engine results pages but someone else built links to the same sites and didn’t diversify their links and then got hit with Google Penguin and so reports these links to Google causing these links then get taken down. There is nothing wrong with the sites that link to these sites, only in the 2nd example, the person didn’t know what they were doing. Is this a useful tool or a recipe for disaster? This will no doubt get improved soon enough.
For more digital marketing advise, sign up to our newsletter.SIGN UP TO OUR NEWSLETTER