In our previous post on Google penalties we looked at the fundamental differences between manual actions and algorithmic actions and how to begin dealing with both. In this post we’ll look in detail at how you can begin identifying bad links as a result of receiving a manual penalty and the best resources and tools to help you start looking.
Being able to tell a bad link from a good link is far from straight forward and with a large backlink portfolio, I’d highly recommend getting a reputable SEO consultant to undertake a thorough link audit of your site. This is akin to getting your back link profile deep-cleaned, helping to purge all those nasty links and blog comments you had no idea even existed.
If your backlink profile is more modest in size or you have the tools, time and inclination to undertake this exercise yourself then read on.
If you’ve received a manual action from Google then you’ll be able to establish that it’s your links that are to blame, but with an algorithmic penalty it is going to be a lot harder to establish exactly what you have done wrong. We’ll address this problem in a future post, but for now there’s no harm in purging your backlink portfolio of negative links anyway. In fact if you’ve been hit by an algorithmic penalty then there’s every chance that clearing out your bad links will aid your site in recovering the rankings you previously held.
Types of Unnatural Link Penalties
There are three categories of links that can cause Google to issue a link related manual penalty notice on your site:
This is the least serious manual action you can receive regarding unnatural links as your site hasn’t been penalised directly, but is rather being indirectly penalised by web pages or sites containing “unnatural, artificial, deceptive, or manipulative links” that are pointing to your site. Because bad links can pass on negative authority in the same way good links pass on positive authority, you would be well advised to try and take these links down.
Like the above category, this indicates that Google’s Webspam team has detected “unnatural, artificial, deceptive, or manipulative links” pointing to your site, but unlike the above penalty, has taken action directly against your site as a result of these links. Possibly this is because of the number of these unnatural looking links or the nature of the sites they sit on. Google may believe that you have purchased these links in exchange for money or some other reciprocity, which breaches the Webmaster Guidelines. You definitely need to take action if you see this penalty.
The final category is to do with “unnatural, artificial, deceptive, or manipulative links” that have been found to exist on your site. Identifying and taking down these links should therefore be a fairly straightforward process.
Google does have a huge amount of resource available online for tackling manual penalties and it’s worth familiarising yourself with some of it (starting with the three links above). If you’re really stuck then Google’s forums pages are a good source of information and somewhere you can get direct help from other webmasters.
Tools and Sources
Your first port of call for dealing with any manual penalty should always be Google Webmaster Tools (GWT). It’s here that you will receive confirmation from Google of the type of manual action that you have received and GWT is your primary means for communicating with Google when trying to get a penalty removed. Within GWT you can download a report which, in theory (see below), identifies enough of the offending links to get the penalty removed.
To do this, log into your Google Webmaster Tools account and navigate to ‘Search Traffic’ and then to ‘Links to Your Site’. Finally click on the ‘More’ link beneath the ‘Who Links the Most’ tab. Now click ‘Download More Sample Links’.
There has been a lot of uncertainty circulating around this assertion recently and many people believe that you need a more comprehensive list of links to your site than GWT provides if you are to successfully remove a manual links penalty. You can find more details in this article by Barry Schwartz as well as the publication of Matt Cutts response to his query here.
Whatever the truth, Google has claimed that clearing up the links identified in the GWT report will be enough to get the manual penalty lifted; meaning that you shouldn’t require third party link data from the likes of Moz, Ahrefs and Majestic SEO. Personally I’d always recommend you take the most thorough approach possible and get access to link data from one or more of these sources. If a job’s worth doing, then it’s worth doing properly, as the saying goes.
It’s worth bearing in mind that getting a full link report from any of the above providers, will often involve subscribing to their service (although Majestic now give you access to this for free but only for verified sites). The data provided by these third party tools is extremely useful and as mentioned before, will help you identify potentially harmful links that just won’t appear in the GWT report.
For larger backlink portfolios, I’d always recommend getting a professional link audit done as the data can be overwhelming and the process of link removal time consuming.
Rules of Thumb
Once you have pulled out all the necessary link data from GWT and any third party sources you are using, it’s time to go through and identify the bad links and try to take them down. We will look at how to record and present this data before you send it back to Google in a future post.
Before we go through and look at some common offenders, there are some basic indicators to look out for when identifying bad links.
- The site hasn’t been indexed by Google.
- The site contains a lot of obviously spammy internal links
- The site contains a lot of obviously spammy external links (i.e. pills, porn or poker)
- The site has a lot of obviously spammy links pointing to it
- The site is about no particular niche or subject, indicating it may have been set up purely for SEO purposes
- The site contains keywords like ‘SEO’ or ‘links’ throughout.
The Usual Suspects
Paid for links are the first thing to get rid of and should be the easiest to identify because, presumably, you paid for them in the first place. Google’s Webmaster Guidelines are pretty clear about paying for links as is Matt Cutts (see the above link) and there is undoubtedly a concerted move from Google to penalise the blog networks that offer paid for links in guest posts as well as the sites that use them.
Site wide links are easy to spot on any link report as they will occur multiple times across an entire domain. This is easy to spot in GWT as you can download a ‘Most Linked From’ report. 10 links or more on one site would normally suggest a site-wide link. This can, and often is, seen as totally natural by Google’s algorithm. Site wide links usually occur on websites you have relationships with and can happen for a number of reasons (such as content management systems putting your link into the HTML of every webpage). These links are often picked up as spam by Google’s Webspam team however so make sure to get the webmaster to either “nofollow” these links or get them taken down entirely.
There was a time when directories on the internet were used in much the same way as a telephone directory, providing information and contact details to businesses and service providers within a certain vertical or geographical location. Whilst niche specific and geographically focused link directories should be OK, it’s wise to take down any links from large generic link directories. Again paying for links in directories is a big no-no. The same goes for article directories, which have recently been highlighted as something to avoid by Matt Cutts.
Undoubtedly one of the most contentious issues in SEO circles today (just look at the comments on Matt Cutts blog article to get an impression of the panic this announcement caused). There are so many factors that can differentiate a spammy blog from a genuine one and they are too numerous to list here. As a rule though, ask yourself: Does the blog look genuine and does it have a readership with a genuine interest in the subject? Is the article in question high quality or really generic and badly written? Is the article unique? If the answer to any of these questions is no, then get the link taken down.
Anyone that’s ever run a blog will know what it’s like to get spammy blog comments. Commenting on other people’s blogs and articles is an extremely powerful way of engaging with your industry sector. Indeed it’s something Mr Matt Cutts does a lot of himself (see above link) but as he points out, there is always a danger that this could be picked up as spam. A lot of blogs are automatically set to publish in-comment links as rel=”nofollow” but this isn’t always the case. If it looks spammy then take it down.
Social bookmarks were one of the many casualties of Google’s Penguin update but have long been considered a fairly spammy way of acquiring links. There is always a good chance that excessive social bookmarks will get picked up as spam by one of Google’s Webspam team. As a rule I’d take these links down where possible as most of them carry little, if any, authority anyway.
Over Optimised Anchor Text
The final point for discussion in this article is about anchor text distribution, specifically over optimised anchor text. With every update Google’s algorithm is becoming more and more sensitive to patterns in anchor text across your entire backlink profile. If you’re using the same keyword rich anchor text over and over again, then it’s likely that sooner or later your site’s going to get flagged by Google’s algorithm or incur a manual penalty.
If you notice a lot of links in the GWT report that seem to contain the same or similar anchor text then you would be well advised to get these taken down. If these links are contained on sites, or within articles, which are particularly high quality then it could be worth trying to amend them to incorporate greater variety or to target a brand related key phrase.
We’ll be discussing anchor text and good linking practice in more detail in a future post on the site.
The best defence against a manual link related penalty from Google is to make sure you steer clear of any bad practices or black hat techniques and just not get hit with one in the first place. Your website’s backlink profile is a critical digital asset and as such should be regularly maintained and cleaned up. You can do this yourself but for larger backlink portfolios a proper link audit conducted by an SEO expert may be your best option.
If you do get hit by a manual penalty then it’s important to remember to be as thorough as you can when dealing with it. There’s absolutely no point submitting a reconsideration request to Google if it’s going to get rejected because you haven’t managed to identify and take down all the bad links. Go over your link profile with a fine toothed comb, looking for the links that fall into any of the above categories.
Tools such as Link Detox can help you identify overused anchor text and potentially spammy links, but it’s always important to go through your link report with human eyes and not to rely solely on third party tools. If you can, try and pull off backlink data from a third party source like Moz, Ahrefs and SEO Majestic. The more information you have the more thorough your clean-up operation will potentially be.
If you have any comments or questions about this post, or would like to discuss a specific issue with your site, please get in touch using the form below.
And connect with us on social media to stay upto date with our latest news: