Google released a new version of the Penguin algorithm about a month ago. Websites that were penalized by a previous Penguin update should have recovered with the new update if they removed spammy links in the meantime.
This did not happen for all websites. What’s the reason for that and what can you do to make sure that your own website will recover?
What is Google’s Penguin update?
Google Penguin is a filter in Google’s ranking algorithm that targets websites that use spam methods to get backlinks. For example, Google does not like paid backlinks, automatically created backlinks from forums, blogs, etc.
If you used these low quality links to promote your website, chances are that Google penalized your website with one of the Penguin updates.
Why should websites have recovered?
According to Google, websites that cleaned the bad backlinks should recover with the Penguin update. In an online discussion, a webmaster said that this did not work for him:
“After disavowing and removing A LOT of domains and putting right our previous problems, although our traffic is marginally up from this latest penguin data refresh, we’re not back to previous traffic numbers AND have earned some great links in the mean time, pushing out great content daily.
Is penguin still unhappy with our site, or do we lack good/natural/high quality links pointing to our site and this is due to our ranking issues?
We didn’t disavow all scraped content as Google is supposedly good at detecting and ignoring this, but this may have contributed to our link profile. Our disavow file is updated weekly and has been since Oct 2013.”
What does Google say about this?
Google’s John Mueller answered this webmaster in the online discussion:
“I think sharing the disavow file (as well as its history) would make it a lot easier for folks here to notice some of the things that may be worth mentioning.
For example, sometimes it shows that changes were made recently, perhaps after the most recent refresh. These kinds of changes can take time to be reprocessed (recrawling the URLs alone can take several months, and then it would require an update of the algorithm’s data), and you wouldn’t expect to see changes based on those submissions to be visible until the next refresh takes place.
Especially because these things take so much time to be recrawled & reprocessed, I think it really helps to have as many critical eyes on those submissions as possible. Maybe you have everything covered already, or maybe there are still some issues that you didn’t realize, which others could help you resolve as early as possible.
The people active here in the forums are not out to
get you for things done in the past, they want to give pointers based
on things they’ve seen over the years, so while I can’t guarantee that
they’ll be able to flag ‘that one thing that’s holding your site back’
(usually it’s not just one thing anyway), they usually have a really
good eye for things that are commonly forgotten (purposely or not).”
What does this mean exactly?
Unfortunately, the answer isn’t very clear and John Mueller does not go into detail. The following can be the reason why the website hasn’t recovered:
- the webmaster missed important bad links in the disavow links file
- the webmaster submitted the disavow links file after the new Penguin algorithm was released
- Google hasn’t processed the disavow file yet
If you haven’t done it yet, create your SEOprofiler account now to use the link disinfection tool and all other website promotion tools in SEOprofiler: