Negative SEO has become a problem. While most webmasters know that other websites can be damaged by adding low quality links to these sites, there are more things that your competitors can do to hurt your Google rankings.
What is negative SEO?
Negative SEO describes the things that other persons can do to lower the position of your web pages on Google.
Negative SEO started to occur following Google’s Penguin update when it became common knowledge that Google would apply penalties for manipulative links. Some persons started to build artificial backlinks that pointed to competitors to trigger Google penalties for the the competitor websites.
Fortunately, webmasters could counteract these negative SEO links by disavowing these bad links.
The next level of negative SEO: faking bad user experience
It seems that the next level of negative SEO has started. Bartosz Góralewicz from Poland discovered more things that your competitors can do to damage your web page rankings.
By faking bad user experience, he managed to damage the rankings of one of his web pages that had very good rankings before (position 1 and 2).
He created a bot that searched for the keyword on Google and clicked on a random website in the results. The bot did not click on his own website, thereby massively lowering the click-through-rate of the website.
Within a few days, the rankings of his web page plummeted. During the test, he monitored other factors closely to ensure that the lost visibility wasn’t caused by something else.
What do you have to do now?
To make sure that your website does not become the victim of negative SEO, it is essential to do the following:
- Regularly monitor the positions of your web pages.
Otherwise, it might be too late when you notice negative changes.
- Monitor the backlinks that point to your website.
The link disinfection tool will show you bad links that point to your
- Make sure that the onsite optimization of your web
pages is spotless. The website audit tool and the Top 10 Optimizer can help you to optimize the
pages of your website.
- Avoid index bloats. Many sites either have issues
with pagination or filters due to various sizes or colors of their
products. If you do not use a correct robots.txt file, search engines
might index several variations of the same page. The website audit tool
can help you to identify these duplicate content problems.
Tracking your web pages and links is important. Fortunately, the tools in SEOprofiler help you to do this as efficiently as possible. If you haven’t done it yet, try SEOprofiler now and find out how SEOprofiler can help your site to protect your rankings: