“For the most part we don’t have any type of special whitelist where we can say, well this website is actually okay, therefore we will take it out of this algorithm.
For some individual cases we do that. So it depends on the algorithm. So for a lot of the general search algorithms we don’t have that ability but for some individual algorithms we do need to be able to kind of take manual actions.
For example the Safe Search algorithm is picking up on these words on this website as being adult. Adult website similar but actually they’re talking about, I don’t know, animals or something completely unrelated.
And in those kind of cases the SafeSearch aglorithm would have kind of a whitelist which would say well, this is a problem that we’re picking up incorrectly with the algorithm and we will add them to the whitelist for the moment, work to improve the algorithm so that they don’t take this into the account in the long run. But in the mean time, we can kind of, like a stop gap measure to help that. That is something where that sometimes makes sense.
We don’t have that for a lot of the other algorithms like Penguin and Panda.“
Google’s Penguin update targets websites with many low quality links. You can find low quality links that point to your site with the link disinfection tool in SEOprofiler.