Google: Spammy Link Networks Share Vulnerable Sources With Each Other
Google's John Mueller said that when a link spammer (or I guess any time of spammer) find a vulnerability on a website, they might share that vulnerability with their network of spammers. So it is important that you close any vulnerabilities on your site not just from a specific spammer but from future spammers who may have been told about your site.
John said this in a Reddit thread, saying, "before you allow pages like that to be indexed (often low-effort link-builders share their sources, so they'll be back)."
This came up when a site owner noticed link farms creating content on his site and throwing tons of links to them. He figured out the pattern and 404ed those pieces of content. John said "If you 404/410 the pages that receive the links, those links won't have any effect. Good job on spotting those and neutralizing them."
John did say that Google is good at ignoring links, saying, "While Google ignores those links for the most part anyway, getting rid of spammers who leech off of your site (those stupid "web 2.0 links") is always good - make your site for real users, not for spammers."
Here is what John posted in its entirety on Reddit:
If you 404/410 the pages that receive the links, those links won't have any effect. Good job on spotting those and neutralizing them. While Google ignores those links for the most part anyway, getting rid of spammers who leech off of your site (those stupid "web 2.0 links") is always good - make your site for real users, not for spammers. You might even go a step further and increase the threshold for quality for all new users on your site, before you allow pages like that to be indexed (often low-effort link-builders share their sources, so they'll be back).
So once there is this opening on your site that others can exploit, they will and others will either find it themselves as well or they have networks where they share this information.
0 Comments