First, no article complaining about Google cutting off the traffic to Site X or Site Y is complete unless it fills in X and Y with specific examples. I want to know exactly what we're talking about. Otherwise I tend to surmise, perhaps unfairly, that articles like this are astroturf: They're written by spammers or spammer apologists who are angry that their highly profitable link farms have been effectively cut off. After all, if I find myself reading a random article on this topic, it's probably because a) it's ubiquitous or b) my subconscious has been lured to it by a well-crafted linkbaity headline. And who is good at making posts ubiquitous and tempting? Spammers!
So knock it off with abstractions like "sheep" and show me some real examples, so I can empathize properly.
Second, the reason why Google doesn't take a "rule of law" approach is easy to see, isn't it? Nobody on earth is better at gaming complex sets of rules than programmers. Create a clear specification that defines "spam" and the spammers will promptly craft a ton of elegant and technically legal ads and then flood Google with them.
Why doesn't this happen with the "rule of law" in a legal setting? Well, it does: People play fast and loose with laws all the time. But the real secret to the rule of law is that the ultimate arbiter of that law is... groups of actual humans called "juries" and "courts", who are empowered to use common sense to throw the book at those who get too creative with the edge cases. Real-world law has intentionally fuzzy edges. (Of course, when the fuzz spreads into the center, you've got a problem.)
Proposing to turn Google's ranking system into something resembling a legal system -- which is perhaps equivalent to taking our existing legal system and applying more of it to Google, i.e. legislating certain elements of Google's design -- is an interesting idea, and it may eventually be tried, but it's not obvious that it will improve anything, or even change anything. And it's going to be hard to discuss how well it might work unless we use... real examples!
I had a similar thought to your first point when reading this. The author is claiming that webmasters have to build two sites rather than one, thinning resources and reducing quality. This is because if Google determines one is spam, they have to fall back to the other site.
But if the sites are similar and are created for safe redundancy, I would assume the second site is going to imminently get nailed as well. I mean, the guys says build two of the same sites at different places so you have a backup. You're building two mediocre targets either hoping that the other one isn't noticed by google (which probably means users aren't finding it either) or that they're significantly different somehow.
I agree, it sounds in its lack of detail like someone doing something sketchy and complaining it doesn't work. I'm not aware of any cases in which non-specious websites have gotten nailed and had no recourse. The author didn't provide any. Has anyone seen this in the wild?
Well, that's very interesting. Although this article is merely a pro-journalism example of the very same "consider the case of sites X and Y, where X and Y are cannot be named" phenomenon, so it still isn't clear to me how large the risk is.
And I'm with justindz, who doesn't understand the proposed defense. If a competitor decides to cross the ethical line and take your site down by "generously" having spambots put up links to your site, thereby causing Google to conclude that you're a spammer and shut down your pagerank... can't they do this to all of your sites at once, not just one? My understanding -- correct me if I'm wrong -- is that the cost or supply of spambots is not the limiting factor here.
I own www.42topics.com. I am investing six months to build this site. This is hosted at webfaction.com. Now suppose there are a few spam sites hosted at webfaction.com, at my IP. Tomorrow Google decides that they are going to do IP based filtering, on spam sites, and my site is toast. I lose significant trafic, and my only source of Income.
Compare this scenario to Joe Spammer, who has hundered borderline spam sites. Tomorrow Google decides that 10 are spam and torches them. He creates 10 new sites, and is back on track. Guess who is hurt more by constantly changing guidelines?
Tomorrow Google decides that they are going to do IP based filtering, on spam sites...
... and you switch web hosts? In, like, a day?
... or buy a static IP?
... and/or you and every other customer of webfaction.com threatens to switch, whereupon webfaction.com, faced with wholesale erosion of their business, gets medieval on the "spam sites" and Google forgives them?
I appreciate that this would be a major expense and hassle if it happened very often. But (a) this is still a hypothetical example: does Google actually do this? And (b) if this keeps on happening to you, you need to re-evaluate your methodology for choosing web hosts. :)
I had the same feeling when reading the article. Then I noticed the other two tales on the site: about nofollow and captchas both being bad for the web. That's when I knew for sure this was written by a spammer.
My definition of spammer might be a bit broad. I'm just really tired of the crap pages in a lot of Google search result so I don't have any sympathy for the people complaining when their site with nothing other than ads and SEO tricks got booted from Google Search.
I think this article is a bad joke. It's laughable how some webmasters think that Google just have to include their spammy web into their ranking, and if not, Google is bad, and internet is bad, and they now just must build 100 other spammy webs and try to push them into search engines.
First, no article complaining about Google cutting off the traffic to Site X or Site Y is complete unless it fills in X and Y with specific examples. I want to know exactly what we're talking about. Otherwise I tend to surmise, perhaps unfairly, that articles like this are astroturf: They're written by spammers or spammer apologists who are angry that their highly profitable link farms have been effectively cut off. After all, if I find myself reading a random article on this topic, it's probably because a) it's ubiquitous or b) my subconscious has been lured to it by a well-crafted linkbaity headline. And who is good at making posts ubiquitous and tempting? Spammers!
So knock it off with abstractions like "sheep" and show me some real examples, so I can empathize properly.
Second, the reason why Google doesn't take a "rule of law" approach is easy to see, isn't it? Nobody on earth is better at gaming complex sets of rules than programmers. Create a clear specification that defines "spam" and the spammers will promptly craft a ton of elegant and technically legal ads and then flood Google with them.
Why doesn't this happen with the "rule of law" in a legal setting? Well, it does: People play fast and loose with laws all the time. But the real secret to the rule of law is that the ultimate arbiter of that law is... groups of actual humans called "juries" and "courts", who are empowered to use common sense to throw the book at those who get too creative with the edge cases. Real-world law has intentionally fuzzy edges. (Of course, when the fuzz spreads into the center, you've got a problem.)
Proposing to turn Google's ranking system into something resembling a legal system -- which is perhaps equivalent to taking our existing legal system and applying more of it to Google, i.e. legislating certain elements of Google's design -- is an interesting idea, and it may eventually be tried, but it's not obvious that it will improve anything, or even change anything. And it's going to be hard to discuss how well it might work unless we use... real examples!