Surprise, surprise - a Google backlash is going on. At GavinsBlog.com some really stupid paranoid thinking about Google can be found.
Is the problem, quite simply, that Google works - and that technology that works so well at information gathering is scary? Efficient search clearly has privacy implications, but surely the real concern here is the publication of data, not the fact that the public data is utilized.
Some people are shifting to AllTheWeb in response. That's just great - AllTheWeb recently was purchased by the most evil company in search, Overture - the paid listing company, whose business is to make ads and content indistinguishable.
Gavin makes reference to Google watch a watchdog site. But the criticism there is as inept and largely unrelated to Google, which makes it hard to see why Google should suffer the criticism especially:
Again, webserver tracking is brought up. A legitimate issue but hardly Google specific. The use of search terms in referral URL's means I know what you're looking for when you reach my site - again hardly a Google problem. Supposedly some guy who used to work for NSA works for Google (Google-watch is here employing the fine tactic of guilt by association that I'm quite sure they would like Google and the government to refrain from). Did it occur to the whistle blowers that Google may need engineers with clearance to sell google search to secured government intranets?
The most ridiculuous criticism is that of PageRank as a monopolizing feature harming the openness of the internet. It's not that there isn't a problem it's just that the problem is not the one being discussed.
Here then is the real problem: If you're looking for something you have to read one text first. That's how attention works, not how Google works. That means that some ranking algorithm will apply.
The mode of your search plays an important role then in whether or not the dominance of PageRank is a good thing. If your mode is 'search for something specific' - i.e. the more and more popular 'Google as DNS' mode of search, where you know the content of your location but not the exact address, then it doesn't matter what the ranking algorithm is as long as it works. Then there is what you might call 'auction search' - where you are looking for something which has a large number of equally qualified providers, or to be more precise: A large number of providers you have no information about.
For that kind of search you might say that you want to stratify the usable results into equally qualified strata and then choose randomly among the searches within each stratum.
You could accomplish this by adding a small random number to the PageRank and ordering by this new rank. You would still get a usable overall ranking but it would be 'fair'. It remains to be seen if a real rank difference of, say, 0.10 has qualitative meaning or not. Whether this adds any value for the user of a search is doubtful
Generally speaking attention monopolizes. The complaints against Google are almost always 'supply-side'. The criticism waged at Google is of the 'I have a rank of 7 - and these 50 sites with a better rank are in the way' kind. Well, a rank of 8 will - for most searchers - translate to a more relevant site. And what's more. Having a site with a rank of 7 puts you in a much larger group than the sites with rank 8. So a fair distribution of hits among the rank 7 sites would not necessarily generate a lot of traffic for you. Your contribution would drown out because of an owerwhelming supply of information.Posted by Claus at April 15, 2003 04:47 PM