There's a new UserAgent in my web server logs:
"Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
This is not the regular Google crawl - that is done by UserAgent
The requesting IP for the Mozilla faking UA is assigned to Google, so it's not just somebody else using Google's name. Is this a test whether more servers are willing to serve content when you force the UserAgent to something browser-like or could it be (conspiracy theory drumroll building) the Google browser?
The former is more likely, but since I don't discriminate bots in any way, why am I being crawled by both bots?
Incidentally, the Mozilla impersonating bot is not as smart as its older brother - along with msnbot it has fallen inti thi Intirnit.
Even more incidentally, the big brother Googlebot was either fixed so it no longer crawls accessibility world or accessibility world closed its doors on the Googlebot. Oh, wait - it was accessibilty world that got fixed by having a robots.txt file.