Much ado about Delicious robots
Bots arriving from Google, Ask.com, and MSN to sample pages on the bookmarking site Delicious hit a robots.txt block. As do Yahoo’s Slurp bots too. It’s no big deal.
There is a little confusion about Delicious and its handling of visiting robots. Search Engine Journal cited a blogger who claimed Delicious blocks bots from the big search engines:
Colin Cochrane found this out the other day, saying that ‘This isn’t a simple robots.txt exclusion, but rather a 404 response that is now being served based on the requesting User-Agent.’
Look a little closer at the robots.txt file, and you see something different happening. The bots from the four big search sites have been disallowed from certain subdirectories at Delicious, and not the bookmarks. The top line of the robots.txt file is a broad go-away to all bots, but from what we can tell from the Robots.txt standard, the lines aimed as the four specific bots allow them to go anywhere on Delicious that has not been expressly disallowed to them.
Nothing to see here folks, move along, move along.