Quantcast

Googlebot Wields New Crawl Tools

Get the WebProNews Newsletter:


[ Search]

Bringing your website into the voluminous BigTable of Google’s index involves sending out pieces of code, referred to as spiders or crawlers affectionately named “Googlebot,” to pick up your site. Google Webmaster Central has some tips for webmasters showing how they can learn more about Googlebot.

Googlebot Wields New Crawl Tools
Google Improves Crawl Capability

October is a perfect month for discussing spiders and other things that crawl silently in the dark, their passage not disturbing the air, undetectable by the normal senses, yet their presence is felt by webmasters everywhere, finding its way into their worrisome dream-filled sleep…

But we digress. Vanessa Fox has posted some news about updates to Google Webmaster Central. That website contains several useful tools for making the indexing process a more rewarding one for site publishers.

While Fox has not yet attained the devoted following that Matt Cutts has, the continued updates at Webmaster Central should make her as sought after as her rockstar fellow Googler. Anything that helps webmasters make their site easier to find on Google for people will get a webmaster’s attention.

The Googlebot represents the gateway to Google’s index. Fox explained some new features related to the Googlebot that should be helpful, the first one being Googlebot activity reports:

We show you the number of pages Googlebot’s crawled from your site per day, the number of kilobytes of data Googlebot’s downloaded per day, and the average time it took Googlebot to download pages. Webmaster tools show each of these for the last 90 days.


Some webmasters have had concerns about the speed with which their sites are indexed. Fox noted how Google’s goal “is to crawl as many pages from your site as we can on each visit without overwhelming your server’s bandwidth.”

Webmasters can set the crawling speed Googlebot uses. If a webmaster feels a site can handle more visits and greater bandwidth demands, the speed can be toggled Faster. To take the crawling back a notch, there is a Slower option.

“If you request a changed crawl rate, this change will last for 90 days,” Fox wrote. After 90 days the crawl rate returns to a setting of Normal, and the webmaster can move it back to Faster or Slower again as desired.

Google can perform enhanced image search on a site. By opting in to this choice, tools like Google’s Image Labeler will make those images easier to find in search, and associate them with relevant labels. Webmasters can opt out of that service if they change their minds about it after opting in for the enhanced option.

Fox also mentioned how a suggestion she received at SES San Jose has made it into Webmaster Tools:

He said that he generates his Sitemaps automatically and he’d like confirmation that the number he thinks he generated is the same number we received. We thought this was a great idea. Simply access the Sitemaps tab to see the number of URLs we found in each Sitemap you’ve submitted.


Feedback and commentary on the new goodies can be made at the Google Group for Webmaster Help.


Tag:

Add to Del.icio.us | Digg | Yahoo! My Web | Furl

Bookmark WebProNews:

David Utter is a staff writer for WebProNews covering technology and business.

Googlebot Wields New Crawl Tools
Comments Off
Top Rated White Papers and Resources

Comments are closed.

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom