Quantcast

Making The Googlebot Fetch With URL Submission

Help Google find your site quickly.

Get the WebProNews Newsletter:
[ Search]

Is there some new content and/or a new site you want Google to notice sooner rather than later? Well, there’s an official Google utility for that.

With the Fetch as Googlebot URL submission tool, site owners can now directly request that Google send the web-crawling/indexing Googlebot to the URL that was submitted. While Google’s index will usually find these new pages/sites–especially if they are backlinked–this method speeds up the process. In fact, according to the Google Webmaster Central Blog, URL that have been submitted with Fetch as Googlebot are crawled within a day.

There are obvious benefits with this technique, especially if Google’s index has out-of-date pages for your site and you’d like to see the content updated. Clearly, the same is true for new site launches as well. The sooner it’s in the Google index, the better. Quality backlinks and good content are still the key to gaining rank in the index, but not knowing your site will be indexed almost as soon as its launched is a boon for site owners and SEOs, alike.

The blog post details the steps in order to submit a URL to Fetch, and it’s really quite simple. So much so, in fact, it would be foolish not to take advantage of the option.

How to submit a URL
First, use Diagnostics > Fetch As Googlebot to fetch the URL you want to submit to Google. If the URL is successfully fetched you’ll see a new “Submit to index” link appear next to the fetched URL

Once you click “Submit to index” you’ll see a dialog box that allows you to choose whether you want to submit only the one URL, or that URL and all its linked pages.

When submitting individual URLs, we have a maximum limit of 50 submissions per week; when submitting URLs with all linked pages, the limit is 10 submissions per month.

Fetch as Googlebot

In other words, you can submit 50 pages a month or 10 sites. The post goes on to say that is if your wanting to submit content like images and/or video, use their Sitemap. The Fetch as Googlebot is intended for content that appears in the web search results, also known as text.

Another update allows users to submit unverified URLs to the Googlebot as well. The difference being, with verified submissions, the person submitting most confirm ownership of the site/URL being submitted. Unverified submissions, obviously, do not require the same proof. There’s even a link provided for these kinds of unverified submissions, which takes you to the Crawl URL page.

Fetch as Googlebot

If you’re a committed site owner and you’re not taking advantage of these capabilities, you are only cheating yourself and your business.

The video that leads this post features Google’s Matt Cutts discussing how long it took Googlebot to recrawl a page, and it was posted on May, 2010 on the Google Webmaster Help YouTube page. While not specific, the answer for sites that frequently update content was “a few days,” but now, with Fetch as Googlebot, if it’s that important that your new content is indexed on an even more rapid basis, well, there’s a utility for that.

Making The Googlebot Fetch With URL Submission
Top Rated White Papers and Resources
  • http://www.seonorthamerica.com Tom Aikins

    Thanks for this. It takes so long to figure out everything you can do with Google that any help is great to have.

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom