Get Your Sitemap Discovered Automatically with your Robots.txt File

    April 12, 2007

At Search Engine Strategies New York it was announced that you can now have your sitemap automatically discovered by configuring it in your Robots.txt file. It is simple and easy to do, you’ll just need to know the URL or web address of your sitemap.

First, open your Robots.txt file on your server for editing. Then you will need to add the following line to the end of the file (it can be anywhere, but the end is probably a good place).


Save the Robots.txt file with the new line for the sitemap URL. There you go! Your whole file may look something like this:

User-agent: *
Disallow: /somefolder/
Disallow: /somethingelse/

Search engines already come to your Robots.txt file when they visit your domain, so on their next crawl they will automatically find your sitemap file.

If you have a new site/domain you will probably still want to submit the sitemap URL to the search engines. To submit you can either submit the URL through their interfaces or use a ping.

Submit Sitemap to Google or Ping Google with your Sitemap

Submit Sitemap to Yahoo or Ping Yahoo with your Sitemap

Submit Sitemap to MSN

Info for Sitemaps
or Ping by hitting this address: