Should SEOs Avoid Sitemaps?

    February 22, 2007

Rand Fishkin recently posted an interesting concept on his blog about sitemaps. For those of you keeping score at home, a sitemap is a document (typically xml) that sits on your server and helps search engine spiders crawl and index your site. Sounds great, right? Maybe… maybe not. Rand theorizes these sitemaps may actually be bad for your SEO efforts.

Should SEOs Avoid Sitemaps?
Should SEOs Avoid Sitemaps?

Consider this: what if this content – that would otherwise go unindexed- is now indexed but still suffering from whatever problems caused it to not be indexed in the first place. Is it a good idea for content with problems (index-preventing problems) to be indexed just because it was listed in the sitemap? If it’s indexed, how is the webmaster to know it had a problem?

So I suppose question then becomes, once it’s indexed, does it matter? Whether by natural crawl or sitemap-assisted crawl, the content is now indexed, so where is the downside? In other words, if your problem is no longer a problem… is that a problem?

Now, Rand isn’t someone I’d categorize as a tin-foil hat wearing, conspiracy theorist loon (well not all of those anyway) and he has had a fair share of supportive comments on this issue from SEO luminaries like RustyBrick and David Naylor. Nonetheless, I still have a problem thinking of a sitemap as a negative thing. Unless there is some sort of value differentiation between naturally indexed and sitemap-assisted indexed content (which I doubt), the argument falls a bit flat I think.

As such, I figured I’d just go right to the source and ask a Googler. So, I fired off a quick email to Vanessa Fox, Product manager for Google’s Webmaster Central. Though she’s still overseas in Dublin after attending SES: London, she was able to weigh in on the issue stating (in no uncertain terms); “there’s no difference in indexing based on how we found a page (either by links or by a Sitemap).”

Unless… these SEO folks are thinking about optimization beyond search engines. In this scenario, perhaps the motivation would be hunting down and addressing problem content/pages that were, for whatever reason less than ideal or flawed. This is certainly noble and good of course… but at the expense of having content indexed in the search engines? I really just can’t get my mind around taking the SE out of SEO. Sure some page may not be perfect, but can’t you just tweak it however it needs tweaking after it’s been indexed?

As for the issue of a sitemap possibly preventing or making it more difficult for webmasters to identify problems, Vanessa didn’t see that as a problem. She maintains that (sitemaps) “won’t mask anything for a webmaster who is looking at potential site problems.”

Further to this point she added the following:

"If the pages aren’t well-linked and have low PageRank, they are unlikely to rank well for queries, so that problem should also be obvious to the webmaster, even if the pages are indexed. I always encourage webmasters to continue doing all of the things we recommend in our guidelines (ensuring the site is crawlable, well linked, has quality, unique content, etc.). A Sitemap doesn’t replace all of those things. It simply is an additional tool for webmasters."

Sitemaps are supposed to make things easier for everybody – search engines and webmasters. As Vanessa observes: “Most site owners aren’t experts on optimization – they simply want their pages indexed. Sitemaps help all site owners — from the very small mom and pop to the very large company tell us about the pages of their site and provide input to us.

Vanessa added that a sitemap might actually help a webmaster find problems with their site in some cases. She points out that: “If the site has pages with errors that prevent us from crawling, the pages won’t appear in the index and those pages will be listed in the Crawl Errors section of webmaster tools. We may not have attempted to crawl some of these pages if they weren’t in the Sitemap, so in this case, a webmaster might be alerted to problems not otherwise known.

I asked Vanessa if future enhancements to sitemaps might include additional means of identifying and reporting issues to webmasters. She said she liked the idea and they were “always looking to provide as much information as possible to webmasters about potential issues”.

In the end, Vanessa said; “I don’t see any reason for webmasters to avoid submitting Sitemaps. It enables them to give us a comprehensive view of the site and provide input to us about the site. And more information is always better.” While I can see the logic, to some extent, behind Rand’s post – namely wanting to be aware of problems.

While I am certainly not an expert SEO (disclaimer for Diggers),  I don’t think the sitemap is going to somehow keep you from identifying these problems. Maybe you have to look for problems beyond: ‘is the page indexed (yes/no)’ but I just don’t buy into the concept of the sitemap as a negative for SEO in all but the wildest of exceptions.

— Tag:

Add to | Digg | Reddit | Furl

Bookmark WebProNews: