Help Google Crawl Your Site More Effectively, But Use Caution

Changes to Webmaster Tools, Plus Matt Cutts on Selling From Multiple Domains

Get the WebProNews Newsletter:

[ Search]

Google has introduced some changes to Webmaster Tools – in particular, handling of URLs with parameters.

“URL Parameters helps you control which URLs on your site should be crawled by Googlebot, depending on the parameters that appear in these URLs,” explains Kamila Primke, Software Engineer with the Google Webmaster Tools Team. “This functionality provides a simple way to prevent crawling duplicate content on your site. Now, your site can be crawled more effectively, reducing your bandwidth usage and likely allowing more unique content from your site to be indexed. If you suspect that Googlebot’s crawl coverage of the content on your site could be improved, using this feature can be a good idea. But with great power comes great responsibility! You should only use this feature if you’re sure about the behavior of URL parameters on your site. Otherwise you might mistakenly prevent some URLs from being crawled, making their content no longer accessible to Googlebot.”

Do you use URL parameters in Webmaster Tools? What do you think of the changes? Comment here.

Google Webmaster Tools - URL Paramter page

Google is now letting users describe the behavior of parameters. For example, you can let Google know if a parameter changes the actual content of the page.

“If the parameter doesn’t affect the page’s content then your work is done; Googlebot will choose URLs with a representative value of this parameter and will crawl the URLs with this value,” says Primke. “Since the parameter doesn’t change the content, any value chosen is equally good. However, if the parameter does change the content of a page, you can now assign one of four possible ways for Google to crawl URLs with this parameter.”

Those would be: let Googlebot decide, every URL, only crawl URLS with value or no URLs.

Users can tell Google if a parameter sorts, paginates, determines content, or other things that it might do. For each parameter, Google will also “try” to show you a sample of example URLs from your site that it has already crawled that contain a given parameter.

To bring up the use of caution again, Primke warns about the responsibilities that come with using the No URLs option. “This option is the most restrictive and, for any given URL, takes precedence over settings of other parameters in that URL. This means that if the URL contains a parameter that is set to the ‘No URLs’ option, this URL will never be crawled, even if other parameters in the URL are set to ‘Every URL.’ You should be careful when using this option. The second most restrictive setting is ‘Only URLs with value=x.'”

She runs through some examples in this blog post, and there is more related information in Google’s Webmaster Help forum.

Webmasters & SEOs: here’s *tons* of great info on our improved tool to handle url parameters better: http://t.co/TtBs8tp 2 minutes ago via Tweet Button · powered by @socialditto

Be Careful About Selling the Same Stuff From Multiple Domains

As long as we’re discussing webmaster issues for Google, I’ll also point to the latest Webmaster Help video from Matt Cutts, who discusses selling products on multiple domains. The user question he sought to answer was:

“I manage 3 websites that sell the same products across 3 domains. Each site has a different selling approach, price structure, target audience, etc. Does Google see this as spumy or black hat?”

Cutts says, “On one hand, if the domains are radically different lay-out, different selling approach, different structure – like, essentially completely different, and especially the fact that you said it’s only 3 domains, that might not be so bad. Clearly if it were 300 domains or 3,000 domains – you can quickly get to a fairly large number of domains that can be crowding up the search results and creating a bad user experience…by the time you get to a relatively medium-sized number of sites.”

“The thing that was interesting about the question is that you said it’s the same products, as in identical. So it’s a little weird if you’re selling identical products across 3 domains. If you were selling like men’s sweaters on one, and women’s sweaters on another, and shoes on a third….I’ve said before, there’s no problem with having different domains for each product, and a small number of domains (2, 3, or 4) for very normally separable reasons can make perfect sense, but it is a little strange to sell the same products, so if they’re really identical, that starts to look a little bit strange – especially if you start to get more than 3 domains.”

“Definitely, I have found that if you have one domain, you’ve got the time to build it up – to build the reputation for that domain…in my experience, when someone has 50 or 100 domains, they tend not to put as much work – as much love into each individual domain, and whether they intend to or not, that tends to show after a while. People have the temptation to auto-generate content or they just try to syndicate a bunch of feeds, and then you land on one domain vs. another domain, and it really looks incredibly cookie cutter – comparing the two domains, and that’s when users start to complain.

Do you think Google takes the right approach to sites selling products from multiple domains? Share your thoughts here.

Help Google Crawl Your Site More Effectively, But Use Caution
Top Rated White Papers and Resources
  • http://twitter.com/digideth B. Moore

    So would this be where I need to put SID’s or should that be handled on the server with the.htaccess ?

    • http://neotericuk.co.uk Neotericuk

      This is really good article….I am doing the same thing for my client and after reading this article i am happy that i am doing the right thing as Matt cutts says we can create different websites URL’s for different products..

  • http://www.solidamerica.com Gino

    Can you ask the gentleman about how to use the crawl tools concerning the parameters of frontpage ” /_vti_cnf/ ” It has been a report of missing title on many of these pages that shouldn’t be crawled because they are generated by the editor program and they are nor t webpages good for user anyways.
    How to tell the blocker not to consider those pages a a valid element on a website. Let me know if you can. Thank You.

  • http://aplawrence.com Anthony Lawrence

    I do use url parameters and find that Google does what Google will do anyway.

    At one time I had “?ref=”foo” on some links I wanted to track. Those were removed years ago AND I told Google WMT to ignore “ref=” , but they still show up as “Duplicate Meta Descriptions”.

    For a bunch of supposedly bright people, WMT tools is amazingly flawed.

    So, yes, I tell Google what to ignore. I just doubt that it really helps.

  • http://www.ajwilliamssolutioninc.com AJ Williams Solution, Inc.

    Do you think Google takes the right approach to sites selling products from multiple domains?

    I agree with Mr. Lawrence, Google does what Google wants to do period.

    I think there is bigger problems than selling products from multiple domains – I think one of the biggest problem to me is kids accessing porn websites.

  • http://www.wsdbiz.co.il/ Aviran

    Will use it to make Google crawl more on my site and update cache and inside webpages daily, instead of weekly~ updates that cause the info/news being irrelevant.

  • http://www.the-system.org The System

    Wow so mini sites are OK with Google, caveat being niche focus.

    We run a lot of insurance websites on niche specific domains, and concentrate soley on the subject at hand – one for each product with great success.
    We realised a long time ago that it wasn’t good enough to have siloised site structure to rank top – you had to have a compartmentalised product structure as well if you want to be the voice and authority in a niche. Even more so now that social signals are starting to count.
    This means multiple domains for different keywords and since panda we have seen our own, our competitions and many others who’ve adopted this approach be highly favoured in the google search results.

    Good to hear Cutts being honest for once.

  • http://ukrbiz.info Viktor

    I was surprised to see this option on my web site Ukrainian Business Directory, but once looked at it closely, I think this is a great future.

  • Brad

    It is sometime impossible not to have web sites with simular material.
    I updated a site from golive to dreamweaver with different domains.
    The unfortunate thing is, the old site still brings in most of my business and I can not afford to bring it down,the older site also has many important links from on line directories etc, which have been grand-fathered and I cannot afford to enlist the new site with them.

    To bad google does not have a tool to map all of the links and pages to my new site so I do not loose any good will so to speak. Or maybe google does and I am not aware of it.

  • http://all.at/brofarops/ dr. Robert

    Yes, you should define what you want shown … Many back offices get crawled – and specialized customer services. Put enough to get you found and that you offer more … Let the customer click not the bot with a “nofollow” or You can easily protect or define what a bot, or spider does. From a Search Engine stand point – they are phone directories … And nobody likes to carry around a Big Fat Yellow Book and thiers is HUGE. If ACE Hardware put in every little plumbing : electrical : nuts & bolts : hammer : appliance – How many listings would they have? Each of these represent another URL … Lets social mediums like fasebook, twitter, and so on advertise our specials URLs and Search Engines advertise our primary URLs … We as webmasters / webowners blog and use social media to promote. For years we simply put every URL on the Engines before but now we have other media forms. We need to promote our primary URLs as far as OUR BRANDS and the Primary URLs ( sitemaps & index services ) on SEO …

  • http://www.aktivtek.no Piotr

    Lets hope those tools will be used as it stands in the description. Google likes to check if the websites are trying to get good rankings on some keywords. This is next portion of data google can analyse.

  • http://www.get-free-facebook-credits.info/ Melody@Free Facebook Credits

    thanks for the great post! :)

  • John

    So sick of the Google ass-kissing I could absolutely scream. Just let me search for the information I want and let it go at that. What really makes me sick is big G doesn’t know the difference between a site using www or not, so counts it as duplicate content and penalizes the same freaking site for being duplicate. That is so anal. How much of this crap are people going to take?

  • http://www.accenttablesonline.com Steve Sweet

    Google does a great job. But I have no idea how I went from a page rank of 2 to nothing over night. I always am link building.

  • http://aidyspoetry.net AIDY

    I am nervous messing around with these new settings. Google has us all hopping from one foot to the next! Google+ now this? #sos

  • http://www.ucontext.com/ Gary

    I find it “weird” and “a little strange” that Mr. Cutts thinks selling the same products on multiple, audience-targeted sites is somehow abnormal. Knowing your audience and selling to that audience in a manner that specifically speaks to their state of mind is a pillar of marketing theory 101. Take for instance something simple like a flashlight. Off the top of my head I can think of several different audiences… law enforcement, campers/hikers, home owners, hunting/fishing, truck drivers, boaters, plumbers, EMT/paramedics, home inspectors…

  • Voltara The Great

    Oh yeah, the Ol’ Ukrainian Business Directory. What a resource it is. Perfect for finding prostitutes and z grade spammers

  • http://www.seonorthamerica.com Tom Aikins

    Good explanation by Matt in regard to multiple domains. After reading about the URL tool it seems like I’m best off by leaving well enough alone at this point.

  • http://www.greytip.com Meera

    Thanks for the info from Matt.It would certainly clear up doubts on multiple domains by webmasters and hope each domain is created with a purpose.

  • http://masterofmister.blogspot.com tanra

    thanks very help full

  • http://www.hedgehogdigital.co.uk/ SEO Bedford

    Well, I think it is a bit stupid to sell the same products through different domains, why not put the effort of managing 3 domains in only one and sell more products that could earn you more money?

  • http://www.weldons.ie Garret

    thank you, very helpful tips

  • http://www.rentsoon.com Andy Weiser

    thanks for the great post……….

  • http://www.alejandroshotelcusco.com/ hotelsincusco8

    We are a family dedicated to serving the family Who has traveled all around the world, knows that, far from home what we need to find is the warm

  • http://www.ekhichdi.com Tina

    Wonderful article but not sure if the information was enough. guess we need more help on this.

  • http://www.leprechaun-software.com/ Leprechaun Salon Software

    Hey ,

    Thanks for providing such a valuable information. It helped me a lot to do my SEO job.

    Gabrielle Smyth

  • http://www.expertmagentodevelopers.com magento developers US

    has this been launched??? I cant see these options in my webmaster account :(

    • http://sgbizness.com junnydc

      Hi Magento,
      Yup this is already implemented in the webmaster toolkit.

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom