Google Panda Update: The Solution for Recovery?

Hubpages may be on to something

Get the WebProNews Newsletter:

Google Panda Update: The Solution for Recovery?
[ Search]

Many sites are still wondering how they can come back from being hit by the Google Panda update. Google has certainly stressed quality, and victims of the update have been striving to improve it, but have had little luck in terms of boosting their rankings for the most part.

Have you been able to recover any search traffic after being hit by the Panda update? Let us know.

When we talked to Dani Horowitz of DaniWeb, she told us about some other things she was doing that seemed to be helping content rank better, but it was hardly a full recovery in search referrals.

An article ran at WSJ.com about HubPages, one of the victims that we’ve written about a handful of times. CEO Paul Edmondson is claiming that the use of sub-domains is helping its content work its way back up in Google – something he stumbled upon by accident, but also something Google has talked about in the past.

The article quotes him as saying that he’s seen “early evidence” that dividing the site into thousands of subdomains may help it “lift the Google Panda death grip.” Amir Efrati reports:

In June, a top Google search engineer, Matt Cutts, wrote to Edmondson that he might want to try subdomains, among other things.

The HubPages subdomain testing began in late June and already has shown positive results. Edmondson’s own articles on HubPages, which saw a 50% drop in page views after Google’s Panda updates, have returned to pre-Panda levels in the first three weeks since he activated subdomains for himself and several other authors. The other authors saw significant, if not full, recoveries of Web traffic.

The piece also points to a blog post Cutts wrote all the way back in 2007 about subdomains. In that, Cutts wrote, “A subdomain can be useful to separate out content that is completely different. Google uses subdomains for distinct products such news.google.com or maps.google.com, for example.”

HubPages is rolling out subdomains for all authors, which in theory, should help the site’s performance remain tied to the quality of the output by specific authors. This is also interesting given that Google recently launched a new authorship markup, putting more emphasis on authors in search results.

When that was launched, Google said in the Webmaster Central Help Center, “When Google has information about who wrote a piece of content on the web, we may look at it as a signal to help us determine the relevance of that page to a user’s query. This is just one of many signals Google may use to determine a page’s relevance and ranking, though, and we’re constantly tweaking and improving our algorithm to improve overall search quality.”

It may be a little early to jump to the conclusion that subdomains are the silver bullet leading to a full Panda recovery, but for those sites with a mix of great quality and poor quality content, this could very well help at least the great stuff rise. It will be interesting to see how HubPages performs over time, once the new structure has been live for a while.

Google’s statement on the matter (as reported by Barry Schwartz) is: “Subdomains can be useful to separate out content that is completely different from the rest of a site — for example, on domains such as wordpress.com. However, site owners should not expect that simply adding a new subdomain on a site will trigger a boost in ranking.”

To me, it sounds like if your entire site was hit by the Panda update because of some content that wasn’t up to snuff in the eyes of Google, but some content is up to snuff, you may want to consider subdomain, at least on the stuff that Google doesn’t like – to “separate it out”. You’ll have to do some content evaluation.

Edmondson’s concept of doing it by author actually makes a great deal of sense. It makes the authors accountable for their own content, without dragging down those who have provided quality content (again, in theory). Not everybody hit by Panda is a “content farm” (or whatever name you want to use) though. For many, it won’t be so much about who’s writing content.

Content creators will still do well to consider Google’s lists of questions and focus on creating content that is actually good. I case you need a recap on those questions, they are as follows:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?

Those are, by the way, “questions that one could use to assess the ‘quality’ of a page or an article,” according to the company.

What do you think of the subdomain theory? Tell us in the comments.

Google Panda Update: The Solution for Recovery?
Top Rated White Papers and Resources
  • Nelson

    I have run a number of community level domains that are all .net and have information about the communities that the sites serve. Just a week ago our site AUPARK.net was in the top 5 in searches.

    Now google will not even bring the page up. Type in AUPARK. Google now goes out of it’s way to not find our domain. Heck it even finds a domain that is no longer in service yet doesn’t find a TLD. The even stupider part is that I am a Google adSense publisher and we offer their advertisers a chance to get their ad displayed in the community they actually intend and we give the community relevant ads and a platform for local links.

    So Google this is a total failure on the search result, the ability to service your adsense customers and publishers the only way you fail more is to just not return a result.

    • Biff

      I’m assuming google dropped your rank because it looks like a keyword link spam page rather than…you know, a website.

      I admit…I’m amused by your hipster design choice to make it look like a geocities page. Nice touch.

      • http://www.webpronews.com/ Chris Crum

        It seems that the design factor is very much an important consideration. If nothing else than for the trust factor (see Google’s list of questions).

  • http://www.pandacode.com/ Stefan

    What I found is that after the Panda update Google now looks closely at how real visitors behave around your site.
    They can track this data through all Google Analytics accounts (sharing data), Google toolbar, Chrome and many other tracking mechanism.

    I have found the bounce rate and the number of pages visited to be a very strong indicator of quality in the eyes of Google.
    When I improved the usability on my sites and my web stats showed visitors staying longer – my rankings recovered.

    • Mullah Omar

      Is that part of the $67 book you are selling on the link?

    • http://www.hub-uk.com David Jenkins

      Pay $67 and still remain in the dark . . . but $67 out of pocket.

      Better to spend the $67 on a good bottle of wine and cheer yourself up.

    • http://www.patantconsult.com/ Carla Lendor

      So what happens if you have no Google analytics. I’m mazed at the kida of rabbits people pull outof the hat.

  • Allen Graves

    So what type of subdomains are they going to roll out? Subdomains by author?


    …or by category?


    Doing it by author wouldn’t necessarily help because a lot of authors write for multiple niches. Even if there are only two or three different niches for an author, it may be more confusing to the poor little Panda than anything else.

    If you do it by category, then you are not really following Google’s latest author related suggestions.

    I wonder what ole Paul is gonna do….by category, but adding all the microdata and href tags at the same time?

    Any ideas?

    • http://www.webpronews.com/ Chris Crum

      From my understanding, it is by author. Not sure how this will affect those writing for multiple categories. Will definitely be an interesting experiment to keep an eye on.

      • Allen Graves

        True – plus, it really boils down to the content itself. Is it good enough to be in the top search results? that’s the end-goal factor.

  • http://www.dry-air.com Chris Leonetti

    I feel we’ve been hit hard as my new site with all new content went up after this change. I barely get leads as I did with my older site. I am also being charged $500 a month for Google Ad Words which is beginning to raise a few serious questions as well – I doubt that being charged equally every month for the same number of clicks is real or factual any longer especially with the lack of leads. I am having our site analytics looked at as well as the new content. I’ve also read that using all of these other domains as a “back door” into your site may not work either… there is too much convoluted information out there — it is difficult to “trust” any company or anything you read when you have the statistics in your hands!

  • http://www.hub-uk.com David Jenkins

    If sub-domains are the way forward it seems a very backward step to me. There should be one domain which everyone can recognise and that should be it. Engineering a site onto good and bad domains does not seem like a logical solution when looking at improving the quality of sites . . . here’s our good stuff and over there is our rubbish.

    If it is necessary to have a sub-domain for rubbish content why have the content at all?

    Just another red herring for everyone to go out and chase their tails?

    How can anyone come up with this as the “repair” if they still can’t be certain what is “broken”?

    • http://www.webpronews.com/ Chris Crum

      Nobody said it wouldn’t be complicated. Evaluation of content is necessary. Like you said, why have the content at all if it’s “rubbish,” but some sites may also feel that some of their content is getting flagged by Google, even though they might still have some use for it.

  • http://www.platinumlynx.net Chas

    After you’ve filled the coffers of your SEO ‘expert’ to move-up in the Google Rankings, they will release Adnap and you will have to start all over again.

    • http://www.hub-uk.com David Jenkins

      Do you think Google has secretly been buying up SEO businesses ready to make a killing?

      • http://www.platinumlynx.net Chas

        Actually, I think that is too small a fish to fry, as far as Google is concerned. That might be something smaller search engines Actively pursue. Google is more interested in tying all the knots in it’s web, such as integrating +1 with Adsense, Adwords and putting a dent in Social Media Marketing and possibly cornering streaming video by going after Vimeo, or other such entities. They are more interested in crushing the competition and catching a few whales in their net, than sifting through all the minnows.

  • http://www.tden.com King Ralph

    Googles algorithm can’t tell good content from bad. I have done Google searches and some first page listings were linking to websites that obviously were written by an article spinner as they were incomprehensible to read. Some others came up with phony security alerts that undoubtedly lead to download of a virus. All Google can do is count links. Google talks a good game but so does a fool. It’s unfortunate that no real competition has arrived to start putting the screws into Google and their obnoxious search monopoly.

  • http://www.catalog-homes.com.ua Volodymyr

    In its first project on the site, Google’s rank is not displayed. Need help to increase from TIC Yandex least. Thank you in advance!

  • http://www.hub-uk.com David Jenkins

    Google search results are currently flawed. As long as that is the case no-one can come to any conclusions as to what might or might not work.

    Change something now and it might get to page 1 of results but for bad reasons and then if “normal service” ever resumes it will just disappear. All that work for nothing.

    I can quote a search example to prove results are flawed if anyone is interested.

  • http://www.thewordbay.com Mark – The Word Bay Guy

    Oh no! Another Panda theory :) Well, no less crazy than some of the theories out there!

    Actually, it sort of makes sense – I doubt it is the “solution” to Panda, but as you point out, on a site like HubPages it may serve to separate poor authors (spammers) from the better-quality stuff.

    So wait, (dons black hat) that means if domain-level “penalties” exist, then sub-domains might be a way of avoiding these…

    Actually, sub-domains have been used in very spammy ways in the past, so I am not sure how that fits in with this idea…

  • http://www.lintang.kilu.de sodrunlintang

    the best

  • http://www.ibizdaily.com Anthony

    I think this stinks because for a long time Google said to not use sub-domains. I never did and I have seen competitor sites pass me with sub-domains… the bottom line is that they can do whatever they want, which makes it really hard for small sites and small webmasters to keep up.

  • http://how-to-internet-marketing-articles-vi.blogspot.com/ Ramiro Rodriguez

    Hey Chris,

    First, I have to say that I usually delete emails that are as long as yours is but, it’s an important topic.

    Subdomains are an excellent idea as it not only weeds out the bad stuff it’ll be easier to get rid of people who consistently violate HubPages’ policy.

    I am a HubPages author and I make sure that I don’t tweak or otherwise re-print an article that’s already on HubPages as that is an important article marketing tool for me.

    Great piece. Keep up the hard work. :-)

    • http://www.webpronews.com/ Chris Crum

      Thanks Ramiro. Nice to hear the perspective of a HubPages author.

  • http://nicheblogpro.com Gerry L.

    Hi Chris,

    This article gave a lot of insights not only of the value of subdomains but more importantly the list of questions that one should consider when creating content. Often sites are built around keywords with rehashed and recycled content to game the search engines. With the Panda update, people are going to give spammy content a second look before publishing it on their website.


  • http://twitter.com/#!/mikegracen MIke G

    Gerbils running on a wheel is what comes to mind here. Even if this subdomain ‘solution’ works, G will only let it work for as long as they want and then shift the algorithm again and the house of cards comes down again. G holds all the cards and could care less about any of us unless we make them TONS of cash VERY easily. Game over if you can’t do that for them.

  • http://www.bluelightit.com it worked for my customer

    A lot of the ‘geek’ people don’t realize, that most users DO NOT look or care about the domain name. so if it’s subdomain1.mydomain.com or subdmoain2.mydomain.com or www.mydomain.com, users don’t see it.
    All they see is the page they landed on and the branding on it.
    We’ve setup one of our customers with an automated script that dynamically creates sub-domains based on keywords. From having SE ranking in positions 150 and above for only 33% of their 400+ keywords, after implementation they have 98% of their keywords ranked and most have caused other pages on the site (under the www sub-domain) to increase ranking. the average ranking is now in the first 20 for majority of the keywords, with more than 50% in the first page.

    Bing/Yahoo/and the rest still haven’t picked up the sub-domains.

    so yes – sub-domains do work, if implemented correctly.

  • http://arkanbiz.ru arkadjar


  • http://www.onearmedseo.com Oliver Bodnar

    I think too many people are concerned with the quality of content alone, while this is very important, they forget that Google wants the whole site to be quality…that means paying attention to the little things like keeping your server clean, having a proper structure, naming pages and files properly and doing all the other basic SEO things.

    None of my clients have been “hit” by Panda…and it’s funny, when I’ve tried to help friends, it’s like they don’t want to hear that there site is Fu@(ed up and they just want a quick fix or they’re hoping someone will outline an easy fix for them on sites like this!!

    There’s no quick fix…just don’t be lazy, clean up your site and you should be fine!

  • http://www.howtobecome.info/forum/ Hando

    Using subdomains does sound like an interesting technique and I was actually considering doing it few years back when we started our site but nobody did not really see any benefit in it so we used subdirectories as usual. I myself actually like the idea as it allows to use keywords in different way and I like the way that About.com has done it. They must have been well ahead of time as I remember them using subdomains for pretty much on every page since they started, it would be nice if someone would do case study on them and see if the Panda has affected them at any way.

  • Jesse

    I think if Google wants to be the police man of content they need to be a lot better at giving people thier day in court. I also think they need to be way more responsive to complaints. As it is they can shut anyone down and there is no customer service, court of appeal or any way to find out from the company what the offending website did wrong.

  • http://www.whiteoutpress.com Whiteout Press

    No, we’ve leveled off but haven’t recaptured the 30 percent or so in traffic we lost. Two posts caught my eye – the ‘quality matters’ comment and the ‘clean-up your site’ comment.

    Until now, I thought the quality argument was correct. But as a user, my Google search results are horrible and practically useless now. I’ve actually gone back to Yahoo for half my search because the results are more ‘real’. Searching mostly for current events, breaking news, and the like, I used to get dozens of links, half to the mainstream media and the other half to lesser-known, often start-ups or single-issue sites. Now, all I get are the same results that only include the bit-times like the LA Times, Chicago Sun Times, etc. And worse than that, they’re all the exact same article from Reuters or AP. For that alone, I think their newest idea will soon be changed. Google just plain sucks now for search and that’s what they do.

    I think the ‘clean up your site’ guy might have something. A clean, professional site implies quality, which is all we get on Google now, ‘implied quality’ – but very little actual quality. It’s also one of the only things that the ten news sites with the Google monopoly on search results have in common. They have the money to maintain a clean site, exactly the way Google wants it.

    Personally, I wish I knew how to clean up my site more. I’m a rookie at this and feel like I’m ahead of the game when my little site occasionally shows up on page 2 or 3 of a search. And the internet is so full of scams, I wouldn’t trust anyone or any company without a referral from someone. As small and broke as we are, we’ll just keep picking up bits and pieces of knowledge and advice from terrific folks like you guys :)

  • http://romancatholicinfo.com Roman Catholics

    Just two days ago, I listened to an internet guru claim that his sites which were set-up using subdomains were doing well and were not swiped by the Panda bear.

    I’ll definitely give this a try because I was hit by Panda 2.0

  • http://www.softpaws4u.com/ Warren

    No luck yet, still trying

  • http://www.tipsinablog.com Daniel

    Nice article, Chris.

    I read that list a little while back. Some very helpful points to consider(And of course, apply).
    I read a post recently that stated there have been hundreds of updates in recent times, as far as Google’s ways of determining it’s page search ranking are concerned( Algorithm changes).

    I am not to sure how dividing content into sub – domains would go.

    I will say I think it is possible that many larger sites(thousands of pages) may be weighed down in the rankings, due to many of their older articles(No longer relevant/ or current/ pages need a good polish) draining the sites overall performance.
    The Search ranking issue can become quite frustrating when you see so many contradictions, which, go totally against many of the “Must Do’s” for getting good search results.

    There is not a day that passes where I almost fall off my chair, upon seeing some sites with almost zero content(very poor or at best average quality) are pulling in high PR, are near the top(Strata) of site ranking(Not just Google search ranking / actual sites ranking per the site) and on occasion, have a huge price tag(Site value) attached.

    When I mention this on some Authority(More established) sites(In a comment) there seems to be an attitude of ” Oh, yeah that’s terrible”! Yet, I have a gut feeling(Which is incredibly accurate) that many(so called) Authority sites, may also be using similar methods, in one way or another.

    It’s like ” Yeah, we know. Just don’t go there”!

    Aside from this, I will hazard a guess and say Google puts the highest level of importance on Quality, Relevance(content per site/niche) and the absence of duplicate content.


  • http://www.cravingtech.com Michael Aulia @CravingTech.com

    Using a WordPress plug-in to improve my SEO, my traffic is now back and better than ever!
    I was on 3,500 unique visitors a day – after Panda, it went down to 1,800 a day :( Now I’m back to 3,800 ish a day (Story and stats on my blog post if you are interested)

  • http://get-business-online.com/ Gal

    I’m thinking Google wants to consider site-level indicators as well as page-level ones and splitting things up sure makes that easier. With things like Wonder Wheel and site shortcuts, being able to tell what a site is about is important.

    Actually, true to its human imitation form, this is what I would like to see as a person when I search – the SITES that fit my query, not just pages in web directories or article sites that don’t specialize.

    Even using the author as a search indicator makes sense to me.

    Funny how we sometimes miss innocuous statements (like the one about Google’s regard of subdomains from 2007) until they come back to bite us.

  • http://www.furpetsonly.com R. B. Jeffrey

    Yes. After the panda update I was unable to find my company with the search term “Dog Collars” Or “Dog Collar” Presently I am on page 22 for dog collars and on page 14 for “dog collar” Before the panda update the search term “Dog Collars” found me on page 21. I have done a lot of work on content and on SEO to get back as far as I have been able to get back.

  • http://www.boholwebdesign.com/ Rey

    Panda has done nothing but boost our websites…all 65 of them offer unique content, correct spelling and punctuation. I would assume allot of websites from non English speaking countries will surely see their sites rank diminish with the new Panda protocols. I believe Panda is a good thing due to the fact that i have spent many years of hard work studying English and others who have not? They will surely perish until they “wake up and smell the Panda” and learn to write and spell properly. This is the solution for recovery in my opinion. Great article Chris!

  • http://www.glenwoodfin.com Glen Woodfin

    Chris, I really appreciate this article. Cracking the Panda algo has been a consternation to many.

    I also applaud your question about has anyone recovered from the Panda Update. Hopefully, some of them will share tips.

  • http://www.seakayaksforsale.org Alex

    Interesting article. Trying to separate the poor quality content from the good quality content by creating a subdomain on a primary domain, in my opinion would still hurt the site. It is essentially still contents from the same site. Do I make any sense?

  • http://www.copy-e-writing.in/blog Ron’s SEO Copywriting Blog

    Interesting. I totally believe that when there is a problem, there is a solution as well.

    Have you seen lately that the blogspot blogs are rising up in the SERPs. Why is so? Is that because of the usage of sub-domains?

    Excellent topic here. I would just wait and watch at the moment. But one thing I supremely believe. These mammoth businesses of Ezinearticles, Hubpages and so forth are NOT going down in a day. They will climb out of the perils of Farmer update soon.

  • http://www.seoresults.co.za Travis

    I think everybody should read this excellent post by Conversation Marketing’s Ian Lurie:

    <a href="http://www.conversationmarketing.com/2011/07/wsj-wtf-google-panda-subdomains.htmWSJ, WTF?! Panda & Subdomains

  • http://www.automatedsocialnetworking.com Nicole

    wow..realy straight forward. really good one
    thanks to the person or a team who came up with it

  • http://www.sebastyne.com Sebastyn

    I was gob-smacked by this update, because several of my sites got an improved ranking after the latest update. I am not big on article marketing (too much trouble) and I haven’t done much link building at all, so most of my back links are naturally generated. That must have been the whole point of the update. Rewards the lazy. XD

  • http://antex.hexat.com Haeckel

    Thank’s this info

  • http://kanbena7ob.forumegypt.net/ ba7bakenta

    movis mb3 porogram

  • http://editorial.equities.com dennis

    I’ve noticed recently ezine and hubpages have modified their criteria for article authors and have reduced the amount of links allowed. This may have something to do with it too!

  • http://cellhow2.com cornea503

    Another shot in the dark by the SEO community. Cutting a lemon into lemon wedges doesn’t turn it into an orange.

  • Don

    All this Google CRAP boils down to one word–censorship. They are so big, high and mighty now that they can tell people what to do. Just like the damn United States government thinks they can do the same thing. All of them are a bunch of Assholes.

  • http://www.chennaipackersandmovers.com/ John Michal

    My Site, http://www.chennaipackersandmovers.com/
    , Top in Ranking But Now Down, Please tell me reason

  • http://www.webimax.com search engine optimization

    Google specifically mentions the use of subdomains to separate out content as unique – Although it is not the entire solution it is a great strategy to implement.

  • http://www.andeanexpresstravel.com/ Shomara

    it sounds like if your entire site was hit by the Panda update because of some content that wasn’t up to snuff in the eyes of Google, but some content is up to snuff, you may want to consider subdomain…

  • http://www.webdesignghana.org Michael

    Frankly speaking, though I was hit by the panda update in one of my websites, its not a bad thing. The problem is that although google can try all that it knows, since its the bot that does the work and not a human, their search results will still be manipulated by seo experts.

    What about websites that are just an online identity of a company? They don’t need to write long articles or accept credit card to be authoritative. Most of the questions are useless for real business. I think google did that to get rid of vague article sites.

  • http://www.kubodo.com/ Kubodo

    clear information i will tray list question of google, though it’s too hard for me

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom