Google Panda Update: DaniWeb Recovers AGAIN

More Panda tweaking? Google Correcting Mistakes?

Get the WebProNews Newsletter:

Google Panda Update: DaniWeb Recovers AGAIN
[ Search]

It’s funny how things can change with Google and the Panda update. DaniWeb has been a prime example of this. At first, the IT discussion forum was hit by the update, but was able to make a 110% recovery. Then last week, DaniWeb was hit once again, which founder Dani Horowitz let us know about in an email. DaniWeb lost over half of its traffic overnight.

Today, we received another email from Dani with much better news. “The Google saga continues. We have just recovered. Google Analytics is very delayed, but it is already reporting that we have received as much traffic today as we received all day yesterday, and it is not even 2 pm yet,” she tells us. “Clearly Google admitted they screwed up with us.”

The timing of this is quite interesting. We posted an article this morning about how Dani’s team discovered that DaniWeb’s “time on site” stats decreased by 75% at 1PM on August 11, and held steady at the reduced number, as what she said is the result of “Google Analytics rolling out their new session management feature.”

“There have been MANY reports across the web of the bounce rate and time on site being inaccurate every since August 11th, especially when multiple 301 redirects are involved (which we use heavily),” she said at the time. “As a result, we have been hit by Panda. Or so I gather.”

In the latest email, Horowitz points us to a recent Q&A with Google’s Matt Cutts.

With that (before the recovery), Dani wrote:

For those who don’t want to watch the full 45 minutes, fast forward directly to 18:30 in the video. It’s essentially Matt Cutts answering a question of when the next versions of Panda are going to run. His response was that, “We’ve made many, many changes over the last few months, even within Panda, trying to iterate, find new signals. You see a site like DaniWeb complain and then we find signals and say, okay, here’s a way we can differentiate between this site and the sites that might be a little bit lower quality.”


This video was recorded on September 22nd, after our recovery and before we were hit again. So I don’t know what to take from that. It can be one of two things:

(1) Matt and the rest of the Google webspam team not only know about my situation, but realize we were penalized unintentionally. They have been working towards making Panda not hurt us, and it was a mistake that we were hit again, and that will be soon fixed.

(2) Matt and the rest of the Google webspam team are aware of our situation, and think we should have been penalized. I misunderstood Matt’s quote and he meant that DaniWeb is a complainer and IS one of the low quality sites they will continue to work on filtering out of the SERPs.

Then came the recovery, so it seems like number one is more likely. We’ve reached out to Google for confirmation that a tweak has been made on Panda. We’ll see what kind of response we get on that. This is fairly reminiscent of when CultofMac was hit by Panda and recovered shortly thereafter.

Let’s refer back to a quote from a Wired interview with Google’s Amit Singhal, who said, “Any time a good site gets a lower ranking or falsely gets caught by our algorithm — and that does happen once in a while even though all of our testing shows this change was very accurate — we make a note of it and go back the next day to work harder to bring it closer to 100 percent…That’s exactly what we are going to do, and our engineers are working as we speak building a new layer on top of this algorithm to make it even more accurate than it is.”

As you know, Google makes “roughly 500″ yearly algorithm adjustments.

Google Panda Update: DaniWeb Recovers AGAIN
Top Rated White Papers and Resources
  • Its a Miracle

    Yay! Now the rest of you suckers, starve. Google got rig of a bad public relations problem.

    Her bounce rate is in the 80’s Chris according to Alexa and has been that way for ages before Panda http://www.alexa.com/siteinfo/daniweb.com# . It’s impossible that it was doubled by Analytics as 80+80=60+100 :).

    In short, it was a manual exception that very few average sites get.

  • Its a Miracle

    Cross-posting it: Chris, many other sites that came back on July were hit too this Panda. Most likely shows that they were actually manually exempt but that code must have been left this Panda. In other words, it appears that they would have never escaped Panda on their own, without some cooking.

    • http://www.idesthost.com Edwin

      Google says it does not white- or blacklist and there where situations where it seem to be desirable. Those situations had much more impact on the internet community then just a tech forum. Therefore, I do not believe that there is any manual action by Google toward DaniWeb. Instead more likely they rolled back the latest update.

      • Its a Miracle

        No they didn’t roll it back. They cooked it so certain sites are no longer in panda’s sight. They did that by looking at what Daniweb is different from other Panda sites, or to differentiate it so it escapes. If that isn’t manual, what is it? CultOfMac?

  • http://liliputing.com Brad Linder

    Liliputing was also hit on the 28th, and we had pretty much recovered by October 3rd/4th.

    I don’t know if it will hold. I don’t know if it has something to do with the fact that I publicly pointed out the fact that Liliputing had been hit or if Google would have tweaked things anyway.

    What I do know is that this time our traffic dipped for a few days instead of half a year.


  • http://www.askthetrainer.com Mike Behnken Personal Trainer

    This whole thing has stunk to high heaven from the moment it was introduced. Essentially penalizing entire sites due to some bad content is not right.

    You could probably mathematically determine that the claimed “success” of google panda is likely due to pandalising a a small number of sites.

    For example if you took the top 100 sites in traffic that got hit like answers.com, ehow, mahalo etc. it would make a difference due to the shear number of pages they have…. This is good, but small sites run by an individual or a few get the same treatment is plain reprehensible.

  • http://www.siamhomesource.com Thai Petchaburi

    This google panda is ripe for someone to come up with a class-action lawsuit and recoup the lost earnings of honest webmasters.

    What gives them the right to single out one site and say, “oh yeah X site was wrongfully hit”

    What’s next, google leaving out one candidates they don’t like in their during an election???

    They abuse their power and there needs to be intervention.

  • Steve

    I’ve been reading up on the DaniWeb drama over at WMT; she got whitelisted just like Cult of Mac. She has recently started a new thread denying any special treatment on the part of the Google webspam team but more than once Matt Cutts has referred to big brands suggesting that those are what Google considers to be high quality. But last winter JC Penny got blasted over black hat SEO tactics and other big name brands have not always produced content that benefits their consumers.

    The Panda update is a complete failure, lots of scraper sites still manage to make their way into the results and the overall quality of Google search is horrible. Cutts should just come out and admit that he favors big brands – including Google’s own stuff.

  • Joe Youngblood

    never thought my simple question would cause such a controversy. new question for #askmatt

    if panda can tell you what pages are percieved quality and what are not, why can you not tell me in webmaster tools? you tell us rounded up page rank, you tell us quality scores in adwords, why not tell us panda quality scores?

  • http://www.affordableinterpreters.com Cate Yan

    I know it’s a basic question.. but, is there a way to see what exactly the page is being penalized by the Panda?

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom