SEM Strategy: Don’t Look Like a Duck?
As much as you "might" escape sanctions from the Google indexing gods if you construct pages that "just look like" other, more spammy, pages, the reality is, if you go into the forest dressed up like a duck… it may not matter if you even quack like one, your danger rating goes up.
Basically: my personal philosophy on the SEO side is to dial back on excessive "on-page tactics" intended to give rankings that "extra boost." There are other ways to rank.
A particular SEO bugaboo for me is that "text way below the fold" technique. Fine if it’s somewhat below the fold and it’s navigational in nature. But not fine if it just looks cheesy and spammy. What "respectable" site would do that?
Search marketing is marketing first, and that involves a consistent, professional process for communicating with readers and customers. A comprehensive, analytical, patient approach *does* work. Creating more useful content *does* work. And above all, off-page stuff does the heavy lifting of enhancing your reputation and standing in the engines.
So back to why you’d use hidden text in the first place? Oh, I’m sure we can dream up all kinds of "legitimate" scenarios. Not pretending I play in this particular sandbox, the "illegitimate" scenarios involve low quality content being "thrown at" the search index while showing users something else. Whether they’re gibberish pages users actually see, as opposed to gibberish hidden from users, and from there… gathering data on which of these two not only ranks in spite of Google’s vigilance, and which leads to conversions to sales of porn or hot tubs… this would be the daily existence of the professional index spammer and the amateur index spammer-dabbler. If you’re a real company, isn’t it nice not to have to worry about those kinds of calculations? So if you are real, don’t hire the amateur index spammer/dabbler person! A little knowledge residing in the brain of the business owner’s nephew who built the site and knows "a lot about SEO"… can be a dangerous thing.
The bottom line? Quibbling about whether Google does or does not allow some specific sub-technique is not the way to go. It’s not like they can give you "license" to work some "loophole". They use automated methods on both the paid and unpaid sides to flag violations. This in turn may trigger some human review, which can and will exercise editorial judgment as to intent. And as we’ve seen of late on the paid side, Google even makes official comments on "business models to avoid."
Google has been talking about intent for years. The spamsters don’t want to hear it.
The webmaster forums may be loaded with folks trying to find out how to best spam Google with hidden text tricks they don’t mind, or can’t catch. But this misses the entire point. A human rater can look at your site and decide, based on criteria, that it falls into some category that is low quality in users’ eyes, such as "thin affiliate." This can lead to low rankings, penalties, and banning. Even this system is highly imperfect because it still gives too much advantage to serial spammers and sophisticated cheaters. Something new is needed to rebalance things in favor of quality sites, even more so than today.
Creators of quality content will increasingly be rewarded through new ranking methods, in my opinion.
Site developers commenting on several legitimate uses of hidden text techniques (see the comments in Eric’s seomoz post) just serve to emphasize the point that certain sites might fall into an *automated* net that flags certain deceptive techniques, but they do not deserve to. That just increases the load of human judgment on Google, or the importance of other (off-page) factors indicating quality and relevancy. Spammers *will* find ways of hiding text that Google simply does not want to work too hard to find algorithmically, as it would create too many false positives in any case.
Funnily enough, then, after looking at it from all angles, the presence or absence of any but the most one-sidedly spammy hidden text techniques would appear to be a very weak signal of quality; one that Google cannot realistically weight very heavily for ranking purposes.