Did Google Penalize A Site For A Natural Link From Moz?

By: Chris Crum - July 23, 2014

Update: We’ve updated the post with some additional comments from Fishkin he gave us via email. See end of article.

Google has been on a warpath against what it thinks are unnatural links, but many think it’s off the mark with some of them. Meanwhile, the search giant scares people away from using even natural links in some cases, whether it intends to or not.

Have Google’s warnings to webmasters had an impact on your linking practices? Let us know in the comments.

When one thinks about reputable companies and websites in the SEO industry, Moz (formerly SEOmoz) is likely to be somewhere near the top of the list. YouMoz is a section of the site that gives voices to other people in the industry who don’t work for the company. It’s essentially a place for guest blog posts.

YouMoz, while described as a “user generated search industry blog” isn’t exactly user-generated content the same way something like Google’s YouTube is. YouMoz content must be accepted by the Moz staff, which aims only to post the highest quality submissions it receives. This is the way a site is supposed to publish guest blog posts. In fact, Google’s Matt Cutts seems to agree.

If you’ll recall, Google started cracking down on guest blogging earlier this year. Google made big waves in the SEO industry when it penalized network MyBlogGuest.


A lot of people thought Google went too far with that one, and many, who either hosted guest blog posts or contributed them to other sites were put on edge. Reputable sites became afraid to link naturally, when the whole point is for links to be natural (isn’t it?).

Understandably concerned about Google’s view of guest blogging, Moz reached out to Cutts to get a feel of whether its own content was in any danger, despite its clear quality standards. In a nutshell, the verdict was no. It was not in danger. Moz co-founder Rand Fishkin shares what Cutts told them back then:

Hey, the short answer is that if a site A links to spammy sites, that can affect site A’s reputation. That shouldn’t be a shock–I think we’ve talked about the hazards of linking to bad neighborhoods for a decade or so.

That said, with the specific instance of Moz.com, for the most part it’s an example of a site that does good due diligence, so on average Moz.com is linking to non-problematic sites. If Moz were to lower its quality standards then that could eventually affect Moz’s reputation.

The factors that make things safer are the commonsense things you’d expect, e.g. adding a nofollow will eliminate the linking issue completely. Short of that, keyword rich anchortext is higher risk than navigational anchortext like a person or site’s name, and so on.”

It sounded like YouMoz was pretty safe. Until now. Contributor Scott Wyden got a warning from Google about links violating guideolines, which included his YouMoz article as well as a scraper post (that’s a whole other issue Google should work out).

“Please correct or remove all inorganic links, not limited to the samples provided above,” Google’s message said. “This may involve contacting webmasters of the sites with the inorganic links on them. If there are links to your site that cannot be removed, you can use the disavow links tool…”

The problem is that, at least according to Moz, the links were not inorganic.

“As founder, board member, and majority shareholder of Moz, which owns Moz.com (of which YouMoz is a part), I’m here to tell Google that Scott’s link from the YouMoz post was absolutely editorial,” says Fishkin in a blog post. “Our content team reviews every YouMoz submission. We reject the vast majority of them. We publish only those that are of value and interest to our community. And we check every frickin’ link.”

“Scott’s link, ironically, came from this post about Building Relationships, Not Links,” he continues. “It’s a good post with helpful information, good examples, and a message which I strongly support. I also, absolutely, support Scott’s earning of a link back to his Photography SEO community and to his page listing business books for photographers (this link was recently removed from the post at Scott’s request). Note that “Photography SEO community” isn’t just a descriptive name, it’s also the official brand name of the site Scott built. Scott linked the way I believe content creators should on the web: with descriptive anchor text that helps inform a reader what they’re going to find on that page. In this case, it may overlap with keywords Scott’s targeting for SEO, but I find it ridiculous to hurt usability in the name of tiptoeing around Google’s potential overenforcement. That’s a one-way ticket to a truly inorganic, Google-shaped web ”

“If Google doesn’t want to count those links, that’s their business (though I’d argue they’re losing out on a helpful link that improves the link graph and the web overall). What’s not OK is Google’s misrepresentation of Moz’s link as ‘inorganic’ and ‘in violation of our quality guidelines’ in their Webmaster Tools. I really wish YouMoz was an outlier. Sadly, I’ve been seeing more and more of these frustratingly misleading warnings from Google Webmaster Tools.”

Has Moz lowered its standards in the time that has passed since Cutts’ email? Fishkin certainly doesn’t think so.

“I can promise that our quality standards are only going up,” he writes, also pointing to an article and a conference talk from the site’s director of community Jen Lopez on this very subject.

“We’d love if Google’s webmaster review team used the same care when reviewing and calling out links in Webmaster Tools,” Fishkin writes.

Burn.

Cutts would most likely have something to say about all of this, but he happens to be on leave, and isn’t getting involved with work until he comes back. He has been on Twitter talking about other things though. It will be interesting to see if he gets sucked back in.






The whole ordeal should only serve to scare more people away from natural linking as Google has already been doing. If Google is penalizing a site for links from a site like Moz, what’s safe?

We’ve reached out to Fishkin for further comment, and will update accordingly.

Update: Fishkin tells us via email that he doesn’t think Google’s targeting of guest blogging in general is off base, but that their reviewers “need to be more discerning in marking problematic links.”

He goes on to say: “When they select editorial links to highlight as problematic ones, they’re creating a serious problem for site owners on both sides. Correctly identifying non-editorial links really does help site owners improve their behavior, and I know there’s plenty of folks still being manipulative out there.”

“In terms of Google ruining natural linking, I suspect that’s an unintended side effect of their efforts here. They’re trying to do a good thing – to show which links are cuasing them not to trust websites. But when they mark editorial links as inorganic, they inadvertently scare site owners away from making positive contributions to the web with the accordingly correct citation of their work. That’s how you get a Google-shaped web, rather than a web-shaped Google.”

Image via Moz

Do you think Google is going overboard here? Share your thoughts in the comments.

About the Author

Chris CrumChris Crum has been a part of the WebProNews team and the iEntry Network of B2B Publications since 2003. Follow Chris on Twitter, on StumbleUpon, on Pinterest and/or on Google: +Chris Crum.

View all posts by Chris Crum
  • http://www.hiswebmarketing.com/ Marie Haynes

    As I mentioned on the Moz post, I think it’s important to note that Moz was not the source of this penalty. The site had a pattern of unnatural links. If guest posting was used on a large scale to build links then the Moz link was just one example from a large list. It’s the patterns that are important, not where the individual links came from.

    • Chris Crum

      This may very well be a valid point (and I encourage readers to read your comment on the Moz post), but either way, Google is presenting the Moz link as one of the few examples it actually provides. The other one is apparently a scraper, which is out of the webmaster’s control. That’s also a problem. If there are better representations of the problem, then these messages should reflect those.

      • http://www.hiswebmarketing.com/ Marie Haynes

        Fair point Chris. In regards to scraper sites, if the site that originally was scraped was an unnatural link then the scraper sites need to be addressed to. It might seem unfair, but look at it this way…If years ago I created an ezine article so I could boost my rankings I would *want* scrapers to pick it up so that I’d get more links. So, it kind of makes sense that I need to address those now.

        I agree with you though that it would be better to give better examples. I think that perhaps this is a problem of scale. A year or so ago the webspam team was getting 5000 reconsideration requests a week and I bet that number has increased. So, perhaps the system they have now is the best they can do and it’s not worth their time to go looking for better examples? I’m not saying that’s fair, but that’s probably what’s happening.

    • Don Ricks

      Marie – If you’re talking about Google’s tolerances, then I agree with you. But how is Moz’ own position on rewarding guest posters with dofollow backlinks not clearly a violation of non-editorial backlinks? It would seem like it’s payment for volunteering a good article and no different than any other guest blogging site.

      • http://www.hiswebmarketing.com/ Marie Haynes

        Matt Cutts in his blog post on guest posting (http://www.mattcutts.com/blog/guest-blogging/) did say that there could be some instances where this is ok. He cited high authority multi-author blogs like Boing Boing. I think it’s important to note here that the penalty did not come because of a single link from Moz. It was because of the pattern of unnatural links pointing to the site and there were lots of them.

        • Don Ricks

          Marie, the bolded portion in Matt’s blog post refers to “a dofollow link or two in the article body.” With respect, I think the distinction is that the dofollow links weren’t even in the article body – they were in the author bio. Reading the Moz article, it seems to be clearly saying that it’s OK to reward guest authors of “quality content” with dofollow backlinks in their bio (unspoken “if they couldn’t fit the self-serving text into the article.”) IMHO, Matt Cutts’ post is just saying “we’ll give you guys some liberties but please don’t pretend as if all of us at Google are oblivious to what’s going on.”

          • http://www.hiswebmarketing.com/ Marie Haynes

            There are lots of guest posts on Google owned sites that have followed links in the author bio. Check out this one for example: http://analytics.blogspot.ca/2014/01/how-analysis-exchange-is-helping-non.html.

            I’m not saying that we should all run out and get guest post links from high quality sites now but I do think that in some instances it can be ok.

          • Don Ricks

            Do you mean that “it can be ok” or that “it’s not ok but Google will allow some tolerance limits?” There is a big difference between the two.

            In many corporations, one division may be clueless about another’s policies. Sometimes an intern may be responsible for these types of menial editing gaffes or there could be a legitimate concern about why some of these guest post articles has dofollow backlinks. And for all we know, Google has decided that dofollow backlinks from its own official blogs are not calculated for purposes of PR or search results. None of what happened on Google’s sites changes what Matt Cutts & Co. have actually been saying. And before acting upon what we see happen occasionally on Google sites, it’s probably prudent to ask.

            Regardless, I still haven’t heard yet how dofollow backlinks in a bio are “editorial” in nature. Popular PR passing guest blogging sites are just fearful that if they can’t reward free content with dofollow backlinks, they might have to compensate with some actual value.

          • http://www.hiswebmarketing.com/ Marie Haynes

            So many people are wrongly focused on whether or not individual links are going to get them penalized…

            Will this guest blog link get me penalized? What if I have a link in the author bio vs the text?

            Will this directory link be penalized?

            Will a link from an industry partner’s resource page be bad?

            But, what’s more important is the pattern of linking that is there. Both manual penalties and Penguin were created to demote people who have been overtly cheating to try to boost their Google rankings.

            So, if an authoritative site like Moz approves my guest post and allows me to link back to my site, then that’s probably not going to cause problems and possibly could even help me. But, if I have a pattern that shows that I’m getting hundreds of links from guest blog links then this is more of a concern.

          • Don Ricks

            Marie… with respect, who is confused? I think that so many people are wrongly focused on whether or not they will “get away with it” and not whether an individual act is actually wrongful and against the rules.

            Penguin was created as a way to police wrongful acts automatically (in bulk). The *fault* tolerance limits are in place because it’s an algorithm and Google is erring on the side of caution when there isn’t a human to confirm. The fact that its not caught or isn’t specifically punished by the machine doesn’t make it OK. Let’s say your spam detection is set to catch 3 accounts created per minute. That logic seems to say that bulk is the issue and that if you create 2 spam accounts, there isn’t a problem and the spam isn’t spam.

            You imply a good question – why does an “authoritative” site like Moz have dofollow links in author bios? You speak of patterns – doesn’t this create a pattern of outbound dofollow links from Moz? And why should such a pattern be only permissible from Moz? Because the authority says that a bio link which is a non-editorial link on my site is suddenly an editorial link on said authority site?

          • http://www.hiswebmarketing.com/ Marie Haynes

            All good points. I think that Moz generally does a good job at vetting the outbound links in their articles. The “new jersey photographer” link was in the author’s bio which actually can be changed by the author after the article is published. I’m betting that Moz is looking into this now so that they are not granting keyword rich self made links.

          • Don Ricks

            I hear you Marie… but Rand explicitly states “I also, absolutely, support Scott’s earning of a link back to his Photography SEO community.” So his opinion seems to be that if you write a quality blog post for YouMOZ, they can reward contributors with a high PR dofollow link to your own website or one of choice. He seems to imply this was and is no mistake.

          • http://www.hiswebmarketing.com/ Marie Haynes

            It’s a bit of a circular discussion we’re having now but I’d refer you back to the Google Analytics blog. Look at all of the guest posts that they have published and quite happily allowed at least one link back to the guest poster’s site: https://www.google.com/search?sourceid=chrome-psyapi2&ion=1&espv=&ie=UTF-8&q=site%3Aanalytics.blogspot.com%20%22guest%20post%22

            I am not 100% sure on this but I think that in Rand’s argument he was thinking that the unnatural link was the one that was anchored with “business books for photographers”. It’s possible that Rand did not see that the author bio previously was linking with “New Jersey photographers”. It’s a fine line, I know, but, I do think that it is ok for some sites to allow links in their guest posts provided that the guest posts are actually bringing value.

            This has been a good discussion. I’ll likely be writing an article on everything that Google has said about guest blogging to try to bring some clarity to the situation.

          • http://www.discoverafricagroup.com/ Andre Van Kets

            @Marie – I’m with you 100%. Link penalties are all about patterns.

            Not individual links.

            I’ve had the privilege (no sarcasm intended) of going through the necessarily painful and tedious task of recovering from a Google unnatural links penalty for three websites. In all cases, penalties have been issued, because of patterns of unnatural links, not because of individual links.

            At the end of the day, link penalties are all about INTENT. And Google uses it’s ability to identify patterns — which are statistically outside of the norm — as a mechanism to identify intent.

          • Don Ricks

            @AndreVanKets:disqus – Many SEOs suffer from a serious issue with perspective. It’s always about “what will get me caught” not about “is what I am doing wrong?” At the end of the day, if you’re not within the Guidelines, your intent of creating patterns really doesn’t matter. There is justification for the penalty. In most cases the intent is a byproduct of enough abuse to attract attention.

  • https://restore.solutions/ Numus Software

    We have to actively block so many bad bots / scrapers referral engines / pic scrapers… we ended up using ISAPI Rewrite 3 to completely block anything that looked suspicious.. this was down to warnings we received from Google, based on pages on our DISQUS that had been scraped and republished.. so even commenting.. can be scrapped rehashed.. and used against you… the code we used and built on is below..

    #Block referral SPAM
    #Add keywords between the () below and separate with |
    RewriteCond Referer: .*(?:keywords|go|here).*
    RewriteRule (.*) /path/to/page.htm [NC]

    and

    Block annoying robots

    Here is a useful example to block a number of known robots and retractors by
    their user agents. Please note this rule is long and we have divided it into
    lines. In order to work correctly no spaces can be added to the end or beginning
    of the lines:

    #Block spambots
    RewriteCond %{HTTP:User-Agent} (?:Alexibot|Art-Online|asterias|BackDoorbot|Black.Hole|
    BlackWidow|BlowFish|botALot|BuiltbotTough|Bullseye|BunnySlippers|Cegbfeieh|Cheesebot|
    CherryPicker|ChinaClaw|CopyRightCheck|cosmos|Crescent|Custo|DISCo|DittoSpyder|DownloadsDemon|
    eCatch|EirGrabber|EmailCollector|EmailSiphon|EmailWolf|EroCrawler|ExpresssWebPictures|ExtractorPro|
    EyeNetIE|FlashGet|Foobot|FrontPage|GetRight|GetWeb!|Go-Ahead-Got-It|Go!Zilla|GrabNet|Grafula|
    Harvest|hloader|HMView|httplib|HTTrack|humanlinks|ImagesStripper|ImagesSucker|IndysLibrary|
    InfonaviRobot|InterGET|InternetsNinja|Jennybot|JetCar|JOCsWebsSpider|Kenjin.Spider|Keyword.Density|
    larbin|LeechFTP|Lexibot|libWeb/clsHTTP|LinkextractorPro|LinkScan/8.1a.Unix|LinkWalker|lwp-trivial|
    MasssDownloader|Mata.Hari|Microsoft.URL|MIDownstool|MIIxpc|Mister.PiX|MistersPiX|moget|
    Mozilla/3.Mozilla/2.01|Mozilla.*NEWT|Navroad|NearSite|NetAnts|NetMechanic|NetSpider|NetsVampire|
    NetZIP|NICErsPRO|NPbot|Octopus|Offline.Explorer|OfflinesExplorer|OfflinesNavigator|Openfind|
    Pagerabber|PapasFoto|pavuk|pcBrowser|ProgramsSharewares1|ProPowerbot/2.14|ProWebWalker|ProWebWalker|
    psbot/0.1|QueryN.Metasearch|ReGet|RepoMonkey|RMA|SiteSnagger|SlySearch|SmartDownload|Spankbot|spanner|
    Superbot|SuperHTTP|Surfbot|suzuran|Szukacz/1.4|tAkeOut|Teleport|TeleportsPro|Telesoft|The.Intraformant|
    TheNomad|TightTwatbot|Titan|toCrawl/UrlDispatcher|toCrawl/UrlDispatcher|True_Robot|turingos|
    Turnitinbot/1.5|URLy.Warning|VCI|VoidEYE|WebAuto|WebBandit|WebCopier|WebEMailExtrac.*|WebEnhancer|
    WebFetch|WebGosIS|Web.Image.Collector|WebsImagesCollector|WebLeacher|WebmasterWorldForumbot|
    WebReaper|WebSauger|WebsiteseXtractor|Website.Quester|WebsitesQuester|Webster.Pro|WebStripper|
    WebsSucker|WebWhacker|WebZip|Wget|Widow|[Ww]eb[Bb]andit|WWW-Collector-E|WWWOFFLE|
    XaldonsWebSpider|Xenu’s|Zeus) [NC]
    RewriteRule .? – [F]

  • Don Ricks

    Am I missing something here? It looks like an author followed a link in his own bio to a site of his own choosing. This seems to be a self-serving endorsement, not a link performed in an editorial context that confers some type of validation or endorsement for the value of the linked destination site. So what is Rand trying to say? That since MOZ is of such great quality that it reserves that right above other guest blogging sites?

    • Jack

      The editorial link in question was removed I believe. The fact that there are editorial guidelines and actual people reviewing the articles is what makes it exempt i think.

      • Don Ricks

        No, only one was. I read this article and it seems as though Rand is saying that if you submit good, high quality articles to Moz, then Moz has the right to give you at least two dofollow backlinks to your site and another site in your bio as a reward for your donation. Somehow if it passes the Moz quality review test, a bio becomes editorial. It strikes me as exactly the type of guest blogging dofollow link compensation that Google is trying to avoid.

  • http://www.2bubbleblog.com 2BubbleBlog

    The whole point is just to rely only on Adwords, nothing else… Last time on their homepage I’ve seen an ad that made me laugh. It was showing a web entrepreneur who was proud to have Adwords as his primary source of traffic… How could somebody being proud to spend thousands of $ to get traffic from Adwords?!!!! If your main traffic is from them, you’re screwed aren’t you?!!!

    • Anon

      Destroy the WWW, dismantle it using the power you have stolen as an unrestricted and unaccountable private corporation, then make the entire WWW totally reliant on you alone. That is Google’s ambition.

      Revolution is not far away.

    • http://www.frontlineweb.biz Michael Andrews

      Not if your making the Dollar your not

  • Anon

    Listen up, Google is a corporation, it’s not a fluffy and cuddly company working for the betterment of the web. Forget the BS “don’t be evil”. Matt Cutts is the equivalent of the PR goon for an evil despot living in a cave and stoking a cat while his minions build a death ray.

    Everything Google does is managed, manipulated and planned to increase either profits or control. This “accidental” threat to natural linking is not an accident, they have thought this through and they realize that the best way to gain complete and utter control is to dismantle everything the WWW is built on, to replace it with Google.

    This is a domination game, a destruction of the WWW piece by piece until the only thing left is Google. Organic linking will continue to be attacked by Google until every single site out there is FORCED to pay it for advertising.

    Watch what happens in about a year or two, when Google announces that it’s attacking social media linking and accusing people of spamming to excuse their domination of that too. Google is looking to make every single site absolutely reliant on it and its ad network. In ten years we will all be forced to spend thousands a month on Google ads if we ever want to get good traffic.

    • anon

      I think that more and more informed people are turning against Google.

      Governments (praise Germany) are obviously looking to take Google down a peg or two.

      The kind of domination you refer to is undoubtedly what Google have planned but I do not think they will achieve it – as they get bigger and bigger more people will hate them.

      There is already a lot of resentment in outside of the US by people who question why they should pay a US corporation to advertise to people in their own country – there are many many rats just looking to jump ship, I am one of them – I think it sucks that here in N.Europe I pay a US giant to advertise my business – I desperately want to pay a European company and I will do so at the very first opportunity. This is not an anti-American message – I read and hear Americans saying all the time “buy American” well I feel the same way only my version is “buy European” I am saying nothing more than the average American.

      Specialist websites offer advertising – they have the advantage of having a specific reach the only thing that stops me using them is that they do not offer pay per click ( I have to pay upfront ) and do not offer sufficient tracking mechanisms to ensure I can analyse what I am getting for my money. The instant these specialists ( and we are talking of major corporations ) start offering me pay per click and sufficient analysis of results I will jump ship. I do not want to pay Google a penny more than I have to – I hate paying them and will need absolutely zero persausion to drop them as soon as it becomes possible.

      The large corporations I am talking of are starting to take note and I do expect to see adwords style options in the next few years.

      The thing we need to see here is more and more specialist websites providing indenpendent advertising options with a standard adwords style – then we will see where loyalties really lie.

    • djemir

      Similar to what Facebook has done by decreasing the amount of people that can view your business page posts then adding the “Boost visibility” by paying button. Basically trying to force you to advertise to get the same visibility you used to get free from all your fans and subscribers.

  • http://www.couchespouradultes.fr/ Patricia

    I have had exemples of real links shown as spammy, i have a customer that is dedicaded to alzeihmer’s desease in France and he had link with anchor text “Alzeihmer” from a big pharma site, it was a totally legitimate link, not bought and it was pointed out by Webmastertools as fraudulent + customer’s site vanished far far in the results. It took us numerous reconsideration requests to get someone at google to accept that this link was totally legitimate, i had to make lots of noises on many forums to bring attention to this case, i wrote an article about it : http://www.sitepenalise.fr/quand-google-se-trompe-liens-frauduleux-qui-sont-legitimes/ ( sorry it’s in french ). Of course the link profile of this customer was not very clean, a previous seo attempt in the past generated many bad links, but come on , it was so obvious that this link was legitimate, there is no way you could fake that one…

  • http://www.givemedeals.com Shabu Anower

    This is getting worse day by day, nobody wins :(

  • John Hogan

    Scrapers, Spammers, Link farms, and googledegoop for content all bad. Some are good to go in Google’s eyes while others are not. No logic, just true. With the ‘Net Neutrality’ bill going no where and the Fed wanting full control over the web the problems we have today have just begun!

    Sometime sooner than you may wish to believe you will not even be listed in Google unless you are on their A team of approved sites OR have opted for an expensive Google Ad campaign. Think that will NEVER HAPPEN? Better think again and read Google’s policies a bit better as NO ONE FOLLOWS THEM at this point which will likely put you in the back of the pack at some point THEY decide is the right time to push it.

    The web is quickly becoming the ‘Corporate Web’ and that is exactly where the fed and Google would like to see it. Not much money in it for them for helping the small business people (never was).

    Google is NOT the only search engine ya know. There are better ones and more on the horizon every day.

  • djemir

    I’m done with Google, their search results are actually getting worse in my opinion because of all their antics to make them better. I use them more for image search now than anything else and once other search engine image searches get better I may drop them all together. I still link to sites and have links to my sites, I could care less about their warnings, they are bipolar anyway, one day it’s more links the next day they penalize links then they’ll change their minds again. The whole point of the internet is to connect. If they discourage this then they need to relook at what they are doing. I have a feeling people will soon just get tired of chasing Google and just say F.U. I’ll get my traffic another way. I would rather have a ton of links pointing at my site and pick up small traffic from each one that amounts to large targeted traffic then put all my proverbial eggs in one basket and expect to always maintain a ton of traffic from just Google. I used to get 120,000 unique visitors per month and am down to just 30,000 partly because I haven’t been able to update the site in a while and partly because of algorithm changes that have dropped my search results which is why I’d rather not count on a volatile source of traffic from now on and instead focus on building multiple streams of traffic and income. – http://www.djemir.com

    • Shunyata

      I suggest that we start multiple Search Engines, and only use Google when there’s no way around it. I’ve set Google as the default Search Engine for Chrome, Bing for Firefox and Duck Duck Go for Safari.

  • http://zionwp.com/ Chris Bunting

    Well, As I’ve seen in the last 15 years of SEO.. Sometimes, when you jump on the “Hype” Bandwagon.. You will eventually get bounced off!..

  • Mike C

    I get many, many emails from people saying they want me to remove links from my sites as a result of google telling them these are bad links. ALL MY LINKS have come from my own spidering of the web, the same thing that google does. NONE OF THEM are paid, spammed or otherwise created. I have NEVER bought or sold a link. Strikes me as predatory monopolistic practices in restraint of trade… but try getting anywhere with that argument…

  • Shunyata

    Come on Spam is a cheap shot here!

  • http://www.tallerlaguarderia.com/ Taller La Guardería

    Google needs to recapitulate its services and meet the demands of users. The SEO is also affected by these irregularities, in the end, Google will only be used for the simplest things, like image search. Sometimes it’s hard to find information, when it should be more immediate.

    Greetings! Parking Atocha

  • http://www.biready.com.au iannicholson2000

    Google has become the self-appointed police, judge, jury and executioner of the internet – and all because we allowed it to be this way. Don’t get me wrong, Google is a great company that has done much for the internet, but I become growingly disturbed by these stories about how it wields its power.

    But there is a solution, and it is the same solution for any consumer that has grown tired of its supplier – leave and find another.

    Impossible? Not at all.

  • Tony Caravan

    The aforementioned Google “penalty” actually ruined one of our online businesses. We built a site called Blarchive (Blarchive.com) that was an indexed and categorized listing of reviewed websites; but because it provided URLs, it was/is considered a “link-back” site by Google, even though it is not reciprocal.

  • http://mikecurleymusic.com/ Michael David Curley

    I fail to see how so many people can fail to expand in 140 words, and not provide some final accreditation’s to exactly who they are. http://mikecurleymusic.com . Tags are business cards as are moving forward the signing of signatures on letter heads that provide us information above and beyond what we already do have. If a person wishes to inquire further there’s a finite fifty fifty chance. Saves endless days of wasted mail, use of plastics, and getting stuck at airports for absolutely no reason at all.

  • justinef

    I’m sorry, but I’m gonna go with: Rand got what he deserved. Honestly, when will people, no matter how successful, understand that Google is a cut-throat business FiRst -and a web advocate, like, 30th?
    How could he email the spam director -or anYOne at Google, about possible spam issues; and think that’d be >>anything<< but a red flag?!!?#?
    I Love me some Google products. But, then again, I also realize that <> am Google’s product -and so are we all.

    There is no “In Google We Trust.” -Frankly, there never was.