Quantcast

Google Algorithm Updates: The Latest Things To Consider

Google algorithm “transparency” continues

Get the WebProNews Newsletter:


Google Algorithm Updates: The Latest Things To Consider
[ Search]

Google has been making a big deal about wanting to be more transparent about its search algorithm lately (without revealing the secret sauce too much of course). And so far, I have to say they’re making good on that promise fairly well.

Is Google being transparent enough for your liking? Let us know in the comments.

We’ve seen plenty of algorithmic announcements made from the company over the course of the year. In November, they discussed ten recent changes they had made. Here’s a recap of those:

  • Cross-language information retrieval updates: For queries in languages where limited web content is available (Afrikaans, Malay, Slovak, Swahili, Hindi, Norwegian, Serbian, Catalan, Maltese, Macedonian, Albanian, Slovenian, Welsh, Icelandic), we will now translate relevant English web pages and display the translated titles directly below the English titles in the search results. This feature was available previously in Korean, but only at the bottom of the page. Clicking on the translated titles will take you to pages translated from English into the query language.
  • Snippets with more page content and less header/menu content: This change helps us choose more relevant text to use in snippets. As we improve our understanding of web page structure, we are now more likely to pick text from the actual page content, and less likely to use text that is part of a header or menu.
  • Better page titles in search results by de-duplicating boilerplate anchors: We look at a number of signals when generating a page’s title. One signal is the anchor text in links pointing to the page. We found that boilerplate links with duplicated anchor text are not as relevant, so we are putting less emphasis on these. The result is more relevant titles that are specific to the page’s content.
  • Length-based autocomplete predictions in Russian: This improvement reduces the number of long, sometimes arbitrary query predictions in Russian. We will not make predictions that are very long in comparison either to the partial query or to the other predictions for that partial query. This is already our practice in English.
  • Extending application rich snippets: We recently announced rich snippets for applications. This enables people who are searching for software applications to see details, like cost and user reviews, within their search results. This change extends the coverage of application rich snippets, so they will be available more often.
  • Retiring a signal in Image search: As the web evolves, we often revisit signals that we launched in the past that no longer appear to have a significant impact. In this case, we decided to retire a signal in Image Search related to images that had references from multiple documents on the web.
  • Fresher, more recent results: As we announced just over a week ago, we’ve made a significant improvement to how we rank fresh content. This change impacts roughly 35 percent of total searches (around 6-10% of search results to a noticeable degree) and better determines the appropriate level of freshness for a given query.
  • Refining official page detection: We try hard to give our users the most relevant and authoritative results. With this change, we adjusted how we attempt to determine which pages are official. This will tend to rank official websites even higher in our ranking.
  • Improvements to date-restricted queries: We changed how we handle result freshness for queries where a user has chosen a specific date range. This helps ensure that users get the results that are most relevant for the date range that they specify.
  • Prediction fix for IME queries: This change improves how Autocomplete handles IME queries (queries which contain non-Latin characters). Autocomplete was previously storing the intermediate keystrokes needed to type each character, which would sometimes result in gibberish predictions for Hebrew, Russian and Arabic.

Now, they’ve put out a similar post on the Inside Search Blog, revealing ten more that have been made since than post.

We just announced another ten algorithmic changes we’ve made! Read more here: http://t.co/VYIow0z8 33 minutes ago via Tweet Button · powered by @socialditto

Google lists them as follows:

  • Related query results refinements: Sometimes we fetch results for queries that are similar to the actual search you type. This change makes it less likely that these results will rank highly if the original query had a rare word that was dropped in the alternate query. For example, if you are searching for [rare red widgets], you might not be as interested in a page that only mentions “red widgets.”
  • More comprehensive indexing: This change makes more long-tail documents available in our index, so they are more likely to rank for relevant queries.
  • New “parked domain” classifier: This is a new algorithm for automatically detecting parked domains. Parked domains are placeholder sites that are seldom useful and often filled with ads. They typically don’t have valuable content for our users, so in most cases we prefer not to show them.
  • More autocomplete predictions: With autocomplete, we try to strike a balance between coming up with flexible predictions and remaining true to your intentions. This change makes our prediction algorithm a little more flexible for certain queries, without losing your original intention.
  • Fresher and more complete blog search results: We made a change to our blog search index to get coverage that is both fresher and more comprehensive.
  • Original content: We added new signals to help us make better predictions about which of two similar web pages is the original one.
  • Live results for Major League Soccer and the Canadian Football League: This change displays the latest scores & schedules from these leagues along with quick access to game recaps and box scores.
  • Image result freshness: We made a change to how we determine image freshness for news queries. This will help us find the freshest images more often.
  • Layout on tablets: We made some minor color and layout changes to improve usability on tablet devices.
  • Top result selection code rewrite: This code handles extra processing on the top set of results. For example, it ensures that we don’t show too many results from one site (“host crowding”). We rewrote the code to make it easier to understand, simpler to maintain and more flexible for future extensions.

Seeing just these 20 tweaks listed all together, as changes that have just been made in the past month or so, really puts it into perspective just how much Google is adjusting the algorithm. That doesn’t even include the integration of Flight Search results announced after these updates.

Google also points to the recently launched Verbatim tool, the updated search app for the iPad and the new Google bar as other recent changes to be aware of regarding Google search.

Google says all the time that it makes over 500 changes to its algorithm each year, and that it has over 200 signals it uses to rank results. There is always a possibility that one of these changes or signals can have a major impact on your site, as many have found out this past year with the Panda update.

Even a huge company like Yahoo is at the mercy of Google’s algorithm when it comes to search visibility, and they just finally made some big adjustments with Associate Content, not unlike what Demand Media has done this year.

Last month, Google also indicated that it is testing algorithm changes that will look more what appears above the fold of a webpage.

We’re getting close to a new year, and there’s no reason to expect Google’s changes to slow down. Google has been clear, however, that it aims to be more transparent about when these changes occur, and what those changes are. Granted, this transparency will only go so far, because Google will not make all of its signals known, and leave their results too open for gaming. That wouldn’t be good for anybody (except maybe Google’s competitors).

Google does say that these lists of algorithm changes are now a monthly series.

What do you think about the latest changes? Good or bad? Let us know in the comments.

Google Algorithm Updates: The Latest Things To Consider
Top Rated White Papers and Resources
  • http://www.ticobooking.com Luca

    I’m really happy about the first, second and specially the last changes in the list.
    Related query results refinements: It was about time google understood that while a search might look similar people might actually be searching for completely different things.
    More comprehensive indexing: Great content is commonly undiscovered or doesn’t rank simply because it isn’t part of a popular site, hopefully this will change things and smaller sites with great content will have a chance in the SERPS.
    Top result selection code rewrite: Big brands are monopolizing the top 10 SERPS. Hopefully this will change things, specially in the Holiday Season where smaller stores didn’t have a chance!

    I sincerely hope this changes will affect the entire google index and not a small percentage of the SERPS.

    • http://www.PlacesToEatOkay.com Steven

      I agree, I see that related query refinement to be a big plus, at least in terms of what I think it means for the rankings of my site. Also with more comprehensive indexing it should be interesting to see if that will actually push Google back in the good graces of the world again. I mean honestly we all know why Google is pushing this transparency thing right now. They’re being questioned by the US Government over their business practices. I think if there is a company that should be pressured into being more transparent it’s Google. Finally they can actually be more transparent and not just talk about it, like they used to do all the time and do nothing. Remember?

    • http://manlyelectronics.com.au DimitriAu

      It is like updating an American hamburger – Hurray ! We improve meet processing algorithm and make it transparent. We stop mixing pork and bull intestinal. Now you can taste whether it is bull balls or a tale in real time!
      But it is still a hamburger. Google has no brains and balls to make anything else.

      • http://www.LAokay.com Steven

        Actually they have made improvements over the years, so it’s not the same hamburger anymore, it’s evolved to something you’d find in a 3 star restaurant, not your average 1 star burger shack, but not quite a 4 or 5 star restaurant where you get the best dining experience you can get, but I think they’ve become arrogant in the changes they want to make. I see them trying to mix giving the visitor what they want and mixing in monetizing it better as well as giving the publishers something as well. They’re in the middle of this ongoing fight for traffic and visitors and money. The problem is some of their products do very well, such as search, but others such as place pages aren’t showing much growth and instead of Google playing by it’s own rules, it has been caught favoring their own sites over anybody who is competing. In many cases they have stolen information and used it in deceptive manors (like what happened with Yelp.com and their fight with Google using their member’s reviews on Google place pages). If Google were to actually level the playing field as they seem to say they are doing with their algorithm refinements, they would also need to force their own sites to do the same. The problem is Google is not creating this content, it’s mostly the public, so there is not any good SEO being done to 99.99% of these pages, and therefore would not rank higher than a page that is more relevant for the same query. Let’s face it, they’re not Wikipedia. They have an inherent problem with rankings on their own sites because they do not create the content themselves. Who would know SEO better than Google since they write the rules (at least for their search engines they do). Google spends most of it’s company’s resources on search, then next in line is advertising, and then you have everything else sharing what’s left over. They simply cannot compete with a site that puts everything into a smaller niche than all of the internet like Yelp does with just businesses.

  • http://advancedmarkettraining.com Ken Nadreau

    I think the most important change for me is the first and second ones, the related query refinement and indexing. It should make it easier to rank for long tail phrases without having to compete with the base keyword.

    But I think the last one is going to hurt a lot of people who have blogs with a lot of content based only on a few keywords.

  • http://www.11b.me/ neil

    The first, second and last updates are interesting..and it’s about time to update the auto-complete..there have already been many jokes about it.

  • http://www.e-zu.co.uk/mimecast/ Lisa

    Google updates is a major change in the search. This needs to be studied and in the coming year it will be very interesting to see how the SEO’s are going to react to it. Google has given importance to branded, authority sites, targeted scrapper, duplicate sites. Given more importance to Content.

    Wat about those sites tat doesn’t have content but have great backlinks pointing to there website. I mean the link building where google needs to keep an eye on it.

    If google can figure out with the backlinks then much of the spam can be eliminated.

  • http://www.mojomediagroup.com.au Damian

    The evolution continues….goodness

  • http://www.webtrooperz.com Internet Marketing

    Not Majorly changed….First two pint are really good modification. possibilities of similar keywords has been increased.

    last one is really hurts me….

  • http://www.mariadelao.net/ Arturo | Semillas autoflorecientes

    The almighty google switch back, we’ll never get a position with such frequent changes

  • http://www.michaelrurupandersen.dk Michael Rurup Andersen

    It’s definetely a good thing that they upped the frequency and the transparency regarding the updates. Yet it doesn’t answer all ones questions.

  • http://edgehypno.com/ Graham Howes

    As a busy ypnotherapist I get sick and tired of having to constantly tweak my page and add content and pay shedloads for seo or backlinks or whatever. I wish someone would go back to the basics and someone is looking for a weight loss specialist using hypnosis in the Ipswich area and just simply find me. Adwords is a joke – it started out as an affordable £30 a month and rapidly went to 300 a month for no real return. Personally I am getting sick and tired of it – I think it is all a gigantic con – why we ever allowed it to happen is beyond me… your competitors spend out on “being on page one of google” so you feel obliged to do the same or slip off page one to oblivion to see othe rpeople with not just ONE but multiple entries on page one alost determined to quash all opposition. At least google is strting to focus on CONTENT – that is one big step forward!

    • http://www.brightermedia.co.uk Brighter Media

      Graham, I couldn’t agree more. These constant changes severely disadvantage small businesses, with the end result that google isn’t listing the best companies or even more relevant, but those who have the time and money to play the game. This is particularly true when it comes to local search. As you say, at least there is now a focus on content but even there the little guy is disadvantaged because the emphasis on fresh content puts even more demands on website owners to invest even more time in their website. I can’t help thinking that google could find better ways to penalise companies that are trying to manipulate its results.

      • http://www.huntervalleyhampers.com.au Gourmet Hampers

        I agree completely – it is a complete distraction for an SME to get caught up in the google roundabout

  • http://www.mycashforums.com matt

    The first, second and last updates are interesting..and it’s about time to update the auto-complete

  • http://www.electric-reviews.org Mark Demers

    I`m all for better search results so if Google seems to think they`re doing better with the new changes all I can say is good because I search for things all the time like an official page result, I have found they were really hard to find sometimes.

  • http://www.wildflowersaromatherapy.com/en/index.html Peter

    With the new Google updates and freshness of the site.
    Would republishing my site map on a regular basis constitute as “fresh content” as the last modified date would show up as the current date?.

    • http://marketsecrets.biz Caleb

      Sounds like you may just have found a small loophole here ;)

      • http://www.bigdogstuff.com BigDog

        loophole yes, at least a small one, but it’s not such a new idea is it? Sitemap freshness always a good idea, hopefully the site has changed enough to warrant a new sitemap too

    • http://www.automaticwashservice.ro reparatii masini spalat

      Oferim service autorizat masini de spalat la domiciliu clientului. Zona de acoperire Bucuresti si Ilfov.

      Deplasarea si constatarea GRATUITE.

      Daca aveti o masina de spalat rufe defecta, o masina de spalat vase cu probleme sau doriti sa reparati un uscator de rufe, apelati la serviciile firmei noastre si in cel mai scurt timp un specialist se va deplasa la domiciliul dvs.

      Va oferim servicii calificate pentru orice reparatie. Nu va ganditi sa aruncati aparatul defect, e posibil ca defectul sa fie minor si sa fie nevoie de inlocuirea unei piese care cu siguranta e mult mai ieftina decat achizitionarea unei masini de spalat noi.

      Economisiti timp si bani apeland la servciile noastre pentru repararea masinii de spalat. Suntem cunoscuti pentru reparatiile de calitate si preturile competitive.

  • http://rs-geo.orgfree.com/ Sergey

    Google, being the leader of the market, wishes to divide sites on: 1.sites which have the rich maintenance necessary to the user, and 2. sites which are only design product and have no big importance for the user. Change of algorithms Google limits, lowers a role of an art, visible component of a site. The role of quality of a content raises, the role of the semantic maintenance of a site raises. It is, of course, correct. After all, if the person wishes to look at beautiful pictures, it is better to it to go to a museum or on a display of paintings. Google as though speaks – «it is necessary to be easier …». By the way, me as to the person who is engaged in external optimisation of sites through RUNet, this approach is pleasant. To advance sites with a good, exclusive content it becomes easier. <a href = "”> Sergey Rodionov

  • Guest

    Personally since changes in the summer I don’t give a Rats A$$ anymore ! I’m tried of worrying over rankings, page position and associated BS due to changes. There’s only so many hours in a day to monitor their great idea improvements and waste money on SEO that won’t apply in the next great idea.

    One could spend a fortune on SEO and never get desired results because Google will make new improved changes because that’s what “they think” customers want. Thinking isn’t knowing.

    And I agree with Graham below Adwords is a joke, a bad joke.

    If google wants to do something constructive how about fixing Google Chrome aka Google Crash, it’s worse than Internet Exploder !

    OK, I’m done ranting guess I go make a few changes on my site.

  • Tracey

    I’m concerned about their new parked domains classifier. I have 5 domains under my primary hosting account, and would like to add two more domains for a new business I am starting this month. In my Cpanel it shows all of these domains as parked and also add-on domains. Now I’m concerned that my two new business websites will not get listed in Google if I set them up this way. Any ideas on the best economically feasible way to do this? Do I need to buy separate hosting for each website I own now? Thanks for any information anyone may have.

    • http://www.tipsinablog.com Daniel

      Tracey,

      I would not think that could even be possible.

      If due to this recent Google Algorithm change, webmasters(Site owners) were required to host each individual site under separate hosting, Goggle would become a social pariah(Outcast).

      So, I would not think this is what Google is aiming for.
      Why would Google jeopardize their entire business model, through overextending their wish to clean out the low quality spam sites?

      For example: Many people have up to 50 sites(IM, Niche Marketing, etc), all under the one Hosting company.

      What I think they are targeting, is what Caleb mentions below.

      There are many sites(even within IM, Niche Marketing) that are almost like mass produced, cheap quality, one pagers with links shooting everywhere. They Usually aim back to the Flagship site, and I imagine, also link to many of those cheap sites.

      Before I started my own website(Blog) I kept bumping into these sites, all over the place. Many have a strikingly similar set up, filled with lots of links and sales pitches.
      When I visited these sites, I was redirected all over the place.

      Many sites are now including content links(Within content body) that have nothing to do with the post topic, where you are then sent to some silly product sales page(product has nothing to do with topic).

      I am seeing this on some mid to medium high level sites.

      Even outside of IM(Internet Marketing) and Niche Marketing, there are also a number of websites and Blogs who are doing exactly the same thing.

      They are following the ” Make 1 –to—50 new sites a month) type of system.

      Some of these sites slip up near the top of the Google search rankings, though, I think Google will soon catch on to them in time.

      I will add one other thing.

      If you do have multiple sites, and those others sites(Parked or not) are quite poor quality and have not been worked on for some time(Possibly one or two pages )they may get the Google Algorithm pinch, also.

  • Mel

    If Google wanted transparency it would provide a written script for the non-advertisers to follow. The knowledge gained would level the playing field and cause chaos within the listings. They obviously are aware of this, purposely created fog that constantly changes in order to create confusion, thus allowing them to do as they wish. Yet another tribute to their genius.

  • http://affordableweddingminister.com Susan

    For a beginner like myself, coming up in the search engine is a matter of being able to pay my bills. I have forced myself to learn about SEO and other skills for my own business. We can’t afford to pay for much advertising or professional help.
    Google is simply a money making machine. It started out as a tool and now seems driven by greed with the rest of the world. Because a low budget advertiser like me can ‘t afford to buy the first page the BIG DOGS get the step up. And the share holders get a big fat wallet.
    As much as I hate to admit it … My business still needs Google – so my ears will be open to any free advise they can give. NO THEY ARE NOT TRANSPARENT ENOUGH !

  • Milo99

    my site has been online since since 1998 and is a high level .com domain. I have been number one in search results for my my sites title for almost 9 years straight. After Google made these changes I fell off the map and they replaced my number one spot with the same titled site except the .net version. This site is a blank white page and has zero content. Number 2 is now a facebook page that mentioned MY site but my site is no where to be found? This makes no sense. They say they are about fresh content and they are giving top spot to a site with none. My site updates every day. Most of the other sites in the ranking are there because they reference MY site in their descriptions and titles. This will put me out of business in less than 3 months if it doesn’t change.

  • http://manlyelectronics.com.au DimitriAu

    I like that new algorithm get read off duplicate pages of my competitor. But still Google can not tell the differences between genuine links and links from link farms, hacked, spammed pages.

  • http://marketsecrets.biz Caleb

    Three things immediately jumped out at me from this update were (1)It will be more advantageous to target longer tailed keywords (2)The more current your content is the better and (3)the end is coming for duplicate or cloned sites or those auto blog type site that merely post a snippet of your content that links back to your main site ;)

    What I would like a better understanding on is just what makes a site more “official” in Google’s eyes because according to the official page detection update that would be the ideal thing to optimize for :?:

  • DougC

    I have now stopped using google completely. I dropped advertising and I now longer use it to search with. Enough is enough. No more hoops for this guy.

  • http://www.CaptainCyberzone.com CaptainCyberzone

    I’m liking! So far this month my hits have ticked-up as well as my Adsense revenue.

    P.S. To Google: I was testing “+1″ on one of my other sites (a weather site) and “no one” clicked it! I removed it. I think that you were a little late to the table with this one.

  • http://www.tipsinablog.com Daniel

    During the period where these recent Google Algorithm changes have been put into motion, I have been doing a fair amount of on page optimization(editing—sometimes re-writing the entire post) amongst other tweaks, with the aim of better search results.

    I have seen some improvements for some of my pages, which are targeting low to mid level competitive keywords.

    Other pages aimed at much more competitive keyword terms, are still a fair way back from page one.

    So, definitely doing some polishing up on our sites, does make a difference.

    Also, I noticed a site I visit and comment on regularly, A site that has been doing incredibly well targeting some really high level(Traffic and cost) keywords, is now doing even better since the recent Google Algorithm changes.
    This site literally has multiple(Sometimes three) of their website’s pages on Google page one, for some ultra big time keywords.
    This particular site, is always improving through the recent Algorithm changes, as far as killing it in Google search rankings is Concerned.

    Now I need to go and study this site, and discover their secret formula…….

  • http://welcometoblackfriday.com Kewell

    Thank you for information of Google Algorithm Updates.

  • J Pub

    As google implements more “lexical algorithms” one thing is becoming glaringly “obvious” – google search quality teams poor English skills and vocabulary, this is making poorly written articles and websites with spun content rank higher.

    Also google search quality team should understand when we search for “architects in city” search results should show real websites of architects not the cluttered yellow page sites listing architects.

  • http://nyayesh.blogsky.com/ ya-ke

    آموزش تخصصی تعمیرات موبایل

  • http://www.huntervalleyhampers.com.au Gourmet Hampers

    The idea that some parked domain is still sitting at the first page is a real anomaly that should have been fixed a while ago.
    It will be a definite improvement.

    All algorithm updates are helpful at refining and de cluttering the less relevant sites.

  • Gert

    It all makes no sense as long as Google allows websites for keyword substitution in there search results. Whatever keyword you are looking for, tens of irrelevant sites suggest you to buy your keyword, sometimes even including a fake price. As soon as a search string contains a purchasable item , google results predominantly spam !!

  • http://www.song4world.com/ omer faruk

    I agree, I see that related query refinement to be a big plus, at least in terms of what I think it means for the rankings of my site

  • http://theyachtowner.net Daniel Mihai Popescu

    More than interesting, :). The “Panda Update” decreased my traffic four times. Can you imagine? Four?

  • http://thailandnightlifecentral.com/ max meier

    Everyone who works in the IT business know that you should not change a functioning system continuously and too often. What google is doing is exact the contrary what they should do. As a result I have seen many problems, irrelevant results, continuous totally different outcome of the same thing etc. They should change their stuff never more as every 6 month all other leads to disaster. The thing is, google dont care much about the people on the other end, e.g. adsense publisher. It looks to me sometimes as if they have lost contact with the outer world, just working within a closed circuit.

  • http://www.studioartistx.nl Alexander

    Not sure what to think about these changes and how much it shall benefit average pages with good content. Time shall tell.

  • http://www.needtube.blogspot.com Roxen Web

    Great Google Algorithm Updates. I like it.
    Global SEO Services

  • http://www.hedgehogdigital.co.uk/ SEO Bedford

    “Google does say that these lists of algorithm changes are now a monthly series.”

    As if I wasn’t busy enough now I have to worry about algo changes on a monthly basis, how nice. Thanks Google. LOL

  • http://www.mathspeople.com Sunil Sharma

    Don’t know how it will affect new websites but it will help net users surely. We can hope that this will work like panda effect as that time big sites had to lose market for small but valuable websites otherwise whenever we wanted to search anything it was impossible to find beneficial sites and we were served with such big names again. Hope this time Google will repeat the same beneficial for both content worthy websites and peoples looking for answers of their queries.

  • http://www.travellerbase.net John

    Why not , It is google

  • http://www.bluemonkeyweb.co.uk Andy

    I think it’s great news for everyone involved with the web industry. Knowing more information about how google determines it’s results will help SEO’s and web designers build better, more relevant sites in the future.

  • http://the-best-denver.com Leonard

    I see a few updates that SHOULD help my sites and others but the complaints seem to point to the opposite of what Google intended. My newest site allows free full page directory listings but the SERPs still bring up large directories that use snippets and no detailed information for searchers.

    And they seem to dominate the top positions even when people are not looking for directory listings for certain topics.

  • Guest

    What google needs to do right before an update is flush it’s index of all the cr.. well you know, then re-index everything anew with their great idea so us regular folks can see where we really stand.

  • http://www.agentur-suchmaschinen-marketing.de/ acantara

    Über 90 % der Kunden, die im Internet nach einem bestimmten Produkt oder einer Dienstleistung suchen, informieren sich zunächst mit einer Suchmaschine über die jeweiligen Angebote. Erst dann wird eine Kaufentscheidung getroffen. Diese Tatsache birgt ein enormes Potential für Unternehmen, sich strategisch in Suchmaschinen zu positionieren, um somit ganz vorne bei den Suchergebnissen dabei zu sein und maßgeblichen Einfluss auf die Besucherzahlen ihrer Homepages sowie auf Ihre Umsatzzahlen zu nehmen.

    Dabei hat sich die Suchmaschinenoptimierung (SEO) in den letzten Jahren stark entwickelt und gewandelt. Die komplexen Relevanzkriterien der Suchmaschinen galten lange Zeit als “Geheimwissen” und die Suchmaschinenoptimierung als “Geheimwissenschaft”. Diese Zeit ist längst passé und sogar das Ranking kleiner Homepages kann mit einigen Tipps und Tricks deutlich gesteigert werden.

    SEO wird verwendet, um eine Website in Suchmaschinen zu indexieren und Sie in den Suchresultaten höher einzuordnen. Sowohl On-Site als auch Off-Site Optimierung zielen darauf ab, eine höhere Positionierung der Website zu erreichen. Im Folgenden werden ein paar nützliche Tipps erörtert, um eine Website in Suchmaschinen besser zu positionieren.

    Wenn Java-Scipt-Drop-Down-Menüs, Image-Menüs und Imagemaps fehlen, sind Text-Links zu integrieren, welchen die Suchmaschinen folgen können.

    Der Content ist die Grundlage für einen guten Pagerank, hochwertige Texte mit den wichtigsten Keywords / Keyword-Phrasen sind somit das A und O.

    Die Verlinkung sollte mit einem Netzwerk hochwertiger Links (Backlinks) erfolgen.

    On-Site Optimierung der Website (wie Alt-Tags, Meta-Tags und Header-Tags) sollten verstärkt eingesetzt werden. Ein Title-Tag sollte auf jeder einzelnen Unterseite integriert sein.

    W3C HTML Validierung für die gesamte Website.

    Keyword Analyse, um das beste Keyword zu finden. Die Website sollte Keyword-verstärkten Inhalt aufweisen.

    Google Analytics, Site Maps, Robots.txt sollten hinzugefügt werden.

    Off-Site Optimierung ist der grundlegende Bestandteil, um Traffic auf der Website zu generieren. Dadurch wird die Website an oberster Stelle in den Suchresultaten erscheinen.

    Social-Bookmarking, Social Marketing sollten für jede Website verstärkt werden.

    Eine Verzeichniseingabe für die Website sollte auf entsprechenden Kategorien basieren.

    Suchmaschinen wie Google, Yahoo, MSN, etc. werden das Einbinden der Websites ermöglichen, so dass diese sie schnell indexieren können.

    Die Eingabe von Artikeln ist ebenfalls eine Methode, um Traffic auf die Websites zu bringen. Regelmäßig neuer Content wirkt sich positiv auf die Bewertung in Suchmaschinen aus.

    Blog- und Foren-Beiträge für Websites werden verstärkt.

    Landing Pages mit unterschiedlichen Inhalten werden für die Keywords entwickelt und optimiert.

    Die Links innerhalb der Webseite sollten die jeweiligen Phrasen enthalten. Z.B.: besser ist “Computer günstig” als “Hier Klicken” zu verlinken.

    Die Integration von Standorten in den Keyword-Phrasen ist sehr wichtig. Z.B.: Besser ist “Hamburger Shop” anstatt “Unser Shop”.

    Ein verbessertes Design der Webseite erzielt keine Wirkung in den Suchmaschinen, da diese nur Text scannen, aber keine Bilder oder Flash.

    Keywords und Keyword-Phrasen sollten in Text Links und im Domain Namen integriert werden.

    Bei AJAX, Flash und Frames kann nicht auf eine einzelne Seite verlinkt werden. Somit gilt es für optimale SEO Ergebnisse, die Nutzung von AJAX, Frames und Flash einzuschränken.

    Die Extensions .php, .htm, .html oder .asp haben keinen Einfluss auf SEO.

    Als Alternative zur Anmeldung bei Google, die über Wochen dauern kann, ist es auch möglich, einen Scan von Suchmaschinen zu erzielen, indem die eigene Homepage mit einer Qualitätsseite verlinkt wird.

    Wer die Inhalte auf der Homepage nicht oft aktualisiert und ändert, für den ist ein Blog empfehlenswert. Hier sollte mindestens 3 Mal in der Woche neuer Content veröffentlicht werden.

    Bei der Verlinkung ist auf die Qualität der Links zu achten. Ein hochwertiger Link kann viel mehr Wirkung bei den Suchmaschinen erzielen, als zahlreiche irrelevante Links.

    Content sollte “natürlich” mit Keywords angereichert werden. Wenn eine Suchmaschine feststellt, dass zu viele Keywords im Text enthalten sind, ist eine negative Auswirkung möglich.

    Bei Links sind Keyword Anchor Texte zu nutzen. Hier ist wichtig, dass auch der angrenzende Text zum Link einen sinnvollen Bezug aufweist.

    Wenn ein Server mit anderen Homepages geteilt wird, ist es zu vermeiden, dass diese Homepages Spammer sind. Dies kann sich negativ auf den eigenen Index in Suchmaschinen auswirken.

    Wird ein Blog Beitrag optimiert, sollte dieser sowohl für Title Tag als auch für den Blog Title getrennt optimiert werden.

    Eine benutzerfreundliche Seite zieht Traffic an und optimiert somit auch das Ranking.

    Wird eine Startseite mit Flash oder als Splash Seite gestaltet, sollten unbedingt Text und Navigationslinks darin enthalten sein.

    Veröffentlichte Texte, welche die Leser dazu bringen, zu verlinken, sind Gold wert.

    Auf jeder Seite sollte eine einzelne Keyword-Phrase optimiert werden.

    Links zu Seiten mit hohem Pagerank aufbauen.

    Videos sollten mit Keyword-optimierten Texten begleitet werden.

    Virale Komponenten in die Homepage einbauen. Zum Beispiel Userkommentare, Blog Reviews oder Bewertungen.

    Falls diese Maßnahmen Ihre Kompetenzen überschreiten, wenden Sie sich an Acantara,Ihre Suchmaschinen Marketing Agentur!

    Wir helfen Ihnen mithilfe geeigneter SEO-Maßnahmen dabei, alle Ihre Online-Marketing-Ziele bzw. Ihren erwünschten Return-on-Investment zu erreichen.

    http://suchmaschinenoptimierung.agentur-suchmaschinen-marketing.de/

  • http://www.agentur-suchmaschinen-marketing.de/ acantara

    its so help full thanks……http://www.agentur-suchmaschinen-marketing.de/

  • leeu

    Wow!, that will be a good upgrade. A lot of features will be strengthen. It will be definitely give an edge to others.

    Leeu

  • http://tarak-mehta-ka-oolta-chashma.blogspot.com tarak mehta

    thanks for sharing valuable information

  • Duhh

    This is not true, Google is BS. See this article: http://www.popherald.com/news/googles-new-tricky-algorithm/14016/

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom