10 Ways To Increase Pages Indexed

Or how to make Google pay more attention

Get the WebProNews Newsletter:

[ Search]

For a while now webmasters have fretted over why all of the pages of their website are not indexed. As usual there doesn’t seem to be any definite answer. But some things are definite, if not automatic, and some things seem like pretty darn good guesses.

So, we scoured the forums, blogs, and Google’s own guidelines for increasing the number of pages Google indexes, and came up with our (and the community’s) best guesses. The running consensus is that a webmaster shouldn’t expect to get all of their pages crawled and indexed, but there are ways to increase the number.


It depends a lot on PageRank. The higher your PageRank the more pages that will be indexed. PageRank isn’t a blanket number for all your pages. Each page has its own PageRank. A high PageRank gives the Googlebot more of a reason to return. Matt Cutts confirms, too, that a higher PageRank means a deeper crawl.


Give the Googlebot something to follow. Links (especially deep links) from a high PageRank site are golden as the trust is already established.

Internal links can help, too. Link to important pages from your homepage. On content pages link to relevant content on other pages.


A lot of buzz around this one. Some report that a clear, well-structured Sitemap helped get all of their pages indexed. Google’s Webmaster guidelines recommends submitting a Sitemap file, too:

· Tell us all about your pages by submitting a Sitemap file; help us learn which pages are most important to you and how often those pages change.

That page has other advice for improving crawlability, like fixing violations and validating robots.txt.

Some recommend having a Sitemap for every category or section of a site.


A recent O’Reilly report indicated that page load time and the ease with which the Googlebot can crawl a page may affect how many pages are indexed. The logic is that the faster the Googlebot can crawl, the greater number of pages that can be indexed.

This could involve simplifying the structures and/or navigation of the site. The spiders have difficulty with Flash and Ajax. A text version should be added in those instances.

Google’s crawl caching proxy

Matt Cutts provides diagrams of how Google’s crawl caching proxy at his blog. This was part of the Big Daddy update to make the engine faster. Any one of three indexes may crawl a site and send the information to a remote server, which is accessed by the remaining indexes (like the blog index or the AdSense index) instead of the bots for those indexes physically visiting your site. They will all use the mirror instead.


Verify the site with Google using the Webmaster tools.

Content, content, content

Make sure content is original. If a verbatim copy of another page, the Googlebot may skip it. Update frequently. This will keep the content fresh. Pages with an older timestamp might be viewed as static, outdated, or already indexed.

Staggered launch

Launching a huge number of pages at once could send off spam signals. In one forum, it is suggested that a webmaster launch a maximum of 5,000 pages per week.

Size matters

If you want tens of millions of pages indexed, your site will probably have to be on an Amazon.com or Microsoft.com level.

Know how your site is found, and tell Google

Find the top queries that lead to your site and remember that anchor text helps in links. Use Google’s tools to see which of your pages are indexed, and if there are violations of some kind. Specify your preferred domain so Google knows what to index

10 Ways To Increase Pages Indexed
Top Rated White Papers and Resources
  • http://www.gatesixhospitality.com Gatesix Hospitality

    Thanks Jason for providing such nice information. I would also like to give some information about Indexing of Google

    Today nearly 50% of the traffic is generated by the Google. This is a greatest revolution in the internet scenario. This revolution is done by two PhD researcher Lary Page & Sergey Brin & make Google a company on the international scene.

    Google becomes popular due to his algorithm which is made out by the two founders & also by making his presentation very simpler, because "the simplest things are sometimes most effective".Google has a very simple interface, without doing any advertising & only concentrating only on services of searching & nothing else.

    In addition of these search results, Google also succeeded to indexing of large number of webpages approximatelt 2 billion of pages. Recently it also includes new type of documents like: Word, Excel, PDF files, PowerPoint slides, Worpad.

    The algorithm is based on two systems

    1. a precise analysis of the contents of the indexed pages (keywords, occurrences, positions in the document, type of HTML tag, etc.)
    2. a classification of the pages according to their popularity (PageRank), calculated from the topology of the Web (i.e. the whole structure of the documents and the links between them).
  • http://www.agentur-eckert.de Heinz Eckert

    I´ll study Your comment

    "10 Ways To Increase Pages Indexed "

    Best Regards, H. Eckert


  • http://www.huntsvillepr.com Huntsville Pr

    One way to get all your pages index by google is to create an RSS feed for all you menu items. Second, display new articles on your home page via RSS. Third, create a sitemap with all of your pages. Break down site map into catergories with not more than 100 links

  • http://www.designaces.co.uk Liam

    Ive always thought the menu on your site itself is very important. There is no point having a Javascript menu that Google crawler cant read and no other way of a crawler getting from page to page on your site. But if you do have a Javascript or even a Flash menu, have a plain html version along the bottom of each page. This solves the problem, and ofcourse makes your site that bit more accessible.

    Also if you are into SEO, and do a lot of linking, try getting links pointing back to all your important pages, and not just the homepage. That would ensure that pagerank will be spread more evenly throughout your website.

  • http://www.designz.org.uk UK Web Design

    Great article! Worth noting that it is a good idea to several "paths" for spiders accessing your site. I always favour 2 sitemaps – both HTML and XML to be certain that all bots can get around the site.

    And don’t forget to check for broken links occasionally, if the pages of your site are being crawled via maps this can often mask problems with your site navigation, conversely a page with no visitors in your stats could point to the same issue.

  • http://www.digitalnasties.com Guest

    So is it true that google will only index half your site if your page rank is below 3.

  • http://www.acquiel.com Thomas

    Don’t overlook the basics of good page and site structure — building your pages so they’re not only user-friendly, but search engine friendly too…

    First of all – here’s the most important thing(s) to remember:

    The search engines want your content. Period. Your content is their primary resource for business and revenues. …But if (a.) your website cannot be traversed by automated, text-reading spiders, and/or (b.) your pages have no distinguishing (descriptive) features, then you’re putting up barriers that impact the spiders’ ability (or in some cases, "desire") to index your site and content.

    Here are some tips on page and site structure:

    1. Make sure EVERY PAGE has a unique HTML TITLE — Use a descriptive title that actually relates to the content of that page, and which also provides proper context to readers who first see this title on on Google and other search results – NOT on your website… For example, if you’re a political commentary site, writing another political article about President Bush, DON’T assume this title is viewed by your existing readers (such as by using a less formal title like "About Bush and his <whatever topic>"). Instead, make sure every title (and Description, noted below) provides a stand-alone and accurate summary of your content (like "President George Bush and his <whatever topic>" (or "George W. Bush", to avoid confusion with his father)), so that (a.) the title will make more sense to people scanning Google search results, and (b.) so that Google et al can more accurately index AND catalog your content, and find better matches between your pages and their users’ search phrases.

    2. Make sure EVERY PAGE has a unique, stand-alone summary of the page content in the form of a META description tag.
    If your page is missing a description, then the search engines will try and figure out a summary for you. Read that last sentence again… Yes, it’s true – and this is a significant indicator of how much the search engines want your content. They will do this two ways:

    First, if you don’t provide any META description tag, but your page content is displayed in a fairly straight-forward manner, the search engines will make a "guess" as to the topic of the page (using complex keyword weight algorythms), and then they’ll extract a bit of text from around the sentence(s) or paragraph that is most weighted towards the topic they *think* you’ve written about. That’s bad. Computers are dumb, and rarely get the topic correct.

    Alternately, if there’s no description and the topic of the page is too hard to figure out, search engines will see if your site has a DMOZ listing and will extract the summary of your website provided by the human editors there. Again, this is bad… Why force them through all this trouble, which will only result in a description that is not entirely accurate, when you could just write a few sentences yourself and put them into the META DESCRIPTION tag?
    Remember, your META descriptions become the "teaser" text you see under the links to your articles or pages on Google and other engines’ search results pages… so be sure that description is adequate to describe the page and your business to someone who has NO idea who you are, and who is NOT already on your website. The description must provide a stand-alone summary for potential visitors who are NOT on your website, and who have no other context with which to associate one article or page from another.
    When a link to one of your pages appears on the search results of one of the major search engines, you had better make sure the title and description are interesting AND ACCURATE enough to make that link look better than the other 20 on that results page they’re looking at.

    Note: Never EVER try and "game" search engines by using descriptions or titles that don’t actually relate to the content on your pages. They’re smart enough to check that now, and it only takes one or two mistakes like this to get your entire website banned from all major search engines.

    3. Use good old HTML hierarchical conventions. The H1 tag must be the first, main visible title, followed by your normal paragraph text, and then also use H2, H3, etc as needed for any subheaders.

    4. Remove (never use) META KEYWORDS tags. The search engines have pretty much ignored them for many years now. Or, I should be more accurate: KEYWORDS tags do nothing to add to your favorable scores with the search engines, but they are used as a factor of your negative ratings. It’s best to just remove them, and be safe. No reason to temp fate and mess with something that has clearly been abused since the very start of the tag.

    5. Dynamic drop-down menus, fancy Flash animations and javascript or form-based navigation are not spiderable, and therefore none of that content and links would be found or read by the search engines…

    Search engines generally only follow text (or standard HREF) links, and don’t read inside of javascripts or DHTML menu scripts. You can use CSS visible/hidden based menus, which load all text and links into the source code where it can be read by spiders, or at the very least, add a "Site Map" link into your page header and footer, which links to a page where you have plain, simple HTML HREF links to every single page on your site.

    6. Yes, provide a "Site Map" — either in your own format, or Google’s XML format), so Google and others can more easily find all pages on your website (but even this is no guarantee of indexing, because again, if Google and others can’t figure out the topic or any descriptions for your pages, they will not index or properly catalog them).

    7. Dynamic URLs are NOT A PROBLEM for any search engine — **UNLESS** (a.) your URLs and site are designed such that following links might create "spider-traps" for the crawlers (where search engine spiders get caught in an infinite looping of links within your site – such as with calendar links that go to infinite number of future and past months – at which point they will simply abandon your site), or (b.) when dynamic variables are appended to URLs (such as datestamps or session IDs) which would create the problem of duplicate content (where the same page is reachable through different URLs).

    And that’s all… Hope this helps!


  • http://newscritique.blogspot.com/ Mohan

    I am no expert but i found from my experience that the following method works in combination:

    1. There have to be at least 8 (eight) content rich pages – not contact, link to us, disclaimer etc.

    2. Each content page to be submitted to Google for inclusion of url.

    I found my 3rd or 6th page appearing on top in search results – may be because it had a better keyword density etc.

  • http://www.rainbowsherbetboutique.com Mommyprenuer

    I submitted my sitemap to google but received an error message stating the format was not valid.  What next?  My site is pretty simple in format so I wasn’t expecting any problems. . .

    Great article!  The comments are as informative as the article itself!

  • http://www.colasoft.com Guest


  • http://www.colasoft.com/im_monitor Guest

    i think the speed  of  website update is very important.

  • http://www.ivolution-seo.com/ Ivolution SEO

    Yeah, everything that’s been writen above is spot on and it all counts…

    What I boil down to is the 3 easy steps to get indexed "FACTOR" hehe, sounds eery!

    1. Good Hosting Company

    2. Google Sitemap

    3. Fresh new content daily (at least for the first week or two)

    You do that, you’ll not only have all your pages indexed but you’ll have google coming back on a regular basis which means new content will get indexed alot quicker.

    Hope that helps.

  • thebloke


    I’ve recently added a lot of pages to my site. Once I did this I generated a new Google Sitemap and resubmitted it. Within a couple of days, my indexed pages increased from 43 to 86.

    A couple of days ago, I added another lot of pages to the site. Once again, I generated a new Google Sitemap and resubmitted it. Webmaster Tools now shows under Sitemap Statistics that there are a total of 180 urls and 86 indexed urls. I’m expecting all 180 pages to be indexed within the next few days. I’ll let you know what happens!


  • http://seotechnique.blogspot.com Arnab

    In my case I continue with my effort of having two sitemaps. But the main focus remains on the page content, quality inbound links to the pages and last but not the least the size of the page with proper title and description.

    This helps me a lot to get my pages indexed in no time.

  • http://www.polarisproperty.co.uk Polaris World Property

    Thanks for the great tips, the page load thing is something that I had never heard of, might have to look into that.

  • http://www.vistaphotos.net Guest

    One thing that I’ve seen, is a photographer  I help others with their websites. I have a videographer who has a 1 google rating and yet by the number of links she has through industry weblistings, she shows up on the first page by googling her town and videographer. 

    I was trying to help her out and now I’m listing myself on these same sites. This works for photographers, videographers and DJs.

    I also link to strategic partners on my word press blog, Vistaphotography, where I chat about the clients I have and the weddings I shoot and I try to offer useful tutorials on how to use photo editing software.

    These are all good ideas above to think about.


  • http://cozumelmexico.net/ Bob Rodriguez

    I have a very high ranking web site under the search term Cozumel Mexico which is very targeted for my purposes.

    I write web sites that are content driven and revolve around Cozumel Mexico, activities, tours, points of interest, etc.

    When I want a web site to be indexed quickly I put a link on if with relevent anchor text and of course it has to be optimized for, and with similar content that applies to Cozumel Mexico. Car rentals, fly fishing, snorkeling or whatever is appropriate gets a link on my index page for a short time and in no time at all I find that it is indexed. It is just that simple.

    I hope that this helps!



  • http://www.peninsulapm.com Rub

    Google is changing rules so frequently that I suspect nobody can guess how to definetely optimize his/her site.

    I´m always tryng to follow the advises of this newsletter, you always publish great tips!!






  • http://www.direito2.com.br Ruben Zevallos Jr.

    I think everything is OK… but to me, what really matters is how many external links your web site have…

  • Pete

    One way that I found seems to have helped and that is indexing with other search engines.
    Especially the older Search engines such as Alta Vista.
    Not sure  how it works but the ranking with Google might be connected with some form of competition. If a website is popular on one search engine then this might trigger Google to also index it.
    I can not see Google sitting on its laurels while other SE index websites.
    If they did then they might not be worth the advertising revinue.
    Especially if the website attracts 1,000’s  or even millions of visits.

  • http://www.fruitsmoothierecipe.bravehost.com/ rx_fruitsmoothies

    I’m still struggling to have my site climb up to the search engines, and ofcourse get all my pages indexed but my site is still at the last spot. How could I improve this?

  • http://www.leatherbull.com AM

    I do agree. Using Google webmaster tools help. I have been using them from about 2 months, i have seen improvement in rank when i used Google’s sitemap. (if anybody want google’s sitemap help, let me know. Email: leatherbull@live.com) plz also visit my websites. Especially Bikers all over the world! http://www.leatherbull.com & http://www.frontiercycle.com Thanks, AM

  • http://www.houseplanet.dj/ houseplanet

    Thanx for the tips! Very useful!


  • Jack Gopher

    Thanks a lot Jason for this high quality aritcle!


    free online Excel tutorial training

  • http://www.webtacs.net/page-rank/ Page Rank Info

    I like these tips, they make pretty good sense. I like this because it is not very common knowledge. Infact, I think I only knew about 2/3 of the topic on this.

  • http://www.travellinesexpress.com Home Based Travel Agent

    Thanks for the tips. Backlinks are important indeed. A bit of patience and all will be well!

    All the best,


    Self-Help Ebooks

  • http://www.rankbetterseo.com/ Better Search Engine Rank

    I have a client that insists on releasing thousands and thousands of pages at time, and I have told them that its not a good idea.  So they launched over 40000 at once and it took a while for them to be indexed.  The best thing to do in this case is alot of internal linking from other pages to these pages to show they are not as spammy as they may look.

  • http://www.MosaicWebsite.com Web Design Pennsylvania

    Will having two sitemaps confuse Google?

  • http://www.riversagency.com website design chapel hill


    I think one of the best ways is to get links back.  Getting websites indexed is all find and good, but there is this notion of the supplemental index that is index purgatory.  The only way out is by getting quality links back to those pages to lift them out of the supplemental and into the primary index so they have a better chance of being found.

  • http://www.rankbetterseo.com/ Bill Ross

    If these sitemaps are different then no.  You can use a sitemap.index file to call these two sitemaps.  take a look at sitemap.org and it will explain this more.

  • http://www.gradesgrowtutors.com Online Tutor

    Great learning content!  Definitely helps me.  Thank You!

  • http://www.wsop2008.org WSOP 2008

    Excellent article even more things to consider in optimisation as the process refines itself

  • http://www.dr-sandy.net BLOGGING GUIDES by Dr. Sandy

    I use three tactics to get my blog indexed-

    1) Register to webmaster tool and verify my site.Register to analytics and insert tracking code.

    2)Submit sitemap in google webmaster tool.

    3)Get some quality links.

    I have seen some blogs getting indexed within a week using all these tactics.

    • http://www.linkchannels.com/ linkchannels

      i agree with you. you’ve got to build quality links for your site and googlebot will love to visit your site regularly.

  • http://www.tmdesigner.it Giuseppe tmd

    Bel lavoro… io utilizzo "di tutto un pò".. il monitoraggio continuo è importantissimo!

  • http://samsmartguy.50webs.com/ Sam

    In my opinion it depends on PR, if PR of the homepage is high more links will be followed and more pages will be indexed.

  • http://www.cool-designs.org/mekadem-atarim.php ???? ?????

    If  someone have a problem with deep pages indexing the best thing  he can do is links to deep pages.

    A common mistake is to get outband links only to your site homepage.

    Try to get outband link to deep pages on your site

  • http://3magin8.com/1 3magin8

    Thanks for sharing these tips!

    Hopefully I can get more pages indexed!

  • http://leveltensolutions.com M M Rahman Maqsood

    Its really very helpful article. I like this concept.

  • http://xoomer.alice.it/francesco-forte fraweb

    This article is very interesting

  • http://www.exposedacnereviews.com Exposed Acne Review

    I found that when I launched 10,000 pages at once, a couple thousand pages get indexed initially, but Google later on starts to deindex them by the hundreds. What was weird was my traffic didn’t decrease from long-tail keywords, so I guess they just took out the pages not ranking anywhere.

  • http://www.vietravel247.com vietravel247.com

    vietnam, vietnam tourism, vietnam travel, vietnam travelling, vietnam tourist, vietnam hotels, vietnam restaurants, vietnam hotel, vietnam restaurant, travelling to vietnam, travelling in vietnam, travel guide, travel tips, travel, tourist, destination, tourists, tourism, culture, hotels, hotel, food, restaurant, restaurants, eating and drink, tour, tours, festival, shopping, travel guide, travelling and voyages, voyages, holiday and vacation, customs, vietnamese customs, culture tour, vietnam culture, cultural, culture, adventure tour, trekking tour, kayaking tour, beautiful site, beautiful spot, tourist attraction, tourist spot, sightseeing

  • http://www.moovinonup.com/ SEO

    some good advice here thanks

  • http://www.swankigifts.com/article/online online articles

    Another great article jason, I have enjoyed the reading.

  • http://earth4energymanual.net earth 4 energy

    It’s definitely all about unique content.

  • http://www.answerblip.com/articles/entertainment/celebrity celebrity news

    Thanks for the great info about increaing pages indexed.. Look foward to reading more of your articles

  • http://www.lowongankerjabanks.com Lowongan Kerja Bank

    thanks for your article. It’s very hope me.

  • http://www.hostingtangguh.com hosting murah

    very usefull article.


  • http://www.8vertise.com Do You 8vertise?

    Thank you for your info. I had just submitted my sitemap.

  • http://www.m4s73r.com/ Internet Marketing Indonesia

    thanks for your article. Very help me. I will more like visit to webpronews site. :) Fantastic

  • http://www.articlesdb.co.cc Alex

    i think internal linking is important to guide crawler on site.

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom