Basics Of Site Architecture

    December 7, 2005

The week chugs on at the SES: Chicago and WebProNews’ own Chris Richardson picked up on some excellent comments from all the speakers in one of the final sessions from Tuesday, focusing on “Successful Site Architecture” and how to utilize architecture to get the best advantage for search engines.

Editor’s Note: Architecture is all about building and design. Discuss your favorite site building tips at WebProWorld’s SES dedicated forum.

Barbara C. Coll, CEO of Inc, moderated this informative session. Three speakers imparted their thoughts on the audience. One speaker Derrick Wheeler, Director of Search Optimization for Digital Impact Inc. provided a lot of great tips.

Wheeler suggested what not to include is as important as what to include because search engines read different items differently. For example, session IDs cause different links to be issued for the same page. Search engines can construe these two different links for the same page as duplicate content. The same rule can be applied to sites requiring cookies.

He also mentioned time stamps can create this same problem. This is because time stamps appear in the URLs, which also means you get a different URL each time a page is accessed. These can cause duplicate content issues because absolute links have the full path. Relative links just have the directories, not the domains.

Weird redirects (root level redirects) can create issues too. If you are taking the root of your domain and redirecting to a longer URL, you’re telling the search engines the content resides at the redirected link and not the root. This can mean the spiders won’t discover your content.

He emphasized to make sure the URLs on your secure pages have the full path in them. He also said search engines can follow text links easily because they are consistent in the page code. Javascript-based navigation can and will confuse the search engines because the path in a javascript URL is not the same as in HTML.

Wheeler talked about the structure of the URL as well. He recommended reducing the number of parameters in the URL because one parameter is good for indexing but more than that can be iffy. Also consider minimizing the directory levels in URLs. Shorter URLs are easier for both surfers and engine spiders.

Do a lemon check on potential domain names. Make sure the previous owner didn’t get penalized for bad search engine tactics. Don’t use duplicate content either. Every domain and every page should have unique content.

Stay away from link farms because the links won’t get counted. Every page should have unique attributes including a page name (don’t have two pages called “links” or “contact us” for instance), a title, content, meta data, etc. If you’re going to redirect pages to newer ones, use error pages done by the search engine writers.

James Jeude, senior product manager at Ask Jeeves, also threw in words of wisdom. He said visual relevance is one of the keys. The site’s look and feel should match your target audience. He also suggested page title and descriptions are also important.

He also said name the media (images, sounds, movies) that appear on your site correctly (file names are important here). Use good spelling and grammar. All of these are basic but absolutely integral to be taken seriously.

The final speaker was the director of product management at Yahoo, Rajat Mujherjee. He focused primarily on getting into his employer’s index, but since it’s Yahoo, it’s quite useful.

One important suggestion he made was to optimize for both spiders and indexing. He said spiders don’t evaluate content. The indexing algorithm does this. Also make sure your robot.txt allows Slurp (Yahoo’s spider).

His final recommendations were to always link back to your homepage and quality content is essential. If you used GUI-based navigation, it’s important to have a text based navigation system to accompany the GUI based system. He also got in a plug to use Yahoo’s Site Explorer, although any of the site explorers out there would probably work to delve into your site and return pertinent info.

This session provided a lot of good information for improving site architecture. By cleaning up a site, keeping relatively simple URLs, strong unique content, and keeping the content topical, improvement can be easily noticed.

John Stith is a staff writer for WebProNews covering technology and business.