InfoQuest! Information Services

Table of Contents

Search Engines Overivew | Optimization | Search Techniques | Search References and Resources | InfoQuest!
General Search Resources | Optimization Resources | Submission Resources | Maintenance Resources | SEO Firms

End Table of Contents

Search Engine Resources and References

General Search Engine Resources and References

Search Engine Watch
This site, by longtime search engine guru Danny Sullivan, is the bible for people doing seach engine optimization, submission, and maintenance. The site has search engine submission tips; web searching tips; search engine listings; reviews, ratings & tests; and search engine resources. Sullivan puts out a free Search Engine Newsletter with monthly information about what is going on with the various search engines. He also has a subscription-based service that provides more in-depth information than is on the free website.

Chris Sherman's Search Day
Chris Sherman switched from running's websearch site to becoming the associate editor of Search Engine Watch. As such, he produces a daily e-newsletter that combines brief issues articles as well as news about searching, search engines and directories, optimization and submission.

Search Engine Showdown
This site has a chart comparing search engine features, a page that ranks search engines by size and other criteria, a pages that compares metacrawlers, and more.

About Guide to Web Search
Jennifer Laycock runs the Websearch site at The site contains information about searching the web, optimizing sites for search engines, and how to submit your site. Elliott also produces Searchlight, a e-newsletter covering search engines and directories news and issues.

Web Search Engines FAQS: Questions, Answers, and Issues
by Gary Price, Searcher, v9#9, October 2001. This article reviews the top search engines, describes their features, discusses the Invisible Web, and provides tips for searchers on how best to use and stay current with the various search tools. Available for a limited time free on the website.

The Art of Business Web Site Promotion
Jim Rhodes of deadlock Design provides information about software to help you optimize and submit, how search engines work, optimization, and promotion. He also has a newsletter and message board.

Search Engines: II. How Software Agents and Search Engines Work
This is WebReference's page on how search engines work, with tips on how to search and on optimization.

I-Search Discussion List: Understanding Internet search technology.
This is a fee-based discussion group of people who optimize websites for search engine success. They discuss problems, tips, how search engines work, or don't work, and other topics about how to get a website to rank high in search engine rankings. The fee for joining the discussion group is $19.50 a year, which gives you access to all of the excellent Adventive discussion groups.
To subscribe by email:
To subscribe by the web:

Search Engine Optimization Resources

General Optimization

RankWrite Roundtable
Now defunct, this newsletter by search engine optimization experts, Jill Whalen and Heather Lloyd-Martin, was superb. Each issue covered a couple of questions about optimizing sites for search findability that are submitted by subscribers. The site still contains the archives.

High Rankings Advisor
Jill Whalen followed up RankWrite with the excellent High Rankings Advisor which continues the question and answer format that RankWrite used. The newsletter also features special in-depth articles by search engine experts on topics of interest to the community.

Super FAQ by Danny Sullivan, Search Engine Watch, ongoing updates.
This frequently asked questions section answers commonly asked questions, including length of description and keyword metatags; repetition in metatags; location of title; how to get your descriptions picked up for search engines that don't use metatags; stemming and plurals use in metatag keywords; frames and metatags; what search engines consider as spamming; how often to resubmit; submitting other people's pages; how search engines view subdomains; not getting websites on free ISPs listed; buying keywords; how much traffic should come from search engines; do search engines read flash; redirection; moving domains to different servers; using header tags; etc.
This information is on the fee-based subscribers' part of the SearchEngineWatch site, so you will need to join to see the full article.

Ten Commandments for SEO by Jill Whalen, RankWrite, #49, May 30, 2001
Search engine optimization expert Whalen presents ten commandments for learning SEO from the ground up. She covers issues such as learning basic HTML, writing copy for the web, key phrases, titles, submission and maintenance.

Designing for Search Engines and Stars by Shirley E. Kaiser, Digital Web Magazine, April 2001.
Shirley Kaiser moderates the Adventive i-Design discussion list. In this tutorial she explains the basics of good web design, in the process describing how to design sites that work for both users and search engines.

Keywords and Content

Seeding the Engines: Part 1. by Paul J. Bruemmer, ClickZ, April 11, 2001.
This two-part series breaks down the search engine positioning process into four steps: 1) Analyze, develop, or improve the key phrases used to find your website; 2) optimize your content with these key phrases for search engine compatibility; 3) submit and register your site correctly to ensure the focus is on all your pages, specific to your key-phrase list; and 4) monitor and audit the results. Part 1 discusses the first two steps. Bruemmer recommends first conducting a keyword analysis and describes some of the tools available to do that. He also recommends taking care of the hidden HTML program language issues such as tht title head tag, kweyword and description meta tags, alt tags, and so on. Part 2 is listed in the Maintenance section.

New Directions in Optimizing Page Content by Paul J. Bruemmer, ClickZ, June 13, 2001.
Web pages need to be optimized on two levels: 1) to gain high rankings in the spider-type search engines such as Google and AltaVista, and 2) to persuade customers to use your products and services. This means that writing page content requires both excellent copywriting skills and current knowledge of search engine algorithm trends. He briefly discusses keyword selection, keyword placement and coding, quality copywriting, site navigation design, and link popularity.

How to Receive a Top 20 Position on The Search Engines! by Terry Dean, webmaster at Bizpromo.
Dean echoes most of the other experts in recommending strong content and identifying the best keywords to use in your page, title, metatags, and domain name and path. His eight tips for high ranking pages also agree with most of the other search engine experts.

Content Strategies for Top Search Engine Rankings by Paul J. Bruemmer, ClickZ, November 22, 2000.
Bruemmer's strategies for optimizing the content of pages include:

  • Use a different title for each web page that begins with one targeted keyword or keyphrase with no more than five to seven words total. Bruemmer recommends not putting the business name in the title tag.
  • Include the title keyword/keyphrase in the meta name description tag. The description should be 25 words or less and be so compelling thtat the reader wants to visit the page. Use the targeted page only once in the description with one or two other keywords not included in that page's title tag.
  • Use your best key phrase for the page in the meta name keywords tag. Bruemmer recommends using only one key phrase in the keywords tag.
  • Include your key phrase in the first sentence of content on the page. It should be the same content as your meta name description.

How to Sabotage Your SEO Campaign by Jill Whalen and Heather Lloyd-Martin, Clickz, April 25, 2001
This slightly tongue-in-cheek article presents four ways to ensure that your search engine optimization campaign will fail: 1) don't include key phrases in your copy, 2) optimize key phrases that are never searched for, 3) write for search engines that is unintelligible or dull, and 4) redesign your site to get rid of all of the old files and addresses.


Tackling the Frames Dilemma by Paul J. Bruemmer, ClickZ, July 12, 2000.
Bruemmer describes how search engines work in order to explain why correct usage of the NOFRAMES tag is so important when a site is developed using frames. In short, in order for search engines to properly index your site you need content and links within the NOFRAMES tag. Bruemmer recommends putting titles and metatags on every frame pages, even those that don't appear when the pages are viewed. Internal links should be within the NOFRAMES tag. Make sure a link to your home page is at the bottom of every frame page so users don't get trapped.

Doorway Pages, Cloaking, and Other Issues

More About Search Engine Spamming by Danny Sullivan, Search Engine Watch, August 28, 2000.
This article describes what you should not do when optimizing your website. Don't use: keyword stuffing, invisible text, tiny text, page refresh or redirects, metatag stuffing, duplicate pages, multiple title tags, cloaking, or mirror sites for domain spamming.
This information is on the fee-based subscribers' part of the SearchEngineWatch site, so you will need to join to see the full article.

Page Cloaking by Danny Sullivan, Search Engine Watch, August 2, 2000.
Marketers who are afraid of their HTML code being stolen may want to try cloaking -- letting search engines see one page and users another. Sullivan warns against cloaking in this article. He says it takes a lot of time to constantly tweak doorway pages or code. The aggressive search engine optimization efforts that cloaking is often a part of can seem like spam to many search engines. The article goes on to describe the agent name delivery and IP delivery types of cloaking. He also provides information about cloaking packages and doorway providers.
This information is on the fee-based subscribers' part of the SearchEngineWatch site, so you will need to join to see the full article.

What Is A Bridge or Doorway Page? by Danny Sullivan, Search Engine Watch, June 2, 2000.
Doorway pages, AKA portal pages, jump pages, gateway pages, and entry pages, are created to do well on search engines for particular phrases. They are easy to identify since they are designed for search engines, not for people. This article describes the problems and issues in using doorway pages to optimize your website, including code swapping, agent delivery, and IP delivery/page cloaking. A significant problem with doorway pages is that when users click on a doorway search result, they don't arrive at the page that will really give them the information they want. They are either automatically redirected to the real page or must click again to reach the page.

More About Doorway Pages by Danny Sullivan, Search Engine Watch, various dates.
This article has a table that shows how several search engines view doorways that have gibberish, cloaking, redirection, or an excessive number of doorway pages.
This information is on the fee-based subscribers' part of the SearchEngineWatch site, so you will need to join to see the full article.

Hiding JavaScript by Danny Sullivan, Search Engine Watch, April 4, 2000.
Putting JavaScript high up in the HEAD part of your website, even if you place it between COMMENT tags, may result in having search engines index it, reducing the relevancy of your other descriptions. The JavaScript may also end up being chosen as the description for your site if the search engine doesn't use metatags. Sullivan recommends moving JavaScript to the bottom of your page where possible. A better solution is to make use of external .js files that contain your JavaScript code and will be loaded only by browsers that understand JavaScript. The NOSCRIPT tag won't work as a way to hide your code -- it will show content to the search engines.
This information is on the fee-based subscribers' part of the SearchEngineWatch site, so you will need to join to see the full article.

Dynamic Pages

Search Engines and Dynamic Pages by Danny Sullivan, Search Engine Watch, October 25, 2000.
Sullivan describes the problems that search engines face with dynamic pages. Most search engines spiders won't read past a '?' in an URL because they could get fed an almost endless number of URLs which would negatively impact both the spider and the host server. The article describes some workarounds including XQASP for active server pages; an Apache server rewrite module that will remove the offending '?'; reconfiguring Cold Fusion .cfm files to get rid of the '?'; using server side includes which usually don't bother search engines unless they have the cgi-bin path in their URLs; and using static pages -- don't dynamically generated pages unless your site really does change a lot.
This information is on the fee-based subscribers' part of the SearchEngineWatch site, so you will need to join to see the full article.

Dynamic Web Pages: Optimization for Dynamic Web Sites by J.K. Bowman.
This article describes why dyamic websites are difficult for search engines to spider, explains ways to optimize a dynamic website, and ways to get dynamic pages indexed. He describes how to optimize your site with layers and CSS (cascading style sheets).

Solutions for Dynamic Page Registration by Paul J. Bruemmer, ClickZ, September 20, 2000.
Dynamically generated pages can be a problem for search engines because of the non-alphanumeric characters in their URLs. This article describes some ways and gives some examples for keeping your dynamic pages while still getting picked up by the search engines.

Apache URL Rewriting Engine
This Apache HTTP Server (Version 1.3) module provides a rule-based rewriting engine to rewrite requested URLs on the fly. It includes rewriting URLs to eliminate the question marks that keep many search engines from indexing a dynamically generated web page.

Using Doorway Pages to Register .asp Web Sites by Paul J. Bruemmer, ClickZ, September 27, 2000.
Bruemmer defines a doorway page as any page modified for the purpose of search engine optimization, regardless of whether it is a subpage or additional page of your website or whether it is hosted elsewhere. Other names for doorway pages are gateway or splash pages. Nonviewable doorways hosted offsite are are usually managed with an IP delivery system known as cloaking or redirect pages. Doorway pages can be very effective in positioning Microsoft/Active Server Pages (ASP) websites. A tool called XQASP can converts the parameters of dynamic ASPs into a search engine compatible format. Bruemmer shows an example of how XQASP works.

Layout Tables

The typical table layout for web pages that have left side and/or top navigation bars, and content in the middle, usually means that search engines are reading your navigation tables before your carefully constructed key phrased copy. Here are some resources to deal with that issue.

Table Trick offers one option for getting your content where the search engines will use it, without destroying the look and feel of your site.

Table Trick Sample Code
The US Department of Energy, Office of Industrial Technologies offers this version of the table trick.

Invisible Table Trick
Creating Killer Web Sites offers a number of tips for using tables to control your design.

Table Trick
Words in a Row offers the standard table trick, along with some other design tips.

Other Search Engine Optimization Issues

More Search Engine Optimization by Danny Sullivan, Search Engine Watch, ongoing updates.
This section of Search Engine Watch lists articles and websites that focus on search engine optimization. Only a few will duplicate what is in this bibliography. Use it to further expand your knowledge of search engine optimization tactics.
This information is on the fee-based subscribers' part of the SearchEngineWatch site, so you will need to join to see the full article.

A Quick Guide to Search Engine Positioning by Danny Sofer
This guide describes the basis for planning an online marketing campaign. It covers defining your business, setting targets for your campaign, translating your busienss definitions into search phrases, desiging your pages to attract search engine traffic, submitting your site to directories and search engines, getting your site linked from other sites, and monitoring the results.

Pointers for Search-Engine Optimization Services by Bridget Leach, IdeaByte, Giga Information Group, April 30, 1999.
Giga analyst Bridget Leach says that search engine optimization is most useful in the beginning stages of site launch. About 80 percent of web users use search engines to find information, so when you first launch a site search engines will be the way most people find your site. After your site has been established you should want only about 10 percent of less of your traffic to be search engine referred and a much larger percentage to be repeat visitors. Successful sites have multiple lead generation methods covering both offline and online methods.
Giga publications are generally available only to subscribers. Some publications are available for sale at the Giga eShop.

Top 10 B2B Internet Marketing Tactics That Worked Best in 2000
Every B2B marketer interviewed for this article agreed that search engine optimization is always worth the investment. Smaller companies optimize their sites in-house, using software such as Web Position Gold to help. Medium to large companies prefer to outsource optimization to expert consultants., January 02, 2001.

A Practical Guide to Marketing Your Site Using Web Search Services by Bridget Leach, IdeaByte, Giga Information Group, February 25, 1999.
This Giga article agrees with others that well designed pages, with descriptive titles and descriptions, are the best thing for getting good search engine rankings. Other recommendations include submitting your site to all major search engines; checking the background of a submission service for legitimacy; don't spam; don't use another company's trademarks; don't rely on keywords or metatags; and consider bidding on keywords.
Giga publications are generally available only to subscribers. Some publications are available for sale at the Giga eShop. Get Found: Strategies for Working with Web Search Engines and Directories by Kathleen Hall, Planning Assumption, Giga Information Services, December 4, 2000.
This article stresses the need for website managers to stay up with search engine technologies and strategies and to optimize sites for search engine spiders. Giga agrees with other experts in saying that the best way to improve and maintain results on search engines is to keep them in mind when designing the website. Sites that rank high usually use descriptive page titles and metatags qne have summaries that clearly describe the organization or page content. Giga stesses that manual submission is important for both search engines and directories.
Giga publications are generally available only to subscribers. Some publications are available for sale at the Giga eShop.

Search Engine Submission Resources

The Four Search Engines to be Listed in and Why by Paul J. Bruemmer,, November 8, 2000.
Bruemmer says that the top four search engines that you need to be listed in are Yahoo!, AltaVista, Excite and Looksmart. This is due to the fact that they collectively reach more than 78 percent of all search engine traffic.
The reasons are:

  1. Media Metrix's ratings give Yahoo!'s audience reach (how many people visit a search engine during a month) as 69.1%.
  2. A Nielsen/NetRatings study found that Yahoo!'s websites were visited by more than half of the active Internet users in eight countries, making it the leading global property.
  3. AltaVista gets 50 million searches per day.
  4. StatMarket's HitBox counters, that show what percentage of all search engine-related traffic comes from each of the search engines, rated Yahoo! as first with 53.4% of traffic, AltaVista with 18%, and Excite with 6.7%, totalling 78.1% of all search engine traffic.
  5. LookSmart has two million links and powers results at MSN, AltaVista, Excite, and iWon. It reaches more than 52 million people a month, or about 74% of all Internet users in the United States.

Search Engine Submission and Resubmission Guidelines by Robin Nobles, Academy of Web Specialists, July 6, 2001.
This is an interview with Jim Stob of PositionPro, search engine submission service. Stob has working relationships with the search engines, which is why he can learn what they want to see in submissions and can stay out of trouble. Stob describes safe search engine submission guidelines, as well as providing some engine-slecific tips.

Search Engine Maintenance Resources

Getting Listed and Staying Listed by Paul J. Bruemmer, ClickZ, August 23, 2000.
Bruemmer explains how to make search engine maintenance a systematic task. He says to make sure you don't submit a domain name more often than the various engines specify. Also, do not blindly resubmit your site without first checking to see if you are already in the database -- you may lose a favorable rating if you do. For directories, make sure you have the proper category. Fit your description to meet the search engine and directory's requirements. Bruemmer recommends creating a text template with all of your submission information for each site; making a set of bookmarks for the 'search page' of each site; and prescheduling designated time intervals to verify and resubmit your web pages. His checklist for performing monitoring tasks includes:
  1. Perform position reports on all of your pages on a weekly or monthly basis.
  2. Search for your website in the database by title, name and URL.
  3. Verify whether your website is in the database or if it isn't found and document your findings.
  4. If the page isn't found, resubmit your website. Document your resubmission and each engine response.

Seeding the Search Engines: Part 2 by Paul J. Bruemmer, ClickZ, April 18, 2001.
The second part of this series lookss at submission and maintenance. Bruemmer describes the different ways to do search engine submission and some of the rules for submitting. He also discusses how to monitor your position and audit your results, recommending that these tasks not be performed more than once or twice a month.

The Road to Link Popularity by Paul J. Bruemmer, Clickz, September 26, 2001.
The article describes how link popularity works and provides some tips for improving popularity.

Link Popularity and Link Building for Search Engines -- A Primer by Eric Ward, Search Engine Strategies 2001, August 16 & 17, 2001, San Francisco, CA.
Linking strategies specialist Eric Ward emphasizes that good content gets free links to your site, products don't. People need a compelling reason to link to your site. He describes link awareness, link depth origination, and other reasons to build a links network.

What Your Link Request Should Contain and Why by Eric Ward, Clickz, November 9, 2000.
Eric Ward provides both good and bad examples of letters requesting that someone put your link on their site, along with 12 things that your link request should contain and the reasons why.

Search Engine Optimization and Submission Firms

Selecting a Search Engine Optimization (SEO) Agency by Paul J. Bruemmer, ClickZ, November 15, 2000.
Bruemmer recommends using SEO firms that focus on optimization to increase the ranking of your pages instead of automated software that submits to hundreds of sites or companies that use external hosting and doorways or cloaking. He says that core competency is the first thing to check when selecting a SEO agency. Verify the following:
  • Will you have direct contact with an SEO account manager?
  • Will your SEO firm's campaign position both your primary domain name and the outside links they host on their site?
  • Are your expectations about length of time for results, and types of payment being met?

What to Look for in a Search Engine Optimization Specialist by Detlev Johnson and Shari Thurow, ClickZ.
The moderators of the i-Search and i-Design discussion lists state that search engine optimization specialists should be brought into the design process early, not after the site has been built. The article lists five key points to consider when looking for a SEO specialist:

  1. Does the specialist understand the difference between a search engine and a directory, and the different strategies that each requires?
  2. The specialist should conduct keyword research on what your audience is using.
  3. No one should guarantee a search engine ranking.
  4. Programmers who make software that steals other sites' content to generate gateway pages are not search engine experts.
  5. Don't use firms that want to use cloaking or doorway pages that provide different pages to spiders than the users see.

Table of Contents

Search Engines Overivew | Optimization | Search Techniques | Search References and Resources | InfoQuest!
General Search Resources | Optimization Resources | Submission Resources | Maintenance Resources | SEO Firms

End Table of Contents

Copyright 2001-2003 InfoQuest! Information Services
Last updated: January 30, 2003
Please send any comments to or 1-266-4159.
Terry Brainerd Chadwick
InfoQuest! Information Services
435 Springs Ave.
Birmingham, AL 35242-4848 US