Thursday, January 31, 2008

Search Engines: Different Types, Different Strategies

Author: Terry Nicholls

There are four basic types of Search Engines:

Free Search Engines

Pay-For-Inclusion Search Engines

Pay-Per-Click (PPC) Search Engines

Directories

Because each type does things a little differently, you need to adapt your strategy to take advantage of their differences.

Free Search Engines

You can submit your pages to these engines free, but be careful. You must make sure not to over-submit (submit too often) or you'll be banned and never get listed.

Always check to see if your site is listed before submitting it.

Pay-For-Inclusion Search Engines

With this type of Search Engine, you pay to have your web site listed in their database. Pay-for-inclusion Search Engines (and the paid section of free engines) are a quick way to get listed in some major databases -- for a price, literally. The cost varies from engine to engine.

The advantages are threefold:

Faster inclusion into the Search Engine's index.

Repeated, regular spiderings.

Guaranteed continuous inclusion.

Pay-Per-Click (PPC) Search Engines

Pay-per-click Search Engines allow you to bid for keyword placement. For example, if one of your pages focuses on the topic of ""fashion models,"" you can bid for the #1 (or any other number) placement on the first page of search results. You only pay when someone actually clicks on your ad .

There's a catch, of course. The most popular keywords have become quite expensive at Overture.com (the first and biggest PPC engine) and are rising at the others.

Directories

Directories are different from Search Engines in that they do not spider pages . Humans review each submission, visit each site, and decide what gets in.

Search engines and directories provide search results for each other. If a search turns up nothing in the directory's database of sites, it will show the search results from one of the spidered engines. All the directories use one of the major engines.

The reverse is also true. Most Search Engines also provide directory results, in addition to their own search results. All of them use one of the ""Big 3"" -- Yahoo!, Open Directory, or LookSmart.

Adapt Or Disappear

The difference between the types of Search Engines requires that you adapt your strategy to take maximum advantage of each engine. We'll help you with that.

For a more detailed explanation of these Search Engines, along with specific strategies and mistakes to avoid, please visit

My Home-Based Business Advisor .

Terry Nicholls My Home-Based Business Advisor www.my-home-based-business-advisor.com

Copyright © by Terry Nicholls. All Rights Reserved.

About the author: Terry Nicholls writes from his own experiences in trying to start his own home-based business. To benefit from his success, visit

My Home-Based Business Advisor - Helping YOUR Home Business Start and Succeed for free help for YOUR home business, including ideas, startup, and expansion advice.

Wednesday, January 30, 2008

Creating Robots.txt File and its Importance

Author: San Christopher

If you are thinking you have developed a truly great keyword-rich-unique-content fully optimized website for the search engines and an attracting site for the visitors - that's fine, but do you know you are missing something? A robots.txt file. Did you include it? By the way do you know what's the importance of a robots.txt file?

Success of big companies lies in keeping their confidential data a secret, hidden from all. They tell the world something and do something. This enables them to execute their future course of action easily and change plans according to the situation. Job of robots.txt file is the same. It can or cannot allow a search engine to visit some or all of your web pages. Of course a human visitor is free to visit these pages. That being the case, for the search engines your website may be different than what a visitor is seeing. If you think one or some of the pages/files aren't good enough to be visited by a particular search engine or engines you can do it. Although this is not recommended - your website should be made in such a way it should not shy away from the search engines. Nevertheless its always better to know the basics of writing robots.txt file. It will help you. We will discuss farther down - robots.txt file is important. I repeat again - don't make pages you think should be hidden from the search engines. If any search engine think you are up to some tricks, it may panelize your site causing a no-rank - in the worst case for ever!

Every search engine has a ""robot"" (a software program) that does the job of visiting a website. Their purpose is to ""know"" the website, what it is all about, gather all information about it etc. Search engine robots gather this information and bring them back to their databases to show them in their search results. So, if your site is not there in their database it never shows up in the search results.

Web Robots are sometimes referred to as Web Crawlers, or Spiders. Therefore the process of a robot visiting your website is called ""Spidering"" or ""Crawling"". When somebody says ""the search engines have spidered my website,"" it means the search engine robots have visited their website. This robot is known by a name and has an independent IP address. This IP address is of no importance to us, but knowing their names will help since this name will be used when we create a robots.txt file. This is why the file is called ""robots.txt."" Given below is the list of the robots of some of the very popular search engines:

Search Engine - Robot Alexa.com - ia_archiver Altavista.com - Scooter (Bought by Yahoo) UK.Altavista.com - AltaVista-Intranet (Bought by Yahoo) Alltheweb.com - FAST-WebCrawler (Bought by Yahoo) Excite.com - ArchitextSpider Euroseek.net - Arachnoidea Gendoor.com (Genealogical Search Engine) - GenCrawler Google.com - Googlebot (http://www.google.com/bot.html) Hotbot.com (uses Inktomi's robot) - Slurp Inktomi.com Slurp - (slurp@inktomi.com) (Bought by Yahoo) Infoseek.com - UltraSeek Looksmart.com - MantraAgent Lycos.com - Lycos_Spider_(T-Rex) Northernlight.com - Gulliver Nationaldirectory.com - NationalDirectory-SuperSpider UKSearcher.co.uk - UK Searcher Spider

Writing Robots.txt:

Let's learn to write robots command. Note that there are two ways to write robots command. One is to include all the commands in a text file called ""robots.txt"" and another is to write robots command in the meta tag.

We will learn both ways of writing robots command.

Writing robots command in Meta tag:

There are 4 things you can tell a search engine robot when it requests (visits) your page:

1) Do not index this page - the search engines will not index the page. 2) Do not follow any links on this page - the search engines will not follow the links included in the page, i.e. they will not index any page that this page links to. 3) Do index this page - the search engines will index the page. 4) Do follow the links - the search engines will index the pages that this page links to.

Note that ""index"" is different than ""spider"". A search engine first spiders a page and then indexes it. Indexing is giving a certain importance to the page on the basis of its content, information, meta tags, link popularity with respect to the searched keyword. All this is decided at run time. When you tell search engines not to index a page, it means they know that ""certain"" page exists but do not rank them. That is, a no-index page will never be shown in their search results. This in any case does not mean a no-index page will not get visitors, it might get visitors indirectly from a page which links to it. Yes, no direct visitors from the search engines.

Suppose you want the search engines to index and also index (follow) its linked pages then include the following command in the Meta Tag:

Suppose you want the search engines to index a page but not follow its links then include the following command in the Meta Tag:

Suppose you do not want the search engines to index a page but follow its links then include the following command in the Meta Tag:

Suppose you do not want the search engines to either index or follow links of a particular page then include the following command in the Meta Tag:

Note: Google makes a ""Cached"" of every file it spiders. It's a small snap shot of the page. Want to stop Google from doing so? Include the following Meta Tag:

Like any meta tag the above written tags should be placed in the HEAD section of an HTML page:

your title

Creating robots.txt file:

A robots.txt file is an independent file and should be written in a plain text editor like Notepad. Do not use MS-Word or any other text editor to create robots.txt. The bottom line is this file should have the extension "".txt"" else it will be useless.

Let's begin. Open Notepad (it comes free with Microsoft Windows) and save the file with the name ""robots.txt"". Make sure that the extension is .txt.

By the way, did you note we did not use name of any robot in the meta tag! What does it indicate? Simple - by using meta you direct all the search engines to do something or not do something on a page. You do not have control over any one search engine. The solution is robots.txt.

It can always happen you do not want a particular search engine to index a page for certain reasons. In that case using a robots.txt file will help. Even though I do not recommend such a thing. The search engines get you traffic, why hate them. Stop them from doing their job and they hate you. I again repeat keep your pages smart for the search engines and welcome them. Fine, then why take the trouble to learn robots.txt? Why should you include a robots.txt file at all?

Let's suppose yours is a dynamic database site containing information of your newsletter subscribers, customers, their address, phone numbers etc. All these confidential information is kept in a separate directory called ""admin"". (It is recommended to keep such information in a separate directory. Handling data will be easier for you and so will be easy to keep the search engines away. We will just know how.) I am sure you would never want any unauthorized person to visit this area leave alone the search engines. It does not help the search engines either since they have nothing to do with the data or files there. Here comes the role of a robots.txt file. Write the following in the robots.txt file: (Ignore the horizontal row - they are included only to separate the commands from rest of the text.)

----------------------------------------------------------------- ---------------

User-agent: * Disallow: /admin/

----------------------------------------------------------------- ---------------

This does not allow the spiders to index anything in the admin directory also including sub-directories if any.

The asterisk (*) mark indicates all the search engines. How do you stop a particular search engine from spidering your files or directory?

Suppose you want to stop Excite from spidering this directory:

----------------------------------------------------------------- ---------------

User-agent: ArchitextSpider Disallow: /admin/

----------------------------------------------------------------- ---------------

Suppose you want to stop Excite and Google from spidering this directory:

----------------------------------------------------------------- ---------------

User-agent: ArchitextSpider Disallow: /admin/

User-agent: Googlebot Disallow: /admin/

----------------------------------------------------------------- ---------------

Files are no different. Suppose you want a file datafile.html not to be spidered by Excite:

----------------------------------------------------------------- ---------------

User-Agent: ArchitextSpider Disallow: /datafile.html

----------------------------------------------------------------- ---------------

Similarly, you do not want it to be spidered by Google too:

----------------------------------------------------------------- ---------------

User-agent: ArchitextSpider Disallow: /datafile.html

User-agent: Googlebot Disallow: /datafile.html

----------------------------------------------------------------- ---------------

Suppose you want two files datafile1.html and datafile2.html not to be spidered by Excite:

----------------------------------------------------------------- ---------------

User-Agent: ArchitextSpider Disallow: /datafile1.html Disallow: /datafile2.html

----------------------------------------------------------------- ---------------

Can you guess what does the following mean?

----------------------------------------------------------------- ---------------

User-agent: ArchitextSpider Disallow: /datafile1.html Disallow: /datafile2.html

User-agent: Googlebot Disallow: /datafile1.html

----------------------------------------------------------------- ---------------

Excite will not spider datafile1.html and datafile2.html, but Google will not spider only datafile1.html. It will spider datafile2.html and the rest of the files in the directory.

Imagine you have a file kept in a sub-directory that you wouldn't like to be spidered. What do you do? Lets suppose the sub-directory is ""official"" and the file is ""confidential.html"".

----------------------------------------------------------------- ---------------

User-agent: * Disallow: /official/confidential.html

----------------------------------------------------------------- ---------------

I hope that's enough. A little practice is of course required. If the syntax of your robots.txt file is not written correctly, the search engines will ignore that particular command. Before uploading the robots.txt file double check for any possible errors. You should upload robots.txt file in the ROOT Directory of your server. The search engines look for robots.txt file only in the root directory else they totally ignore it. Mostly root directory is the directory where the index page is kept. In that case keep the robots.txt file in the same directory as the index file.

I know a user-friendly software that will write robots command for you (the software is introduced at the beginning of this article). It can make error-free robots.txt file very easily. This software RoboGen is a great tool. Never bother ever again to check the syntax of your robots.txt file or even write a robots.txt file yourself. RoboGen is a visual editor for Robot Exclusion Files and is easy to use. Just select files you want to be visited or not to be visited by the search engines, and it creates the robots.txt file. You can also select the search engines of your choice. RoboGen maintains a database of over 180 search engine user-agents, which are selectable from a drop down menu. It is the BEST and ONLY software on the Internet to write robots.txt file correctly and effectively. This great tool is cheaper than you expect. CLICK HERE NOW to know more!

Note: You should be able to see robots.txt file if you type the following in the address bar of your Internet browser.

http://www.your-domain.com/robots.txt

(Where your-domain is the domain name of your website. If yours is not a .com site, replace .com with the respective extension your website. For e.g. .net, .us, .org etc.)

You must be wondering whether to use Meta tag or Robots.txt or which of these is more effective!

A robots.txt correctly written is more effective than the meta tag. All search engines support robots.txt, but not all search engines support robots command written in the meta tags. I recommend that you use both so that you cover your site in both the scenarios. RoboGen will help you to write both!

One last thing - You can look in your web server log files to see what search engine robots have visited. They all leave signatures that can be detected. These signatures are nothing but name of their robots. For instance if Google has spidered your site it will leave a log file called Googlebot. This is how you know which search engine has spidered your pages and when!

About the author: Senior Manager - Internet Promotions http://www.searchengineoptimizationpromotion.com

Tuesday, January 29, 2008

KEI Concerns and CID Alternative

Author: Serge M Botans

KEI Concerns and CID Alternative

Like many folks, I have been using KEI for some time now to determine what keywords I should target with my web site. And this has led me to becoming concerned with the results KEI provides and the keywords it suggests. I need to say here that my concern is very subjective as many folks are happily using KEI and don't seem to have a problem with it.

My main concern with KEI is that, by the way it works, it strongly favours demand numbers without, I feel, sufficiently taking into account the corresponding supply numbers.

I need to say here that I interpret supply numbers as a representation of how competitive a keyword is. For example, if keyword 1 has a supply of 200,000 while keyword 2 has a supply of 5,000,000, then I would consider keyword 2 as being more competitive than keyword 1.

And all things being equal, I would prefer to target a keyword that is less competitive and with less demand, rather than a highly competitive keyword that has a higher demand. The reason for this is that I feel that I have a better chance of cornering a section of a less competitive market than I do that of a highly competitive one.

Based on my concern with KEI, I have decided to create an alternative. I have called this alternative ""Competition Indexed Demand"" (CID). Now, CID works out the marketing potential of keywords in a similar way to KEI but it uses a different formula, one that takes more into account the supply numbers of keywords (or their competitiveness).

For example, using ""ranking"" as the starting keyword with Overture, KEI suggests the following top 3 keywords,

Keyword Demand Supply KEI nfl quarterback ranking 43,474 75,800 24,934 nfl power ranking 43,171 122,000 15,277 college basketball ranking 71,149 541,000 9,357

while CID suggests the following top 3 keywords,

Keyword Demand Supply CID dick vitale college basketball ranking 16,983 33,400 640 nfl quaterback ranking 43,474 75,800 427 vote nba power ranking 3,129 30,200 394

Comparing the 2 sets of results, you can see how CID favours lower competition compared to KEI. I have now used CID for quite a number of keyword research projects and have found that not only it favours lower competition, but it also suggests keywords that, I feel, have a better demand-supply balance.

Given that CID is an alternative to KEI, you now have to make a decision when doing your keyword research in order to determine the marketing potential of the best keywords to use. The decision is: shall I use KEI or CID? The answer to this question is straightforward: if you want to focus on high demand then use KEI, and if you want to focus on lower competition, then use CID.

Furthermore, based on my obervations of KEI and CID results, I have felt the need to come up with 2 rules to avoid both KEI and CID generating what I feel are inappropriately high numbers. My observation has been that these high numbers are generally generated because the demand and/or supply numbers are too high.

These 2 rules are:

- ""the 100 thousand demand rule"" which states that ""any keywords whose demand numbers are above 100 thousands should be ignored"",

- ""the 10 million supply rule"" which states that ""any keywords whose supply numbers are above 10 million should be ignored"".

Applying these 2 rules to KEI and/or CID will help you determine more realistically the marketing potential of keywords.

In conclusion, CID should be seen as an alternative to KEI and not as replacement for KEI. The reason for this is that CID focusses on the competitiveness of keywords while KEI focusses on the demand.

Serge M Botans

Contact: author@cattle-ramp-seo.com Phone: 61-03-9478-7088 or 61-0415-642424 Web Site: www.cattle-ramp-seo.com

PS. I have not currently released the CID formula. However, you can download my program Keywords Analysis to research your keywords using KEI and/or CID www.cattle-ramp-seo.com/KeywordsAnalysis.zip

of the self-help search engine optimisation web site

You may reproduce this article as long as it is in its complete form and that the resource box is included.

About the author: Serge M Botans is the CEO of the web site Cattle Ramp SEO Self-Help Resources.

Monday, January 28, 2008

Think Before You Ink - SEO Contract Advice

Author: Michael J. Gallagher

Understand the Service

In just about any industry in the world, you understand the product or service you will be receiving.  If you were going to have a contractor build a deck in your backyard, you have a basic idea of what's involved.  The contractor has to provide design services, labor, materials, and make a profit.  After a few quotes, you have a fair idea of where the market is in your neighborhood.  You also understand the deliverable, a new deck in your backyard.

Many SEO companies operate by a different set of rules.  There seems to be a big secret around how some companies will provide a customer with the results they promise.  Now that doesn't sound very fair.  How can you determine what you should pay for a service when you have no idea what services will be performed?  How can you compare two different SEO companies when they both have ""secret"" formulas they cannot disclose to you?  The answer is simple.  You don't do business with either company and you look for one that will be a little more honest about the services they will be providing.  There are no secrets in SEO and anyone telling you differently should not be trusted.

There are a few reasons an SEO company would not tell a customer how they intend to increase their search engine visibility and neither are good.  The first is that the service they will be providing is basic and overpriced.  This is actually the better of the two.  The second reason the process may be kept secret is because the SEO intends to use unethical or black hat techniques such as doorway pages, cloaking, or disposable websites which can end up doing the customer more harm than good.  There have been many websites banned from major search engines due to practices performed by questionable SEO companies.

Understand the Agreement

Contracts can be very misleading and deserve a great deal of attention before signing.  When it comes to SEO contracts, it is imperative that the verbiage reads exactly as the oral commitment.  I have seen many SEO contracts that state the SEO company will ""guarantee X number of top ten listings in any of the major search engines.""  Most of the time, the oral agreement is based on specific pages and keyword phrases within the customer's website.  The issue with this or a similar statement is that it is too vague.  The door is left wide open for the SEO company to define a second tier search engine as a ""major search engine.""  They can also use any successful keyword phrase from the customer's website rather than the keyword phrases they agreed to optimize for verbally.  If you agree to optimize for the term ""lion king tickets"" with such a contract, the SEO could come back and say you received a top ten for the keyword phrase ""new lion tickets king york"" so, therefore, their end of the contract was met.  I think it's safe to say no one in their right mind would search for Lion King tickets in New York in this manor.

The key to avoiding such trickery is to add every specific commitment into the contract.  That means keyword phrases, individual web page URLs, and any other details need to be defined within the contract.  If the SEO company is not willing to amend their contract for any reason, they are not worth doing business with.

Money Back Guarantee

If an SEO company is as good as they say they are, they will put their money where their mouth is.  There are many SEO companies that will promise a money back guarantee if they fail to live up to their commitment.  Sketchy SEO companies will claim they have no way of controlling search engine changes so there is no way they can make that commitment.  A trustworthy SEO will be able to adjust their optimization model to meet any changes that come along from the search engines.  They may need a little more time to achieve the committed results, but they will succeed in the end.  Managing change is part of the game.  Think about it.  How would any SEO company stay in business for any period of time if they were not able to adjust to search engine algorithm changes?

About the author:

Michael J. Gallagher is the author of SEO In A Can , a do-it-yourself guide to search engine optimization.  Some of the subjects covered in SEO In A Can are keyword usage and placement, back link distribution, SEO pitfalls, and advice on hiring an SEO.  Please come visit my website @ http://www.seoinacan.com .

Sunday, January 27, 2008

You are only as good as your PR!!

Author: Mark Thrope

You are only as good as your PR!!

Imagine a situation where you have started a great site with exceptional design and high quality content. Your pagerank of course is zero and you alexa traffic rank is somewhere in the 18th lack or perhaps more. You want to bring visitors to your site and hence want it included in the search results but the search bots are taking their own time caching your site. Finally the googlebot caches your site and you say, what the heck, I am through. But to your dismay you find your site way deep down even for a non competitive keyword.

You feel a bit down but have the belief that getting some quality inbound links will solve the issue. You start reaching sites with your link request but alas no one seems to be answering. Of course they want a link from a site having a good PR and not with a site having a PR of zero irrespective of the quality. In fact quality just doesn't seem to matter in today's internet which is all but pagerank. If your site has a good page rank you are the king or else no one is ever going to bother about you.

So the only option left before you now is to get links from directories, blogs, forums etc. and hope for the best. You do your best and wait for a PR update. After months of waiting the PR updates do happen but your pages get no more than PR3. You feel great about it though thinking 'something is better than nothing'. Now again you approach sites for a link exchange but to your sheer disappointment find that good sites never respond to you. All you seem to be getting are links from relatively unknown and low quality sites. You develop hundreds of low quality links and wait for the next PR update which seems an eternity to arrive. You have strong faith that for all those links you established your site will surely get a higer PR, but nothing of that sort happens. Your PR increases but not more than PR4. Then you realize that PR is exponential in nature. God forbid, you feel like cursing the PR system but you are left with no choice but to continue racing against the odds to get links. You become a link maniac getting links from all corners of the internet; after some time you realize that you are spending considerable amount of time getting inbound links and almost negligible amount of time on your website. You have not updated content for months now and your visitors are starting to turn away from your site.

One fine day you read an article that google has started penalzing sites that resort to getting unnatural links with an objective to cheat its ranking system. A few sweat drops escape from your face as you realize you have been spamming guestbooks and giving links to link farms. You also find you have cross linked with many sites actively taking part in link farms. You also find that you have got links from free for all sites. All these can lead you into trouble so in a frenzy you start removing links to those sites after mailing them and explaining them the reason. Finally the PR update happens and you get a PR of 5. You have entered the land of the lords now and you escaped getting penalized. You think, 'what a heap load of hardwork getting a quality site to rank, I wonder how these junk sites make it to the top?'. And ask yourself a question, 'Is a site only as good as its PR??' or 'Is the internet only meant for big shots who can invest considerable amount of time and money on optimization?'. The answer to this is probably yes which you gulp down with a pinch of salt.

About the author: About the Author:

The author is the webmaster of http://www.infodizz.com which offers quality information and resources related to the internet and networking technologies including CCNA tutorials .

Saturday, January 26, 2008

Search Engine Optimization Research Findings: A Client Perspective

Author: Jimsun Lui

This research attempts to obtain business firms' views on search engine marketing. It gives valuable implications to both business and search engine marketing firms.

The research collects findings from 121 Hong Kong trading firms involving in foreign trade through telephone interview. Also, a separate focus group was held with 8 marketing managers and 1 search engine optimization (SEO) specialist. We believe the implications also apply to other geographical areas.

We have extracted part of the results to present in this article.

1. Employed SEO service or Exercised SEO Before?

103 (85%) respondents reported that they have neither employed any search engine optimization service nor self-exercised any search engine optimization service. However, 18 (15%) respondents pursued some sort of search engine marketing including search engine submission to search engine optimization.

2. Importance and Concerns of Search Engine Marketing/Search Engine Optimization

A rating scale of 1 to 5 was employed, with 5 means Very important and 1 means very unimportant. Respondents reported that they perceived that search engine marketing could be a cost-effective method to help them get new overseas customers (rating 4.7).

Respondents also considered the following factors whether they would use a search engine marketing firm: 1. Pricing (4.6), 2. Deliverables (4.8), 3. Guarantee Clause (4.0), 4. Ranking against rival firms (4.7), 5. Possibility to get to Top 10 ranking (4.1), 6. Chance to increase response rate (4.8)

3. Paid for Performance Search Engine Marketing

This question has an explanation about the meaning of paid for performance search engine marketing. The question includes pay-per-click and paid-for-inclusion plan.

Again, a rating scale of 1-5 was employed, with 5 means Very important and 1 means very unimportant.

The respondents expressed that they concerned the possible high cost (4.8), potential fraudulent clicks (4.9), delivery of message to right persons (4.7), and time required to manage the campaign (4.2).

4. Understanding of Search Engine Optimization

This question asked about their knowledge of search engine optimization techniques. Respondents were given 3 choices: 1. Yes, 2. No, 3. Don't know.

31 respondents perceived that the major task of search engine optimization was adding keywords in Title and Meta Tags while another 62 of them were not sure about the answers. 108 respondents did not know link building was an important factor that could help them improve search engine ranking. Interestingly, 72 respondents believed that program "submit to 1000 search engines" could increase their website traffic. 101 respondents did not realize that content analysis and improvement measures such as keyword density ratio and keyword proximity ratio played a role in search engine optimization.

In general, most respondents believed that they did not possess sufficient knowledge on search engine optimization.

5. Focus Group

Another 8 marketing managers were invited to a focus group interview. 4 of them had experience in employing search engine optimization firms, and another 4 considered using search engine optimization services but finally had not taken action. An SEO specialist was acted as a moderator in the interview.

In the interview, they generally agreed that search engine marketing could help them to generate business leads. For those who did not use SEO service expressed that they abandoned to pursue as they could not make up a decision. First, they were not familiarize with how to achieve good search engine ranking. Second, the pricing scheme offered by different SEO firms varied greatly from US$19.99 to US$5,000 up-front fee, and some also required US$500 to US$5,000 monthly service fee. However, the SEO firms could not express clearly the deliverables that clients could obtain and they also refused to clearly explain the methodologies that they would use. These facts made the interviewees very difficult to persuade their boss to use SEO service.

In addition, the interviewees expressed that they feared that SEO firms would require many website changes, and some changes like keywords stuffing in Alt Tag or put too many texts and too little graphics in web pages make the site looked unprofessional.

Another point made by interviewees was very interesting. For those who have employed SEO firms, some of them perceived that they were being kept in the dark as they did not know what the SEO firms were actually doing! When they scrutinized the changes made by SEO firms, they found that the SEO firms have just written some keyword phrases in Alt Tag, Title Tag and repeated them in Meta Tags. Then, the SEO firms only ran reports for them in the subsequent months and asked them to wait for the results. Finally, they found that they had not got any extra traffic from keyword phrases suggested by SEO firms while the SEO firms claimed that they had achieved more than what was written in the guarantee clause (e.g. 20 Top 20 rankings in major search engines). Only 1 interviewee felt satisfied to the chosen SEO firm. This was because they knew the deliverable, got desirable number of top rankings, and a piece of link exchange software called Automate Link Exchange free and they could continuously manage link building themselves after the project end. The SEO firm also provided suggestions for them to maintain ranking afterwards.

Also, participants voiced out that they also wanted to know their search engine rankings against their rival firms.

6. Implications to Business Firms

In general, it was a consensus that search engine marketing could help to generate new business leads. However, it was not a simple activity. Most business firms lacked the knowledge and tempted to use an SEO firm. However, quality of SEO firms varied. In order to select a good SEO service, business firms should define clearly their objectives. E.g. some firms might simply define that they wanted a better ranking than their rivals or that they wanted a better conversion rate. Then, it was suggested that the business firms talk with the SEO candidates and try to differentiate which one was more knowledgeable. Moreover, business firms should discuss with the selected SEO firm in order to get an agreement on the number of deliverables and project schedule.

From the research findings, many business firms were yet to optimize their websites. That means if you first optimize your website, you could have an advantage over your rivals in search engine exposure perspective.

7. Implications to SEO firms

SEO firms should spend time and clearly explain to clients the SEO techniques and methodologies that they would use. Some SEO firms might fear that they would expose their "top secret" techniques. However, you could not get business if your clients could not understand the SEO processes and failed to persuade their bosses. In addition, SEO firms should apply optimization techniques with a consideration of usability of websites. Do not make the site look silly or unprofessional. Also, clients would not mind your using of optimization software such as Webposition Gold and they do want to know how you would serve them.

SEO firms should assist clients to devise a measurement matrix for the campaign. This helps the decision maker to make judgment and explain to his/her colleagues and boss. Clients would feel more comfortable to do business with you.

Moreover, SEO firms should define clearly what would be delivered to clients and the delivery schedule, e.g. how many ranking reports, keyword research result, etc. In addition, SEO firms should contact clients frequently and report the progress. Clients would then understand that you were indeed working for them and you could help to set or adjust their expectations.

Finally, SEO firms should offer assistance for clients who want to terminate the relationship and to maintain rankings themselves. The idea looked ridiculous but the clients would be satisfied by such offer and would probably recommend you to other firms. After all, clients who ceased to work with you might be due to many other reasons but not your performance. Why don't you give them another "candy" and maintain a good relationship?

About the author: Written by Jimsun, from Agog Digital Marketing Strategy Limited, provides professional Search Engine Optimization Positioning Service with an aim to increase website traffic and revenue. Visit http://www.agogdigital.com to learn more.

Friday, January 25, 2008

Search Engine Optimization Tips (part 1)

Author: Jimmy Whisenhunt

Search engine optimization (seo) can be a freighting and daunting experience. We have put together some tips to make this task much more pleasant. We are going to go over some tips to help your search engine rankings. Here is what we are going to cover in this article.

1)Design and Setup Problems 2)Selecting the correct keywords 3)The Title Tag 4)Your Page Copy 5)Meta Tags 6)Images "alt" tags 7)What you should not do

1)Design and Setup Problems Unfortunately, some webmasters have lost the ballgame before they even get started with search engine optimization with design problems. We are going to go over five of the most common design problems and there work around or solutions.

a)Sites that use Frames: Search engines do not index frames well. In fact search engine do such a poor job we recommend no to use frames at all. Here is the problem a frame page is like the name conveys it is a page in side of another page in a frame. The HTML code is like this: Framed Site

As you can see, there is no real content on in the page so search engines do not have anything to index. The work around is to use the tag to add content to the page manually. You would place the tag before the then add your optimized content between . An easy way to do this is to create a new regular (not framed) home page for your site that describes your site, products & services. Then copy everything from between the tags and insert it in the tag. Example Framed Site

Optimized page content goes here for best results.

Keep it simple search engines like it that way and no one will every see it. Do not for get about the Title and Meta tags more on that later in the article.

b)Dynamic URLs Most search engine cannot or will not index dynamic URLs. A dynamic URL is a URL that contains any of the following characters: ?, &, %, +, =, $, cgi-bin, .cgi Dynamic URLs are most common on database driven sites. If your URL contains any of the above characters it is very unlikely that you will get listed with the major search engines. Solution is to make a static page that has a static URL on that does not contain any of the above characters.

c)Sites That Use Flash Sites that use flash as a splash screen spiders cannot index them. A general rule is not to use flash as a home page splash screen. If you just have to use as much of your page copy as you can and remember to use your title and meta tags. Another note about flash is menu items. Spider's follows HTML links with flash menu spiders cannot follow them. Work around use a site map to link all of your pages together so the spiders will have HTML links to follow.

d)Image Maps Image maps are similar to Flash most spiders cannot follow links in image maps. To be on the safe side build a site map of your entire site so the spiders have HTML links to crawl.

e)Javascript for Navigation Search engines cannot follow links written in Javascript. If Javascript navigation is all you have on your site, you are in trouble. The solution is to add a HTML navigation menu some where on the home page and to add a site map of the entire site to make sure the spiders index your site completely.

2)Selecting the Correct Keywords The first step in the optimization process is selecting to correct keywords for your targeted market. The keywords step is the most important step in the optimization process because your will use them throughout the entire optimization process. It is also how other people will find you on the search engines. Optimize for the wrong keywords and all your efforts will be wasted. The first thing you want to do if write down on a piece of paper what the theme of your site is. For instance your site is about golf. You would write down 10 keywords or phrases about golf. Think specific keyword phrases not keywords. Why? Because keyword competition is so extreme for general terms such as golf. You would need to be more specific golf clubs, golf shoes, golf courses, golf equipment, etc. to rank well in the search engines than for a more general term. Then you would need to use a tool like Word Tracker to see if that is a well searched for phrase. Get outside opinions ask family members and friends what they would search for making a note of them. Then go back to Word Tracker and check the new key phrases adding the best ones to your keyword list. Also check out your competition for ideas. Do a search on a search engine and check out some of the sites. View the source HTML of the page and look at the meta tags. This should give you some more ideas. Only use keyword that relate to your site . You should develop a list of keyword phrases for each page that you optimize for the search engines.

3)The Title Tag Without a question the title tag is the most important process of optimizing your web page. I can't stress enough how important this tag is. This is because most search engines & directories place a high level of importance on keywords that are found in your title tag. We recommend using 1-2 of your keywords in the title near the front. Do not use just keyword in the title search engine may blacklist your is. Make your title enticing! The title tag is what most search engines show in there search engine results page (SERP) as the clickable link. Each page in your site should have its own title with its own keywords.

The Format: Your optimized title tag goes here Where to place it: The correct place to place the title tag is between the tag. It should be the first tag after . Tag length: We recommend that your title tag be between 50-80 characters long - including spaces! Staying at about 60 characters will be optimum.

In the next article we will go over Your Page Copy, Meta Tags, Images "alt" tags and what you should not do.

About the author: Jimmy Whisenhunt is the webmaster at Article Zone. Article Zone is a new web site for free articles that you can use on your web site or in your newsletter. http://articlezone.v ipenterprises.org

Thursday, January 24, 2008

The Changing Face of Search Engine Optimization

Author: Bobby Heard

With the ever evolving internet market for just about anything you can imagine and Google's index growing to almost 9 billion pages, and counting, there is little dispute amongst search engine optimizers that our job is getting much harder. From linking to articles, and density to ontology, our industry changes as fast as any other. The search engines, especially Google, seem to be on a daily diet of change and their algorithm seems to be growing at the rate of their index.

The word 'related' plays a much bigger part in SEO today than it ever has in the past. Instead of targeting an exact keyword phrase, it now makes a lot more sense to keep your site within context and to have related words to your keywords, compared to having density of one keyword phrase. Linking has also turned into a frenzy for relevancy. Unrelated links seem to no longer carry much weight at all. The theme through all of Google's recent changes seems to come down to one cliché: quality over quantity.

Just like with any other update at Google, optimizers must search and research their profession, however it seems to be happening more often than ever. You can't walk through our office without hearing Google's name a thousand times. We have unofficial RND (research and development) meetings almost every hour as there seems to constantly be new ideas and theories popping into all of our heads. In the past there were always changes to the way we've done our work, but the pace of this change is accelerating rapidly as well as the competition for online searchers.

MSN seems to be creating a buzz and they've just recently started a national television ad campaign. Their search results resemble Google's of 6 months ago, a time that will go down in SEO history as "the good old days", and also a time that Google's SERPs seemed a lot more relevant than they do today. New search engines seem to be popping up all over the place, and after all, wasn't Google a virtual unknown 5 years ago?

One of the best points made to me over the past month came at the preview of the new become.com search engine in California, where founder Michael Yang decreed this to be only the very beginning in the history of online search. Whatever happens in Google's future, and the future of online search, there is one thing for certain: only the most intelligent and innovative of SEO companies are going to stay above the bar and continue to find ways to get their clients, and themselves, to the top.

About the author: Bobby Heard (bheard@abalone.ca) is the Vice-President of Abalone Designs (http://www.abalone.ca), which offers great SEO results at affordable prices.

Wednesday, January 23, 2008

The 7 Points of Do-It-Yourself SEO

Author: Gordon Goodfellow

Ever felt intimidated at the convoluted, jargon-ridden information about Internet marketing for small businesses available on the Net? Ever been horrified by the huge fees the experts charge, putting search engine optimization beyond your own means? Ever thought: What exactly is search engine optimization anyway, and can I do it myself?

The answer is: Yes, you can! The basics of search engine optimisation in applied web marketing are simple. It's all to do with the keyword content of your text copy, and can be summarised in seven points.

1. Register a good domain name which reflects what your site is about. Even if you are an established business, don't register www.FredJones.com if you make widgets. Rather, you want to register something like www.BestWidgets.com because that would inspire confidence in people looking for quality widgets who would not necessarily have heard of Fred Jones the widget-maker.

2. Name your page URLs based on reasons similar to the above for your web promotion, except now you can be more specific. Search engines like to know what your page is about. Name a page after a product (BigYellowWidgets.htm) or a service or action (Buy-Widgets-by-Post.htm) on one of the sales pages.

3. The text in the title tag is crucial in letting search engines know what each page is about. Put your important keywords in your title tags, using both the singular and plural versions (people will search for both) and make these tags different and specific for each page. For example, ""Widgets and After Sales Widget Services"". Whatever you do, don't call the home page ""Index"", but treat it almost as a mini-description.

4. The other tags (at the top of the html page) between the two ""HEAD"" tags are not as important as the title tag, but the description tag is still used by some search engines in displaying what you would like web users to see when they scroll down a page of search results. Some search engines don't use the description tag at all; others, like Google, sometimes use part of it together with part of the main body text surrounding prominent keywords on your page. So you may as well treat the description tag seriously; make it brief (about 25 to 30 words) and as comprehensive as possible in the short space allowed. Make sure you have your popular keywords included within your description tag. The ALT tag is used for a very short description of an image or graphic file, and is what is displayed if you allow your mouse pointer to hover above a graphic. These days it is not considered important for search engines. The COMMENT tag is never displayed on the body page, and is used by coders and designers as an instruction or reminder to themselves about what that section of html coding should be doing; in the past, some webmasters in their quest for website promotion and search engine ranking used to stuff keywords in the comments tags, but now it is generally acknowledged that the main search engines pay little or no attention to these.

5. Keyword density. Each search engine has its own preference as to how many times a keyword phrase appears on the page in order to signify the relevance of that keyword phrase (in other words, in order to help the search engine understand what the page is about). Around 5 to 8 per cent is a rough guide as to the optimal level. Don't overdo it, otherwise it will be seen as spam or keyword-stuffing. Also use your keywords in the headings tags H1 and H2. There is an H3 tag as well, but it is doubtful whether search engines bother with that, as it is perceived as less prominent on the page, therefore less relevant to what the page is about.

6. Don't forget good linking in your website marketing. Search engines will judge the importance of your web pages to some extent on the number and quality of incoming links from other sites. Ask other webmasters with sites on similar themes to yours for a link, in exchange for a link back. These sites should not be in competition with yours, but should be similarly themed. You may occasionally be asked by other webmasters if they can link to your site. If this is so then have a look at their site; make sure that their site is relevant, that it has at least some Page Rank, and that it just ""feels"" good, and has no nasty surprises like redirects or unexpected popups. You don't want to be associated with a ""bad neighborhood""!

7. Make sure that important keywords are included in the anchor text within inbound links from other sites. This is crucial to search engines when they try to figure out the relevance and importance of your pages. The inbound link from the other site should take the form of something like this (I'm using normal brackets instead of angle brackets so as not to use compromising html): (A HREF=""http://www.Yoursite.com"")your important keywords included here(/A). You should definitely avoid something like (A HREF=""http://www.Yoursite.com"")click here(/A), which tells search engines nothing except that your site is about ""click here"". Be careful!

About the author: Gordon Goodfellow is an Internet marketing consultant and practitioner He lives and Works in London, UK, and has helped companies in many industry sectors with clients worldwide. His main site, http://www.applied-web-marketing.com is an introductory resource on Internet marketing for small businesses.

Tuesday, January 22, 2008

Keep Your Web Site Content Relevant

Author: John Metzler

Visitors and search engines love content-rich web sites, but just having a lot of content on your web site is not enough. It all has to be relevant to a main topic with each page or section of the web site having a specific theme (And yes, this includes any resource or links pages the site may have). Each page should have its own topic and content should not stray to a different topic.

If you are promoting your graphic design business and have a page on business card design, stay on the topic and refrain from using a page title such as ""Graphic Design company in Vancouver, Canada - business cards, logos, letterheads"". Your want the business card design to be the most important key phrase.

There are two main reasons for content relevancy. The first is so that visitors have an easy time understanding the flow of your web site. Visitors who have to search through multiple pages to find the information they're looking for won't be visitors much longer. The average web site user takes about three seconds to decide whether or not stay on a site. A clear idea of what your site is about should be apparent immediately, followed by easy navigation to other pages that display further topics in more detail.

The second reason for keeping content relevant throughout your web site is for search engine algorithms. Keyword relevancy is an important part of search engine optimization. The more relevant your web site's content is for a specific term, the more likely the site is to show up near the top of search results for the term.

Keyword density is another big deal with search engines. There is an optimal ratio of key terms to the overall amount of text that must be used for search engine optimization purposes. The more unrelated terms that are used consistently throughout the content will bring down the percentage of more important keywords. Keyword density matters throughout an entire web site, not just on certain pages.

Other areas to keep an eye on are the contact page, about us page, and any other pages that you may not think are important to have optimized for search engines such as advertising info, privacy policy, etc. For instance, some web sites have pages devoted to reciprocal links. There's nothing wrong with them unless you link out to a lot of unrelated web sites. The keywords that are used in the anchor text and surrounding description text will detract from your overall site content if they are not related. Incoming links from unrelated sites are fine, but keep in mind that the links page counts as part of your web site as a whole.

Consider using a reciprocal links page as more of a resource for visitors instead of a long list of irrelevant sites. This not only appeases search engines but your visitors as well. And as mentioned before, both visitors and search engines should be kept in mind when creating web site content.

About the author: John Metzler is the co-creator of Abalone Designs, Inc. - http://www.abalone.ca, a Search Engine Optimization company in Vancouver, Canada. He has been involved in web design and web marketing since 1999 and has helped turn Abalone Designs into one of the top SEO companies in the world.

Monday, January 21, 2008

Why valid HTML code is crucial to SEO

Author: Roseanne van Langenberg

Valid HTML code is crucial to search engine optimization

Valid HTML code is crucial to search engine optimization Copyright 2005 Marketing Defined. All Rights Reserved Why valid HTML code is crucial to your web site's search engine optimization efforts and subsequent high rankings: Many webmasters and newcomers to web page design overlook a crucial aspect of web site promotion: the validity of the HTML code. What is valid HTML code? Most web pages are written in HTML. As for every language, HTML has its own grammar, vocabulary and syntax, and each document written in HTML is supposed to follow these rules. Like any language, HTML constantly changes. As HTML has become a relative complex language, it's very easy to make mistakes ... and we do know by now the favorable weight the new msn.com beta search engine places on proper coding practise ... see a recent article on

msn.com coding requirements HTML code that is not following the official rules is called invalid HTML code. Why is valid HTML code important to search engine optimization and your whole marketing effort? Search engines have to parse the HTML code of your web site to find the relevant content. If your HTML code contains errors, search engines might not be able to find the content on the page, and there ends your seo efforts and quest for high rankings of that page. Search engine crawler programs obey HTML standards. They only can index your web site if it is compliant to the HTML standard. If there's a mistake in your web page code, they might stop crawling your web site and they might lose what they've collected so far because of the error. Although most major search engines can deal with minor errors in HTML code, a single missing bracket in your HTML code can be the reason your web page cannot be found in search engines. If you don't close some tags properly , or if some important tags are missing, search engines might ignore the complete content of that page. How can you check the validity of your HTML code? Fortunately, there are free services that allow you to check the validity of your HTML code. The search engine optimization community's guideline HTML validator is the free W3C HTML Validator . It is the service that checks HTML documents for conformance to W3C HTML and XHTML recommendations and other HTML standards. To correct the errors outlined by the W3C validator enter the address of your web page at the

Netmechanic in-valid HTML repair page where you have the option of automatically fixing the errors on that page. This Netmechanic resource does have a demo evaluation mode which enables self-application of the referenced alterations. Although not all HTML errors will cause problems for your search engine rankings, some of them can keep web spiders from indexing your web pages and spoil your search engine optimization efforts. Valid HTML code makes it easier for search engine spiders to index your site so you should make sure that at least the biggest mistakes in your HTML code are corrected. .. run your web pages through the W3C validator, make the recommended alterations and the new MSN.com beta search will love you .. the MSN search engine places a high value on proper coding practise. Entire article available at:

Marketing Defined:Valid HTML code crucial to SEO This article may be reproduced in its entirety, with no alterations. The resource boxes, live URL's and Author Bio must be included.

About the author: Roseanne van Langenberg is a Marketing Consultant and Publisher from Melbourne, Australia. Her articles on legitimate SEO and Web Marketing techniques are at the

http://MarketingDefined.com/blog/

Sunday, January 20, 2008

KEYWORDS IMPORTANCE AND USES

Author: Vikas Malhotra

The first step in the area of search engine marketing (SEM) is to understand the relevance & importance of keywords. Keywords are the starting blocks of a successful SEM campaign.

So lets dig into this section of SEM (or specifically SEO, search engine optimization) by defining keywords & keyphrases.

KEYWORDS and KEYPHARASES Definition : site words or phrases which/that are matched with the search words or phrases of target audience using search engines are known as keywords and key phrases.

Importance of KEYWORDS and KEYPHARASES : Keywords and Keypharases are used to promote and market a website by improving search engine rankings.

Where to use KEYWORDS and KEYPHARASES for Offpage Optimization : The Keywords and keypharases are used as the anchor text to link to the site. Search engines specially google attach lot of importance to the anchor text hence if there is a keyword in it, the site gets promoted for that keyword.

The Keywords and keypharases are also used to advertise, promote and market site with PPC (pay per click) and PPI(pay per impression)

Where to use KEYWORDS and KEYPHARASES for Onpage Optimization : KEYWORDS and KEYPHARASES are generally used with HTML tags ( title tag, meta description tag, meta keywords tags, body tag , anchor tags, comment tags, heading tags, alt tags, table tags, form tags, list tags, formatting tags, frame tags, map tags, applet tags, etc.) and in url itself. Apart from this the keywords & key phrases are used in the text matter of the page.

These are also used to create KRPs (Keyword Rich Pages – where a specific KEYWORD and KEYPHARASE is repeated to get top ranking for that particular web page for a particular KEYWORD).

The strategy is to target one keyword or at maximum two, per page. Once these KRPs are produced it is easier to get these pages ranked with search engines. These pages then have links to other pages in the site & hence serve as the passage way for inviting in the surfer to investigate the site deeply. These Keyword rich optimized pages are also called 'Doorway Pages.'

So lets start to introduce the keywords where they belong:

KEYWORDS and KEYPHARASES uses in THE PAGE TITLE : --------------------------------------------------

Page title is the first element that search spiders' find on a web page and hence it stands to reason that the title of each page should be carefully crafted.

THE PAGE TITLE text is displayed by using ….. tags in the title bar of the web browser. As it is one of the most important element of the webpage therefore most important and essential KEYWORD AND KEYPHARASE (S) must be included in it. KEYWORDS and KEYPHARASES which are used in page title are used by search engines in their result listing and they are then used by the target audience as the hyperlink to access the website from the search engine results page(SERP).

Keywords in a title is generally upto 60 characters or less including spaces. Google cutsoff title text after 60 characters when it is displayed in the SERPs. To achieve high ranking on the search engines the main keyword or phrase for the web page should appear should appear in the beginning of the title on the page. The page title tag carries significant weight in most search engine algorithms.

Example:

The page title " keyword & kyphrases tutorial for search engine marketing " uses keywords ( keyword , keyphrases, tutorial, search engine marketing)

These keywords and keypharases are dependent upon the domain, nature and objective of the web site.

KEYWORDS and KEYPHARASES uses in THE META TAGS : -------------------------------------------------

Meta Tags : Meta Tags are html text which describe information about the web page. In other words they provide data about data(web page content). Meta tags are invisible on the web page. They do not appear in browser window. To access them one has to select View menu and then select sourcece option from menu bar of browser Each meta tag has both a 'name' and a 'content' attribute.

In fact Meta tags are one of the first elements I look at, while assessing a competitor web site.

Meta tags are placed after the tag (position of Meta Tags is not fixed. If you put Meta tags elsewhere the page will not suffer).

The purpose of using the meta tag is to add relevant keywords that could be used by search engines while indexing a page.

example

Keyword and description meta tags can hold up to 1000 characters . Keyword and description meta tags should not repeat same keyword more than seven times otherwise some search engines will regard this as spam. It is better to avoid those keywords and phrases that have nothing to do with the page although search engines do not punish pages that include keywords that cannot be found elsewhere on the page.

As you can see we have chosen not to include commas. The search engines don't mind and it opens up for new keyword combinations. ""search engine optimization techniques"" includes no less than eight keyword or keyword phrases, including ""search engine"" and ""optimization techniques"".

About the author: I'm an eBiz consultant and owner of Mosaic Services- an SEO Company. I regularly write and submit articles on various SE specific topics.

To find more articles, please visit my site http://sem.mosaic-service.com

Keywords Finalization Methodology

Author: Vikas Malhotra

To arrive at the set of keywords that:

Describe business correctly (are relevant) Attract traffic (are popular & are searched for) Have less competition (are relatively un-optimized for )

Steps

Step I: Lets start by saying that the for the keyword finalization of a web site the first step is to device the theme of the web site. The keywords then should be generated which are in sync with the themeing structure of the site. The home pages & the other higher level pages should target more general(main theme)keywords. The deeper pages (embedded in subdirectories or sub domains) should target more specific & qualified keywords.

Once the sites themes & sub-themes are done, lets start by looking for the keywords

StepII:

The finalization of the keywords for any given site can be done in the following way:

Generation of the seed keywords for the site (theme keywords).

Expansion of the seed keywords into key-phrases by adding qualifiers (sub theme keywords)

Generating a larger set of keywords by word play on the key-phrases generated in step II.(sub theme targeting)

Lets take them one by one:

SEED Keywords/Primary keywords:

The seed keywords can be generated by either of the ways mentioned below:

The client provides the terms he feels are relevant to his business.

The SEO firm generates the seed words by understanding the business domain & the business model of the client.

Some outside domain consultant provides them.

Another way of generating seed keywords is to look for the meta tags of the competition web sites.

WARNING: do not place any unnecessary emphasis on these tags. Use them just to generate you seed keywords list.

If one has certain set of keywords then tools like WT & Overture can also be used to arrive at the other relevant seed keywords.

Typically seed keywords are single word. A good number of seed Keywords is between 10-12.

SUB theme Keywords (add Qualifiers)

Now to these seed keywords add qualifiers.

These qualifiers can be anything location/sub-product/color/part no/activity/singular etc.

By utilizing these qualifiers one can expand the list of the seed keywords. Say a good number would be anywhere between 20-30.

Typicaly a sub theme key phrase could be of 2-3 -4 word length.

One recent study suggests that

The typical searcher often uses longer queries. Many contain more than three words. Within three different search engines, keyword distribution data tells a compelling story: Words in Query LookSmart (%) Ask.com (%) Teoma (%) 1 27.00 12.76 38.04 2 33.00 22.46 29.59 3 23.00 19.34 18.13 4 10.00 11.89 8.00 5 7.00* 7.86 3.51 6 - 6.19 1.39 7 - 5.47 0.63

LookSmart does not report beyond 5 search terms, instead grouping five or more terms into one category.

Approximately 40 percent of queries in LookSmart have three or more words. About 32 percent in Teoma have three or more. Ask Jeeves has an even higher skew, nearly 62 percent, because of its natural language focus. Within FAST, the database that powers Lycos and others, the average is 2.5 terms. That suggests a similar frequency distribution to LookSmart and Teoma.

Hence we can keep the average length of sub theme keywords at around 3.

About the author: I'm an eBiz consultant and owner of Mosaic Services- an SEO Company. I regularly write and submit articles on various SE specific topics.

To find more articles, please visit my site http://sem.mosaic-service.com

Saturday, January 19, 2008

Types Of Keywords

Author: Vikas Malhotra

Keyword can be classified into three categories :

-Single word Keyword -Multiple word Keyword -Keywords based on Theme

Keyword(s) are basic raw material used in Search Engine Optimization (SEO). Keyword selection must be done using Keyword Research where we use special tools to find out a list of Keywords (search terms) searched by targeted audience, recently. Keyword can be single word, two or three words, multiple words and theme based.

Lets detail each of these one by one . Understanding these categories of keywords would also help one to decide as to their targeting on specific pages.

Keyword of Single word ----------------------

Keyword of Single word is used to target a large traffic but leads to highly competition category of sites. Keyword of Single word are known as generic Keyword(s) where we target general audience. Keyword of Single word does not help to target a specific page for a specific audience.

Although, Keyword of Single word helps in bringing huge traffic but these terms are mostly not relevant these days, as, searchers mostly use two or three keyword to find out their required information.

Instead single word keywords are good theme keywords.

We can use these primary keyword 5 to 7 times in a web page for good theming according of a site. The inside pages of the site can qualify these themes into product or service categories by adding qualifiers to these theming keywords.

Example:

Keyword "Services" will produce result of all the web sites related to Services which can be United States Department of Health and Human Services, The Substance Abuse and Mental Health Services, direct Services, online Services, offline Services, Indian Services, American Services, food Services, agricultural Services, business Services, free Services, paid Services, etc.

Keyword of Multiple words -------------------------

Keyword of Multiple words is used to target a specific traffic, which leads to high sale, top position in search result listing as well as improves page rank competition. Keyword of Multiple words are known as Specific Keyword(s) where we target a specific audience & not general searcher. Keyword of Multiple words helps to target a specific page for a specific audience.

Keyword of Multiple words, more often than not are location specific and related to geographic area of your intended services or products.

Keywords based on Theme -----------------------

Keyword based on Theme are used to target a highly targeted audience, which leads to quick high sale, top position in search result listing as well as boosts page rank. Keywords based on Theme are known as Conceptual Keyword(s).

Keywords based on Theme consists all possible primary Keywords related to the web site therefore attracts high traffic of targeted audience. Keywords based on Theme add quality by providing Keyword Rich Text to search engines, which improve results and provide targeted search result listings.

Keywords based on Theme are used in each page of web site to focus on a specific topic correlated to a targeted topic or idea (theme). Keywords based on Theme effectively contribute in growth of informative pages for its site focusing on different related topics.

Keywords Targeting Strategy ---------------------------

Keywords Targeting Strategy is a tactic, which guides in placing primary keywords like Keyword of Single word, Keyword of Multiple words and Keyword based on Theme according to their importance for successful optimizations of the site.

Vikas Malhotra.

This article can be reprinted as long as the resource box stays intact.

------ I'm an eBiz consultant and owner of Mosaic Services- an SEO Company. I regularly write and submit articles on various SE specific topics.

To find more articles, please visit my site http://sem.mosaic-service.com

About the author: I'm an eBiz consultant and owner of Mosaic Services- an SEO Company. I regularly write and submit articles on various SE specific topics.

To find more articles, please visit my site http://sem.mosaic-service.com

Friday, January 18, 2008

Holistic Search Engine Optimization

Author: Biana Babinsky

We all have heard that search engines are the number one resource for consumers to research and buy products or services online. In fact, marketing studies have shown that over 70% of web surfers use search engines to locate online merchants. The ramification of this statistic is simple: if you want to reach the members of your target market you need to make sure that your business web site appears in popular search engines.

Search engine optimization (SEO) can help you achieve this. SEO is a crucial online marketing technique. It helps you to get your web site to appear in search engine result pages (SERPs) for keywords that that are relevant to your business. For example, if you are a business coach, and someone is searching for ""business coach"", it would be great for your business if your web site appears in the first 10 results.

So what exactly is search engine optimization? SEO includes selecting the best keywords for the body of the page as well as other page elements (the title tag, keyword and description meta tags, etc.) Carefully chosen keywords provide ""hints"" that search engines use to classify your site and to determine whether your site should appear for certain search keywords. On-page keyword optimization by itself will help your rankings. However, holistic search engine optimization is required to achieve optimal results. For the best placement, focus on SEO as a whole, rather than concentrate on a particular technique.

Here are a few important elements of holistic search engine optimization:

* Have good content. After all, content is king (or queen, whichever the case may be). Without content there isn't much of a site. The content must walk a fine line between providing useful content and including sufficient and relevant keywords for search engines.

Often it is difficult to optimize your product or service pages for search engines, since the main goal of these items is to showcase the benefits of your offer. On the other hand, it is easier to optimize blog posts and articles. In fact, you can create entire articles and blog posts around keywords that are popular with your target market.

* Have consistent navigation. Both your web site visitors and search engine spiders navigate your web site through links that connect different pages. Consistent navigation ensures that your visitors can reach every page on your web site from every other page. On top of that, you will you get the most out of your search engine optimization efforts.

* Get links to your site. Links to your web site play a very important role in holistic search engine optimization. Many popular search engines evaluate your web site's popularity based on counting the number of links that point to your site. The more popular your web site is, the better rankings it gets on search engines. The best links are links from reputable, quality web site that have complimentary or related content.

You can get links to your web site by using a variety of techniques, such as writing articles, participating in link exchanges, and participating in online networking.

Take a holistic approach to web site optimization. Apply SEO to your web site as a whole, and always evaluate the overall impact of your marketing efforts.

About the author: Biana Babinsky shows you how to drive thousands of targeted visitors to your web site in her Step by Step Search Engine Optimization Special Report . Learn more about the report at http://avocadocon sulting.com/rlinks/zseo

Thursday, January 17, 2008

Would You Let A Dog Or A Butler Market Your Website?

Author: Michael Cheney

The latest news to hit the Internet's 'water cooler' is that Lycos and Ask Jeeves are to begin their own SEO services..

Search engine optimisation (or SEO) is any practice related to the end goal of improving your website's positioning in the search engines.

The brief history of SEO is that it first started out solely as the remit of the developers themselves. This was back in the day when one person designed, built, maintained, hosted and marketed a website.

The Ever-Changing Internet Landscape

During the past few years, however, the entire Internet industry has divided up into a myriad divisions to such an extent that SEO is now a booming sector in its own right. Entering the term ""seo"" on Google returns nearly 20 million results!

Now businesses recognise that it is no longer possible to expect one individual or company to possess all the necessary skills 'under one roof' to be able to achieve great results on the graphic design, technical construction AND marketing of their website.

This has led to the trend of bringing in specialist search engine marketing consultants or companies to assist.

As Google and the like are spending millions every year on developing their tools to accurately sift through the billions of web pages that exist it really is a full time occupation just keeping track of what has an effect on search engine ranking and which strategies work best.

Everyone Wants 'Organic' Nowadays!

Ask Jeeves and Lycos are now offering their SEO service to help companies get found in the 'organic' results of search engines.

'Organic' results are those results in a search engine listing that are non-paid. Almost all search engines now offer businesses the shortcut of simply paying to get an ad listed whereas as a latest study indicates that only 28% of searchers will click on an advertisement as opposed to one of the 'organic' results (on Google).

This is the reason we're all fighting for high rankings in the 'organic' results. This demand from businesses is also presumably the main driving factor behind Ask Jeeves and Lycos starting to offer their own Search Engine Optimisation services.

Money Well Spent?

In short - I'm not sure.

If you have a 50 page website, getting Lycos to provide you with SEO services will cost you a tidy $10,000. They also include the questionable tactic of re-submitting your website over and over to search engines which is both pointless and risky as some will act against those that submit more than once.

$10,000 sounds like a lot to me for a service that is only dealing with search engines and which talks of 'recommendations' that you then have to implement yourself.

Really though - it comes down to what you are trying to achieve with your website.

My two cents on this is that the Lycos service doesn't devote enough attention to links - your money may be better spent on getting more high quality in-bound links or on a well-managed Pay Per Click Campaign.

Ask Jeeves http://www.askjeeves.com

Lycos' SEO Service http://ranking.lycos.com/sitesidemore.html

Search Engine Users Survey http://www.clickz.com/stats/sectors/software/article.php/3348071

About the author: Michael Cheney is Author of The Website Marketing BibleTM. Take the Free 7-Part Course ""Internet Marketing Made Easy"" and get your free sampler of 'The Bible' here: http://www.websitemarketingbible.com/marketing/

Wednesday, January 16, 2008

Google's New SEO Rules

Author: John Metzler

Google has recently made some pretty significant changes in its ranking algorithm. The latest update, dubbed by Google forum users as ""Allegra"", has left some web sites in the dust and catapulted others to top positions. Major updates like this can happen a few times a year at Google, which is why picking the right search engine optimization company can be the difference between online success and failure. However, it becomes an increasingly difficult decision when SEO firms themselves are suffering from the Allegra update.

Over-optimization may have played the biggest part in the dropping of seo-guy.com from the top 50 Google results. Filtering out web sites that have had readability sacrificed for optimization is a growing trend at Google. It started with the Sandbox Effect in late 2004, where relatively new sites were not being seen at all in the Google results even with good keyword placement in content and incoming links. Many thought it was a deliberate effort by Google to penalize sites that had SEO work done. It's a few months later and we see many of the 'sandboxed' web sites finally appearing well for their targeted keywords.

With 44 occurrences of 'SEO' on the relatively short home page of seo-guy.com, and many of them in close proximity to each other, the content reads like a page designed for search engine robots, not the visitor. This ranking shift should come as no surprise to SEO professionals as people have been saying it for years now: Sites should be designed for visitors, not search engine robots. Alas, some of us don't listen and this is what happens when search engines finally make their move.

One aspect of search engine optimization that is also affected in a roundabout way is link popularity development. After observing the effects of strictly relevant link exchanges on many of our client's sites recently, we've noticed incredibly fast #1 rankings on Google. It seems Google may be looking out for links pages designed for the sole purpose of raising link popularity and devalues the relevance of the site. After all, if a links page on a real estate site has 100 outgoing links to pharmacy sites, there has to be a lot of content on that page completely unrelated to real estate. Not until now has that been so detrimental to a site's overall relevance to search terms. It goes back to the old rule of thumb: Make your visitors the top priority. Create a resources page that actually contains useful links for your site users. If you need to do reciprocal linking then keep it relevant and work those sites in with other good resources.

Keeping up with the online search world can be overwhelming for the average small business owner or corporate marketing department. Constant Google changes, MSN coming on the scene in a big way, and all the hype around the new Become.com shopping search function can make heads spin. But just keep things simple and follow the main rules that have been around for years. Google, as well as other search engines, won't ever be able to ignore informative, well written content along with good quality votes from other web sites.

About the author: John Metzler is the co-creator of Abalone Designs, Inc. www.abalone.ca, a Search Engine Optimization company in Vancouver, Canada. He has been involved in web design and web marketing since 1999 and has helped turn Abalone Designs into one of the top SEO companies in the world.

Tuesday, January 15, 2008

Keywords usage in the Body Text

Author: Vikas Malhotra

Once the sites keywords have been finalized it is now time to start using them to optimize the site. Once the title & meta tags of the site have been optimized its time to start with optimizing the content of the site with keywords.

KEYWORDS and KEYPHARASES usage in Body text :

Body text is a gist of web page which represents overall information of the web page in a precise way. We should use Keywords and Keyphrases in body text in standard and natural way as it is read by the targeted audience. This practice is also rewarded by search engines.

Body may consist of headings , pictures , tables, frames, forms and of text paragraphs. Therefore, we can include Keywords and Keyphrases within

… , , … , … , … , … , … , … , etc.

In case of tables , frames , forms we can include Keywords and Keyphrases in their respective headings.

One thing to consider is that Keywords and Keyphrases must be included in descending order of their importance. It is recommended(thumb of rule) that at least four or five informative paragraphs (including important Keywords and Keyphrases) should be included on each page totaling to at least 100 – 250 or above words.

Remember, to include targeted Keywords and Keyphrases in the first paragraph of the body text since some search engines access the first few lines of web page to show as description in the search result listing.

Therefore, it is normally recommended that at least first 25 words in the first paragraph should by Keyword rich.

Starting the body of page with an image is also not a good idea, since most search engines can't read images although some engines can read Alt text of that image.

Bold and large font of Keywords and Keyphrases may help in improving search result listing.

Example : We can write a paragraph showing the gist of the web page including targeted Keywords and Keyphrases.

First Paragraph :

Keywords and Keyphrases should be used in page Title Tag, Meta Tags (Meta keywords tag, Meta Description tags, Meta Robots tags) , Header Tags( , , , , , ), Body text (in paragraphs-

, headings of forms, frames, list, etc.), Alt tag, Anchor tags, Domain name , Urls of directory pages , Comment tags, etc. "

"Keywords and Keypharases used in different part of the web page body text is a very important method of Search Engine Optimization (SEO). As search engines boost the search result listing on the basis of Keywords and Keypharases included in Domain name , Urls of directory pages, Page Title, Body text, Meta tags, Header tags, Anchor Tags, Alt Tags,etc. Therefore, we can safely say that one of the main components of Search Engine Optimization (SEO) is Keywords and Keypharases."

KEYWORDS and KEYPHARASES usage in Alt tag :

Keywords and keyphrases use in alt tag is also a very important factor in boosting page rank used by spiders .The main reason behind altimage tag keyword usage is that usually search engines can't read images . To overcome this problem we can use alt tag which describes image (.gif) or graphic (.jpg).The text in the alt tags is indexed by the search engines.

It also used to describe the contents of the picture or graphic while downloading that image or picture or while moving on to other page. Sometime, Alt tag is also used to describe the contents of the image when the browser is using graphic – off mode.

Alt tag should start with Keyword or Keypharases . Alt text can be used with image or clickable image or text. Alt tag text should be simple ,short, meaningful sentence with target Keyword or Keypharases because it is read by the audience.

In case of linked images we can use set of keywords with little description of the coming page and border property or attribute should be set to zero (0) and name of images should be relevant which helps to improve validity of the web page.

Example : Alt tag text for image without link.

Example : Alt tag text for image with link.

KEYWORDS and KEYPHRASES usage in Anchor Tag :

Keywords and Keyphrases used in Anchor tag contribute in raising ranking of the web page. Anchor tag is used to move from one page to another . We use href attribute to specify the url of the target web page. Anchor link can be created on text as well as on the image or picture.

Example :

Page Rank in Search Engine Optimization (SEO)

Google has started to attach lot of importance to the anchor text in the anchor tags. In fact the usage of keyword in the anchor tags merits such a high place in google ranking algorithm that a phenomenon called google bombing started.

In this, huge number of people linked to one particular page using a particular text. The text had no relevance to the content on that page. However when the text that was used as the linking text was used as a search query, the page that all these tags were pointing to appeared tops.

As an example if everyone links to this page using classical music as the hyperlinking text, then this page has a chance of appearing on a top rank for the keyword classical music in google.

KEYWORDS and KEYPHRASES usage in Comment Tag :

Keywords and Keyphrases used in Comment tag also contribute in raising page rank by increasing Keyword frequency of the web page. We can use targeted Keywords and Keyphrases in comment text instead of simple text to describe the page content.

Comment Tag can be used after title , description ,or robot tag in header section. We can use Comment Tag after header if the opening body does not contain Keyword Rich Text and before the starting of important content like before flash content, programming in javascript, ASP or Applet , images and for those sections we need to identify for easy access.

Although only Inktomi uses this factor to boost page rank but it is advisable to use comment tags with keywords to organize page content which helps in easy editing and review.

Example:

Monday, January 14, 2008

DigitalPoint COOP Ad network

Author: David A. Saharkhiz

Link Popularity is important for any online business, and a powerful new text-link network has emerged on the radar, and best of all, it's free.

One of the internet leaders in Search Engine Optimization and Webmaster Tools, Digit alPoint (PR7) has unveiled a powerful free opportunity for webmasters: a free advertising network . You can display text links or banner ads in return for having your website link shown across many of the thousands of websites in the network.

The service is non-profit and the servers for the project have been funded by donation only. This service benefits all webmasters who join, but those with bigger and better websites will naturally get a larger slice of the pie. Basically, you share advertisements with fellow webmasters and gain additional traffic .

But the real beauty of this program isn't the free traffic!

The COOP Advertising Network uses stanadard text links, not javascript links, so when search engine bots visit a site with your advertisment in someone else's coop ads, you'll recieve a valid backlink for it! The larger the site, the more you will be displayed across the entire network, and the backlink power of the DigitalPoint COOP ad network has skyrocketed rankings in SERPS for many participants.

It takes a few minutes to set up, it's not spam, and it's provided as a free service by Digitalpoint (presumably to increase the popularity of the site). It's really worth it, and within weeks you'll be seeing quality backlinks and traffic.

Sign Up Today !

I know some of your are skeptical, and almost all of you will have questions. I don't work for digitalpoint, but I am a firm supporter of the network, as it has greatly benefited some of my endevours. If you would like to field questions to me about the network, feel free to e-mail me .

Originally posted by David Saharkhiz at GoArticles .

About the author: David Saharkhiz is a computer science major and National Merit Semifinalist at America's Clemson University. He provides comprehensive web-help and HTML help , codes free HTML scripts , and works to help novice webmasters set up new websites.

Sunday, January 13, 2008

Search Engine Optimization that Works in the Long-Term

Author: Hristo Hristov

Search engines are constantly tweaking their ranking algorithms and when that happens some pages lose their top ranking positions. One such event was the infamous Florida Update. Many pages were practically kicked-out of the top 1000 pages for competitive keywords.

With recent updates, webmasters have been thinking that Google does not use PageRank because low PR pages can get very good rankings. Before that everyone was saying that PageRank was THE factor for top positions. Now, everyone is saying that keyword rich anchor text links from many different sites is the key for the top ranks.

All these recent events seem to indicate that search engine algorithms are totally unpredictable, right? Wrong!

All search engines are going in the very same direction. The scientific literature related to information retrieval and recent search engine patents reveal the not-so-distant future of search engine ranking algorithms.

Introducing Topic Specific Link Popularity

For the last few years search engines relied on General Link Popularity to assess the importance of every page. Relevancy was based on a combination of General Link Popularity (importance) and keyword matches on page and off page (anchor text of links for specificity).

General Link Popularity is measured by summing the weight of ALL incoming links to a page. With General Link Popularity ANY link improved the importance of a page. Webmasters started to buy high-PR links from totally unrelated sites. Pages were getting unrelated votes.

To combat this problem, Google implemented a Topic Specific Link Popularity algorithm. When a user specifies a query, Google determines the importance of a page by the Link Popularity it gets from RELATED to the keywords pages.

A link from a page will give you considerable Topic Specific Link Popularity when:

1) the page itself is optimized for your keywords

2) the page has a high General Link Popularity (PageRank)

3) the page is from a site owned by someone else (you can't vote for yourself)

From a search engine's point of view, implementing a Topic Specific Link Popularity algorithm is a very tough task when the queries need to be answered in less than a second.

All you need to know is this: the top ranked pages for competitive keywords are the ones with the highest Topic Specific Link Popularity.

You need links from pages that have high PageRank, are optimized for YOUR keywords and are owned by someone else.

How do you get these links?

1. Search for your keywords on Google and look at all pages that rank for your keywords. Seek links from these pages.

2. Reciprocal Links. Swap links with sites that can give you a link on a page optimized for your keywords. Look for pages with high PageRank that have your keywords in their title and in their incoming links. Reciprocal links work provided that they come from optimized for your keywords (related) pages.

3. Buy links from some of the top ranked for your keywords pages.

4. DMOZ and Yahoo's directory usually have pages that are very well ranked for your keywords. You absolutely must get links from these pages. If you have a commercial site, don't hesitate and buy a link from Yahoo immediately. It is well worth the $299.

5. Find out who links to the top ranked pages for your keywords. Many of their links will not be topic specific, but many WILL be. Try to get links from the related ones. A page is related when it has your keywords in its title, text etc.

6. Form a link exchange ring with some of your competitors. That's a brutally effective strategy. Basically, you link to your competitors from your main optimized page (usually the home page) and they link to you from their most optimized page! Such rings can dominate the top positions and will be very difficult to outrank (it is difficult to get that amount of topic specific links). The caveat here is that the link exchange is on the main page and is not buried somewhere deep.

One more very important tip.

Increase the relevancy of the page that links to you by using your keywords in the anchor text and the description of your site! Yes, having keywords in the links pointing to your page increase your rankings not only by associating the keywords with your page but also by increasing the relevancy of the page that gives you the link! That's the reason SEOs think anchor text is the most important factor. It is NOT. You can get a monstrous ranking boost from a link that does not use your keywords in the anchor text provided that the page has high PageRank and is optimized for your keywords (an example would be a DMOZ listing).

What about getting unrelated links?

Let's say you buy a high PR unrelated link. The page that links to you does not have your keywords in the title and text. The only factor that makes the link relevant to your keywords is the anchor text to your site and your description. You'll still get some benefit but that's nothing compared to a link from an optimized for your keywords page.

Your site can't get into Google's top 1000 results?

If your site lacks Topic Specific Links, it may get filtered out from the results even if it has a good amount of PageRank (from non-related or affiliated sites). You need some threshold amount of Topic Specific Link Popularity to get into the top 1000 pages for very competitive keywords.

Two Final Points

1. Only one link per site can give you a Topic Specific ranking boost. Look for a link from the most optimized for your keywords page.

2. If you find a page that ranks well for your keywords, go for the link EVEN if that page has a lot of links on it.

To recap: the more optimized a page is for your keywords (measured by PageRank and keywords found on-page and off-page) the more Topic Specific Link Popularity Boost you will get from a link.

Topic Specific Link Popularity is and will be the key for top rankings. Anchor text plays a major role but it is not THE factor. PageRank is still very important especially the PageRank of the pages that link to you.

About the author:

Hristo Hristov, owner of the Search Engine Optimization Guide

Saturday, January 12, 2008

Do the Search Engines Know Your Website?

Author: Chet Childers

Are you considering a search engine promotion campaign to improve your website's search engine visibility? To aid in your decision, have you checked your website to determine its search engine awareness?

Perhaps you may be thinking why do I need to check my website? Do you remember going to the doctor for an illness? Hopefully, your doctor performed some tests to diagnose your illness before prescribing your medication. If not, you may have gotten some very undesirable results. In the case of your website, you need to diagnose the patient and determine the extent of your website's search engine visibility. Based on your findings, you can decide to focus on standard search results or paid search results.

Your checks should be done in the Google and Yahoo search engines since they are the major search engines in today's marketplace. As an option, you should consider MSN since they have recently released their search engine. It is only a matter of time before they are considered a player, if not already. Your first check determines if your website is indexed in Google or Yahoo. Think about the card catalog in your library. Each book has a card in the card catalog signifying the indexing of the book in the library. In this exercise you'll check if your website has a ""card"" in the Google or Yahoo card catalog.

The first exercise is opening your web browser to the Google - www.google.com, Yahoo - www.yahoo.com or MSN - www.msn.com website. Enter your website's URL, www.yourdomainname.com, in the search box and review the search results. Your website's domain name or URL should appear in the search results if your site is indexed.

If just the domain name or URL is listed, the website has been indexed but not crawled by the search engine spider. A domain name and several lines of descriptive text indicate the website has been crawled by the search engine spider to determine the website's content. Also, look for the link ""Cached Page"" or a similar phrase. This is another indication the website has been crawled for content by the search engine. Now, you should know if your website is indexed and has a ""card"" in the search engine card catalog. Your second check evaluates the extent of your website's indexing by determining which website pages are indexed by the respective search engine.

The second exercise is opening your web browser to the Google, Yahoo or MSN website. Enter the ""site"" command and your website's URL in the search box in the format ""site:www.yourdomainname.com"" (quotation marks not required) and review the search results. All of your website's pages indexed by the queried search engine should be listed. At the top of the search results screen look for ""Results X - Y of about Z from www.yourdomainname.com"" or a similar phrase. The value for Z indicates the total number of pages indexed by the queried search engine for your website.

In addition, you might see the comment ""repeat the search with the omitted results included"" in the results listing. Since the search engine does not always show the entire listing, you can click the link and view the complete list of your website's indexed pages. Your third check determines the number of inbound links from other websites to your website. The number of inbound links is very important in the ranking algorithms of many of the search engines. The commands for this exercise are slightly different between the search engines. We'll review the commands for the three major search engines. The other search engines have similar commands.

Google & MSN

Open your web browser to the Google or MSN website. Enter the ""link"" command and your website's URL in the search box in the format ""link: www.yourdomainname.com"" (quotation marks not required) and review the search results.

All the external website pages linked to your website and indexed by the queried search engine should be listed. At the top of the search results screen look for ""Results X - Y of Z linking to www.yourdomainname.com"" or a similar phrase. The value for Z indicates the total number of externally linked pages to your website.

Yahoo

Open your web browser and go to the Yahoo website. Enter the ""linkdomain"" and ""-site"" command and your website's URL in the search box in the format ""linkdomain:www.yourdomainname.com -site:www.yourdomainname.com"" (quotation marks not required). The ""-site"" command is used exclude the internally linked website pages from the results.

Review the search results. All the external website pages linked to your website and indexed by Yahoo should be listed. At the top right of the search results screen look for ""Results X - Y of Z linking to www.yourdomainname.com."" The value for Z indicates the total number of externally linked pages to your website.

Do not be alarmed if the displayed results from Yahoo and MSN are significantly different from Google. Recently, Google began displaying limited or no results from the ""link"" command. The general conclusion from the search engine optimization community was that Google felt disclosing the link information might possibly share too much information regarding their ranking algorithm and its impact on a specific website. From these three simple checks you should have determined if the search engines know the existence of your website and be prepared to select the proper search engine promotion campaign. A pay-per-click search engine campaign would be a good choice if your website's search engine visibility is limited and the time deadline to achieve your desired search engine rankings is short. If the time to achieve the desired search engine rankings is not critical to your website's marketing plans, then a search engine optimization campaign to improve the standard search results could be a logical decision.

About the author: Chet Childers is a successful Internet marketer utilizing both pay-per-click marketing and search engine optimization to increase website visibility and profitability. Click http://www.ThePayPerClickMarketer.com and enroll in our e-course, ""Discover Tips and Secrets for Pay-Per-Click Marketing Success,"" or visit http:www.ChetChilders.com.