Monday, March 31, 2008

MSN Beta Search Phenomenon

Author: mark white

I don't fully understand the algorithm's involved with the new

Beta search from MSN but I have to say as a site owner I love it.

After releasing my new site to the World Wide Web I am already

Really impressed with the speed that I got indexed and even more

Impressed with the ranking I have been given.

I let my newest site ( www.buy-dvds-online.com ) out of its box

Less than 2 weeks ago and I have already attained a position of

3 out of 1,589,310 for the keyword search ""buy dvd's online"".

How is this happening?

Well I have looked at all sorts of forums relating to MSN Beta

and it seems that it genuinely searches for the most keyword

relevant site it can find.

The name of the site in relation to the keyword or words searched

for has proved as important as ever, when I searched for the

keywords ""free traffic tips"" on msn I got www.freetraffictip.com as

number 1 result.

When I searched for the same keywords on google the same

site came in at number 41.

Further searching for some more obscure phrases seem to show that

MSN beta places a high priority on the keyword density, this will

reward all of us who have keywords relevant to our sites and do

not rely on heavy backlinks (although I believe this is vital to

get a decent rank with other engines).

The fact that I am ranked at all means that I have escaped the

Google ""sandbox"", I don't expect to receive a PR for at least 2

months and a decent listing for the same period from google.

Lets hope that MSN Beta search is used quickly as a replacement

for MSN search, My opinion is that we should all be allowed to

get a placement we deserve rather than one we have effectively

bought or traded, it kind of gives us little guys a chance.

I still love google, it provides us with a challenge and is still

the number one search engine but I still want the traffic I get from

The smaller engines.

About the author: Mark white has built and run 3 sites for the last 4 years, has worked in I.T. for 15 years and is actively involved in traffic/search engine related sites. His sites are $10 dollar downloads.com sunspeks.com and Buy-dvds-online.com

Sunday, March 30, 2008

Search Engine Optimization Is For Fools?

Author: Kevin Emswiler

Do you know how to get your website on the first page of google's search results? If you don't, you soon will.

Regardless of how you market your website, you should take the time to learn search engine optimization. Theres no other marketing method that I know of where you can get a constant flow of free quality traffic to your website everyday.

Before I show you how to optimize your web pages, I like to make a quick point. You don't need a large site to get great rankings on google. You could have a one page website and optimize it for one keyword, and be on the first page of googles search results for that keyword. This would be hard to achieve because google's search engine values link popularity. Most websites will not link to yours if its only one page. Your site wouldn't be providing any value to their readers, but a website that's just 5 pages can be of value as long as it provides good quality information.

Now lets optimize your web pages. Optimize each of your web pages for one keyword. Put they keyword in your title, description, and keyword tags. Heres an example. Lets say your creating a website about ebay and one of your web pages is about ebay auctions. This is how the tags will look.

Ebay Auctions: How to properly setup your auctions

Next write an article about ebay auctions and use the keyword throughout the article. I like to use my keywords once per paragraph. The longer your article the more times you can use your keyword without sounding corny or abusing google's search engine.

Then ask other websites in your industry to place a link on their website to yours. This will help google index your site faster and get you higher results in their search engine.

About the author: Kevin Emswiler's goal is to help beginning internet marketers establish their own home based business. For more marketing advice, vist http://www.chit-chat-cash.net

Saturday, March 29, 2008

Google's Suggestion Beta Search - New SEO & Webmaster Wonder Tool or Over Rated Popularity Meter.

Author: James R. Sanders

With Google's release of their new Beta Google Suggest site, many SEO's are jumping on the bandwagon to preach praises about Google Lab's latest breakthrough. However, in a recent article published by Site Pro News by Mike Banks Valentine, noted SEO of SEOptimism.com, entitled "A is for Amazon, B is for Best Buy Google Suggest Feature", Mr. Valentine would have us believe that the "results" shown in Google Suggest are for the number of SEARCHES conducted at Google for a given search term. I would have to disagree with him given the research I've done behind the ground breaking new tool. However, before I put the cart before the horse, maybe I should explain this new "tool" just in case you haven't heard of it yet.

Google Suggestion Beta – Salvation for SEO's & Webmasters or a Marginal Tool to Gauge Search Populatiry.

Google has finally rolled out a new tool touted as their latest breakthrough in logical search suggestion. In a nutshell, this tool follows your typing, letter by letter, and as you continue to formulate you search query, Google Suggest opens a drop down box of suggestions based on what it thinks you might be searching for. As you type each letter in your query window, Google Suggest continues to update the list based on the information you type. The Beta site is located at http://www.google.com/webhp?hl=en&complete=1. After reading Mr. Valentine's article, I was quick to drop by the Beta Site to check this new tool out. I thought to myself, "If this actually reports the number of searches being done at Google for a particular search term or phrase, then boy is it going to be easy to evaluate the best search terms to use when optimizing a page for Google." I felt like a kid on Christmas morning running downstairs to see what Santa brought me.

The Investigation Begins – My Hopes Become Fears.

My first idea was to type web site design into the search box query area. No sooner than I finished typing "web si", up pops the box listing "web site design" as the top suggested pick, and the results showing "22,600.000 results". My first thought was "Holy crap, there's THAT many searches being done at Google on web site design?" My next question was "Now I wonder if that is a month, to date since Google has been in operation, or what?" Soon after additional thought, my mind began to clear and the horror struck me. The next question was "What if these are just the number of matching results in the Google database for the search term web site design?" My hopes and dreams of a new tool to demystify Google search popularity began to disintegrate as my mind started to rationalize the situation and ponder the question further.

Comparing Suggestion Beta "Results" with Google SERP "Results".

I decided to click the top suggestion "web site design" and see what happened. As I looked at the SERP, the top line stated "Results 1 - 10 of about 32,700,000 for web site design. (0.22 seconds)". At first, my hopes began to soar again as I pondered the wonderful SEO opportunities. 32,700,000 does not match the results of 22,600,000 reported in the suggestion tool. There seemed to be hope, but then my mind started to wonder again. The next question I asked was "What if the beta tool is using an older database than the present database used to distribute Google's main site results". The horror sat in again as I sat there pondering my thoughts and possibilities. Logical reasoning sat in, and my hopes and dreams were dashed as a ship tossed into a rocky coast during a hurricane. 22,600,000 results a month would be an extremely odd amount of searches for the term web site design even given the number of webmasters and SEO's that normally check that search term to see competition rankings. In addition, given the closeness to the regular SERP results for the same search term, it just stands to reason that the results are from an older database snap shot of the web the lab is using for testing the beta release.

Further Investigation – My Hopes Continue to Diminish.

To investigate my thoughts as thoroughly as possible, I spent some time reading the FAQ located at http://labs.google.com/suggest/faq.html. Nowhere in this FAQ does it say anything as to what the "results" indicate other than that the ranking of suggested terms are based on the popularity of searches done at Google. That does NOT come right out and say that the number reported for "results' in any way indicates the actual number of searches for the particular search term, but yes, one could infer that meaning, especially SEO's and webmasters so desperately looking for another way to help "properly" optimize their pages for Google. I can understand this thought process, and would have fallen prey to it had I not thought about it in a little more detail, but the facts speak for themselves, and common sense rules out.

Comparing Google's Usage of the Term "Results" Throughout Their Site.

The other proof I offer to substantiate my claims stems from the context Google uses everywhere else the word "results" appears on their site. They tend to use that word EXCLUSIVELY with the results being returned for the number of matches for a particular search term from their database. Add to the facts that Google has never been one to openly give webmasters or SEO's ANYTHING that can be used to manipulate their SERPs and it just goes to further prove that the experts are wrong in the assumption that the "results" are an indication of the search term's search popularity at Google. The way the terms are ranked to provide the best possible suggestion IS based on search popularity at Google, but the number has nothing to do with the actual number of searches conducted. That synched it. I have no choice but to believe that the article written by Mr. Valentine is misleading, and that many other SEO's are jumping the gun to tout the "new salvation tool" to help us demystify Google search term popularity, or help us improve our page optimization. This "tool" isn't going to give SEO's and webmasters anything but a preview of the number of SERP results they will have to wade through to find their information.

Taking the Final Plunge – Getting the Information Straight from the Horse's Mouth.

Nevertheless, being I'd rather hold on to some shred of hope that I am wrong, I have taken the liberty to question Google and am still awaiting their reply. Although the order can give some insight into search term popularity for the particular search terms, there is still no way to know just how much of a difference exists between the first and second terms suggested. In all reality, there could be thousands more searches for the first term over the second, or their could be just a few more for the first over the second, but either way it really doesn't give us the useful information that I'd like to see for SEO work like Overture's suggestion tool does. When Google responds to my question, I'll be more than happy to try to edit this article or post the results to the forum. Until then, hopefully I am wrong, but I just can't hold out hope that Google would make things that much easier for SEO's and webmasters looking for ways to "optimize" their pages for the Great Google Bot.

About the author: James R. Sanders is the owner of Sanders Consultation Group Plus. He has been a webmaster and web site designer since 1997, and involved in self-employment ventures since 1992. He is presently a contributing author of NewbieHangout, and has been published through WebProNews. You can email him at webmaster@sanders-consultation-group-plus.com.

Friday, March 28, 2008

How Web Design Can Affect Search Engine Rankings

Author: John Metzler

Uniquely built web sites can create unique issues when being promoted on the search engines. From a basic 3 page brochure site to a corporate site with hundreds of dynamically generated pages, every web site needs to have certain design aspects in order to achieve the full effects of an SEO campaign. Below are a few points to take into consideration when building or updating your web site.

1. Size Matters. The size of a web site can have a huge impact on search engine rankings. Search engines love content, so if you have only a few pages to your site and your competitors have dozens, it's difficult to see a top page ranking for your site. In some cases it may be difficult to present several pages of information about your business or products, so you may need to think about adding free resources for visitors. It will help in broadening the scope of your web site (which search engines like) as well as keep visitors on your site longer, possibly resulting in more sales.

2. Graphics-Based Web Sites. While web sites that offer the visitor a more esthetically-pleasing experience may seem like the best choice for someone searching for your product, they are the most difficult to optimize. Since search engine robots cannot read text within graphics or animation, what they see may be just a small amount of text. And if we learned anything from point #1, small amounts of content will not result in top rankings. If you really must offer the visitor a graphics-heavy or Flash web site, consider creating an html-based side of your site that is also available to visitors. This site will be much easier to promote on the search engines and your new found visitors will also have the option to jump over to the nicer looking part of your site.

3. Dynamic Web Pages. If most of your web site is generated by a large database (such as a large book dealer with stock that is changing by the minute) you may find that some of your pages do not get indexed by major search engines. If you look at the URL of these pages they can be extremely long and have characters such as ?, #, &, %, or = along with huge amounts of seemingly random numbers or letters. Since these pages are automatically generated by the database as needed, the search engines have a tough time keeping them up to date and relevant for search engine users.

One way to combat this problem is to offer a search engine friendly site map listing all your static pages just to let them know that you do have permanent content on your site. If search engines see links going to and from these dynamic pages within a good internal linking system, this may also lead to the pages getting indexed. The link popularity of your site may carry more weight in this case as well, so if you can't offer as much static content as your competition, make sure you have an aggressive link campaign on the go.

4. Proper Use of HTML. There is quite a bit of sub-par web design software out there. Word processors usually have a way to create HTML documents which can be easily uploaded to a site via ftp. However, in many cases the code that the search engine robots see is mostly lines and lines of font and position formatting, not relevant content. The more efficiently written web sites usually achieve higher rankings. Our choice for web design software is Macromedia Dreamweaver, as it is an industry standard. It also makes using CSS (Cascading Style Sheets) a breeze, which can drastically cut down on the amount of text formatting in HTML code. Hand-coding HTML to design sites is also a good method if you are proficient enough.

There are some no brainers too: Web sites with abnormal amounts of hyperlinks, bold or italicized text, improper use of heading, alt, or comment tags can also expect to see low rankings.

5. Choosing a Domain Name. The golden rule to web development of any kind is to keep your visitors in mind above all else - even search engine optimization. When choosing a domain name, one should pick either your business name (if you have a high-profile business name such as Chapters or Coca-Cola) or a brief description of your products. Domain names can always help with search engine optimization, as it is another area of your web site that important keywords can appear. Forget about long-winded domains such as www.number-one-best-books-on-earth.com as no one will ever remember it and it will be hard to print on business cards or in ads.

If you need to change your domain name for any reason you obviously don't want to lose existing rankings. An easy way to do this, and one that is currently supported by most search engines, is the 301 redirect. It allows you to keep your existing rankings for your old domain name, while forwarding visitors to your new web site instantly.

6. Using Frames. Don't use frames. Frames are a thing of the 90's (and in the Internet world that is eons ago) and are not even supported by some search engines. The search engines that are able to index your site through frames will most likely frown upon them. Whatever you are trying to accomplish by using frames can usually be done with the help of PHP includes or CSS (Cascading Style Sheets). Some browsers are not frames-compatible, so there is the danger of some visitors not being able to see your site at all. Bookmarking of individual pages within a frame becomes difficult without lengthly scripts being written.

7. Update Your Information. Not only does information printed two or three years ago look badly on your organization when it is read by a visitor, it is also looked down upon by search engines. Web sites that continuously update and grow their web sites usually experience higher rankings than stagnant sites. When the trick to SEO is offering visitors the most relevant information, you can bet that the age of web pages is taken into consideration by search engines. Consider creating a section of your site devoted to news within your organization, or have a constantly updated resources area.

Many shortfalls of web sites can easily be attributed to designers who just don't keep the user or search engines in mind. Search engine algorithms are quickly improving to try and list the most user-friendly sites higher, given that the content and link popularity are there to back it up. So first and foremost, know your target market and make your web site work for them before focusing on search engine optimization. If you build it (properly), they will come.

About the author: Copyright John Metzler of Abalone Designs, November 2004. This article may be freely distributed if credit is given to the author. Abalone Designs is a family-run Search Engine Optimization firm in Vancouver, BC, Canada. Visit www.abalone.ca for a free personalized analysis of your web site.

Thursday, March 27, 2008

The Resubmission Myth

Author: Bobby Heard

So the thinking goes, the more times you resubmit your site to the search engines, the better chance that you have of getting to the top for your keywords. I can't tell you how much it aggravates me when I hear about another searach engine optimization company who is selling resubmission packages to struggling small-business owners who just want to find one "professional" in this industry who isn't charging them ridiculous prices for NO results. Unfortunately, not only do their clients not understand that they are totally wasting their money and not improving their site's rankings at all, they are actually paying a company to potentially crush their rankings.

The search engines are a touchy group. As you can imagine, they get hassled constantly, as they hold the key to most business' success. They have systems in place to handle the incredible amount of information that they must organize and database. The submission process is simply one of those systems they have, used to inform them of your web site's presence if they were previously unaware of its existence. That's all that it is. Once they are aware of the URL of your web site, they will continue to go to it, update it, and most importantly, keep it in their database. So what's the harm in resubmission you ask? Well, it's like sitting in class when your teacher is taking attendance. She calls out your name and you say that you are present. Then, EVERY 5 minutes for the rest of the day, you put up your hand and remind her that you are still present. This may sound ridiculously stupid, but in essence, that is what you are doing to the search engines when you resubmit to them, and search engines can sometimes react by throwing your listing out of their database, just like your teacher is likely to react by throwing you out of her class.

By continually submitting your site, you are creating more work for a major company that is trying (as every company surely is) to create a cost-efficient operation. Your continual submission means that they have more information to process and thus, more money to spend employing someone to deal with that information. The search engines, just like all of us and any successful business, hate wasting money where they don't need to. So if you continually resubmit your site, they get progressively irritated, and will sometimes take drastic measures such as banning your site from their database.

These search engine optimization companies that tell their consumers that resubmissions work also present other ridiculous claims:

- "We will submit you to 1000's of search engines" - This list always ends up including Google, Google Canada, Google Germany and Google for every country they have a site for as well as Yahoo, Yahoo Canada, Yahoo Australia, etc. The problem here is that every Google site uses the same Google database and every Yahoo sites uses the same Yahoo database. Don't believe me? Go to google.com and look at the very bottom of the page beside their copyright. It says searching 8 billion + pages. Now go to google.de, google.ca, google.com.au, google.co.uk and look in the same place, do you notice something? It's the same number! It's the same database!

- "Search Engines only index the pages that you submit to them" - What they're saying here is that any other pages of your site won't be indexed unless you submit each of them individually. Wrong again! Don't believe me? Go to yahoo.com and type in "www.abalone.ca" We only submitted our index page when we first started our business. Just one page (index.html), once. Look how many pages they have put in their database.

About the author: Bobby Heard is the VP at Abalone Designs and is an active writer of SEO articles. More articles he has written are available at www.abalone.ca/resources/

Tuesday, March 25, 2008

Improving the ROI of Web Directory Submissions

Author: Will Spencer

Submitting your web site to web directories is a great way to build high-quality inbound links, increase your

PageRank , and improve your SER PS .

The Internet now features a large number of both general purpose and specialized directories. The specialized directories focus on a specific topic or region.

Unfortunately, submitting your web site to the plethora of available directories requires a significant investment of time and effort. The key to achieving a positive ROI (Return On Investment) from directory submissions is to focus on those directories which will provide you with high-quality inbound links.

It is important to note that many web directories have issues which prevent them from providing useful links. Taking the time understand these issues and submit only to the directories which will provide value will provide significant time savings in promoting your site.

A Taxonomy of Web Directory Issues

Web Directories Which Are No Longer Managed

A survey of 79 general-purpose web directories revealed that 62% of those web directories do not appear to be processing submissions.

In many cases, this is because the operators of the web directory launched their site with great enthusiasm, became overwhelmed by the amount of effort required to maintain a web directory, and stopped maintaining their directory. These operators leave their directories online to take advantage of the residual advertising revenue, but no longer invest the time required to process directory submissions.

Web Directories which Block Search Engine Robots

Some SEO's believe that they can preserve Google PageRank by not giving outbound links from their web sites.

Web directories which operate in this manner will sometimes block Googlebot and other search engine robots from their directory pages using

robots.txt or meta tags.

Before submitting to a web directory, check the PageRank of the directory page upon which you expect your listing to appear. If that page has a PageRank of zero, it could mean that the site operator is blocking Googlebot.

However, this same symptom can also mean that the site or the page is new, and has not been assigned Google PageRank yet. To research this further, examine the sites robots.txt file and the source code to the directory page upon which you expect your listing to appear.

Web Directories which use Scripted Links

Many web directories use scripted links to avoid leaking PageRank. Scripted links use PHP or

JavaScript links to redirect the user to the target page. Although these links work correctly for most site visitors, they do not work for most search engine robots.

To determine if scripted links are in use, place your mouse cursor over an outbound link on the directory page upon which you expect to appear. Examine the link text which appears in the lower left corner of your browser.

A JavaScript link will look something like this:

javascript:var handle=window.open('http://www.internet-search-engines-faq.com')

A proper HTML link will look something like this:

http://www.internet-search-engines-faq.com

Free Directory Scammers

Several web directories pretend to accept free submissions, but in reality these free submissions are never processed.

These directories claim to accept free submissions as part of a bait-and-switch tactic to upsell webmasters into paid submissions.

Often these web directories will claim to provide free service for web sites which meet certain defined criteria. It just happens that no web site ever manages to meet those criteria!

Web Directories which Require Reciprocal Links

Some web directories require you to link back to them in order to be listed. This helps build the PageRank of the directory, which benefits all web sites in the directory.

Some people view these sites as link-farms.

If your web site is unable to provide a link back to these directories, do not invest your time submitting your web site to these directories.

Reciprocal linking is not an appropriate strategy for all web sites.

Paid Directories

Some web directories charge for submissions. These web directories charge either a subscription fee, a one-time fee, or a Cost-Per-Click (CPC) fee.

Some of these web directories charge exorbitant fees -- and a few of these web directories charge this fee whether your site is accepted or not!

Although there are exceptions, the general rule is that paid directories do not provide a positive ROI.

Better Directories

A few web directories stand out by offering good user interfaces, speedy inclusion, and plain HTML links which pass PageRank:

All The Web Sites Ezilon

illumirate Jayde JoeAnt Massive Links Mavica Sea rch City The Super Ultra Mega Web Directory Tygo World Site Index Wow Directory

Summary

Submitting your web sites to web directories is an excellent method of building quality inbound links.

Submitting your web site to only the best web directories can net you 90% of the results with 20% of the effort.

About the author:

Will Spencer is the webmaster of The Internet Search Engines FAQ .

Monday, March 24, 2008

SEO Promises from GMR

Author: Pradeep

What are the GMR Promises for developing Search Engine Optimized website?

* We are not going to have any throwaway web pages; we shall link all the WebPages in a logical sequence. * We shall not link your webpage with any harmful websites or websites which are considered as spam, but we shall have the links of your webpage in certain businesses which we feel shall be value adding to your business. * We shall not have any hidden text\links written anywhere in the webpage, but we do say that we will embed the text and keywords so that your webpage is recognized by Search engines in a fast way. * We use our own algorithms to find the best keyword match once you give the phrases with which you want to appear in the search engines. We are not using any ready made tools to employ this, the tools and techniques we use vary with the latest news items posted by Yahoo, Google and MSN on search engine optimization. * We do not promise you the first rank in any search engine, but we do promise that by having some home made SEO algorithms; we increase the visibility in the search engines to a very great extent. * We will give you a flow map of your website that will show all the WebPages and their navigation in the website. * We promise that any confidential matter you would be sharing with us will not be revealed to any third party apart from the members of GMR web team. * We do a lot of research in order to show you on top, we shall require your cooperation and seek continuous maintenance with GMR web team so that latest trends of SEO shall be implemented. We also want to say at this point that SEO frameworks would be changing time and again. Big giants like Yahoo, Google and MSN continuously work on their algorithms to find the best website. So there will be no constant top position in any search engine if you stop having maintenance. * Search Engine Optimization is a science of logic and no body can own any rights saying their algorithm is right or say others algorithm is wrong. It depends only on the perceiving capabilities of the search engine.

About the author: I am a postgraduate in Enterprise Managment from University of Dundee, U.K. I love to understand the new trends happenning in ever changing web world. GMRwebteam is the web company with which i am associated with in implementing my new ideas.

Sunday, March 23, 2008

The SEO Journey:

Author: Christine Stander

Working in the fast-paced search engine marketing industry, I have often been faced with the question: "Where exactly does SEO fit into web site development?"

The relevance of SEO cannot be overestimated. Increasingly, webmasters and site owners alike are beginning to realize the importance of optimizing a site to increase rankings on search engine results pages (SERPs). However, they are not always sure how to go about this.

The Growing Importance of SEO:

A few years ago, web sites were created with the goal in mind of bringing companies closer to their clients, i.e. their target market. As the popularity of search engines grew, the number of web sites being created increased alongside. During this process, webmasters and site owners began to realize how important it was to get that all-important competitive edge by reaching the ideal top 20 positions on SERPs. Thus the battle for the top 10 positions ensued.

With this was born the need for people to find and understand a method of constantly reaching those positions. In the case of SEO, the techniques which evolved can be divided into "good guys" and "bad guys" (the proverbial "white hat" SEO and "black hat SEO", respectively.) Of course, some SEOs evolved into more shades of black than others, finding techniques to "outsmart" the spiders, while others remained true to developing content-relevant sites aimed at users rather than the spiders, while still attaining those sought-after top positions. (But that is a topic for another day.)

Although the black-hats may differ from my opinion here, SEO is no quick fix to propel a site into high-ranking top positions. SEO and usability go hand in hand. Target search engines and searchers at the same time by writing good, relevant, converting content. Map your site well with internal links to related sections. In so doing, you will provide your browsers with easy-to-follow navigation and simultaneously help spiders to index your site. Achieving a well-balanced site is a tedious journey, but when implemented correctly, may produce long-term stable results that will drive your site to top positions and increase your conversions.

Know your territory.

If you want to experience something fully, you have to be prepared for it. You have to research what you need, from the most basic needs through to the finest details. I have been privileged enough to experience the wonders of an African safari. Before the journey, one knows to pack comfortable bush clothes. One also knows that you're not going to need high heels or suits in the bush. It's the basics that count: jeans, sneakers and t-shirts.

When you go out on a game drive, what keeps you at ease amongst lion, leopard, elephant and buffalo is the fact that firstly, you know you're in a Land Rover, driven by an experienced safari guide, and secondly, that if it comes to it, there's a trusty rifle in the back seat.

The lessons illustrated by this analogy are: research your territory, be prepared, dispense with what is unnecessary and remember that the basics count most of all.

To take the analogy further, launching a web site is not unlike going on safari. Before sending it on its journey to the World Wide Web, you need to pack it full of the things you know it's going to need. Start with the basics.

•Know the territory. oWhat is the industry which you are entering? •Know the climate. oHow competitive are your competitors? oDoes it call for additional optimization methods such as PPC? •Know the predators. oResearch your competition. Who's going to be out there with you? oAre they well prepared? Scrutinize the optimization fluctuation of their web sites. •Choose a suitcase. oEnsure the SEO strategy is developed around the technology you choose to develop the site. Flash, Javascript and dynamic sites' strategies may vary from conventional "straightforward" html web sites. •Pack accordingly. oThe site's clothes are the copy and design elements. Ensure both are enticing, convincing and relevant, and that they flow with the general theme of the site. oEnsure your key-phrase fluctuation is relevant to the page content and does not deter from the natural flow of the copy. •Be vigilant. oRegularly check your site's key-phrase ranking and follow-up on your competitors to see if they have made any changes to their strategy. •Adapt. oIn an ever-changing, fast-paced environment, it is crucial that SEOs are able to adapt their style to the most relevant optimization requirement for the intended market.

SEO is a crucial element throughout the life cycle of a web site. Its roots are firmly planted from the site conception to the launch, and tenderly keep it stable throughout its life.

The only question remaining is: "If you choose to call in the experts, should you do this before or after the fact?"

Choose your guide:

Search Engine Optimization (SEO) consultants can be called upon at any time during the production of a site. It is preferable, however, to include them in the process as early as possible to ensure that your site is as prepared as possible for the journey ahead.

When you go out on a game drive, you have one of two options – either you drive yourself, or you choose to drive with a guide. You may end up driving along the same path, but the guide knows the territory best. They know where to look for the footprints and may often find the leopard hidden in the tree that you may not have found.

SEO is much the same. You may choose to self-optimize your site or to consult a professional. You may achieve a similar result, but the expert will know which potholes to look out for.

The journey ahead:

Whether you choose to self-optimize your web site or call on an expert, the sooner you begin planning your SEO strategy, the better. Whichever option you choose, send your site to the World Wide Web prepared. By keeping the basics in mind and commencing your SEO strategy early, you'll ensure that the end result is a web site that is easy to find by clients and search engine spiders alike. It will not only speak to your target market, but be loved by search engine spiders. Enjoy your journey. Preparation will only improve the ride.

About the author: Christine Stander is a professional search engine optimisation and online marketing strategist with experience in many facets of search marketing, user behaviour analysis and brand management. For more information please refer to: http://www.altersage.com

Saturday, March 22, 2008

The Meta Tag Myth

Author: Bobby Heard

The more the better right? Wrong. At least when it comes to meta tags. The history of the meta tag started off as a nice tool that web sites could use to show the search engines what the site was about without the words showing up on the actual page. It seemed like a great idea until people started to abuse the meta tag. They would add highly searched for keywords that were unrelated to their site in their meta tags in hopes of attracting additional traffic. The search engines caught on and lowered the importance of meta tags - they figured out that if they put more emphasis on the visible content of a site, people would have a much more difficult time ""cheating"". Turns out that they were right.

Now, don't get me wrong, meta tags still do carry some significance. They need to be consistent with the content of your site, but most importantly they're somewhat of a measure of the legitimacy of your site. The most common myth when it comes to search engine optimization is that the best meta tag is the one packed with the most information. This couldn't be further from the truth.

The keyword meta tag has been abused more than any other meta tag and does not carry as much importance as most of the others because of this reason. Most search engines only read the first few characters of the tag, if they read it at all, because they know that most keyword meta tags are filled with spam - Just the same words repeated over and over. That is why it's important to get your most important keywords to the front of your keyword meta tag.

The meta tag that still carries the most importance is the description meta tag. This is because it serves as a description for the particular page of your site that it is included in. Description meta tags should be unique to each page of your site, as search engines frequently use it as the description under your page title that appears in the search results. Obviously, you want your description to be representative of the page being displayed.

So in conclusion, don't use meta tags the way we all have a compulsive urge to - by packing it with everything that we can think of. It seems like a good idea, but it will only help you to fail in your goal of the holy grail: higher rankings.

About the author: Bobby Heard is the VP at Abalone Designs and is an active writer of SEO articles. More articles he has written are available at www.abalone.ca/resources/

Friday, March 21, 2008

All About Google

Author: Courtney Heard

If you read The Search Engine Showdown at www.abalone.ca/resources/searchenginetest.html, you know Google is my favourite search engine. Why? Google always offers the most results for any given search (they currently have over 8 billion pages indexed), it's faster than the Audi Quattro we test drove this morning, and 9 times out of ten, in my experience, all the front page results are relevant to my search. In fact, I usually find what I'm looking for within the first few sites listed. I also really respect the fact that two college kids started it (kinda like Abalone!) and that those two college kids seem not to have forgotten where they came from. If you check out the images at Google's press center (http://www.google.com/press/images.html) and scroll down to the Everyday Life Inside Google section, you'll get a feeling that life at Google is fun.

Google is the most used search engine on the web. In May, 2004, 36.8% of all searches on the web were done using Google. Also during that month, Google powered 54% of all searches done on the web.

Google owes their success to their mind-boggling algorithm. This intricate formula sucks in a web page, considers its keyword density, its link popularity, domain name, how often it is updated, the amount of content in the site, and a myriad of other things that few know and spits out a number called PageRank. There is absolutely no way to be sure how to get your site to number one on Google, but there are a few things that we know can help:

1. Make sure your site is well organized, visitor friendly and useful. Google seems to like sites that are listed in the Open Directory Project, and my theory is that it's because the Open Directory Project is human edited. This means real human eyes have looked at each site that is included in the directory and deemed it useful in some way or another. If your site has a link on The Open Directory Project, you're on the right track.

2. Avoid ""spamming"". Spamming refers to many different things. If you add keywords to your site that are out of context or hidden from plain view, it's considered spamming. Resubmitting your site to Google can be looked upon as spamming. The basic principal is to make sure every page on your site is professional looking, clean, organized and has its own unique information to offer.

3. Try to trade links with good quality sites that you like, 'cause if you like them, chances are they have something to offer and Google will recognize that. The more good quality sites around the web that have links pointing to your site, the higher your link popularity will be.

4. Stay away from hi-tech sites unless you offer an alternative. ie. If your site has been designed in Flash, try to offer a plain HTML version of the site. Google can index flash but it's not likely that it will be indexed well, and your ranking will suffer. Frames are also a no-no. Although Google can index framed sites as well, again, the ranking can suffer and more importantly, frames are universally recognized in our industry as hideous!

5. Keep the content on each page to a decent level. You don't want too much content, but you definitely do not want too little. A good way to judge what a good content level is, is to search for the #1 ranking site for the keywords you wish to target, and see how much content they have. Make your content keyword-rich, but don't make it so full of keywords that it sounds ridiculous to visitors. You want to keep the visitors you get from Google, right?

Google almost always offers you the best resource for your query due to the fact that all of these things matter to them. They are also always trying to find ways to improve on the Google Algorithm so they can continue to offer us the best service. Every once in a while we hear about this new search engine and that new search engine, but no one seems to have been able to catch up. As long as this remains true, these simple tips will be applicable.

About the author: Courtney Heard is the founder of Abalone Designs, an Internet Marketing and SEO company in Vancouver, Canada. She has been involved in web development and marketing since 1995 and has helped start several businesses since then in the Vancouver area. More of Courtney's articles are available at www.abalone.ca/resources/

Thursday, March 20, 2008

Lead Them On - Getting the Spiders to Notice Your Site

Author: John Calder

© 2004, John Calder http://www.TheEzine.net

You've found your market, developed a marketing and business strategy, identified or created your products, and finally finished your site. Everything else is in place, and you're ready to get listed in the search engines. In order to get indexed, you will need to let the search engine spiders know about your site in the first place.

Are there ways to lure the spiders to your site without manual submission? Thankfully yes, and in fact, many professional SEOs recommend that you follow these methods rather than submit your pages directly.

The very best way, if possible, is to get a link to your site from another site that is already indexed, spidered frequently, and is related to your site. This may include mentions in blogs, news releases, and so on. This technique can get you spidered very quickly, sometimes within a few days or less!

You can also stick to tried and true methods such as posting on forums and writing articles for any of the various article directories. If you do this, don't spam. Follow their rules, offer helpful answers, and don't overdo your ""sig file"". Private membership forums may not be indexed, so be sure that the forums where you post are listed in search results, and that relatively recent posts are displayed.

Whatever method you use to get the spiders to visit, you'll need to provide them with a good site map. Whether created manually or with software, a site map is a listing of and a link to all the pages on your site. If you have more than about 100 pages, consider having a multi-page site map, and make sure each site map page links to every other site map page. Put a link to your site map on your home page at the very least, and preferably on every page. Try to make your site map pages as simple and clutter-free as possible.

About the author: John Calder is the owner/editor of The Ezine Dot Net. Subscribe Today and get real information YOU can use to help build your online business today! http://www.TheEzine.Net

The Ezine DOT Net RSS feeds are available.

(You may reprint this article in full in your newsletter and/or web site)

Wednesday, March 19, 2008

7 Powerful Ways to Get One-Way Inbound Links

Author: Trent Brownrigg

Copyright© Trent Brownrigg www.work-at-home-jobs-iowa.com

I grant permission to publish this article, electronically or in print, as long as the bylines are included, with a live link, and the article is not changed in any way (grammatical corrections accepted).

7 Powerful Ways to Get One-Way Inbound Links

It's no secret that having links pointing to your website is very good for search engine rankings, and gives more ways for human visitors to find your website. So, how do you get one-way inbound links to your website? Well, there are numerous ways, and I am going to focus on seven of the most effective techniques.

You probably already know about link exchanges as great way for getting links to your website. However, with link exchanges, you have to give a reciprocal link back to the website that is linking to you. In this article I am going to show you how to get links to your website without giving a link back.

Ok, here are seven powerful ways for getting one-way inbound links:

1) Content – Packing your website full of useful content is one of the best ways to get inbound links. When you have a lot of content other webmasters will naturally link to you because your site will be valuable to them and their visitors.

Some examples of good content to add to your website would be; free articles, tools, tips, and resources related to your website theme. You could also have a blog that you update regularly with fresh content.

2) Testimonials – A big trend on the internet right now is adding testimonials to websites. The site owners typically get the testimonials from their site visitors and business customers. In exchange for the testimonial they usually give a link back to your website.

Every time you come across a great website, or have a good experience with a purchase, send a testimonial to the website owner. If they decide to publish your testimonial you will get a link to your website. After a while these links will add up to a significant amount.

3) Online Directories – There are hundreds of online directories that accept website submissions. This is a very good technique for getting one way inbound links. Many of the top directories require you to pay for inclusion or provide a link back to their directory. You can give the link back if you want, or pay if you have the extra money. However, there are also numerous directories that will accept your website without payment or a link back.

All you have to do is find the ones that accept free submissions and you will get your one way inbound links. You can find quite a few directories, and some other great information about submitting to directories at http://www.directory-pages.com/

4) Articles – Writing articles is one of the best techniques for getting one-way inbound links to your website. Many webmasters are looking for more content for their websites and Ezines. You can provide them with that content by writing articles and submitting them to websites, Ezines, and article directories.

In return for your effort you can put a little blurb about yourself and/or your business, with a link to your website, at the end of the article. You will rack up a very high amount of inbound links to your website by utilizing this technique. I have done it many times with a lot of success.

5) Free E-book – You can write an e-book and offer it for free in exchange for a link to your website. You can also put links to your website and other affiliate links in the e-book. Putting the links in the e-book will increase your website traffic and affiliate profits.

You are probably thinking that you can't write an e-book. Well, yes you can! It doesn't have to be some killer piece of work that you spend months writing. Instead, you can just make it a collection of your favorite articles, resources, and tools related to your website theme. As long as it is useful to your visitors they will gladly link to you in order to get it.

6) Blogs – There are actually a few ways to get one-way inbound links with blogs. The first way is simply to host a blog on your website and update it frequently. People tend to link to blogs because they provide content that is constantly being updated.

The second way you can get one-way inbound links with your blog is to submit it to blog directories. Just like website directories, there are tons of blog directories on the internet that you can submit to for free. Just do a search on Google for "blog directory" and you will find many of them to submit too.

The third way of getting inbound links with a blog is to host it somewhere other than on your website. Then, provide a link from your blog to your website. After you have done that, you can submit your blog to directories for the extra page rank and added exposure. Use this third technique with as many blogs as you have time to update regularly with quality content.

7) Free Article Directory – Create an article directory on your website and allow your visitors to submit articles to it. Ask for a link back to your website in exchange for the article submission. This will build your inbound links and create free content for your website. I have successfully used this technique for quite a while. You can see one of my article directories at http://www.work-at-home-jobs-iowa.com/article

Alright, there you go, you now have seven very effective techniques for getting one-way inbound links to your website. Get started on them right away and you should see your search engine rankings rise, your website traffic increase, and your profits explode. If you have any questions about any of these linking techniques just let me know, I will be happy to help. You can find my "contact me" link at the bottom of my website.

About the author: Trent Brownrigg is a successful internet marketer and home business mentor. Visit his website at http://www.work-at-home-jobs-iowa.com and he will personally help you succeed with your own home business. You can subscribe to his "Biz Tips Newsletter" by sending a blank email to: workathomebiz@aweber.com

Tuesday, March 18, 2008

Are You Getting The Most From Your Meta Tags?

Author: Francisco Aloy

As any Web Business startup knows, creating a Website is a bunch of work! You have to bother with content, layout, graphics and HTML links, just to name a few. What about your Meta Tags?

Meta Tags are words that are placed in your web page to provide a title, description and keywords to the Search Engine spiders or crawlers. They are not visible to visitors. To see a sample, open any website and click ""View"" on the menubar, and choose ""Source."" You'll see the Meta Tags up top, within the Head section of the web page.

Most Search Engines will rank your site on how well your Meta Tags provide a description of the content of your web pages. Google is the exception; it also take's into account the number and quality of backward links to your website. Here lately, there's been differing opinions on the relevancy of Google's Page Rank system. Some experts don't consider it accurate or necessary.

Yet and still, there are many other Search Engines and a further study of Meta Tags is in order. We are going to base our Meta Tags revision on a tool known as a word Frequency Counter. It counts and separates the words in a web page or document, giving you an idea of what words are used most.

Here are some Web based Frequency Counters:

http://www.mytranslate.com/wordfrequency.htm

http://rainbow.arch.scriptmania.com/tools/word_counter.html

http://www.writewords.org.uk/word_count.asp

http://www.writewords.org.uk/phrase_count.asp

http://web4future.com/free/wordcount.htm

http://www.keywordcount.com/

Each one of the Frequency Counters above has different uses and qualifications. The Writewords counter has two flavors: word and phrase count. The Web4Future counter is Web based or a standalone download.

How to use them:

Copy and paste the content of the web page in question and see what the top word frequencies are. You can eliminate most articles and modifiers and concentrate on pure words. As an example: If your web page is about red and blue widgets, the words ""red/blue/widgets"" should have the highest frequency.

A Word of Caution About Writing Content:

Use the Frequency Counters AFTER you write the web page content. Don't allow word frequency to get in the way of your natural writing style. Do NOT attempt to hit a certain frequency percentage; you'll be penalized by many Search Engines for too much keyword usage.

In Closing

Though the word Frequency Counters are good tools for a quick check, nothing compares to laser-focused writing for good Meta Tags information. Let your writing flow naturally and stay concentrated on your subject keywords. If you do that, you'll find it easy to weave your keywords into your content and Meta Tags.

by Francisco Aloy

(C)2004 Francisco Aloy reminder, not part of the article: Please include my working hyperlink in the resource box. Remove paragraph before posting.

About the author: Francisco Aloy is the Author of ""Yes, I Want to Start My Internet Business Without Being SCAMMED!"" For more power packed info about website building AND a Business Opportunity, visit our Kiosk.ws section, here: http://www.newbie-business-guide.com/website_building.html

Monday, March 17, 2008

The Search Engine Showdown

Author: Courtney Heard

f you're anything like me, you have a favourite search engine and you're loyal to it. You never use any others (which made this research difficult for yours truly), insist yours is the be-all and end-all and you even go so far as to deny any shortcomings it may have. But is your search engine truly the best? Inspired by a great article at the BBC News (BBC News: Search Wars), we decided to compare the major search engines. Here's what we found.

We chose the search term ""real estate fiji"" because it's a competitive industry and geographically specific. We also searched with the same phrase, misspelled to see if the search engine would suggest the correct spelling.

Google

• Initially loading the search page for Google is lightning fast. • The look is clean and easy to understand. • Search time was 0.15 seconds. • The search yielded 1,190,000 results. • All results on the first page were relevant. • Spellcheck was available.

Yahoo!

• Initially loading the search page for Yahoo! is a little bit slower than Google, but still fast. • Search time was 0.18 seconds. • The search yielded 711,000 results. • It is difficult to tell the sponsored links from the actual web results. • All results on the first page were relevant, however one of them directed you to another set of results for your search at DMOZ.org. • Spellcheck was available.

AskJeeves

• Initially loading the search page for AskJeeves is fast. • Search time was not posted and was much slower than Google & Yahoo!. • The search yielded 63,100 results. • Sponsored links take up the whole screen. You have to scroll down to see the web results. • All first page results were relevant. • Spellcheck was available.

A9.com

• Initially loading the search page for A9 is fast. • Search time was not posted but was average. • The search yielded 209,000 results. • All the results on the first page were relevant. • There were image results alongside the text results. This could be helpful. • Spellcheck was available.

MSN

• Initially loading the search page for MSN is fast. • Search time was not posted but was average. • The search yielded 112,607 results. • All except one of the results were relevant. This result pointed to Philippines real estate. Also, one the results directed you to DMOZ, where a second search for your keywords is performed. • Spellcheck was available.

Alexa

• Initially loading the search page for Alexa was fast. • Search time was not posted but was somewhat slow. • The search yielded 208,000 results. • It was difficult to tell the sponsored results from the web results. • Some results included screen shots. • The look was kind of disorganized. • Spellcheck was available.

AltaVista

• Initially loading the search page for AltaVista is fast. • Search time was not posted but was a sliver slower than Google. • The search yielded 736,000 results. • The sponsored results take up almost the entire screen. You have to scroll to get to the good stuff. • The results are all relevant, though one redirects you to DMOZ, where a second search for your keywords is performed. • Spellcheck was available.

Lycos

• Initially loading the search page for Lycos is fast. • Search time was not posted but was rather slow. • The search yielded 114,356 results. • The sponsored results take up almost the entire screen. Once again, you have to scroll to get to the good stuff. • All the results are relevant although 2 of them redirect you to DMOZ.org. • Spellcheck was available.

Excite

• Initially loading the search page for Excite is slow. • Search time was not posted but was rather slow. • The search yielded 114,356 results. • All the results on the first page were relevant. • The look of the site was clean. • Spellcheck was available.

HotBot

• Initially loading the search page for HotBot is fast. • Search time was not posted but was fast. • The search yielded 114,389 results. • The sponsored results take up almost the entire screen. Once again, you have to scroll to get to the good stuff. • All the results are relevant although 1 of them redirects you to DMOZ.org. • Spellcheck was available.

AllTheWeb

• Initially loading the search page for AllTheWeb is fast. • Search time was not posted, but was fast. • The search yielded 679,000 results. • It is difficult to tell the sponsored results from the web results. • An offensive content filter was available. • All the results are relevant although 1 of them redirects you to DMOZ.org. • Spellcheck was not available.

Looksmart

• Initially loading the search page for Looksmart is fast. • Search time was not posted but was rather slow. • The search yielded 300 results. • There were 3 completely irrelevant results on the first page. • The look of the site was clean. • Spellcheck was not available.

Jayde

• Initially loading the search page for Jayde is somewhat slow. • Search time was not posted but was average. • The search yielded 60,424 results. • There were quite a few irrelevant results. • The look of the site was clean. • Spellcheck was not available.

So, what's the conclusion? My favorite search engine is the best. All hail Google! ... Alright, alright, some of the others are pretty cool, too.

About the author: Courtney Heard is the founder of Abalone Designs, an Internet Marketing and SEO company in Vancouver, Canada. She has been involved in web development and marketing since 1995 and has helped start several businesses since then in the Vancouver area. More of Courtney's articles are available at www.abalone.ca/resources/

Sunday, March 16, 2008

Organic Search Engine Optimization

Author: Jeff Palmer

Organic Search Engine Optimization What it is and why it's so important.

By Jeff Palmer

Search engine optimization can be broken down into two separate yet intertwined categories. Non-organic or paid search optimization and organic, or unpaid search optimization. Paid search advertising relies on purchased search phrases to drive visitors to a website, while organic search optimization focuses on developing web sites that are naturally search engine friendly and appear in the unpaid or ""organic"" search engine results pages. (SERPS)

Successful organic optimization combines technical know-how with persuasive marketing.

Organically optimized web sites contain content that visitors find informative and relevant to their searches. Content is further optimized for search engines by incorporating relevant key phrases or words into the site's literature.

Organic optimization is holistic in approach. Every aspect of a web site is analysed for it's level of search friendliness. Aspects like the site's title, meta-tags, editorial copy, structure and design, usability and function are all taken into consideration. These aspects and many others are equally considered when optimizing a web site. There are, however four main points of interest:

1. Key Phrases: One of the first steps in organic optimization is determining which key phrases are to be targeted. This is determined by researching which words or phrases a target audience is most likely to search for. These target keywords are then incorporated into the title, description and content of a web site.

It is important to note that the overuse of keywords in a web site can result in a search engine's indexing software to considering a site as abusing or spamming the search engine and can result in that site being removed from the search index.

Generally a key word should appear five to eight times within a site's editorial content and content should consist of between 200 and 400 words.

2. Site Structure Search engines are somewhat limited in the way they can index a web site. Search engine robots or spiders are chiefly concerned with determining two things, what is this web site all about and where should this site show up in the search engine results. Often the way in which a web site is constructed can have negative results in how effective these search spiders are.

It is important to understand how search engine indexing works in order to create web sites which are fine tuned for optimal search results

Search spiders look primarily for text content when judging how a site is to be indexed. Sites which are built entirely of graphic elements or flash are not search friendly. Sites which feature an overuse of javascript and other dynamic content are not search friendly.

A multitude of factors are considered when optimizing for the search engines. Since different search engines follow different rules, and the rules often change, it is unlikely that every aspect of a web site will be perfectly matched to every search engine. The important thing to strive for is eliminating the elements of a site that are known to cause problems, and emphasizing as many search friendly aspects as possible.

3. Usability A site which is difficult to navigate, slow to load, or leaves a user wondering exactly what the site is all about is a site that is not going to perform very well. The flow of information within a web site must be logical and intuitive. Optimizing a site for performance is critical to its overall level of success. After all, what is the point of having a web site that ranks well in search engines if nobody can use it?

4. Inbound Links: The amount of inbound links to a web site has a direct effect on the search engine page ranking of the site. Virtually all of the popular search engines have methods of calculating the link popularity of any given site. This makes inbound links an important area of organic optimization. The quality of the inbound links can matter more than the quantity. Search engines place more importance on relevant links from sites which they consider to be authorities on any given key phrase. Search engines strive to provide the most relevant results possible by filtering out meaningless or "junk" links to a website. In some cases, large numbers of irrelevant inbound links are seen by the search engines as abuse or spamming. Practices such as "link farming" and "free-for-all" link pages are frowned upon.

As the popularity of paid search advertising grows so does the need for organic optimization. While paid search campaigns can offer short term exposure, organic optimization involves steady, long term results. And as the costs of paid search advertising continues to climb, organic optimization offers the assurance of appearing in search pages for natural searches.

By combining aspects of search engine technicalities, site structure and usablity, organic optimization not only focuses on search engine results but offers the end user of a site a quality experience.

About the author: Jeff Palmer is a Search Engine Optimization specialist and Senior Interactive Designer for Openvision an Internet marketing company located in Hilton Head Island South Carolina.

http://www.openvision.com succeed@openvision.com

Saturday, March 15, 2008

How I reached #1 in Yahoo!

Author: Michael Rock

How I reached #1 in Yahoo!®

&nbsp ; By accident, no all that reading and studying statistics finally paid off. I started out by reading articles from people with opposing views and trying to figure out who was wrong and who was right. To find the right person to listen to I didn't only look at his educational background, but was more interested in his results. Good thing too, because results I found. And I hope people look at my results to help them reach #1. &nbsp ; In the past I had a great working simple traffic tracker on my site that started stated someone visited my site by typing in 'online editor flash sites' into Yahoo!. I went to Yahoo! And typed in the keyword phrase and found out I hit #1 out of 1,040,000 hits! Two weeks later I dropped to #2 and found out that my other site took #1. Now I had the #1 and #2 position in Yahoo!. My world changed forever and at that moment and never will go back. Today I have more sophisticated software, charts, data, articles, and trends referenced for my webs posted on the wall. I maintain contact with forum discussions and receive up-to-date news in my email telling me of changes taking place. I have achieved the top 10 list in the top 3 search engines (Google, Yahoo!, and MSN) and wish to educate you on how to do the same. Part One: Keywords and Keyword Phrases SEO - search engine optimization; the process of designing your site to achieve high rankings in the search engine &nbsp ; Picking the right keywords is essential. Step into your customer's shoes and think of what keywords they would use to find the information on your site and write them down in a list. Get your friends and others with different opinions and ask them what they would type to find the information that is located on your site. Add these keywords and phrases to your list. Now check your competitors' sites that have reached #1 status in the search engines. (Careful don't pick sponsored sites located at the top or side, they got there by using a PPC (pay per click) ad campaign. You want the non-sponsored sites. Right click anywhere on the page and select view source to see the site in html format. Under the header tag called <META name="keywords" content=" keywords are here " you will find the keywords that the site used to get to the top. Add these keywords to your list. &nbsp ; Now you should have about 40 to 50 keywords and keyword phrases in your list to research. (By the way 2 or 3 word keyword phrases are used the most today, when only a few months ago 1 word keywords were used.) Just following one of these three avenues (your list, your friends list, and view source) will not get you to the top, you must use all three and research the 40-50 words to get to the top. &nbsp ; Now take these 40-50 keywords to the search engines and type them in. Do the results pertain to the information on your site? If not, cross them off. You should have about 40 keywords left. &nbsp ; Here is a good SEO (search engine optimization) secret! Visit these sites: (

http://excite.com ,

http://teoma.com/index.asp ,

http://www.lycos.com/ ,

http://www.webcrawler.com/info.wbcrwl/ , and

http://www.bos2.alltheweb.com/ and reenter your 40 keywords and you'll find a list of keywords under the heading, "Did you mean . . ., Other users typed in . . ., and Related links . . ." Using these search engines is an untapped resource that should not be overlooked. The list of keywords listed here are the ones that Joe Q. Public used to find the information you're looking for. The average of all opinions of all the people browsing the internet looking for the information you want them to find on your site. Adding the relevant keywords from these lists will improve your chances of getting to the top immensely. There is free keyword generators located on the internet to help you add to your list. The top two are Google Keywords (

https://adwords.google.com/select/main?cmd=KeywordSandbox ) and Overture Tools (

http: //www.content.overture.com/d/USm/ac/index.jhtml Click keyword selector tool . Overture will tell you how many times that keyword was type in during the last month. Find these tools on the internet and type in your keywords to see how they rank, and cross off the ones that don't generate much traffic. And guess what? You get a list of more keywords that will help your goal to get to #1! They show up in the order from most used to least used. Add the relevant keywords to your list. &nbsp ; You should now have a list of about 75 or more keywords. Remember: The more research you do, the better your chances will be to getting to #1. What do you do with this large list of keywords now? Determine how often they are used by Joe Q. Public and relate it to how many of your competitors use this keyword in optimizing their site. This will give you something called a KEI (Keyword Effectiveness Index; compares the Daily World Searches with the number of competing Web pages to pinpoint exactly which keywords are good enough so you can use them while optimizing your site) rating. &nbsp ; Part two of this article will show you how to narrow your list to 10 of the most powerful keywords in the market to raise your ranking in the search engines. Be sure to look for part two coming soon . . . Copyright © Michael Rock

Web development contractor (Web Design and Hosting) Internet Presence www.TheInternetPresence .com

About the author: The owner of this registered company has twenty years experience with DOS, windows business applications, numerous programming languages, artistic development, and web design. Other areas of interest include web marketing, web promoting, and business marketing and development. After the persuasion of those praising his work, he decided to go into business himself and highly suggests everyone else to do the same.

Friday, March 14, 2008

Along Came a Spider (Part One)

Author: Julia Hyde

So, your Web site is up and running. It looks great and on its first day you're excited about getting your first order. But your excitement soon turns to weariness as that one order is the only one that comes in for a whole month. And worse, your Web site statistics show a disappointing hit rate. So much for the perception that "if you build it, they'll come."

Every Web site owner wants people to visit their site, but very few understand the role search engines play in getting those people there. And fewer still, understand how relevant content can not only attract the search engines, but convert your visitors into paying customers.

Have no fear. This guide will help you understand the relatively simple steps you can take to make sure the search engines send targeted traffic to your site, and increase your sales.

But before we begin it's important to understand how search engines work, and make the distinction between crawler-based search engines like Google and a directory like DMOZ.

Part One - Understanding the difference between search engines and directories

Crawler-based search engines.

Crawler-based search engines, or spiders, literally "crawl" the Web looking for content. They're able to do this because of the way pages on the Internet link to other pages by way of hyperlinks. Anyone who's sat down at the computer "for five minutes" to find information has experienced this linking system—hours later you're still there, completely off track, clicking away from one page to another to another.

The search engines use this linking system in much the same way as human users. For example, when Google sends its "spider" (fondly known as GoogleBot) to "crawl" the Web it follows links from page to page indexing the content it finds along the way. The information is then stored in a huge database somewhere at Google. Later, when someone enters a particular word or phrase into the search box, Google scans its database for possible matches. It then displays pages that contain, or relate to the word or phrase in an order it considers most relevant.

There are really only two major crawler-based search engines, Google and Yahoo. The others, with the exception of several smaller engines such as, Ask Jeeves/Teoma and engines based outside the United States, get their results from these two. See below to find out which engine supplies and which engines receive.

While Google and Yahoo crawl the Web in much the same way, the results you receive from each can vary greatly. You can see an example of this by searching for "direct mail packages" on both Google and Yahoo. As of today, (and this is certain to fluctuate on a daily basis) a test page from my site (www.juliahyde.com) with the title "Sales Letters and Direct Mail Packages" hovers around number 12 on Google's results. Perform the same search on Yahoo and the page ranks number one. It also ranks number one on MSN, but that's because, until MSN officially launches its own search engine, Yahoo supplies its results.

Contrary to popular belief there's no need to submit your Web site pages to the crawler-based search engines—if your site is built with the search engine's and your visitor's best interests in mind, the crawler-based engines will find it on their own. I'll talk about this in more detail in a future chapter.

The two major crawler-based search engines supply results for:

Google - AOL, Netscape and iWon

Yahoo (synonymous with Inktomi) - MSN, Alta Vista and AllTheWeb

Online Directories

Directories are like giant yellow pages that compile, rank and organize listings into different categories and sub-categories. They do not crawl the Web looking for content but rely on submissions from web site owners. Professional, human editors generally edit directories. Most of them work something like this:

? You want to buy a pair of jeans so you go to a directory like DMOZ and click on the main shopping category. ? Then you click through the sub-categories, "apparel" "retail" "jeans" and so on, until you find exactly what you're looking for.

Top Directories include:

? The Open Directory (supplies directory services to Google) ? Yahoo ? Looksmart ? Gimpsy ? Zeal ? JoeAnt

Although submitting your site for inclusion in directories will drive some visitors to your site, you should not necessarily base your decision to submit on how much traffic you think you'll receive from the directory. But, rather, view the submission as an opportunity to obtain a link to your Web site. Why? Because a directory listing will allow the crawler-based engines to follow a link to your site and help get your site indexed in their database quickly. It will also give your site a good quality incoming link (more about this in later chapters).

Next month: Words. Words. Words.

About the author: Julia is an independent copywriter and consultant specializing in advertising, search engine optimization and search engine marketing services. To learn more about how Julia can help boost your company's profits visit her site at www.juliahyde.com. You may also like to sign up for Marketing Works! Julia's monthly ezine. Visit www.juliahyde.com/form.html to sign up or email Julia at info@juliahyde.com for details.

Thursday, March 13, 2008

Web Design Optimization

Author: Melih Oztalay

Is it possible to have an attractive website and still be optimized for search engines? The answer is absolutely yes! Of course we tend to be mis-guided about what will dazzle the visitor. It is not necessary to overwhelm your website with graphics, which will cause the page to download slower. Our intention is to provide you with tips that will still be attractive, while focusing on Easy Spider. Easy Spider is a term we use at our firm that will optimize the HTML code, but not the aesthetics of the website.

There is no question that most webmasters are developing websites that are complex and full of graphic design elements. Even though, we have faster connection speeds today compared to 10 years ago, we can go overboard and the page wills till download slowly. Of course the user will never wait that long for a website to load, and will move on to the next website in their search results.

One does have to ask the question why webmasters are still developing slow loading websites. Primarily due to a lack of knowledge. The knowledge of simple graphic optimization techniques that will allow them to maintain an attractive website while keeping the page size smaller.

Did you know that a box with rounded corners can be achieved using CSS code only, without the need for any graphic image? It is possible and although not prevalent with all web browsers, we are heading in the direction where the browsers will accept more CSS alternatives. However the point we are trying to make here is not about CSS vs. graphic images, but that we tend to be stagnant in our knowledge and do not keep up with the fast changing trends in web development.

For instance, Flash only websites, while attractive, fund an interactive are not appropriate for search engine optimization. Consider keeping you web page 90% regular text HTML and insert Flash elements to provide the interactive attractiveness.

Granted one does believe that we should not limit the web designer by placing any restrictions that impact the final outcome. However, do consider implanting code wherever possible over graphic elements. Equally, consider the complexity of the graphics as the web sites structure, layout and functionality are being planned.

Another example is when the design is finished and you are ready to slice it into small images to be used in the html code. Everything you do at this stage will affect the total page size. If your design contains rounded shapes that overlap each other or areas with color gradients, then you must slice it carefully so the outcome is a small file size.

Let's look at what efficient slicing means:

1. Do not make large slices that contain lots of different colors. Use a small number of slices where each slice contains a limited number of colors.

2. Do not make a large slice that contains the same graphic structure. Slice a small portion of it and duplicate it in your code. This is a very common mistake that webmasters/programmers make when dealing with gradient color background.

3. Do not use JPEG file format all the time. In some cases a GIF format will be much smaller in size. A rule of thumb - a slice with high number of colors will be smaller in size using the JPEG format rather than the GIF format, and the opposite is also true. Check each option separately. Every 1KB that you reduce from the image file size will eventually add up to a significant reduction in page size.

4. Consider using PNG graphic file formats that will provide you the balance between transparent backgrounds and larger color possibilities.

5. If you have text on a solid color background, do not slice it at all. Use code to create the background instead. Remember that you can define both the font style and background color of the area using CSS.

Advanced Techniques

Graphically optimizing a website is more than just knowing how to do image optimizations. There are some advanced techniques that require a high level of programming. CSS2 has much more to offer then CSS does. Although not all browsers have adopted this standard yet you should be ready for when they do.

PHP scripts have become more prevalent during 2005 and actually feed the information that Search Engine spiders can use. PHP is preferable over JavaScript.

JavaScript's are still good possibilities to consider. JavaScript also gives you a set of options to create some cool effects without needing to overload the page with Flash. Using limited tools like JavaScript compared to an advanced application like Flash to create the desired effects can be difficult. However think about the outcome. For a onetime effort you can differentiate your website from others. You will have an attractive professional looking website that loads quickly.

Get used to writing well optimized web pages because search engines are still the number one method of finding information on the web. Search Engine spiders have to be considered in this cat and mouse game that will never end.

About the author: Melih Oztalay is the CEO of SmartFinds Internet Marketing. Internet marketing is not only about knowledge and experience, but also about imagination. Visit SmartFinds Internet Marketing.

Wednesday, March 12, 2008

12 proven steps to Get number one ranks in search engines.

Author: sumit

12 proven steps to Get number one ranks in search engines.

You would have heard that search engine ranks are very crucial and it is right because most of a traffic that you will see will be from search engines . My websites are getting pretty good number of visitors because they show at number 1 in google , msn and yahoo ! These search engines together drive over 90% of search market share and about 1 million people search everyday for what they want to buy ! If your site comes up at 1 position in these search engines , then imagine the boost in your Bank account figures... But how to get this rank ? Let me show you very clearly , ""How to rank 1 in search engines in 12 proven steps !""

First , create a text file on your computer and name it analysis

Step 1 : Go to the search engine on which you want high ranks .

Step 2 : Search the term you are targetting . Example if you want to rank high for ""SEO"", then search for it .

Step 3 : Look at the number one sites title that the search engine is showing you . Count the number of times your search words appear in it . Add this number to your ""Analysis"" file .

Step 4 : Count the number in the description provided by the search engine . Add this number also to your ""Analysis"" file .

Step 5 : Visit the site and count the number of times the term appears there and Add this number to your ""Analysis"" file . Don't forget to count the number in META DESCRIPTION tag also !

Step 6 : Type the URL of that site in the search with link attribute . Example if www.mydomain.com is number one , then search for ""link:http://www.mydomain.com"" . The search engine will show you the number of other websites that are linking to that site . Add this number to your ""Analysis"" file .

Step 7 : You can also add the same data for the second and third results in search .

Step 8 : Open your page which you want to get ranks for in your favourite html editor . I personally use the one at http://www.website-design-software-india.com . Most of the html commands are inbuilt and I don't have to bang my head with HTML codes when I am trying to concentrate on page optimisation . Afterall time is money .

Step 9 : Modify your title tags , meta tags so the number of times keywords appear is just one higher than the number in the analysis file . Do the same for the BODY of your page . Enclose the keywords in BOLD tags and sometimes in both BOLD and ITALIC tags . I just have to choose an option to this in Easy Html Editor SSS12 at http://www.website-design-software-india.com . Didn't I tell you it is my favourite ?

Step 10 : If you have lot of text on the page , then divide it into paragraphs . Before every 2 or 4 paragraphs add a HEADING tag . Don't forget to squeeze in your keywords in this tag ! You can use Style sheet to make the heading look better on your page . Example will reduce the size of the text in heading to 20 px instead of displaying the ugly large heading . Replace with > in the code above .

Save and upload this page to your website .

Step 11 : Get your site indexed by search engines . You can submit a sitemap to google directly at http://www.google.com/webmasters/sitemaps/login . Google wants a XML map which is also indexed by other search engines . Generate this map for your site and submit to google . For other search engines link this map from your website's pages . But the XML version is not for human visitors who will be visiting your site . So create a HTML version for them and link this from all your pages . A free software to do this is at http://www.ad4business.com . It will use your XML map and create a HTML map from it , count the number of pages on your site , link to all , divide them in categories and also links to your XML map for search engines . Best part it also allows you to add your own website templet so your visitors know that this is the same site . Have a look at their sitemap to get an idea how yours will look like . See at http://www.ad4business.com/sitemap.html , html version . http://www.ad4business.com/sitemap.xml , XML version is here . Providing a sitemap will help you get your site indexed faster . Google says that you can have 50,000 pages listed in one single sitemap of yours.

Step 12 : Start building links for your site . You can do this by submitting your site to web directories . The more sites link to you , the better it is . Remember you have to cross the LINK number in your analysis file . Once again I will have to say that the html editor SSS12 helped me here as well . A list of more than 200 free web directories is provided with the download of the editor and better a Directory submit manager is inbuilt which prevents me from submitting to a directory twice . Most directory owners will cancel all your links if you submit twice to their directory . The inbuilt directory manager keeps a record of the number of directories you have submitted to and how many remain and you can keep track of multiple sites of yours . And they don't even count this as a feature on the feature list at http://www.website-design-software-india.com/sss12-html-editor.ht ml .

Now keep checking your back links frequently in the search engines . You can use MSN to get the best idea . Search at http:/search.msn.com

Some would suggest that you need to submit your site to search engines but that is totally unnecessarry because the back links will keep bringing search engines to your site again and again . With each visit , they will index more pages of your site .

You must also link to quality sites as this tells the search engines that you are providing your visitors a useful resource and your site must be important enough to refer a quality site .

Best of luck for your site . Always remember , ""Write for human visitors and not for search engines ""

Find over 3000 Free similar tactics about site promotion at http://www.ad4business.com/sss/ and see your site win the internet marathon.You can reprint this article on your site untill you keep all the links intact and display URL's as links

About the author: Sumit maintains a collection of articles about site promotion , advertising and marketting from the last 5 years at http://www.ad4business.com/sss/ . Over 3000 tactics by authority people.

Tuesday, March 11, 2008

Relevance of Description Tags

Author: Brian Basson

Although not an important factor in the search engine rank of a web page, it is still very crucial to give careful consideration to have a good description tag on every single page of your website. As with the title tag, it is important to plan and decide on an ""eye-catching"" description tag. The ""description tag"" is ultimately what will make a person click on a specific link from a search engine result. Give special attention to this fact, and you will find that your site's visits will increase by a huge margin ! People are naturally inquisitive, so make them WANT to click on your link to find out more, and not on the link of the competition !!! Put yourself in the shoes of potential visitors to your website, and ask yourself : what will make me click on a specific link - and what info and description must it contain ? End result - more visits & sales ...

As with the Title Tag, do not make the description tag too long - rather make it short and to the point, but still very attractive to click on. Google do not place any value on keywords included in the description tag, but does display it in the results - so please plan this tag well !

Keep the description tag relevant - make it an ""extension"" of the page's title. Do not loose track of this fact. It sounds obvious, but we see websites all the time where webmasters are oblivious of this ! At the same time, do not try to mislead potential visitors to your site - offer them exactly what they will find on the web page. You will find that visitors become repeat visitors, as content remains King, and all this due to a well planned description tag.

Conclusion : the 3 tags of a web page (title, description and keyword) goes hand in hand and supplement the contents of a web page - see them as intertwined, and dependent on one another. If one of the components don't fit perfectly into the bigger picture, your web page is not optimized for the search engines ..... period !

About the author: Brian is a freelance writer, website marketing expert & webmaster of 3 websites, including Rank Advance at http://www.rankadvance.com

Monday, March 10, 2008

Website Content vs. your Search Engine Rankings

Author: Brian Basson

This is a one of the most important questions any webmaster should ask him or herself : which keywords, how much, and how to naturally include it in the site's content ? Many people refer to this plainly as keyword density. Getting back to basics - the search engines spiders websites all over the internet. Many factors will give a site a certain rank on the search engines, including links pointing towards a site, title and description tags, but in the end, one of the most important factors is site content - Isn't this exactly why we search the internet in the first place ? We search for specific information, and relevant information is what we're hoping to get when searching a search engine. The better the content, the higher your website's rankings should be.

It is therefore extremely important to look at your site's content - what, where, when and how ! The trick lies in giving content a natural flow, like when reading a book. Content having an unnatural sound to it will most certainly distract any reader of it. Keyword stuffing has become a huge problem as many searches end up in finding a web page with irrelevant information. Luckily the search engines have started to give attention to this factor, and are penalising web sites adopting this crude method. At the same time, for any web page to draw attention to it (search engines), it is very important that the page contain a fair quantity of keywords and keyword phrases related to the topic / theme of what the page is all about. It is not that easy, but spend some time on this and the search engines will most certainly reward you for your efforts !!

Your site will also become known for it's relevant high quality content, and more and more people will regularly visit it and also link to your website. If in doubt, print out the page and ask a friend or relative to read it. They will surely be able comment on the natural sound of it or not.

About the author: Brian is a freelance writer, website marketing expert & webmaster of 3 websites including Rank Advance at http://www.rankadvance.com

Sunday, March 09, 2008

How to get your site a top ranking in Google

Author: Paul Bliss

It's the new American dream. Your website appears in a top spot on Google for your chosen keyword. Next thing you know, orders start coming in faster than you can handle, and you are rolling in the money. If only it were so easy, right?

Well,

It can be done. I've done it many times in many different industries. There is no secret, but rather, it's just knowing what to do. I've made just about every mistake one can make with a website, but I learned from every setback. If you were only allowed to do one thing to get ranked for your site in Google, without a doubt, all you'd need to do is get links for your site.

Yes, there are many other factors involved in getting your site to a top position. But this is the most powerful way as of this writing to get a top spot in Google. It's not just enough to have links pointing to your site, but you need to have your keyword ""anchor linked"" to your site. Anchor linking is when you use your keyword phrase as the click-able text for a link. So, instead of saying ""Click Here"", you would use ""Widgets"" as the link text.

Now, another point of consideration is determening what keyword/phrase you want to use to get your site found. Most times, people impulsively choose a one word phrase. While this would be a great way to bring traffic to your site, would it bring targeted traffic, with people looking specifically for your product or service? Most times when people type in a one letter keyphrase, they are in the beginning of their search.

They may type in ""Shoes"", but are really looking for ""Running Shoes"". So, if you have a top ranking for shoes, do you serve that user's needs? Maybe, but they may also be looking for dress, casual, Women's, Men's, Children's, athletic, girl's, boy's, etc. This is why when you begin to optimize your site, you should focus on more targeted keyword phrases.

Suppose you sell a certain brand name of dress shoes. For this example, we'll call the famous brand XYZ. So, by getting anchor links as ""XYZ Dress Shoes"", you are already eliminating those users who are looking for another brand or line of shoe. Next, you need to make sure that the page that gets linked contains the on the page content with ""XYZ Dress Shoes"". If you would link to a page without relevant content, Google would view this link as possible spam, or more appropriately, irrelevant content.

Now, once you have compiled your list of keywords, you need to see which one are searched on the most. The best tool for this is WordTracker, and it is worth the tiny fee you need to pay to have access for one day. There are also free tools online that you can use, but WordTracker will give you the most accurate results.

Once you have run through your list of all your keywords, the obvious choice is to pick the ones with the highest amount of searches (and content relevant to your site!). The next step is to then begin the process of a link campaign. Now, I can already hear you complaining about doing a link exchange. This is only 1/3 of your campaign. The ideal method is to not only engage in a reciprocal link exchange, but to also engage in strategic linking.

Strategic linking is when you get a link to your site without having to return the favor. What's the best way to do this? Write an article just like this one. If I get one website to use this article and have it point to my site, I've just created another link to my site. Pretty easy, eh?

Since you have now engaged in a linking campaign, you should expect to see results in Google in as little as 4 days, and as far as 6 months. All of this is determined by where your links are coming from, and the popularity of the site from which the link came. Next, you need to get as many links as you can pointing to your site with your popular keyword phrase anchor linked to your site.

As I mentioned before, there many other factors that wll only enhance your rankings in Google, but the omplementation of a link campaign is the strongest method to get your site to a top ranking!

-To your online success!

Paul Bliss

About the author: Paul Bliss is a leading authority in the emerging field of search engine optimization. Paul has more than four years experience in the field and has produced some truly amazing results. Email Paul at pbliss@seoforgoogle.com or visit his site at: http://www.seoforgoogle.com