Sunday, August 31, 2008

Search Engines 101 - Search Engines Explained

Author: Kristy Meghreblian

Search Engines 101 - Search Engines Explained

What Are Search Engines?

A search engine is a database system designed to index and categorize internet addresses, otherwise known as URLs (for example, http://www.submittoday.com).

There are four basic types of search engines:

Automatic: These search engines are based on information that is collected, sorted and analyzed by software programs, commonly referred to as ""robots"", ""spiders"", or ""crawlers"". These spiders crawl through web pages collecting information which is then analyzed and categorized into an ""index"". When you conduct a search using one of these search engines, you are really searching the index. The results of the search will depend on the contents of that index and its relevancy to your query.

Directories: A directory is a searchable subject guide of Web sites that have been reviewed and compiled by human editors. These editors decide which sites to list, and, in which categories.

Meta: Meta search engines use automated technology to gather information from a spider and then deliver a summary of that information as the results of a search to the end user.

Pay-per-click (PPC): A search engine that determines ranking according to the dollar amount you pay for each click from that search engine to your site. Examples of PPC search engines are Overture.com and FindWhat.com. The highest ranking goes to the highest bidder.

There are a few downfalls you should know about using PPCs:

1. The use of PPC search engines as part of your search engine optimization process will not improve your search engine positioning in the regular editorial search results. Instead, they will most always appear in a ""Sponsored"" or ""Featured"" area located at the top or side of the regular search page results. Even though your paid listing will appear at the top of the search page, many users will not click on paid listings because they look at it as an advertisement. In the past, people used to always click on banner ads, but now they are seen more of as a nuisance. Similarly, the same thing is happening with PPC listings. Also, PPC listings are not always as relevant to a query as the editorial search results.

2. If your site is not effectively search engine optimized before you begin to submit it to a PPC, it will still be poorly advertised afterwards. The optimization of your Web site is critical to the success of your rankings.

3. When you stop paying for a PPC submission, your listing disappears and so does the traffic.

PPCs can be an effective short-term solution for gaining exposure and driving immediate traffic to your Web site while you wait for full indexing, but it can become expensive if you use it as a long-term solution.

How Do Search Engines Work?:

Search engines compile their databases with the aid of spiders (a.k.a. robots). These search engine spiders crawl the Internet from link to link, identifying Web pages. Once search engine spiders find a Web site, they index the content on those pages, making the URLs available to Internet users. In turn, owners of Web sites submit their URLs to search engines for crawling and, ultimately, inclusion in their databases. This is known as search engine submission.

When you use search engines to find something on the Internet, you're basically asking the search engine to scan its database and match your keywords and phrases with the content of the URLs they have on file at that time. Spiders regularly return to the URLs they index to look for changes. When changes occur, the index is updated to reflect the new information.

What Are The Pros And Cons Of Search Engines?

Pro: With the vast wealth of information available on the Internet, search engines are the most effective and efficient way to find information based on your specific search requests.

Con: Because search engines index mass quantities of data, you are likely to get irrelevant responses to your search requests.

Are Search Engines All The Same?

Search results vary from search engine to search engine in terms of size, speed and content. The results will also vary based on the ranking criteria the search engines use. If you aren't getting the results you need, try a different search engine. While the results may not be wildly different, you may get a few search results from one search engine that you didn't from another.

How Do Search Engines Rank Web Pages?

When ranking Web pages, search engines follow specific criteria, which may vary from one search engine to another. Naturally, they want to generate the most popular (or relevant) pages at the top of their list. Search engines will look at keywords and phrases, content, HTML meta tags and link popularity -- just to name a few -- to determine the value of the Web page.

About the author: As Submit Today's copywriter and editor, Kristy Meghreblian has written online content for many successful companies, including Monster.com. She has successfully combined her excellence in journalism with the delicate art of keyword density as it relates to search engine optimization. As a result, she has helped many Submit Today clients achieve top ranking. Submit Today is a leading search engine optimization, submission and ranking company located in Naples, Florida.

Saturday, August 30, 2008

Dispelling the Myths - Will WebPosition Get My Site Banned from Google?

Author: Matt Paolini

Dispelling the Myths - Will WebPosition Get My Site Banned from Google? by Matt Paolini

In mid November of 2003, Google seriously revamped their ranking algorithm. As a result, many sites were dropped from their index, or fell dramatically in rank. This infuriated many Web site owners at the height of the holiday buying season.

Since that time, many accusations have been thrown at Google as to the reasons why this happened. Some say it's a plot to encourage people to buy Adwords listings. Others have even theorized WebPosition is somehow to blame. Still others cite more traditional causes.

As soon as Google changed their algorithm, many WebPosition Gold customers whose sites had dropped contacted me demanding an explanation. They wanted to make sure their sites were not dropped because they had used WebPosition Gold. I reassured them that this was not the case. I went on to explain that many thousands of sites were dropped that don't even use WebPosition Gold.

Many of our customers even saw their rank increase. In addition, most of the time the site had not actually been banned from the index. It had simply dropped in rank.

In this article, I will attempt to dispel many of the pervasive myths regarding WebPosition Gold and Google. I've used WebPosition for years on my own site and for clients. I've also helped provide technical support to others using the product. Therefore, I've been on both sides of the fence, and thereby feel uniquely qualified to address the most common questions that tend to come up:

1. Will running automated Reporter Missions on Google get my site banned?

No. Despite repeated rumors, when running a Reporter Mission, WebPosition Gold does not pass personal information, such as your name, address, email, Web site URL or domain name to Google. Instead, it conducts queries as a normal browser would, and then examines the results offline. With that in mind, Google cannot determine if you're running a query relating to a specific domain. The only information that is passed to Google is your ""IP"" address. In most cases, your Web site's IP address is different than the IP address of your ISP (Internet Service Provider). So, how can Google connect the two? Simply put, it can't.

Google states on their FAQ page that they do not recommend automated queries to be run on their service because it utilizes server resources. Yet, most businesses find it impractical not to measure their search engine rankings at least occasionally. It's also hardly reasonable to check ranking by hand in Internet Explorer, which for the same keyword list, would yield the same number of queries on Google anyway. Therefore, most businesses optimizing their Web sites find it impractical not to use some kind of automated tool to monitor their progress and to measure their visibility. Working as a search engine marketer myself for many years, I've found that the best policy is to simply be sensitive to the needs of the search engines. Avoid being "abusive" in your practices, whether it is your optimization strategies, your submissions, or your rank management. Therefore, when using WebPosition, I often recommend the following strategies: 1.Avoid excessive numbers of queries if you choose to check your rankings on Google. Most people do not have time to improve their rankings on hundreds of keywords. Therefore, there's no need to rank check on hundreds of keywords if you don't have the time to do anything about that many different rankings anyway. While your site won't be banned from excessive queries, Google could block your IP address that you use to connect to Google, if it found your query volume to be excessive. This is true regardless of what tool you may use, even if it's a browser.

It has been my experience that a blocked IP is extremely rare even among consultants conducting rank checks for dozens of clients. Presumably, Google would not want to accidentally block an IP that does a large volume of queries simply because its shared by many different users. Even so, it's always a good idea to practice a little common sense. 2. If you choose to run queries, try to run most of your queries at night and during off-peak periods, which is something Google has suggested in the past. This is when many of their servers are presumably standing idle, waiting to handle the increased volume during peak periods. The WebPosition Scheduler makes this easy to do.

3. Do not run your queries more often than is really necessary. Since Google normally doesn't update their entire index more than once a month, there's limited benefit to checking your rankings more often than that.

4. As an alternative to Google, consider checking your Google rankings using Yahoo Web Matches or another Google "clone" engine in the Reporter. Although these rankings can vary slightly from Google.com, they're normally close enough to give you a very good idea of your actual Google rankings without checking Google directly.

5. With WebPosition Gold 2, you can also use the ""Be courteous to the search engines"" feature on the Options tab of the Reporter so you don't query their service so quickly. This gives you added peace of mind not found in many other automated tools, assuming you don't mind your missions taking longer to run. The Submitter has a similar feature to submit randomly at various time intervals.

2. Can I use WebPosition Gold to get my competitors' banned from Google? No. If running automated queries on Google with WebPosition Gold would result in your site being banned, you could use it to get your competitors' banned from Google. However this is not the case.

Google even verifies this on their web site. They don't specifically name WebPosition Gold in this section; however, they do mention that there is nothing you can do to get your competitors' banned from Google. For more information on this, please see the ""Google Facts and Fiction"" document at Google's site. http://www.google.com/webmasters/facts.html

3. Will over submitting my site get me banned?

No. Many people think that Google will ban your site if your submissions exceed the recommended daily limits. If this were the case, we could over submit our competitors' sites and easily get them banned from Google. Google is very clear on this and even states that over submitting will not get you banned. Even though over submitting will not get you banned, some of your submissions might still be ignored or discarded if they break the rules. Therefore, I recommend using the ""Slow Submit"" option in WebPosition Gold's Submitter and staying within WebPosition's recommended daily limits. Some people argue that manual submissions are best. However, manual submissions can't warn you if you inadvertently over-submit, make a typo in your submission, or forget what you submitted and when.

For achieving top rankings, and staying indexed long-term, the best submission technique may be to not submit at all. Instead, try to establish third party links to your Web site and wait for Google's spider to find you on its own. WebPosition's Page Critic offers numerous strategies for doing this.

4. Will Doorway or Entrance pages get me banned from Google?

That depends on whether these pages contain spam. If your definition of a doorway page is a page full of irrelevant or duplicate content, and excessive keyword use, then yes, you could find your site banned. That's how Google often defines a doorway page. Consequently, the term doorway has developed a negative connotation over the years.

If your optimized page is nothing more than an extension of your main web site that happens to contain search engine friendly content, then you'll be fine. In fact, you'll be rewarded for the effort through top rankings. The key is not whether you label a page a doorway, entrance, optimized, informational, or "whatever" page. The key is whether the page contains quality, relevant content that provides the search engine with what it wants to see.

Google mentions that they discourage the use of "doorway" pages because they fear that webmasters will optimize for keywords that are not relevant to the page's content. This is a legitimate fear as they are in the business to provide relevant results to their visitors. However, if you create pages that contain what Google is looking for, then obviously Google will not penalize this page, or view it differently from any other page on your site.

With this in mind, here are a few of my tips on creating Google-friendly pages:

1. Always Include Relevant Content - Make sure that the content on each of your pages is relevant to your site. Many sites have various resources on a number of different topics. This is fine, as long as the overall theme for your Web site is solid. I would also suggest that you organize your related content into individual directories. Some businesses find it beneficial to organize each sub-theme of their site into a separate domain so they can cross-link the domains. If you do this, make sure you have links from other sites as well.

2. Avoid Duplicate Content - Create each page with unique content. If you are targeting different search engines for the same keyword, then you may find that you have some very similar content between certain pages. If this is the case, you can always create a robot.txt file to tell each search engine crawler not to index a page or directory that was created for another search engine. See the October 2000 issue (http://www.marketposition.com/mp-1000.htm#THREE) of MarketPosition for more information on creating a robot.txt file.

3. Avoid Keyword Stuffing - Creating pages that excessively repeat your keyword phrase is definitely not a good idea. This almost always will throw up a red flag to the search engine and is one of the most common forms of ""spamming."" How many keywords is too many? See WebPosition's Page Critic for up to date, specific recommendations regarding how many words and keywords are recommended in each area of your page.

4. Design Good Looking Pages - Although Google cannot tell if your page is aesthetically pleasing, it is recommended that you create pages that look good and fit the theme of your Web site. This will definitely increase the click through rate from the arrival page to the rest of your Web site.

5. Avoid Using Hidden Image Links - Many site owners think they can fool Google by including transparent 1x1 pixel image links on their home page that point to their optimized pages. These are very small images contained in a hyperlink that are not visible to the naked eye. This can get your page dropped from Google's index.

6. Avoid using links that have the same color as the background on your page - Many site owners try to hide the links on their home page by making the text color the same as the background color of the page. As with the scenario above, this can also get your page banned from Google.

7. Avoiding using Javascript Redirection Techniques - Many Web site owners have implemented the use of Javascript to redirect a user to another page while allowing Google to crawl the page that includes the Javascript code. This did work for a while, but Google eventually caught on. Other forms of redirection, like IP cloaking are also frowned upon by Google.

In Summary:

The rules regarding each search engine change routinely. That's why WebPosition's Page Critic is updated monthly to keep pace. As a search engine marketer, it's critical that you keep informed as to the latest search engine rules and strategies. It's also important to understand that WebPosition Gold is only a tool. When used properly, it will not get you banned or blocked, and will in fact improve your rankings dramatically. However, as with any tool, you can choose to ignore its recommendations and to go your own way. For example, you can use a hammer to build a fine house, or you can take that same hammer to knock a bunch of holes in someone's wall. Ultimately, this call is up to you, the user of the tool.

This article is copyrighted and has been reprinted with permission from Matt Paolini. Matt Paolini is a Webmaster and support specialist for FirstPlace Software, the makers of WebPosition Gold (http://www.webposition.com). He's also an experienced freelance Search Engine Optimization Specialist and Cold Fusion/ASP.NET/SQL Server Developer/Designer. For more information on his services, please visit http://www.webtemplatestore.net/ or send him an email at webmaster@webtemplatestore.net Interested in reprinting the above article?

When reprinting the above article, you must include the final credit paragraph which physically links to my Web site. If you'd like to rephrase the credit line or change the wording of the article for your audience, then I will try to accommodate you. I ask that you email me at webmaster@webtemplatestore.net with a copy of your proposed revisions for approval before reprinting it.

We simply require two things:

1. You must maintain the accuracy and general intent of the content.

2. We need to obtain an appropriate credit and link in exchange for your use of the article.

Thank you and we wish you the best of luck in your business!

About the author: Matt Paolini is a Webmaster and support specialist for FirstPlace Software, the makers of WebPosition Gold. He's also an experienced freelance Search Engine Optimization Specialist and Cold Fusion/ASP.NET/SQL Server Developer/Designer.

Friday, August 29, 2008

Optimize At Design Stages

Author: Anthony Parsons

Are you now thinking, what does this mean? Let me just say that most of my customers could have had their websites rectified whilst still in the design stage instead of costly repairs once finished. This does not only go for those who pay, but more than ever, those who learn web design for themselves to design and manage their own website. The costly ones are generally those who pay a web designer as they have no knowledge of web design and thus need to pay for all the repairs to be made by an SEO, whereas someone with a little skill in design can simply make most of the modifications themselves, making a much lower cost.

If you're designing a website, having one made, whatever the case, seriously stop now. What you want to do is this, have one page made and pay a professional SEO to find the faults, identify weaknesses and give you the advice and solutions to strengthen all aspects in the one page first. The faults and characteristics that an SEO will discover with one page will generally follow through all pages of a website. The only other necessary changes are keywords and copywriting text. An SEO will generally advise what keywords to utilise and you simply send these to your web designer or utilise them yourself within each separate page to cover all avenues. An SEO will give you all the information you require, such as for flash, navigation, page size, etc (for a cost). You will not get away with this for $20 dollars. If you do, guess what? You just probably found one of those unprofessional SEO's that I mention in other articles, because that sort of money would not cover my time to properly evaluate a webpage manually and report on it.

This can save you literally thousands of dollars by having an SEO fix something you have either paid for already, or spent a lot of time on yourself. It is much easier to have a professional SEO evaluate the one page, pay them for the extra information you require, such as all relevant keywords, copywriting advice for each page that you have planned, etc. It is money in your pocket, think smart about how you do business and get it right from the start. I am sure this is how most successful businesses work or else they would be broke having to correct faults constantly. SEO is no different, a couple of hundred or so well spent dollars in the design phase of a website, can save thousands, trust me on this, I do it and see it often.

About the author: Anthony Parsons has been performing search engine optimization since 1998. In late 2003 I decided to fly solo and opened my own SEO business to service the global community. With my wife being an acknowledged copywriter, anthonyparsons.com as a business will continue stepping forward breaking the boundaries of conventional SEO techniques. Making a winning husband and wife team, we make SEO affordable for all budgets.

Thursday, August 28, 2008

Absolute & Relative Links How Do They Rank?

Author: Martin Lemieux

Absolute & Relative Links How Do They Rank?

The question for this article is whether or not you should use ""absolute url's"" or ""relative url's""? Not only that, this article researches whether or not Google ranks these methods differently.

Absolute: You use the entire url pointing to the designated page. ex. www.yoursite.com/page1/index.html

Relative: You use an automatic path to the file ex. /page1/index.html

Relative gives a path that is ""assumed"". Your browser will automatically ""assume"" to put www.yoursite.com before the link.

When researching these two methods, I used 4 factors to consider:

1) 20 Different popular search terms 2) Top 20 listings 3) Top 10 ""Inbound Links"" for pages within the site 4) Relative/Absolute urls NOT images

So here's the results of this study:

1- The average results within the 20 search terms had a ratio of: Absolute 40% / Relative 60%

2- The average inbound links for each site I researched had a ratio of: Absolute %50 / Relative 50%

So it seems safe to say that Google doesn't necessarily rank ""absolute/relative"" paths differently.

Google may recognize the fact that neither method is wrong, it only reflects the designers preference.

There's only 1 type of Absolute and Relative paths that get a bad rank. Web sites that use ""tracking url's"" or data base urls get a significant reduction in page rank emmediately.

The easiest way to notice this in action is to go to www.pogo.com (Online games). You would think that pogo has a great rank but nope, in fact their main page rank is 0/10. This happens because every time google crawls through their url, the site is different.

So if you care about page rank, keep your url's the same as the day your site was born!

About The Author:

Martin Lemieux is leading the field online for web design and online advertising. Visit his company right away for many marketing tips & strategies: http://www.smartads.info

About the author: The question for this article is whether or not you should use ""absolute url's"" or ""relative url's""? Not only that, this article researches whether or not Google ranks these methods differently.

Tuesday, August 26, 2008

Do you submit your website to all the Search Engines?

Author: Jeremy Gossman

Should you submit your website to all the Search Engines?

by Jeremy Gossman

There is really only one search engine to worry about submitting too and that is Google and you should only submit one page your main once and maybe a sitemap a week later but never more than once a week. If you have a link coming to your new site from a site that is currently listed in Google then Google will find you quicker.

Most importantly search engines that have any bearing will find you if Google has you listed. Not that there are not any other worth while search engines. There are just many that look to Google. If you have a non comercial website you will want to look into DMOZ for a seperate submission.

So don't pay for directory submission many directories will bring you no traffic at all.

Here is the #1 tip to getting into Google. Get people to link to your site. In particular links form site that are in Google already.

Important Note: The more links that you have going out from your site will effect your page rank with Google negatively. The more links coming in will help your page rank. If you have the highest page rank for a particular key word you win.

Copyright 2003, WellnessWealthSystem.com, All Rights Reserved

About the author: Jeremy's

Wellness Business

 

Monday, August 25, 2008

Organic Search Engine Optimization

Author: Anthony Parsons

Organic Search Engine Optimization (SEO) is simply a marketing term for the natural development of a website. Organic SEO allows a website to gain free listings within the major search engines without regular ongoing marketing costs. SEO comprises many aspects to complete a total marketing package. The difference is whether the SEO package is natural in form to comply with the search engines, or artificial in an attempt to cheat the search engines. It's no secret that every website owner wants the top position within the Search Engine Results Pages (SERP's). What is unknown to many is that artificially manipulating a website or other external elements will not allow a website see the top position for very long. Artificial manipulation is what you hear when someone says ""hit or miss"" optimization.

Let me quickly define professional and unprofessional. A professional is ""a person having impressive competence in a particular activity"" and unprofessional being ""below or contrary to the standards expected in a particular profession"". I have explained this to highlight the meanings of non-organic, artificial, unethical or more commonly, unprofessional SEO. Unprofessional SEO is utilising door-pages, entry-pages, cloaking, spamming (constant submissions), hidden text, tiny text, pages made with no sole purpose other than for search engines, landing pages and several other methods. A professional SEO would never need to adopt these methods to achieve high, stable rankings.

There are several key advantages and disadvantages to organic SEO, as there are with most things. Organic SEO may be costly, and it will be costly when performed by a professional. Once performed, however, your website will achieve long term stable high rankings for the one outlay, then your costs are finished. The disadvantages of organic SEO is that because it is designed to obtain completely free high ranking, it takes some time to see the results. Results average between 1 - 6 months for stable performance depending on your actual keyword phrase. You also need to allow for minor tweaking and ongoing link analysis work to be performed. A problem that can occur, and will occur for competitive keyword phrases for example; internet, computer, marketing, etc, is that other websites are constantly tweaking to out perform yours. This can lead to a monthly monitoring and updating cost as required. If your website is that competitive though, the cost is obviously well worth it.

About the author: Anthony Parsons has been performing search engine optimization since 1998. In late 2003 I decided to fly solo and opened my own SEO business to service the global community. With my wife being an acknowledged copywriter, anthonyparsons.com as a business will continue stepping forward breaking the boundaries of conventional SEO techniques. Making a winning husband and wife team, we make SEO affordable for all budgets.

Sunday, August 24, 2008

Link Manipulation

Author: Anthony Parsons

Link popularity is a winning factor in many campaigns to achieve a substantial boost in your rankings, however; with all good can come over-inflated and out of control manipulation (BAD).

The search engines will not be defeated for long. Those who think they can get away with something in the short term will generally come unstuck in the long term. Google recently changed their algorithm (end 2003) which has upset many website owners who relied upon Google results to provide them business. To me, that is ineffective marketing, which ever way you look at it.

Websites who dominated the rankings because of link popularity suddenly became unstuck and are now, nowhere to be seen. Websites with minimal popularity but great content and even those with little content and medium popularity have begun to dominate the rankings for their given terms. Many people blame Google, blame SEO's and anyone else they can blame. These are free rankings that your contending within, which are constantly being manipulated to use every inch from the system structure.

Users are not going to tolerate this for long if they cannot find quality content and products when they search. With this sudden upset and some minor tweaking by Google, the results will eventually steady and those who once dominated will still not be seen.

Many people, especially SEO's and Link Marketeers, get carried away with the websites that their link is going to be displayed upon. Yes, a website with wrong or illegal content and websites that blatantly abuse search engines editorial policies, a definite no no, but everything else is OK.

Another thing that I have found is people getting too wrapped up in what Google wants! It is well documented that Google is currently the major search engine on the Internet. Google is also the major search engine that keeps changing their rules because so many attempt to manipulate them. Remember this, Yahoo was the leader of the pack in 2001. Who next?

What about the other MAJOR Search Engines that deliver the other 50% or so of traffic? This leads people to become to choosy with who they will and won't link too. How about this! Link to whom ever you please within the above standards, and advertise your website as much as possible.

Whether your website is on a page with no PR or on a page with a PR10, pages change, link importance changes and every search engine evaluates different aspects to rank websites within their engine. Maintaining a high ranking is like changing your underwear; a daily requirement!

About the author: Anthony Parsons has been performing search engine optimization since 1998. In late 2003 I decided to fly solo and opened my own SEO business to service the global community. With my wife being an acknowledged copywriter, anthonyparsons.com as a business will continue stepping forward breaking the boundaries of conventional SEO techniques. Making a winning husband and wife team, we make SEO affordable for all budgets. http://search-engine-optimisation.anthonyparsons.com

Saturday, August 23, 2008

Newbie in SEO,Read it

Author: Shankey Joshi

These are some points which are very useful for,who is newbie in SEO.These are some basic points in seo.

The domain name:-The domain name should show entairely your website.The hyphen can be use in the domain name in seo sense,this doesn't looks good but usful in SE ranking.

Webhosting companies and some other things:-- a) The site url is connected with static IP address not with dynamic address.Static IP address is related with only one site and dynamic IP address is given dynamicly to any site of the main. server.but around 98% sites have dynamic IP address.So try for static ip address.

b)mod riwrite-:this tecqnic changes the dynamicly pages in html pages. this is done becaus search engins spiders skip to include the pages which has %,=,?,+ signs in the url of any page.so by converting them in html pages by mod rewrite the spiders list these dynamic(database driven) pages.

c) For going on one page to another page if require to fillup a form.the second page is not listed by the SE spiders.becaus the SE spiders do not fill any forms and they skip those pages and afterward pages.so make a site map for that pages and write there about the information related to that. site map is imp according to seo.

selecting the keyword phrases:--selecting the keyword is an very very importrant thing in the SEO so you have to give so much time for that and never ignor that thing.keyword selection is basicly depends on the information in the site and in which he deals.never use very common kewords like date,automobiles,medical,chat etc.use some keywords which is differnt and have same meaning and think about u'r information u providing in site.think about that if u are looking for that information then what will u type in the google(search engine) for search. In common keywords u get very less ranking becaus there are lots of site with these keyword. do not make single word keyword make a phrase of less then 4 words.and put only 4 keyword phrase for one page.single word is useless and long phrase is uneffective so make a keyword phrase of max 3 words.for selecting the keyword we can use ""http://inventory.overture.com/d/searchinventory/suggestion/""

Meta Tags(Meta tags comes between the tags.):-- A. tag-: It is entirely depends on our site discription and page discription or the information store in it.we have to use our main keyword in this tag once. the title is not more then 60 characters or 7 words.it shows in our top navigation bar.

B. meta keyword tag--:it showws as .the content is no more then 250 characters and one keyword is not use more then three times its just like an spaming or abusing in the sense of search engin.

c> Discription tag-: In the discription tag we only give information for the particular page not of the whole site,it should be some discriptive and use our keyword phrases too.and that should be not more than 200 character long.

Keywords in heading:--Put the keywords in heading,this will increase our keyword density too. like SEO help same like this . this also increase the size of the text too. the Search Engins concentrate on these headings and the ranking of the page is also increase if we increase the headings with our keyword phrases but in the limit. we can also increase the headings by using the heading in the same line of text.by using this This will make a small heading with no space after it!

Use keyword in the text:-- The spiders see the text and see how much time the keyword and keyword phrase are used in the text.the spider only see 80-100 lines on the page so use the text in that lines.if we use text at the end the spider can be skip it.thats why the testimoinals are use in the left bar which is in the begning of the page just in starting of the table. the text should be min 500words or more than this if possible.

Put keyword in the Alt tags:-- Alt tags are use for the images on the page.when we take the cursore on that image than after few seconds the text is shown.it is very littile thing for the spiders.but thses littile things are have much improtance such time,so we have to think about it too.this text is useful for blined b'cos they cant see the image but here the text by special software.and put the keyword phrases in the alt tag is good.

Anchor text:--The anchor text mens the text at the hyper link.the title in that is index by the spiders.so put the keywords in the title of anchor text.every small thing work in seo so concentrate on all of them . the anchor text is like that.. Submit

Some Neglected things:-- a) The heading tag is important by in the relation of Search Engins .They give them an higher treatment.like (big headings), (small heaings) etc. same as alt tags are also important for the search engins and for users too.alt tags in graphics are also have given higer treatment by the Search Engins.when any user change its browser settings in which the graphics are blocked at that time this alt tags are shown nly in the browser. so put the main keywords and keywords phrases in the alt and heading teags.

b) Text is very important for Saerch Engins.SE gives it very high weight.so use it with your keywords and keyword phrases.but not put these keywrds many times b'cos it is treated as spam indexing.and the SE bar it from the search for life long.

C) Alt tags are also use for buttons too and for the divider bar too. as in sign in button put the alt tag as ""signin in the keyword phrase of the site.""and so on.

So these are the basics of seo,Just read more and more articles on seo and get updated each and every time.

About the author: Shankey joshi writes for ArticleBunch.com

Friday, August 22, 2008

Search Engines & Optimization

Author: Anthony Parsons

Search engines, search engines, search engines....Who knows which one to optimize for? Why does everyone behave so silly when it comes to search engines and optimization for a particular engine or keyword even? As most know, Google is the flavour of the month. It appears that everyone is excited about it. What about the other engines? Did someone forget that they exist?

Because Google is the leading engine, this years ""in engine"" , word of mouth spreads, more people use it, website owners begin optimizing for it and SEO goes nuts. How easily we forget, that only a year or so ago, Yahoo was the leading search engine. Which one next I wonder?

I think it is quite humorous that people have become strung up about one particular engine and one method of advertisement because they do well for a short period. By a short period, I mean a year or two. This is short when your running a business over decades. Google changed their algorithm end 2003 and sent the world a shockwave. Someone forget to tell these many upset businesses and website do it yourselvers, that relying upon one method of advertisement is not good business practice. If that's the extent of business knowledge , then some businesses are in lots of trouble.

Florida (Google HQ) decided to change their system. As a result, many website owners, who had paid thousands of dollars or suffered thousands of man hours building a website came unstuck. Why? Because all of their efforts were built around the characteristics of one system. Who did they blame? SEO's, Experts, Copywriters, Link Building Companies, Google or anyone who would listen - often the media. Someone forgot to tell them that Google is a business, and if your manipulating a website within their search engine rankings, your messing with their business. Keeping this in perspective, only those negatively affected are crying to the press. Those businesses or individuals who gained rankings as a result of this change, strangely enough, don't have anything negative to say.

I think the moral of the story is, people need to stop attempting to over optimize a website and never concentrate on only one source for rankings. Website owners need to focus on reality and market their websites across as many engines as possible. It is also necessary to keep a focus on what you are trying to achieve. Essentially, a website to attract visitors and stimulate their attention for your information and products.

About the author: Anthony Parsons has been performing search engine optimization since 1998. In late 2003 I decided to fly solo and opened my own SEO business to service the global community. With my wife being an acknowledged copywriter, anthonyparsons.com as a business will continue stepping forward breaking the boundaries of conventional SEO techniques. Making a winning husband and wife team, we make SEO affordable for all budgets.

Thursday, August 21, 2008

The Marketing Secret To Page Rank (PR)

Author: Anthony Parsons

Since Google implemented PR, the link popularity consortiums have gone crazy. PR, PR, PR, that's all that anyone seems to be focused upon. Did someone forget about what makes the WWW? Freedom of information, not how big my PR is. So, what is the big secret about PR and just how do you obtain that high six and above? Its called, a great website! That's it, no more, no less. Anyone who has a great website, with information that others find useful, exciting and refreshing, will generally have a high PR. Why? Because its an informative website that people like and will happily spend their time browsing. Often, browsing leads to spending.

If people like your website, simply because the information and content appeals, users will link to your website just because they like it; not because it is made from flash or has some unbelievable design to it. That should be your focus with any website. So, what is marketing then? Stay with me for a second. Definition of marketing is; the promotion and selling of products and services! The definition of marketeer is; a person who sells goods or services in a market. Now with that in hand, your website is a MARKETPLACE. The definition of marketplace is; a competitive or commercial arena. How does that grab you? A competitive or commercial arena is why you need to have a website that stands out from the crowd.

Everyone is providing something; whether for free or with some cost involved. All are exchanging information, or selling products or services. They are all aimed to appeal to your senses, keep you browsing or finally buying, well some anyway. If your competition has a very informative website, then you need to have more information and something new that enhances the product, service or information. The free thing is well overdone on the Internet and doesn't appeal to many anymore. People are not stupid. They want quality information and value for money. That is the bottom line of any successful business focus these days, not just dollars and cents.

If you can make a website that attracts interest, then you will gain a good PR. You can reciprocate links and pay for inclusion until the cows come home, but if nobody is visiting, then all of your efforts are for nothing. You can have the best PR and still not rank well. On the other hand, you can have very little PR, and still rank like a champion within the search engines because people are staying upon your website and visiting often for more information. It must be said, that a high PR will not happen overnight. With lots of work, exposure and time it will happen. Good information and good marketing will bring many visitors, converting into sales and/or a quality website sustaining a great Page Rank.

About the author: Anthony Parsons has been performing search engine optimization since 1998. In late 2003 I decided to fly solo and opened my own SEO business to service the global community. With my wife being an acknowledged copywriter, anthonyparsons.com as a business will continue stepping forward breaking the boundaries of conventional SEO techniques. Making a winning husband and wife team, we make SEO affordable for all budgets. http://search-engine-optimisation.anthonyparsons.com

Wednesday, August 20, 2008

Why Purchase Search Engine Optimization?

Author: Anthony Parsons

Well that's easy, so your website can actually be seen by users within search engines when relevant key terms to your website, business or product are being searched. If your website is not ranking in the top 20 for actual keywords being searched on the engines, then you definitely require search engine optimisation immediately.

It is fact that 85% of visitors to a website will come from search engines. It is also fact that over 90% of users rarely go past the first twenty results, first two pages, from search engines. It has been statistically proven that users will generally change search engines before sifting past the first two pages.

From this you can start to imagine the lost revenue and exposure that your business and website is suffering. Search engine optimisation is like a well-marketed television advertisement, your business is placed in front of the most appropriate maturity audience at a given time to achieve the best return on investment. You would not place your television advertisement about ""house renovations"" in the morning during the cartoons for example. This type of advertisement would be marketed during programs that are relevant to that subject. The same is achieved through professional website optimisation. A professional SEO will ensure your website appears where it can be seen at the most appropriate times. For example, when a search for ""home renovation"" or similar is searched, your website would then appear on the front page of a search engine. Your website will not appear when ""cartoon"" is entered into a search engine for example.

You have to look at Internet advertisement like this, with Billions of Websites floating aimlessly, thousands or hundreds of which are in direct competition with your website, all fighting for the front two pages of a search engine. Only a professional SEO will know how to gain that extra advantage to ensure your website can maintain a constant high ranking. Ensure you utilise a professional SEO, and I mean shop around, as a hit and miss job is no good when all your competition are continually attempting to rank over the top of your business.

I achieve steady top 20 ranking for my clients as most professional SEO do. It always depends on the market your targeting to how many visitors you will see, but try not to look at optimisation as just improving your throughput, as the actual aim is to achieve targeted throughput that will buy your products or service. The numbers game is not really for any website on the Internet, even though many play that angle, as each website is unique in content which is only required to be viewed when searched for that type of information, product or service.

About the author: Anthony Parsons has been performing search engine optimization since 1998. In late 2003 I decided to fly solo and opened my own SEO business to service the global community. With my wife being an acknowledged copywriter, anthonyparsons.com as a business will continue stepping forward breaking the boundaries of conventional SEO techniques. Making a winning husband and wife team, we make SEO affordable for all budgets. http://search-engine-optimisation.anthonyparsons.com

Tuesday, August 19, 2008

Keyword Density

Author: Kristy Meghreblian

We can't emphasize enough the importance of including keyword-rich content on your site to increase your ranking potential. Simply put, keywords are the words and/or word phrases people use when searching. As we've mentioned throughout the site, search engine spiders love content. Therefore, the more keyword-rich content you have, the better. When a search engine spider crawls your site, it won't recognize pictures or images. So, if you have limited amounts of text (or none at all) and you've got a lot of beautiful pictures or Flash animation, the spider may deem your site unworthy of listing.What Is Keyword Density? Keyword density is the ratio of a keyword or key phrases to the total number of words on that page. It is one of the most critical aspects of successful search engine optimization. To improve your search engine ranking potential, your keyword density must be just right. To calculate your keyword density, divide the total number of words on your page by the number of times your primary keyword or key phrase appears. Keyword density is critical when outlining the keyword portion of your search engine optimization strategy.Naturally, there is a fine line between strategically scattering these keywords throughout your content versus grouping them all together, separated by commas. The latter is known as spamming and you will get penalized for doing it. Don't think you can fool the search engines -- they have the technology to figure out these little tricks.Using Keyword Density To Improve Your Search Engine Ranking The best way to increase your search engine ranking potential is to develop your keyword strategy by researching the most relevant (and most searched for) keywords or keyword phrases before you even begin building your site. So, you've already built your site? No worries -- you should still consider reviewing the keywords you have selected and make any necessary changes to your meta tags and site content. No matter how nice your site looks, you won't get high search engine rankings without the right keywords. And remember, if your site has a lot of graphics or Flash animation with little content, we encourage you to consider a redesign. We understand that most site owners who fit into this category have spent a lot of money for these beautiful sites, but what is the purpose if they aren't getting the high rankings?That being said, here are a few tips on using keyword density to maximize your search engine ranking potential:1. Use our Search Term Suggestion Tool (powered by Overture) to research your keywords. This powerful tool will direct you to the most popular keywords for your specific business based on how many times that keyword or keyword phrase is searched for each month. You can then take that information and develop your keyword strategy based on those results.2. Incorporate these keywords or keyword phrases in your meta tags as well as your site content. People often forget that search engines will spider the heading meta tags first because they preceed and stand out from your main site content.3. Write keyword-rich content that not only satisfies the search engine algorithms but is equally informative for customers visiting your site. This is the most difficult part of writing your content - but also the most critical.4. Try to write at least 300 words for each page on your site. Again, the more content you have the better chance you will have to include those all-important keywords you diligently researched and ultimately selected. 5. Too often we see content saturated with too many keywords that, as popular as they may be, just don't relate to the site itself. Avoid doing this - it will only irritate potential customers. 6. Web sites should be updated on a regular basis -- don't let them go stale. Add new products/services, update users with new information and tools, do what you can to change your content (keyword-rich content, that is!) and keep users coming back for more.

About the author: As Submit Today's copywriter and editor, Kristy Meghreblian has written online content for many successful companies, including Monster.com. She has successfully combined her excellence in journalism with the delicate art of keyword density as it relates to search engine optimization. As a result, she has helped many Submit Today clients achieve top ranking. Submit Today is a leading search engine optimization, submission and ranking company located in Naples, Florida.

Monday, August 18, 2008

How to Optimize Your Website for Both Google & Inktomi

Author: Jim Hedger

The search engine environment continues to evolve rapidly, easily outpacing the ability of consumers and SEO practitioners to quickly adapt to the new landscape. With the ascension of Inktomi to the level of importance that until recently was held solely by Google, SEO practitioners need to rethink several strategies, tactics and, perhaps even the ethics of technique. Assuming this debate will unfold over the coming months, how does an ""ethical SEO firm"" work to optimize websites for two remarkably unique search engines without falling back on old-fashioned spammy tactics of leader-pages or portal-sites? Recently, another SEO unrelated to StepForth told me that he was starting to re-optimize his websites to meet what he thought were Inktomi's standards as a way of beating his competition to what looks to be the new main driver. That shouldn't be necessary if you are careful and follow all the ""best practices"" developed over the years.

The answer to our puzzle is less than obvious but it lies in the typical behaviors of the two search tools. While there are a number of similarities between the two engines, most notably in behaviors of their spiders, there are also significant differences in the way each engine treats websites. For the most part, Google and Inktomi place the greatest weight on radically different site elements when determining eventual site placement. For Google, strong and relevant link-popularity is still one of the most important factors in achieving strong placements. For Inktomi, titles, meta tags and text are the most important factors in getting good rankings. Both engines consider the number and arrangement of keywords, incoming links, and the anchor text used in links (though Google puts far more weight on anchor text than Inktomi tends to). That seems to be where the similarities end and, the point where SEO tactics need revision. Once Inktomi is adopted as Yahoo's main listing provider, both Google and Inktomi will drive relativity similar levels of search engine traffic. Each will be as important as the other with the caveat that Inktomi powers two of the big three while Google will only power itself.

2004 - The Year of the Spider-Monkey The first important factor to think about is how does each spider work?

Entry to Inktomi Does Not Mean Full-Indexing Getting your site spidered by Inktomi's bot ""Slurp"" is essential. Like ""Google-bot"", ""Slurp"" will follow every link it comes across, reading and recording all information. A major difference between Google and Inktomi is that, when Google spiders a new site, there is a good chance of getting placements for an internal page without paying for that specific page to appear in the index. As far as we can tell, that inexpensive rule of thumb does not apply to Inktomi. While it is entirely possible to get entire sites indexed by Inktomi, we have yet to determine if Inktomi will allow all pages within a site to achieve placements without paying for these sites to appear in the search engine returns pages, (SERPs). Remember, Inktomi is a paid-inclusion service which charges webmasters an admission fee based on the number of pages in a site they wish to have spidered. From the information we have gathered, Slurp will follow each link in a site and, if provided a clear path, will spider every page in the site but, pages within that site that are paid-for during the submission will be spidered far more frequently and will appear in the indexes months before non-paid pages. We noted this when examining how many pages Inktomi lists from newer clients versus how many from old clients. We have noticed the older the site, the more pages appear in Inktomi's database and on SERPs on search engines using the Inktomi database. (This is assuming the webmaster only paid for inclusion of their INDEX page) Based on Inktomi's pricing, an average sized site of 50 pages could cost up to $1289 per year to have each page added to the paid-inclusion database so it is safer then not to assume that most small-business webmasters won't want to pay that much.

Google's Gonna Get You Google-bot is like the Borg in Star Trek. If you exist on the web and have a link coming to your site from another site in Google's index, Google-bot will find you and assimilate all your information. As the best known and most prolific spider on the web, Google-bot and its cousin Fresh-bot visit sites extremely frequently. This means that most websites with effective links will get into Google's database without needing to manually submit the site. As Google currently does not have a paid-inclusion model, every page in a site can be expected to appear somewhere on Google produced SERPs. By providing a way of finding each page in the site (effective internal links), website designers should see their sites appearing in Google's database within two months of publishing.

We Now Serve Two Masters; Google and Inktomi OK, that said, how to optimize for both without risking placements at one over the other. The basic answer is to give each of them what they want. For almost a year, much of the SEO industry focused on linking strategies in order to please Google's PageRank. Such heavy reliance on linking is likely one of the reasons Google re-ordered its algorithm in November. Relevant incoming links are still be extremely important but can no longer be considered the ""clincher"" strategy for our clients. Getting back to the basics of site optimization and remembering the lessons learned over the past 12-months should produce Top10 placements. SEOs and webmasters should spend a lot of time thinking about titles, tags and text as well as thinking about linking strategies (both internal and external). Keyword arrangement and densities are back on the table and need to be examined by SEOs and their clients as the new backbone of effective site optimization. While the addition of a text-based sitemap has always been considered an SEO Best Practice, it should now be considered an essential practice. The same goes for unique titles and tags on each page of a site. Another essential practice SEOs will have to start harping on is to only work with sites that have unique, original content. I am willing to bet that within 12-months, Inktomi introduces a rule against duplicate content as a means of controlling both the SEO industry and the affiliate marketing industry. Sites with duplicate content are either mirrors, portals or affiliates, none of which should be necessary for the hard-working SEO. While there are exceptional circumstances where duplicate content is needed, more often than not dupe-content is a waste of bandwidth and will impede a SEO campaign more than it would help.

The last tip for this article is, don't be afraid to pass higher costs on to the clients because if your client wants those placements soon, paid-inclusion of internal pages will be expected. When one really examines the costs of paid inclusion it is not terribly different than other advertising costs, with one major exception. Most paid-advertising is regionally based (or is prohibitively expensive for smaller businesses). Search engine advertising is, by nature, international exposure and that is worth paying for.

About the author: Jim Hedger is the SEO Manager of StepForth Search Engine Placement Inc. Based in Victoria, BC, Canada, StepForth is the result of the consolidation of BraveArt Website Management, Promotion Experts, and Phoenix Creative Works, and has provided professional search engine placement and management services since 1997. http://www.stepforth.com/ Tel - 250-385-1190 Toll Free - 877-385-5526 Fax - 250-385-1198

Sunday, August 17, 2008

PPC For Dummies - Part Two Of Two

Author: Scott Van Achte

Two of the most important factors of any Pay Per Click (PPC) campaign are creating successful ads and deciding how much to pay per click. There are many PPC options out there to choose from, I am going to focus on the two most popular, Google AdWords and Overture.

Creating your ads for AdWords Creating your ad copy is the single most important part of any ad campaign. You want your ad to stand out amongst the others and scream out 'click me!' If your add looks and says the same thing as everyone else users will simply pass it by.

Before creating your ads you need to determine your target market and keyword selections. If your company focuses on a specific market niche try to target your ads in regards to that niche. Properly targeted ads will almost always out-perform those directed at a general audience.

When creating your first ad be sure to fit in your main keywords either in the title or near the beginning of the body text. Say something to draw attention by using call to action phrases and words that provoke enthusiasm and response. Things like " Save on DVDs,"" ""Get cheap stereos,"" or ""Join now for 20% discount," etc. Just be cautious, if you advertise something that you don't offer, Google will pull your ad. If your ad says you have something for free, you better have something for free listed on your landing page! Always be sure to follow Google's Guidelines

Once you are happy with your first ad, create 3 more ads that are radically different from the first. After 3 or 4 days take a look at how your ads are doing. (If you are using less frequently searched terms you may have to wait 1-2 weeks for better results.) Check the click through rate (CTR) of each ad. In most cases one of the 4 will show to be out-performing the rest. If this is the case, delete the poorly performing ads and create 3 new ads that closely resemble the successful one, each with subtle differences in the title and body text.

Again wait 3 or 4 days to see which of the ads is out performing the rest. If you again notice that one stands out, repeat the process. Eventually you will end up with 4 quality ads that are performing equally. Once the ads have leveled out, continue to keep an eye on them, I recommend daily. If one begins to slip, slightly tweak the wording. You must always keep an eye on your ads if you wish for them to continually perform well.

Determining your Max Cost Per Click with AdWords With AdWords when you enter your MAX CPC, it will then show you what Google estimates your average position will be for each keyword. ( The position predictions provided by Google are based on historical data from previous advertisers and are not 100% accurate, but it will give you an idea what to expect.)

Unfortunately there is no way to see what the competition is paying, so in most cases it's a bit of a duck hunt in the beginning. I suggest starting out with a MAX CPC slightly higher than you would normally, this will give you a slightly higher ranking and increase your chances of accumulating clicks. If your ad performs really well your rank will increase. As you begin to establish a good click through rate (CTR) you can adjust your max CPC to reflect the position you wish to obtain. (See part one of this article to find out how Google ranks ads.)

Creating your ads for Overture With Overture, writing the perfect ad is slightly different than with AdWords. Overture only allows you to create one ad per keyword, so this takes away the option of trying out various ads and going with the obvious winner, however, the basis for creating your initial ad remains virtually the same. After you have selected your target market and main keywords, write a specific ad targeting each individual keyword and be sure to include the keyword in the title or beginning of the main body text along with a call to action phrase or something that is sure to draw attention. Remember to check the status of your ads on a weekly basis, and tweak as needed. Keep and eye on your click through rate and regularly tweak poorly performing ads

Determining your Max Cost Per Click with Overture Deciding how much to spend on Overture is simple. Take a look at what the competition is spending, and out bid them. With Overture you should always try to be in the top 3 if you wish to have your ad dispersed among partner sites. (Yahoo, Lycos, MSN, etc). If the number 1 spot is currently paying 25 cents per click you need only bid 26 cents to grab the number 1 spot. If you want the number one spot, but are also willing to pay more, you can bid 40 cents, and will only be charged the 26 cents. One penny above the competition. Keep in mind though, if someone else increases their bid, your actual cost will also increase up to the max CPC you have entered.

Managing an AdWords or Overture PPC campaign can be confusing at first, but it doesn't take long to get a handle on what works. Creating a highly successful ad the first time around with either AdWords or Overture is a rare occurrence, but with a bit of regular maintenance and a well targeted campaign it won't take long to start seeing results.

About the author: Scott Van Achte is a Search Engine Optimization Professional and PPC Manager at StepForth Search Engine Placement Inc. Based in Victoria, BC, Canada. You can read more of Scott's articles and those of the StepForth team at http://news.stepforth.com or contact us at http://www.stepforth.com/ Tel - 250-385-1190 Toll Free - 877-385-5526 Fax - 250-385-1198

Saturday, August 16, 2008

Google AdWords Tips & Tricks

Author: Nakul Goyal

Synopsis: Google Ad Words are a wonderful and cost effective way to advertise has great flexibility on the part of the advertiser, he/she can make it running in minutes and can change ads whenever feels doing it after tracking the response of the visitors and the conversion rate. However testing keywords and ads to improve the conversion rate is a never ending process and one has to be very selective and alert in choosing the right keywords. You can be running an ad campaign at Google in minutes! Well to make your ad word campaign successful and worth spending I present some keen facts and suggestions to be considered while developing ad word advertisements.

Always be very specific: You should be very specific while choosing your keywords in the ads because being general in advertisements will bring traffic to your site but not many actual buyers in other words the conversion rate will be very less or your return on investment will be less.

Use square brackets: Use square brackets [ ] around your keywords that you want to target the most because by doing this your ads will be shown only when searches are made that match your keywords or phrases bracketed exactly. For example : [search engine optimization], [SEO Masters]

Now in this case the ad will be displayed only if searches are made of exact words ""search engine optimization"" or ""seo pages"" and you will get only the desired traffic to your site.

Link to exact pages: Always link to correct pages for example if you are advertising a particular product then link to the page in particular that is selling that product and not to the site's homepage. User will never like much browsing and will ultimately leave for your competitor if isn't able to find the desired information. So link to particular pages.

Offer immediate benefits: Offer the benefits that your product will give the consumer because many times users are looking for those benefits and will be induced to your site. For example learn free, earn at home, become an expert, lose weight etc. Try to include such benefits in the ads and see the results soar.

Make multiple ads: Always make two or more copies of ads as you might not know what might click up, constantly monitor your ads and see which one is gaining pace and remove or change the lesser responsive ad. This is an ever going process to reach the maximum potential customers and will go on and on.

Record the ROI: Record the return on investment of each ad to keep your expenses in your pre decided budget and update or change low ROT ads simultaneously. ROI is the base for which you are planning ads and it should not be under looked. Google offers the stats free for the ad words, hence you can use it and calculate the ROI and act accordingly.

Use provoking words: Use provoking words that instantiate action of the user like free, free shipping, special offer, limited time offer, tips, tricks etc. These types of words actually attract the users but make sure that the words are specific to your business otherwise they can be removed by Google.

Include Price Quotes: A wise and proven selling technique is to give a hot price and that's what works in ad words too. If you are featuring a particular product on your site then you must mention its price in your ad and I assure that you will get much targeted customers to your site. This will benefit the other way too that is some free finders will be kept away from your ad and you can save another valuable click which in return meet your budget limits. Your main target is potential customers and not everybody and this way you can head towards your goal.

Avoid double meaning words: There are words and phrases that mean completely different things depending on the target audience you're trying to reach. These words should be avoided. Be sure and examine your keyword list for any words that could take on a different meaning than what makes sense for your business. The best way to figure this out on your own is to simply do searches at the major search engines for each of your keywords, and see who else is in the result set with you. Then end up finally selecting the keywords that fit your purpose and relevance for the campaign.

Your goal to use Ad Words is to increase ROI, not to increase your traffic. It's more valuable to you, as an advertiser to receive only 10 new visitors in a month if they're all qualified and likely buyers, than it is to pay for 1,000 visitors a month that found you by mistake. You can achieve it by targeting at the right audience and selective very specific keywords and constant monitoring.

Good Luck! Contact me if you need to learn or know more about SEO [Search Engine Optimization], SEP [Search Engine Promotion], SEM [Search Engine Marketing], PPC [Pay Per Click] Search Engines, Link Building, Google Page Rank, Alexa Page Rank, Reciprocal Links etc.

About the author: A Master of Sciences in Information Technology from Panjab University, Chandigarh and a Bachelor of Computer Applications from Punjab Technical University, Jalandhar, India. He's a Microsoft Certified Professional, a Brainbench Certified 'MVP' (Most Valuable Professional), Brainbench Certified Internet professional and a CIW (Certified Internet Webmaster) Associate.

Website: http://www.nakulgoyal.com

Friday, August 15, 2008

Site Maps: A Force to be Reckoned With

Author: Kristy Meghreblian

Another important component of search engine optimization is the use of site maps. If you want visitors -- and search engine spiders -- to find every page on your Web site, a site map can be your biggest ally especially if you have a lot of content on your site (and if you've been reading all the advice on our site, you should know by now that the more content you have the better your chances are for top ranking).

So, what is a site map? Basically, it is a navigation tool. It lets visitors know what information you have, how it is organized, where it is located with respect to other information, and how to get to that information with the least amount of clicks possible. A good site map is more than a hyperlinked index, which only provides the user with a list of alphabetically arranged terms.

Site maps also provide lots of nutritious spider food for search engine robots that crawl your site and eventually index it. Once the robot gets to the site map, it can visit every page on your entire site because all the information is clearly indicated on that one page. However, in order for your site map to work most effectively, you must include a link to your site map in the navigation on every page of your site.

To make your site map most appealing to both the search engine robots and human visitors, be sure to include descriptive text along with the page URLs and links. Be sure to use your targeted keywords in that text. Remember not to be too repetitive with your keyword phrases, though, or you may be penalized.

When you make it easy for people to navigate your site, they will find what they are looking for and will most likely be a repeat visitor. Likewise, when your site is easily navigable by search engine spiders, you increase your chances of being favorably listed in their search results.

So, if creating a site map isn't part of your current search engine optimization strategy, maybe it's time you thought about adding this beneficial -- and fairly simple -- tool to your repertoire.

For an example of Submit Today's site map, go to: http://www.submittoday.com/site_map.htm.

About the author: As Submit Today's copywriter and editor, Kristy Meghreblian has written online content for many successful companies, including Monster.com. She has successfully combined her excellence in journalism with the delicate art of keyword density as it relates to search engine optimization. As a result, she has helped many Submit Today clients achieve top ranking. Submit Today is a leading search engine optimization, submission and ranking company located in Naples, Florida.

Thursday, August 14, 2008

Search Engine Optimization and Web Site Usability

Author: Kristy Meghreblian

Build a Web site and the people will come. Ha! If it were only that easy! The Web is the one sales environment where the customer has total empowerment. They have all the resources (i.e., your competitors) just a mouse-click away.

Not only are you in competition with the millions of other Web sites owners who sell the same product/service as you, but you are also competing for users' time and attention. While search engine optimization and submission can bring you the traffic you need, only you can ensure that visitors will stay on your site by giving them a reason to want to stay. That is where Web site usability comes in.

What is Web site usability? The International Standards Organization (ISO) defines Web site usability as the ""effectiveness, efficiency and satisfaction with which a specified set of users can achieve a specified set of tasks in a particular environment."" In simpler terms, usability is how efficiently and effectively users can accomplish what they are trying to do when they visit your Web site.

Now that you have an understanding of usability, we'll explain the basics of what a Web site should include to make the most of the user experience:

Content is king

Let's face it, people visit Web sites for content -- they want information. Sure, it helps if your site is visibly appealing as well. But, without the right content, the results of the user experience can be fatal to your business. They simply won't come back.

Here are a few tips to remember in regards to content:

1. Be concise. Research shows that reading from a computer screen is about 25% slower than reading from paper or other print medium. To that end, you will want to edit your writing to say the exact same thing in half the words it would take if you were writing on paper. Also, think back to the last time you came to one of those really long-winded Web sites where the content may have been great, but you still had to scroll and scroll and scroll to get to the end. It can be a nuisance. So, keep your pages short.

2. Make your content scannable. When people use the Internet, they are looking at mass amounts of information. Help them get to the core of what they want by using bulleted items, short paragraphs, and subheadings to make it easier for them to find what they are looking for.

3. Write without error. There is no excuse -- absolutely none -- for poor grammar, typographical errors, and misspellings. If you own a computer, you have access to spell-checking and grammar-checking technologies. Use them. These small details will reflect upon your site. If you don't convey professionalism on your own business, how will you be conveyed to potential clients? Can they trust you with theirs? Before uploading any new content, proofread it. Then, turn it over to someone else for their input.

4. Write as if you were a Public Relations pro. Granted, many of us aren't PR exec's, but you should know how to market your business. Use the lingo that is most appropriate for your business. While you want to provide information, your main goal is still one thing: to sell. So, write to sell.

5. Maximize your keywords. As part of the search engine optimization process, you went to great lengths to select keywords and phrases that are most appropriate for your business. Be sure to use them whenever possible (without being overtly redundant) in your content.

6. Refresh, refresh, refresh. Web sites should be updated on a regular basis -- don't let them go stale. Add new products/services, update users with new information and tools, do what you can to change your content and keep users coming back for more.

7. Know your audience. Since most audiences vary in terms of experience level with both your product/service and their experience level with the internet, you will want to simplify things more than ever. You don't want to talk to yourself - make sure potential clients understand your product/service. The best way to do this is to create content that is informative, yet easy to understand for even the newest of the newbies. Web site design

Secondary to content is the actual design of your Web site. While the user comes to your site specifically for information, they also will want to enter an area that is easy to use and visually appealing. Here are some usability tips regarding Web site design:

1. Avoid long load times. While the latest technology for Web sites is incredibly interesting and fun, lots of graphics, Flash images, and audio can create long load times that make the user wait. And, if customers have to wait too long, they may leave -- and never come back. As a guide, users will generally wait for a site to load for ten seconds before vacating.

2. Make your pages easy to read. A common error in Web usability is the incessant need to create the prettiest Web site that ever existed. We've all seen them - every color from the Crayola box of 64 has made its mark on these pages. And, with a little bit of color usually comes a lot of cute little images that dance across your screen. In all seriousness, resist the urge to do this. Not only will it hoard a lot of memory, but it will drive your users crazy. Black text on a white background is the easiest to read. If you really want a colored background, stick with a lighter shade, but remember to use black text.

3. Create a well-organized site. Maintaining a consistent look and feel throughout your site is critical. The navigation you use on the home page should be carried out throughout your Web site. Clear navigation can either make or break your site. You are basically providing your users with a road map to your products and services. Don't let them get lost along the way.

4. Consider your space. Content should amount to 50-80% of your page design, with navigation taking up approximately 20% of the space.

5. Stay consistent with design elements. Select one or two (maximum) fonts and stick with them throughout your site.

6. Have a secure and automated server. Amazingly only 20% of current Web sites are secure.

7. What can you do different? This is probably the most important thing to remember when designing your site. Think about your business and your competition. What are you doing differently that will make users visit your site? Once you find out what that is -- whether you offer the lowest prices, have a special widget that no one else sells, or have reputable customer service -- capitalize on that one thing by incorporating it in your design elements.

Conclusion

There are good sites on the Internet and there are an equal number of bad sites (if not more!) out there. The good sites provide for a smooth user experience - easy navigation and easy-to-find information. The bad sites are slow to load, difficult to navigate and leave the users frustrated before they can even get to the information they initially needed. If you've already invested the time and effort into developing a Web site, you should take a serious look at the usability of your site. Here's an easy homework assignment: Some day, when you've got a few hours to spare, surf the Internet and make note of sites you think are good and which ones drove you absolutely crazy. Investigate the qualities of those sites and what made them good or bad. Pretty soon, you'll start to see some patterns that you can learn from and implement into your own usability strategy. Remember, usability is all about creating a unique and enlightening user experience. Usability is the name of the game -- isn't it time you started playing?

About the author: As Submit Today's copywriter and editor, Kristy Meghreblian has written online content for many successful companies, including Monster.com. She has successfully combined her excellence in journalism with the delicate art of keyword density as it relates to search engine optimization. As a result, she has helped many Submit Today clients achieve top ranking. Submit Today is a leading search engine optimization, submission and ranking company located in Naples, Florida.

Wednesday, August 13, 2008

How Can Search Engines Help You with Your Business?

Author: Dmitry Antonoff, Irina Ponomareva

What Are Search Engines? Most of us often face the problem of searching the web. Nowadays, the global network is one of the most important sources of information there is, its main goal being to make information easily accessible. That's where the main problem arises: how to find what you need among all those innumerable terabytes of data. The World Wide Web is overloaded with various stuff related to diverse interests and activities of human beings who inhabit the globe. How can you tell what a site is devoted to without visiting it? Besides, the number of resources grew as quickly as the Internet's own development, and many of them closely resembled each other (and still do). This situation necessitated finding a reliable (and at the same time fast) way to simplify the search process, otherwise there would be absolutely no point to the World Wide Web. So, development and deployment of the first search engines closely followed the birth of the World Wide Web. * How It All Began At the start, search engines developed quite rapidly. The ""grandfather"" of all modern search engines was Archie, launched in 1990, the creation of Alan Emtage, a student at McGill University, Montreal. Three years later, the University of Nevada System Computing Services deployed Veronica. These search engines created databases and collected information on the files existing in the global network. But they were soon overwhelmed by the fast growth of the net, and others stepped forward. World Wide Web Wanderer was the first automated Internet robot, whereas ALIWEB, launched in Autumn of 1993, was the first rough model of a modern web directory that is filled up by site owners or editors. At about the same time, the first 'spiders' appeared. These were: JumpStation, World Wide Web Worm, and Repository-Based Software Engineering** starting the new era of World Wide Web search. Google and Yahoo are two of their better-known descendants. http://galaxy.com/info/history2.html. Search Engines Today Modern web searchers are divided into two main groups: • search engines and • directories. Search engines automatically 'crawl' web pages (by following hyperlinks) and store copies of them in an index, so that they can generate a list of resources according to users' requests (see 'How Search Engines Work', below). Directories are compiled by site owners or directory editors (in other words, humans) according to categories. In truth, most modern web search combine the two systems to produce their results. How Search Engines Work All search engines consist of three main parts: • the spider (or worm); • the index; and • the search algorithm. The first of these, the spider (or worm), continuously 'crawls' web space, following links that lead both to within the limits of a website and to completely different websites. A spider 'reads' all pages' content and passes the data to the index. The Index is the second part of a search engine. It is a storage area for spidered web pages and can be of a huge magnitude (Google's index, for example is said to consist of three billion pages). The third part of a search engine system is the most sophisticated. It is the search algorithm, a very complicated mechanism that sorts an immense database within a few seconds and produces the results list. Looking like a web page (or, most often, lots of pages), it contains links to resources that match users' queries (i.e., relevant resources). The most relevant ones (as the search engine sees it) are nearer the top of the list. They are the ones most likely to be clicked by the user of the search engine. A site owner should therefore take heed of the site's relevancy to the keywords it is expected will be used to find it. http://www.searchenginewatch.com/webmasters/article.php/2168031 A Relevancy calculation algorithm is unique for every search engine, and is a trade secret, kept hidden from the public. However, there are some common principles, which will be discussed in the following paragraph. http://www.searchenginewatch.com/webmasters/article.php/2167961 What to Do to Have Your Web Site Found through Search Engines There are some simple rules to make your resource relevant enough to be ranked in the top 10 by the majority of search engines. Rule 1: Work on the body copy A search engine determines the topic of your site judging by the textual information (or content) of every page. Of course, it cannot comprehend the content the way humans do, but this is not critical. It is much more important to include keywords, which are found and compared with users' queries by the programme. The more often you use targeted keywords, the better your page will be ranked when a search on those keywords is made. You can increase the relevancy of your targeted keywords still more if you include them in the HTML title of your page ( tag), in subheaders ( - tags), in hyperlinks ( tag), or just emphasize them with bold font ( or tags). Meta tags and were introduced specifically to help search engines. Unfortunately, they are rapidly losing their significance because it is too easy to abuse them. Webmasters should therefore concentrate mainly on body copy, which is the part of textual content placed between the and the tags. One should take into account the facts that the search engines' algorithms are constantly improving and that index databases are updated. When you have aquired the desired position in the listings, do not rest on your laurels. Site optimisation should become a permanent job for all site owners who regard web presence as an important part of their business. http://searchenginewatch.com/webmasters/article.php/2168021 Rule 2: Build links to your site As we have mentioned before, a spider scans the web following the links placed by site owners onto their pages in order to inform their visitors of where to find something that might be of interest. So, the greater the number of website owners agreeing to list your site, the smaller the time that will pass before all existing search engines will find out about you. What's more, those pages that are linked from multiple sites are considered by crawlers as more important. Google (http://www.google.com/) implements this concept via a so called Page Rank; other engines analyse your site's popularity in different ways. Remember that a link from a site that itself ranks well, is much more valuable than just any link. Also note that content relevancy of the site linking to you further increases the importance of the link. Rule 3: Play fair Do not get involved in unfair games with search engines. If you feel that your method is obviously deceptive, do not use it. Here are just some of widespread methods used by unscrupulous webmasters. Spam Let's assume that the site owner wishes to make a page very relevant to a certain key phrase. The most obvious course to take is to include this phrase into a page copy as many times as possible. When it starts looking unnatural (that is, the keyword density value becomes excessive), it will be regarded as a kind of spam (so-called keyword damping). This page will look odd both for human visitors and for search engines. Consequently, any WWW user will hardly wish to return to this page after having visited it just once, and search engines will be likely to penalise spam by reducing the page's ranking. Using colours to hide multiple keywords, as a kind of spam Some web masters in their vain hope to deceive search engines go a step further. They make the part of body copy, which is intended only for search engines, invisible (that is, of a colour identical or just a shade different from the background color), or tiny enough to be indistinguishable (i.e., 1 or 2 pixels high). Modern search engines have become smart enough to detect such tricks, so we wouldn't advise you to use these methods. You might even win for a short time, but lose afterwards, because some search engines penalise spammers by excluding their web sites from their databases. Link farms Many site owners unite in so called link farms in order to artificially increase the link popularity value. These are nothing but networks where everyone links to everyone else, concentrating on the quantity of links and disregarding their quality. Their efficiency is very low. First, a page can deliver just a small part of its value to every page it links to within the farm. If it contains too many links, this part will be worthless. Second, a page that contains links, just links, and nothing else but links, cannot be very authoritative for quite natural reasons. Besides, modern search engines analyse the link quality in terms of web site relevancy, ranking the link highly if it leads to a site devoted to similar issues. So, when you are looking for link exchange partners, choose those whose business is similar to yours. The sites of your partners, or web portals devoted to your business issues, are ideal for this. Cloaking This is a widespread technology that aims to deceive search engines . The point is, all known spidering robots recognised by their IP addresses or host names are redirected to a page that is specially polished to meet search engines' requirements, but is unreadable to a human being. In order to detect cloakers, spiders often come from fake IP addresses and under fictitious names. Also, users' feedback is collected, and if people too often find that a page's real content doesn't match its declared description, the page is revised by search engine owners' staff and runs the risk of being penalised. Rule 4: Your site must be interesting Increasing the number of pages included on your site, and the quality of information you place on those pages, increases the probability of getting good links to your pages. Interesting articles, and actual news concerning your business, will attract visitors' attention, and your site will be well-known and spoken of. If you gain a good reputation on the Internet, your commercial success will be almost certain, and the site will promote itself. Good site structure is also very important. If your site is created with the basic usability requirements in mind, and is categorised well, the users will enjoy visiting it. Every page should be easily accessed from the home page, and plain text links are preferred. Thus, a search engine robot will experience no difficulties whilst spidering your site content, following the links that lead from one page to another.

As you can see, merely having a website or running a company does not guarantee success. The demands of promotion, catering for conditions of your web site or brand recognition, popularity and attracting still more clients must be of prime importance. We have introduced you to the majority of tools used for the promotion of your business on the Internet. These tools apply the technologies that facilitate searching for desired resources. Evidently, website owners can be discouraged by the multiplicity of web searching algorithms as this demands search engine optimisation and comprehensive spadework. So if you don't think you can cope with this job, it is probably worth seeking a qualified Internet promoter or an Internet promotion company in order to gain good results at affordable costs. You will surely stand high in directories and search engines results and therefore increase traffic and the number of potential clients your business has access to.

________________________________________ * Before HTTP protocol was invented (around 1989-1991) the Internet was just a huge network consisting of FTP servers, and was used as a means of file exchange. There were no websites at all. the first search engines mentioned in the article ran via FTP and similar protocols. Only after Tim Burners-Lee had created HTTP protocol, did we get the World Wide Web, and the Internet acquired its actual shape. http://gsdtc.org/labhist.htm http://ieee.cincinnati.fuse.net/reiman/03_2000.html ** The first search robots that supported HTTP.

About the author: Dmitry Antonoff, 28. I've been with Magic Web Solutions ltd. (UK), Dartford, Kent, as a marketer and an SEO consuntant, since May 2003. I specialise in website promotion, and copywriting. I'm eager to share my experience with the Internet community. Irina, Ponomareva, 32. I joined Magic Web Solutions ltd. (UK), Dartford, Kent, on March 2003. I've been working as a web master, a developer, and an SEO specialist ever since.

Tuesday, August 12, 2008

SEO the Secret Weapon in the E-Commerce Wars

Author: Dylan Downhill

The Opportunity

It has been well documented that consumers use search engines to research and buy products online. They read reviews and descriptions, analyze ratings, and research pricing as they compare products and vendors. A successful e-commerce site must offer a positive customer experience and build trust. However, in the nitty-gritty world of online retail sales, probably the most important success factors are the accessibility of the product and price.

It follows then that high rankings in the search engines are essential to the success of an e-commerce site, because high rankings make the products more accessible to the online shoppers. The site should rank highly not just for the general keywords that describe the business, but, ideally, all of the words that describe the products sold on the site.

The Challenge

There are two primary methods to achieve high rankings in search engines. One is to optimize your website naturally, i.e. organically, so that search engines rank it highly for your important terms on their search results pages. The second way to achieve high rankings is to buy Pay-Per-Click (PPC) ads that appear on search results pages. Which method drives the best traffic for online retailers? According to a recent study conducted by Jupiter Research and presented at the Search Engine Strategies New York Conference in March, 2005, ""6 out of 7 clicks from the organic listings as opposed to Pay-Per-Click listings.""

Clearly, high natural listings in the search engines are key for a successful e-commerce site.

In spite of this fact, many online retailers still focus exclusively on PPC advertising to drive traffic to their sites. The explosion in the popularity of PPC advertising for e-commerce sites is likely due to a variety of factors, including the ease of setting up campaigns, competitive peer-pressure, and the ease of budget approvals based on the lure of quicker returns on ad spend. The rewards of organic site optimization, although greater than PPC, appear over a longer period of time.

For an e-commerce business, a PPC-only strategy is a short-sighted solution that has known pitfalls, too. As competition increases, PPC ad prices also increase, which erodes profit margins and forces the e-commerce business owner to increase their PPC budgets just to keep up.

Many experienced SEO practitioners have problems optimizing for online retail sites because they lack the necessary programming skills to optimize dynamic pages from shopping cart software. Un-optimized dynamic pages are often invisible to search engines. To compound the problem there are hundreds of different types of shopping cart software, each of which has different optimization requirements.

The Solution

To successfully optimize an E-Commerce site for search engines requires specialized techniques and processes that are above and beyond regular optimizations performed for non-commerce sites. These include:

* Development of web server output filters that enable Title and Meta tag reconfiguration even if the shopping cart application doesn't allow it. These filters can be applied to PHP and IIS with amazing results.

* Implementing sitemaps that allow search engines to index the entire site, even when cookies are required to access certain parts of the site.

* Optimizing product and page titles and descriptions to match the phrases that consumers actually search on. For example an online store in Thailand was selling Kaw Kwy statues. After research it was found this phrase was not a popular search term. Further investigation found a more common search term for Kaw Kwy - 'Fake Ivory'. After changing the product names, titles, and descriptions, the client started being found for his products and selling them.

* Develop product feeds to the major shopping engines such as Froogle, Yahoo Shopping, and BizRate to ensure maximum exposure for the products at prices far below current PPC ad prices.

* Undertake usability research and analysis so you know how people are really using the site and where they are abandoning the site or your shopping cart.

* Implement analytics to understand traffic, reporting and visitor conversion patterns.

* Implementation of other SEO tools such as paid inclusion (including Yahoo's SiteMatch) to increase the frequency of spidering if there is a constant stream of new products to your E-Commerce site. The Results

Natural optimization for an E-Commerce site can have a truly profound impact on the performance of the website and profitability of the business.

A major supplier of car and truck accessories reduced PPC increased visitors, and profits while reducing their PPC ad spending by $6000 per month. They achieved these results in less than three months by applying web and eCommerce analytics to an existing paid placement campaign, and then applying what they learned from the analytics in their e-commerce search engine optimization strategy. After 6 months their PPC sourced revenue and their organic sourced revenue were equal and their overall advertising spend was decreased by 30%.

A distributor of rugs and carpets, who had previously worked with two other ""regular"" SEO firms with only minimal success, implemented these e-commerce optimization techniques, and increased revenues 200% -- after only three months!

Conclusion

As PPC ad prices continue to rise, savvy online retailers E-Commerce site owners providers need to take a fresh look at how much true success they actually gain from their PPC campaigns, and evaluate (or re-evaluate) possible gains from integrating organic search optimization, too.

Optimizing an E-commerce website is different than optimizing a content-only site. While there are many SEO consulting firms all over the world who can work wonders with regular sites, there are only a few SEO practitioners who have expertise in e-Commerce optimization. Be aware of the difference when selecting a firm to work on your e-commerce site.

Taking steps now to optimize your E-commerce correctly for natural search engine rankings can increase your profits, and help you gain a competitive advantage over most online retailers who are still using only PPC campaigns.

About the author: Dylan Downhill is the CIO and Technical Director and James Peggie is the Director of Marketing for Elixir Systems - a search marketing agency located in Scottsdale, Arizona. www.elixirsystems.com