The Rules of SEO - XSitePro Website Design

Home Previous  Next

icon_xsp

Brought to you by XSitePro Website Design Software

professional website design made easy      

 

 

 

There are four major problems with using a fixed set of search engine optimization rules.

 

Change - The rules are always changing. Search engine technology is constantly evolving. What works today might be a complete no-no tomorrow. Equally, optimization techniques which are a complete waste of time this week might get fabulous results this time next year.

 

 

The Rules are a Secret that only the Search Engine Staff know - Secondly, outside of the search engines nobody really knows exactly what the rules are and nobody at the search engines will let anyone know this classified information.

 

The reason that the search engines guard this information so closely is that if everyone knew what the rules were then everyone would make sure that their sites followed the rules perfectly and that would, of course, mean that the rules no longer worked as there would now be no way of differentiating between the best and worse sites.

 

 

Rules Vary from Search Engine to Search Engine - The rules vary from search engine to search engine. For example one search engine might have a rule that says that the search term must appear at least once in the title of the Web page whereas another search engine might give priority to a Web page that has the search term twice in the page title.

 

 

Other Factors - Lastly, it is not simply a matter of mentioning your keyword exactly the right amount of times in your page title and body copy. Today's search engines are very sophisticated and rely on a whole host of factors to decide which site ranks best and which will end up at the bottom of the pile.

 

For example, over recent years there has been a growth in the importance of links. For a while the more links you had coming into your site the better. Then this became more sophisticated still when it wasn't just the link that mattered it was where the link came from (i.e. the site) and what the text was that was used on the link itself.

 

Something else that has gained in prominence over recent years is keyword proximity to the incoming link and other new techniques of improving the efficiency of search engine results are being introduced constantly. For example incoming links that are within an appropriate distance to the keyword in question will help improve ranking more than links that are not close to the keyword in question. Here's an example of good proximity. The blue text that is underlined is the link.

 

Keyword searched for: Caribbean holiday

 

For the ultimate romantic Caribbean holiday, make sure that the only name on your lips is Sandals. With the very best beaches and everything from gourmet dining to unlimited scuba diving it's perfect for a man and a woman in love.

 

The reason this would be a good link to have coming into your site (assuming your site was to do with Caribbean holidays) is that link (i.e. Sandals) is in close proximity to the keyword phrase you are trying to optimize for, ie. Caribbean holidays.

 

 

Is there any point in optimizing my pages for the search engines?

 

So, where does this leave us when it comes to optimizing Web sites. Are these four problems insurmountable? Should we just accept that to optimize for the search engines is just too difficult and to focus on just getting on with building our Web site instead.

 

Well, the answer is yes and no.

 

Yes, you should definitely make an effort to help maximize the chance of you getting a good search engine ranking, but you shouldn't become so obsessed with optimizing your site that you start to suffer from 'paralysis through analysis'.

 

It is quite common to find webmasters who have spent so much time trying to optimize their existing pages for the search engines that they haven't actually created any new pages or improved the site in any way for weeks, months or even years. Some web masters will even mess up the existing design in the vain hope that it will help with the page ranking to the point where the page looks so bad that anyone visiting would quickly move on even if they did stumble across the site on one of the search engines.

 

The problem is that trying to optimize Web pages in the short term by doing clever things is a futile exercise destined to end in failure for the simple reason that even if you achieve success tomorrow or next week there is no guarantee that your top ranking will still be there the following week.

 

 

Keep it natural

 

As the years of SEO have ticked over, Search Engines like Google have continually refined their algorithms to ensure that they return the best possible results to their users. In so doing, their assessment of what constitutes an appropriate match of content to search has come closer and closer to a "real person" appraisal. This means, in practical terms, that as Google (and other major search engines) get better and better at filtering out over-optimized content, instead favoring more naturally matched content, those sites that are 'genuinely' a close fit to the search result will feature more regularly at the top of the rankings.

 

This is of course a fairly simplistic view (and doesn't take account of practices such as link-building, etc.) but it does give an indication of the sort of approach you may want to adopt. Keep your optimization sensible and 'natural'. Focus more on creating a great quality site that is wholly relevant to the visitor and you won't go far wrong.

 

More information about taking a 'long-term approach' is contained in the following section.

Home  Previous  Next