Introduction – in simple terms, SEO is the process of improving the number of visitors to a website via search engines. By optimising your website with targeted specific key phrases used by your target customers, its possible for search engines rank your website more highly than similar competitive sites (that are not optimised). SEO should be viewed as a component of part of your overall professional internet marketing strategy and ethically used to improve the quality of your visitor experience, according to search engine guidelines and standards. The first step is to understand how search engines work….
Search Engine Basics – A search engine is the website that allows anybody to enter a search query for web site information from billions of web pages, files, video, images, music files. Most people have heard of Google, Yahoo, MSN but they’re also literally hundreds of other less well known specialist Search Engines also providing similar services. When you visit search engine, search results are traditionally displayed as blue links with a short description about the website. The results related directly to the users search query. Search engines evolved from the creation of large directory projects such as the DMOZ and the Yahoo Business directory. In the early to mid 1990s, search engines started using the web by crawling technology to trawl the ever increasing number of websites being developed. Today search engine results from google, yahoo and MSN also appeared in other minor search engines such as AOL. 80% of people find information on the Internet via a search engine because they are easy to use, flexible and provide a highly relevant links to the Internet.
How Do Search Engines Work? – Search engines use automated mathematical algorithms to rank and compare web pages of a similar content. The algorithms are highly complex of and rely on search bots continually trawling the Internet to an copy or ‘cache’ every webpage it visits. Search bots automatically look for specific information when visiting a web site such as the robots.txt file, sitemap.xml file, WHOIS data. They do this to find new content in microseconds and ensure their own listings presented to users are highly up to date and relevant. The data is stored by the search engine company in huge server data centres. The exact mathematical formulae of the search algoithm is jealously guarded by search engines, and so only analysis of historical data is used to make some general assumptions about how they ranking work. In addition, each engine publish some webmaster guidelines to give some general guidance about how to create a quality site and not use techniques that may get a website banned from its listings, by its moderators.
How Do Search Engines Present Relevant Results? – historically, the primary factor search engines used to rank web sites is the number of links a website has from other websites. These are known as inbound links. Over time search engines grew more popular and link farms developed to try and manipulate the results. To combat this the algorithms became more sophisticated. Today, links are less important and instead the textual relevancy of the words, paragraphs, pages and entire theme of website is critical achieving high search engine results. Search engines employ advanced anti spam factors to ensure that users are presented with the most relevant and quality results possible to reflect their search. More recently search engines are diversifying into different means of search, such as images, video, universal local search, product and price comparison as well as developing free online applications such as calendars, spreadsheets and word processing applications.
Key Phrase Analysis & Selection – the next step is to identify the keywords related to your product or service, that your target prospects are typing into search engines. Only then can begin to effectively strategise and design and optimise a website around your buyers needs and wants. Key phrase selection is the first and most important step in internet marketing. Why is this relevant?… Search engines use mathematical algorithms to compare web pages in order to rank these pages (based on a user search query). If you make incorrect assumptions (without researching) and target key phrases that don’t interest buyers, your website will fail. Conversely, if you target the right combination of keywords and phrases (before you even design your website), you will maximise your chances of higher search rankings and create an opportunity to sell. The bigger the market is for a particular product or service, the more competitive the online marketplace is for the related search terms. For instance, a quick check in Google reveals there are approximately 3.44 million search engine results for the term ‘mortgage’… yet only 0.217 million results for the phrase ‘discounted commercial mortgage quote’. In other words, the former is approximately 19 times more competitive in achieving a top search engine position than the later. By using keyword selection tools, advertisers can identify what search terms are not only popular but also how competitive they are. For instance, there are approximately 37.2 million people typing in ‘mortgage’ into all global search engines per year, yet only 0.8 million people typing in ‘commercial mortgage’. Keyword tools are invaluable in identifying a range of niche search terms that can be used to help optimise a website to achieve higher search engine rankings/ more website visitors. These tools can also produce derivates and synonyms, common spelling mistakes as well as produce a comparative competitiveness indices to see if a particular phrase is hard or easy to achieve top search listings with.
Once you have used your market knowledge and keyword tools to validate the search volumes of phrases, make a list, ranked by search volume, of your top 10 phrases. Invariably there are derivatives of your top ten target phrases. For instance, if your primary target term is ‘mortgage quote uk’, you might also identify ‘uk fixed rate mortgages’ and ‘mortgage broker uk’ as secondary phrases. Your prospects will type in hundreds of similar search queries to find a particular product or service. By creating optimised web pages with relevant content (that include these phrases), you maximise your chances of achieving increased search engine traffic. You will need to continually update the list and check your site logs and statistics and test using keyword tools. From this feedback, by checking which search phrases and entry pages were used to enter your site over time, you can easily see how successful your optimisation efforts are going. Key phrase selection should literally dictate the choice of domain naming for new sites, website design; navigational structure, unique selling points and linking strategy. Always add a couple of optional fields in your Contact Form to get feedback form your website visitors; ‘which search engine did you use to find our site?’ and ‘what search term did you use?’
Competitor Analysis – assuming you have a product or service that has some unique selling points, you need analyse your online competition so you can optimise your USP’s effectively. From an internet point of view you need analyse what other websites have achieved relative to your own website. For the first step is to identify who they are and make a list. This is simple enough by entering all of your primary and secondary key phrases into the major search engines and building up to a list. Of similar key phrases are usually bring up the same websites and you will very quickly understand what do you need to knock off the to the search engine to succeed. Updating this list on a regular basis is as important as the initial analysis. By obtaining feedback from search results you can constantly re-analyse competitors in terms of additional links or content they have added, or analyse how they have restructured and reorganised their website have to make it more search engine friendly. Review each site carefully to analyse your competitors on the following basis:-
Keyword / Keyphrase Density Analysis – make a list of your competitors keywords (as per above) to validate your own analysis.
Pagerank Checkers – seochat provide a useful tool to lookup pagerank of multiple sites, if you have a large number of competitors.
Supplemental Page Checker – use toosl to check the proportion of ‘less important, less highly ranked’ pages of ab website in Google’s ‘Supplemental Index’, versus its main index.
Search Engine Exposure – rather than visiting each of search engine individually, you can visit websites like netconcepts to provide tools to measure how many internal links a site has been able to get cashed in each major search engine.
Whois & Contact Forms – the WHOIS databases will allow you to match website owners with real world businesses that are (perhaps) using multiple sites to boost sales possibilities.
Site Age – By using archives.org WayBackMachine you can see copied of old pages of your competitors to see how they originally started years ago and then changed and improved their sites over time.
Quote Checker & Mystery Shopper – if your competitors are online quote system its very easy to compare your product price level against theirs. if they only use a contact form and telephone call back you could pretend to be a prospect requiring a quote.
Links Analysis – link building is a never ending process and never stops (because you competitors won’t). As the number of pages on your site increases you will find out sites (and in particular directories) will begin adding links to your site without your knowledge. To help achieve top rankings, you could begin to see how many quality links you need by looking at the number and type of links and ‘backlinks’ from your top 5 competitor sites. You can also use links checkers to see how many inbound links competitors have amassed since their conception by using tools such as xxx or xxxx. This is your general target for link volume to. Alternatively, you can o manually find this out type of information direct from search engines like Google by entering the following search queries:- your-competitor.com
Choosing a Domain Name – this section summarises the issues faced when deciding what to call your new website and how to manage the domain name….
Where to Buy a Domain Name? – if you haven’t already chosen your domain name for your new website a good place to start is NetworkSolutions. This is the largest registry of of domain names on the Internet and provides a service to check domain name availability. This is a reputable company that allows you to move an existing domain name settings (DNS) freely and ensures your privacy is also protected. Some domain name resellers have used sharp practice to make it difficult for you to move your domain name in the future, between different hosting companies. They have achieved this using financial penalties, technical over complexity, call centre queues and in some cases contractual small print (that could even lose your legal right to own the domain name). In most situations, resellers of domain names are often also hosting companies whose primary interest is to sell you more hosting space which is linked to the domain name itself.
Choosing The Right Domain Name? – it is preferable to choose a name which incorporates the primary search term or phrase you are trying to achieve high rankings for. Your new domain name has to reflect what you are selling or your company vision. There has to be no confusion in the minds of the user. In particular, if it is a type of site where you want repeat business, the name site has to be easily remembered, spelt – so the shorter the better. However, virtually all single word and popular commercial key phrase ‘expressions used to register a domain names have been already purchased by other people and domain name squatters. So choosing a domain name of the is always difficult and may involve the bidding on a domain from the second user market. If this is the case of, (just like buying a second hand used car), it is important to check the history of the domain name in terms of what it was used for in the past and whether it has been banned by search engines like Google. The last thing you want is to buy a domain name from the second user market, only to find Google has banned it in the past for not conforming to search engine guidelines. It is also always sensible to look at the country extensions of domains to check whether the domain name you are seeking has already been registered and made live in other countries. People tend to sometimes assume of that all websites are a ‘.com’. Therefore, if you can find a name where you can register the ‘.com’ as well as your local country extension, you will avoid confusion. Search engines also use the country extension of a domain name in their ranking algorithms to generate search results. As search engines also have various country orientated search websites, you must ensure your domain name reflects your country (particularly if your target market lives in one geography only).
Domain Names & Search Engines – if there is not usually a valid reason why you need dozens of domain names in order to sell the same product or service, then don’t… these are known as splash sites or doorway pages and are disliked by search engines to view this as a form of spamming. If the content of each website is unique and completely different, then separating by topics or localised geographic content makes perfect sense. Just remember it takes twice as long to promote and build and manage twice as many sites. Search engines ‘search bots’ also automatically interrogate the WHOIS database to identify the owner on the website. Therefore, when you register your domain name, it is sensible to reveal your contact details on WHOIS database if you have nothing to hide.
Hosting Decisions – Hosting of a web site of is almost always overlooked by the beginners. There are thousands of hosting companies offering very cheap shared hosting space and the temptation is to just go to the easiest, quickest and cheapest option. Sometimes this attitude can destroy a web site of business. The general rule for choosing your hosting company is that you get what you pay for. Quality hosting is a critical to ensure your target market can actually access your site without it falling over every other day. Before ‘pointing’ your DNS settings to a new hosting shared server package, you must think through your technical requirements such as operating system support for database, online applications and scripts, stats, email access and the level of telephone support offered by the hosting company. To lean more about the technical issues surrounding hosting please visit our hosting section.
Writing Compelling Website Content – the most important ranking factors search engines employ is an analysis of the quality and relevance (and to a lesser extent the sheer quantity) of the text on any given webpage and collectively the website. We would recommend not over analysing a web page too much using software tools. Instead, put yourself in the mind of your prospects user and invest more time in creating an interesting, and informative, exciting and most importantly original content. Try and be obvious; you must tell the user what you think he wants to hear (based on your market research) as a users attention span when they hit your home page lasts for about 4 seconds before the click back in their browser and go on to the next one in the list. Therefore, use bullet points and simple sales messages or offers to summarise unique selling point sites. Remember, most web user skim websites until they decide they have landed on the site they want to learn more from. make sure you are also original – write in such a way that you are not or copying other web pages (or rewording). Duplicating content from other websites is a huge no no as it attracts duplication penalties from search engines or may risk having your site banned altogether. Writing content is the most difficult area of of what search engine optimisation and web design. To find out more visit out section on writing compelling content. Quantity of word count is also relevant; there is no right or wrong answer for the optimum number of words on a webpage. By analysing existing competitor web pages you’ll probably find that the average number of words ranges between 600 to to free 1000 words per page – any more and the spider may view of the page is too large and any less and the search was part of spider may view of the page as having to battle content to bother with.
Keyword Density Analysis – when targeting a certain keyword or keyphrase is essential that this appears within the text of the page at least once. If you insert this keyword or phrase too many times you run the risk that search engines view it as ‘keyword stuffing’ (which is a form of spamming). The ideal amount should by no more than three times per page. When optimizing a webpage actually this may depend on the volume of text on the page itself – the more text (words) on a page, the more times you can justifiably make reference to your key phrases. By using any number of free online tools to measure the exact percentage a keyphrase appears relative to the other words of page you can make simple judgments regarding what whether or not the page is correctly optimized. This simple principle is known as a keyword density. By analysing the keyword density of the home pages of your top 10 competitors you can establish quite quickly of what percentage of their keyphrase appears on their home page. For the lower and upper limits of this test will give you a safe range for your own home page in terms of the number of times of your primary keyphrase appears relative to the rest of the page. There is no right or wrong answer. The simplest solution is to write compelling, informative and interesting information for your web site visitors. It is important you placed your keyphrase as close as you can to the top of the page. Robots read the text from left to right starting at the top left hand regards the screen and working away its along the rows. By including your keyphrase near the top (as opposed to near the bottom of the page) you are attaching a higher keyword prominence to that phrase (relative to words that appear near the bottom of the page). To find out more visit out section on keyword density.
Design & website: Usability Good Practices – you will need to consider a range of web design issues in the context of implementing your online marketing strategy. To find out more about good design visit out section on website design. Highlights:-
Understanding of HTML – HTML is computer language that facilities the creation of internet web pages and is essentially a text file with a series of short defining codes around text.
Designing a Compelling Homepage – this is the most important page with the website as most uses will land on it, relative to other internal pages of your website.
Simple Navigation – the most important aspect of usability is a clean and simple navigational structure. Your structure must be consistent across all pages so users have a clear understanding of how to find their way between sections and back again.
Simple Page Structure – try and suspect the areas of your page up but into a top border, side border, bottom border and main body.
Accessibility – the Internet is developing standards such as WC3 for browser accessibility.
Implement a Linking Strategy – having an effective linking strategy is a critical part of search engine optimisation. Search engines measure ‘inbound links’ by the sheer volume (link popularity), link quality (basis of the webpage and website your link is posted from), link type and frequency of links added over time. A website with no inbound links will never rank higher than one with lots of links from other good quality sites. Implementing a reciprocal linking program requires patience, belief, tenacity and a very good understanding of the factors search engines used to rank web site links. Depending upon the existing ‘quality’ of your own website will dictate how you implement a linking strategy. Linking strategy is so important from the outset when launching new websites.
The Role of Meta Tags – a metatag is a hidden HTML element which helps to search engines described the page. Metatags help of a search engines described the contents of a web page. Search engines use snippets of information to present a listing in the results to users. This is a mandatory first step for webmasters when designing a web page. Without a Metatag your webpage will get cached properly. Historically, metatags were an important element of search engine optimization. However, today’s search engine algorithms concentrate more on content and popularity of site across the net. Metatags are only really useful to search bots create of snippets of information for their listings. To find out more visit out section on metatags which explains more about what each tag means and its impact on page ranking. Different search engines apply different importance to Metatags. For instance Google has stated it ignores the Metatag keywords as a ranking factor. Never ‘stuff’ you targeted key words in the Metatag section at search engines to view this as a form of spanning and will act appropriately by banning your site. The golden rule is that your Metatags should help the user understand your webpage.
Statistics, Logs, Web Analytics – the use of ff page optimisation is critical. Statistics are a critical way to analyse your success. To find out more about statistical options please visit out section on website statistics. Website Statistics can provide you with a huge range of information was such as the behaviour of the users, lead and sale conversion ratios and which search engine customers used and what searched termed a typed into it. Without feedback your online marketing strategy will fail. Most shared hosting comes with statistical options such as Smartstats, AwStats and Webalizer. Alternatively, there are also online services that provide similar information such as Google Analytics, Onestat and Statcounter.