Understanding the basic structure of how search engines register Web sites is crucial to succeed in search engine optimization (SEO). It is one of the most effective web site promotion techniques, referring to improvements and changes made to web pages so that they conform to the search criteria utilized by search engines to rank and position listings.
'Spiders' are used by search engines to locate, index, rank, and list their findings. Most indexing spiders can only read text and have in general problems to see dynamic, asp, Cold Fusion, Flash, multiple redirects, tables and codes such as Java Script.
In addition spiders take into consideration criteria such as keyword frequency, prominence, weight and proximity, plus keyword placement within the HTML They like "content rich" or thematic text (sufficient enough to support the primary keyword phrase) about your products/services and use this to determine how relevant the page is to a searcher's request.
There are many factors to consider when optimizing a web site, some of them are:
- Head, title, and meta tags (include descriptions, keywords, and copyright info)
- Incorporating body text that supports targeted keywords
- Converting dynamic web pages (asp, jsp) to allow reading by search engine robots/spiders
- Reducing source code without compromising Web site functionality
- Improving/fixing html code, broken links, and tags
- Ensuring sufficient keyword concentration and placement on a page
- Rewriting or copy writing to ensure the content supports the keyword phrases
|< Prev||Next >|