Friday, June 1, 2007

Server is Often Inaccessible to Bots

Ever wonder why your fancy, jazzy, snazzy website doesn’t generate as much hits as you it want to? When you specifically seek your site in a search engine (SE), don’t you wonder why your server ends up at the bottom rankings, or worse, the search engine results page (SERP) drops your site altogether? Although you have a server with excellent data, don’t you wonder why the SERP still ranks higher the less useful sites or those sites which have no connection to the keywords you supplied in the search query?

Stop wondering. When you created your site, you probably have forgotten to consider a few things. One of these could be your server’s “crawlability”, or its accessibility to bots.

One of the basic aspects of search engine optimization (SEO) is making sure that pages from your server are accessible to SE’s. Your homepage and other pertinent pages of your site should be easy to index, especially those which showcase whatever data, product, or service you are promoting. Homepages are especially vital, as these (should) carry the top level documents and (should) attract the most inbound links.

The navigational structure of your site should be seriously considered. Important data on separate pages should be only one click away from your homepage. The second most important, two clicks away… and so on. Linking these pages to gain optimization is best done using generic HTML text links. Text contained in the anchor can be used to describe the destination page.

SE bots can only follow generic ‘href’ links. Your fancy dropdown menus, Flash links, JavaScript links, and submit buttons are all inaccessible to bots. Query strings that have more than one parameter are also ignored by bots. Although image links are okay, the ability to describe the destination page is reduced: the alt attribute doesn’t rank as high as the anchor text.

By categorizing the contents of your website, and structuring them so that you have a hierarchy of pages (ordered according to importance) linked to your homepage, you provide more niche key phrases that SE bots can target easily. Also, it would help if you add a sitemap to your website; this is another basic aspect of SEO. A sitemap lists the links to all the pages within a site contained in one page, called the index page. This gives the SE bots easy access to all the pages within your site.

Often, when SE bots cannot “crawl” the contents of your site, the SERP redirects you to one of the bots which can. Other websites are at advantage of being crawled first, thereby giving them higher ranking in the SERP.

Another thing to consider is your site itself. If your server is often down, you cannot expect to see your site ranking high in the SERP. We all must have experienced how it feels like—clicking through air—and it’s frustrating as hell. SE’s are also not keen in sending visitors to sites that are always under construction or are inexcusably difficult to get to. Those sites give both themselves and the search engine a bad reputation.

If your site is inaccessible too often, you might want to check your web host. Your site may be up and running but your domain may not be. Or better yet, find a better host. The importance of this factor in Google’s algorithm is highly disputed by experts. Nonetheless, the inaccessibility of the server to bots is considered moderately detrimental to “crawling” a site.

Making your site more bot-friendly is an important step in the optimization process. It certainly helps your site gain more hits, more traffic, and a higher ranking.

No comments: