Friday, April 1, 2011

Are you crawlable?

Creating great content and making your site accessible to web users and search engine spiders goes hand in hand. In order to get as much of your content indexed and linked to as possible, it needs to be highly visible, easily navigated and clearly structured.

There are a number of common ‘build’ obstacles that can affect how easily your site can be crawled. Fixing these hitches is a fundamental part of on-page optimization and can make a real difference to the volume of traffic you receive. Here are a few bug bears:

Images with no alt text: Search engines cannot read images. Your page may look beautiful thanks to the glorious image you’ve included but it will be completely invisible to the search engine’s spiders. Likewise if you have submission forms, diagrams or text box images, none of the important content you’ve created will get its due credit from the likes of Google etc. This is easily remedied with either an alt tag providing a description of the image that the human users can see, or by ensuring your text is not embedded in an image file, but as HTML text.

Flash heavy and framed sites: As with images, spiders don’t deal very well with Flash or Java. While Flash can enhance a user’s experience of a site, as long as it doesn’t slow down the loading time, it hides important content from the spiders. Highly creative sites commonly have their homepage in Flash, which looks great but is detrimental to the site given that most inbound links will point there, yet the spiders can’t see it. Finding ways to secure a balance or at least ensure there is a text option available is crucial to a successful ranking. Similarly, a framed site can cause spidering problems as frames tend to fragment your site. Content from pages within the overall framework can be hidden as the spiders might only read one of the frames on a page. It is best to avoid building your site in frames if possible.

General functionality: Being too clever, including whizzy graphics and such like can sometimes slow down your site. The user experience has to be as smooth and as uninterrupted as possible so avoid over-complicated functions, ensure file sizes are low for downloads etc and invest in some high quality server hosting. Checking your site works in different browsers is prudent and it also pays to check regularly for broken links in order to limit user frustrations and also ensure that all content can be accessed by the spiders at all times. Having a site map helps both user and spider as, if your site’s not vast, it means that most of the content can be accessed in two clicks.

Duplicate content: Search engine spiders like unique content. If they find duplicated versions of the same content, they will take one page as ‘canonical’ and display that single page as the result. This is particularly damaging to your ranking in the search engine results as the strength you have built up through the links to a particular page will be diluted by the additional copies. Commonly this can occur on a site’s homepage when multiple versions of the URL are used e.g. the inclusion and exclusion of the ‘www.’. As most external links tend to point to the homepage, it is important to protect its strength which can be done simply by using a 301 redirect. This ensures you dictate the canonical page to the search engine spiders by pointing all duplicate pages to the primary page.

Design and navigation: The design and navigation of your site can guide a user seamlessly through your content on a path you have pre-ordained. This is the same for the search bots. By creating a clear and logical network of internal links, obvious navigation systems, breadcrumbs, well written anchor text so it is clear what will be returned on the click of a link, and so forth, you not only create a smooth user experience but lead the bots deeper into your site. Search engines look for relevancy so designing your navigation system to act as a funnel; taking them from broader to narrower subject matter within an overall theme, will help them rate your relevancy and suggest there is depth and quality to your site.

These are just a few of the most common adjustments that can help improve your site’s crawlability. Speaking very generically, it stands to reason that the better the experience for the web user, the better it will be for the search engines. The key is to make sure a literal, textual translation is provided at all times for the internet’s creepy crawlies.

0 comments:

Post a Comment

 
Copyright © 2010 Rohit Blogs. All rights reserved.