Evaluating your website from an SEO,Indexing, and Crawlability

Tuesday, June 9, 2009

When you are evaluating your website from an SEO/Indexing/Crawlability perspective there are a few things to keep in mind and some important questions you may not be asking yourself.

A traffic problem is not always a ranking problem. Many of us are too quick to assume that a drop in traffic is an indicator of some sort of issue with our search rank.

You may not have a ranking issue at all. You could have a crawling problem, there could be parts of your sites getting crawled but not indexed, you could have some sort of extraction issue, any number of things.

The thing here to keep in mind is that you need to develop some sort of infrastructure to diagnose problems. Start with some ranking report benchmarks. Generally speaking, you want to be able to know about where you stand in the rankings for some of your top queries. Know about where you stand in the SERPs for those queries and that will give you a general idea of any significant ranking movement.

Organize your pages into categories. Analyze your server logs for search engine bot activity on a per category basis. This will help you have a better idea about how well the bots are spidering/indexing your content. You may find that categories 'A' and 'C' are being actively crawled by the search bots, but 'B' is getting very little attention from them.

These various category pages may also have significant variation in terms of their crawl rates. Some of your category sections may be crawled at a rate of 10 pages per day, some at 100 pages per day. Being able to see how many pages the crawlers pick up from the individual categories gives you a good idea about how long it takes the bots to get through your whole site.

Search Engines aren't going to spend all their time crawling all your content. Crawl efficiency is the name of the game... If you have a lot of pages you need to be able to let the crawlers know what pages are the most important for them to crawl around on. If you don't want registration pages, error pages, things like that -all non productive pages. So you would want to keep the engines off of things that are not productive for you so they can spend more time dealing with the 'good stuff'.

Create comprehensive, canonical .xml sitemaps for each of your categories (the categories have to be declared canonical before you can do this). In other words, you can create a sitemap index file that links your multiple sitemaps and submit it to Google Webmaster Central.

This gives you a very granular and accurate assessment of how well Google is crawling the various parts of your site. Better yet, you get access to all the cool little graphs and tools in Webmaster Central sitemap reports. This will allow you to identify not only how much and what areas of your content are being crawled, but also how much of that content is being indexed.

So if you see that Google is crawling everything you have in category 'A' but only indexing 20% of it, you have a solid spot to start looking for reasons why.

Make sure you actually have a problem before you start running around trying to fix a problem. Seems like it should go without saying maybe, but a decrease in your overall indexed pages for example, doesn't necessarily mean you have a problem. Google may have simply de-iindexed some of your ineffective or duplicated pages for example.

If you haven't had a drop off in search traffic, then you probably don't have a significant search problem. Changes are not necessarily problems. Seems like quite a folks in this business have a little trouble with that distinction.

So there you have it. 5 pretty solid tips from one of our favorite former Googlers. I would of course urge you to check out our video to get it right from her - she says it all a lot better than I do.

0 comments:

  © Maravila inspiration by Vila 2009

Back to TOP