Deep and Thorough Indexing is Critical As Second-Tier Search Becomes Important

Services for Real Estate Pros with MyST Technology Partners

Recently, a client with an integrated Google CSE contacted me and said - "Hey, my search engine is busted; it won't bring up a post that's in one of my blogs!".

This client is using MyST Blogsite as well a few other blogging platforms and he also uses our newsest product MyST/VS, integrated custom search solution powered by Google. It turns out his custom search engine is fine; but what *isn't* fine is the content he was searching for is not in the Google index.

Many people believe that the simple exercise of creating a custom search engine (CSE) for a blog or collection of web sites automatically causes all content in the domains of the CSE to be included in search results. This is not the case at all; a Google custom search engine can only perform as well as the content that exists in the Google index itself.

As many of you know, I'm a big proponent of a large and dominant footprint in Google. I won't rehash my viewpoints on this matter except to point out that with the emerging importance of second-tier search, deep and thorough indexing of our content is even more important. If every bit of your content isn't in the Google index, you will experience an ever-growing competitive disadvantage.

Many people on ActiveRain have challenged my philosophy about thorough indexing in the past; they handily dismissed the benefit of a large Google footprint as a business requirement. With the importance of comprehensive second-tier search, this subjects emerges again; the consequences of a partially indexed body of content are clear - visitors aren't given an accurate set of search results. Missing pages in search results are lost opportunities to provide a complete visitor experience that offers up all your content. An incomplete picture of what you've written may mean the difference between a new customer, and a prospect that wanders off.

Can you Overcome This Problem?

Yes. Fortunately, Google engineers count on the possibility that your content architecture isn't designed well enough to achieve deep and thorough indexing. You can use the Google Sitemap capabilities to submit your unindexed pages directly into your CSE. But, this requires that you know all the URL's of your unindexed pages - a task that could take a fair bit of time to complete. This is not ideal though because you'll have to babysit this process time-and-again to make sure it's kept up to date.

There are some other tricks that can help, so if you're interested in learning more, feel free to contact me.


This entry hasn't been re-blogged:

Re-Blogged By Re-Blogged At
Web 2.0
Colorado Real Estate
LA Connection
custom search engine
unified search
deep indexing

Post a Comment
Spam prevention
Spam prevention
Post a Comment
Spam prevention

What's the reason you're reporting this blog entry?

Are you sure you want to report this blog entry as spam?


Bill French

Ask me a question
Spam prevention