Someone over at Kinetic Knowledge (who apparently doesn't want to be identified) published this justification for not having a thorough and dominant index of content in search engines. If you base your service on technology that is unable to create and sustain a deep and thorough of everything published, this is an expected justification.
While quality typically trumps quantity (no disagreement there), it's important that we recognize the debate was never [solely] about quantity; at its root, it is about thoroughness, and at a higher level, it's about size because without size, you can't achieve a sustainable marketing model for leveraging the long tail. Having a thorough and complete representation of all your content in Google is far better than having only a fraction of your content in Google, regardless of whether the pages are focused on whatever axis your subject expertise happens to align with.
"While having a large Google index might feel like something to brag about - take a step back and ask yourself if those pages are bringing you a return of value." - Kinetic Knowledge Author
There's one thing that's certain about this statement - if content cannot be found, it certainly won't provide any value to you or your prospects. ;-) But I have to ask - if you have pages that you've published, and they aren't in the search indices, why did you publish them? Furthermore, you don't get to decide with ones get indexed and which ones get passed over - what if many of the pages that aren't indexed are key value pages that you want prospects and future customer to find?
Unfortunately, the writer of this article bypasses any discussion of (or requirement for) creating a sustainable marketing model that depends on the shear number of posts that [each day] capture a few long-tail hits on every post, thus creating a continuous flow of new people to your brand. If you're blogging to reach the long tail, by definition, you must have lots of ways to reach the many markets of few in a way that will sustain the effort for doing it in the first place. A small index footprint is unable to achieve this - a large footprint makes it possible, but not a certainty. Suggesting that it's okay to not achieve a thorough and sizeable index footprint dismisses the guidance (outlined here) from an array of SEO experts that have studied this exact requirement.
The writer also fails to embrace a growing and serious implication of poor indexing performance - custom search. Google's CSE (custom search engine) solution depends on your pages being in the Google index, otherwise, like Google [worldwide], your own search engine will always be an incomplete version of reality. When visitors use your CSE, they expect that it contains 100% of everything you've published. If Google (itself) is unable (or unwilling) to index 30% or 40% of your content, your search engine will provide a fraction of the benefit. Perhaps one of the pages that aren't indexed is the difference between getting a new customer and not.
In my view, I don't think it's wise to advise anyone that anything but comprehensive and thorough indexing is a key business objective. Indeed, as the author concludes, nothing is perfect, but tolerating anything less than what's possible and reasonably feasible, is simply an excuse for not improving the SEO craft.
Comments(5)