<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Bill French's (bfrench) Blog</title>
    <link>https://activerain.com/blogs/bfrench</link>
    <description/>
    <language>en-us</language>
    <item>
      <guid>https://activerain.com/blogsview/934794/home-staging-channel</guid>
      <title>Home Staging Channel</title>
      <description>Recently, the folks at The Home Staging Channel (in concert with a prominent social media PR firm, Expansion+) launched a Ning-based social space for home stagers. Check it out here.
http://community.thehomestagingchannel.com
Ning (ning.com) is pretty good at creating and sustaining social networks that might need to scale quickly.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sun, 15 Feb 2009 03:42:17 -0800</pubDate>
      <link>https://activerain.com/blogsview/934794/home-staging-channel</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/630759/thorough-indexing-performance---the-debate-continues---</guid>
      <title>Thorough Indexing Performance - The Debate Continues...</title>
      <description>Someone over at Kinetic Knowledge (who apparently doesn't want to be identified) published this justification for not having a thorough and dominant index of content in search engines. If you base your service on technology that is unable to create and sustain a deep and thorough of everything published, this is an expected justification.
While quality typically trumps quantity (no disagreement there), it's important that we recognize the debate was never [solely] about quantity; at its root, it is about thoroughness, and at a higher level, it's about size because without size, you can't achieve a sustainable marketing model for leveraging the long tail. Having a thorough and complete representation of all your content in Google is far better than having only a fraction of your content in Google, regardless of whether the pages are focused on whatever axis your subject expertise happens to align with.
"While having a large Google index might feel like something to brag about - take a step back and ask yourself if those pages are bringing you a return of value." - Kinetic Knowledge Author
There's one thing that's certain about this statement - if content cannot be found, it certainly won't provide any value to you or your prospects. ;-) But I have to ask - if you have pages that you've published, and they aren't in the search indices, why did you publish them? Furthermore, you don't get to decide with ones get indexed and which ones get passed over - what if many of the pages that aren't indexed are key value pages that you want prospects and future customer to find?
Unfortunately, the writer of this article bypasses any discussion of (or requirement for) creating a sustainable marketing model that depends on the shear number of posts that [each day] capture a few long-tail hits on every post, thus creating a continuous flow of new people to your brand. If you're blogging to reach the long tail, by definition, you must have lots of ways to reach the many markets of few in a way that will sustain the effort for doing it in the first place. A small index footprint is unable to achieve this - a large footprint makes it possible, but not a certainty. Suggesting that it's okay to not achieve a thorough and sizeable index footprint dismisses the guidance (outlined here) from an array of SEO experts that have studied this exact requirement.
The writer also fails to embrace a growing and serious implication of poor indexing performance - custom search. Google's CSE (custom search engine) solution depends on your pages being in the Google index, otherwise, like Google [worldwide], your own search engine will always be an incomplete version of reality. When visitors use your CSE, they expect that it contains 100% of everything you've published. If Google (itself) is unable (or unwilling) to index 30% or 40% of your content, your search engine will provide a fraction of the benefit. Perhaps one of the pages that aren't indexed is the difference between getting a new customer and not.
In my view, I don't think it's wise to advise anyone that anything but comprehensive and thorough indexing is a key business objective. Indeed, as the author concludes, nothing is perfect, but tolerating anything less than what's possible and reasonably feasible, is simply an excuse for not improving the SEO craft.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Thu, 07 Aug 2008 13:17:05 -0700</pubDate>
      <link>https://activerain.com/blogsview/630759/thorough-indexing-performance---the-debate-continues---</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/628919/knol--i-think-this-is-going-to-be-big</guid>
      <title>Knol: I think This Is Going to be Big</title>
      <description>Google opened up Knol to everyone this week and I have a hunch this will be an important idea. Knol (unit of knowledge) is like Wikipedia but with some subtle differences. It will likely become an important player in the knowledge publishing space because ...
Google has great search technology, and no information has any value until it can be found.
It's an authenticated publishing model; like the scientific community, experts can peform checks and balances on other "experts".
The smaller or mode discrete a knowledge artifact is, the greater its value. Knol accomodates this idea.
Create an account and tie your business content into it and vice-versa. I suspect this will generate some interesting visibility benefits for your expertise.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Wed, 06 Aug 2008 12:46:39 -0700</pubDate>
      <link>https://activerain.com/blogsview/628919/knol--i-think-this-is-going-to-be-big</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/613956/wordpress-hacker-bot-surge---trojan-zombie-botnet-wordpress-spam-blogs</guid>
      <title>Wordpress Hacker Bot Surge - Trojan/Zombie/Botnet WordPress Spam Blogs</title>
      <description>Hey folks - as many of you already know, I'm not a Wordpress fan (i.e., bias - MyST Blogsite), but I am a fan of stopping hackers and other nefarious web activity that threatens businesses.
While reviewing some real time tada a few minutes ago, I noticed more than 400 Wordpress Hacker Bots attacking one of our server banks just in the last 10 minutes. Thankfully we have some very sophisticated defense systems that protect our blogsite clients, but most Wordpress sites are unable to defend against security breaches that are reident in Wordpress to begin with.
I've read many posts on AR where folks using Wordpress are particularly angry about being hacked, but I have a hunch that Wordpress itself is a big part of the problem.
Perhaps (as a group) you should look carefully at your server logs and see what services are actually running lots of outbound requests and where they are going. I suspect those of you that are unknowingly harboring this threat might be able to apply a security patch or correction to remove this nasty beast. Kevin Burton didn't have a fix back in March, but he did know what it was - Trojan/ZombieBotnet.
The data below shows a pattern representing more than 2 million requests in one day by more than 3,000 total bots hitting just one of our many servers. This is up more than 50% in the last few days, so this trojan worm seems to be spreading and it's doing so on many versions of Wordpress. Read more about compromised Wordpress blogs.
&lt;img src="https://activerain.com/image_store/uploads/1/9/9/3/1/ar121730061913991.png"&gt;</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Mon, 28 Jul 2008 15:11:42 -0700</pubDate>
      <link>https://activerain.com/blogsview/613956/wordpress-hacker-bot-surge---trojan-zombie-botnet-wordpress-spam-blogs</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/600945/a-specialized-search-engine-of-free--and-near-free--content</guid>
      <title>A Specialized Search Engine of Free (and near-free) Content</title>
      <description>Many of you in this group might find GimmeFreeContent.com (and this post) helpful for finding free content for your blogs.
Enjoy...
bf</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sat, 19 Jul 2008 15:51:44 -0700</pubDate>
      <link>https://activerain.com/blogsview/600945/a-specialized-search-engine-of-free--and-near-free--content</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/580526/deep-and-thorough-indexing-is-critical-as-second-tier-search-becomes-important-</guid>
      <title>Deep and Thorough Indexing is Critical As Second-Tier Search Becomes Important </title>
      <description>Recently, a client with an integrated Google CSE contacted me and said - "Hey, my search engine is busted; it won't bring up a post that's in one of my blogs!".
This client is using MyST Blogsite as well a few other blogging platforms and he also uses our newsest product MyST/VS, integrated custom search solution powered by Google. It turns out his custom search engine is fine; but what *isn't* fine is the content he was searching for is not in the Google index.
Many people believe that the simple exercise of creating a custom search engine (CSE) for a blog or collection of web sites automatically causes all content in the domains of the CSE to be included in search results. This is not the case at all; a Google custom search engine can only perform as well as the content that exists in the Google index itself.
As many of you know, I'm a big proponent of a large and dominant footprint in Google. I won't rehash my viewpoints on this matter except to point out that with the emerging importance of second-tier search, deep and thorough indexing of our content is even more important. If every bit of your content isn't in the Google index, you will experience an ever-growing competitive disadvantage.
Many people on ActiveRain have challenged my philosophy about thorough indexing in the past; they handily dismissed the benefit of a large Google footprint as a business requirement. With the importance of comprehensive second-tier search, this subjects emerges again; the consequences of a partially indexed body of content are clear - visitors aren't given an accurate set of search results. Missing pages in search results are lost opportunities to provide a complete visitor experience that offers up all your content. An incomplete picture of what you've written may mean the difference between a new customer, and a prospect that wanders off.
Can you Overcome This Problem?
Yes. Fortunately, Google engineers count on the possibility that your content architecture isn't designed well enough to achieve deep and thorough indexing. You can use the Google Sitemap capabilities to submit your unindexed pages directly into your CSE. But, this requires that you know all the URL's of your unindexed pages - a task that could take a fair bit of time to complete. This is not ideal though because you'll have to babysit this process time-and-again to make sure it's kept up to date.
There are some other tricks that can help, so if you're interested in learning more, feel free to contact me.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sun, 06 Jul 2008 05:43:12 -0700</pubDate>
      <link>https://activerain.com/blogsview/580526/deep-and-thorough-indexing-is-critical-as-second-tier-search-becomes-important-</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/578699/improper-characterization-of-a--meme--is-ironically-a--meme-</guid>
      <title>Improper Characterization of a "Meme" is Ironically a "Meme"</title>
      <description>Sarah (and members):
Recently, I received a message from someone that I had been "memed". The invitation to participate in what is ostensibly a chain letter, said ...
"A meme is an internet game similar to a chain letter except you post on line."
I have a hunch that when Richard Dawkins (the foremost authority on genetics, atheism, and a profoundly critical thinker) first coined the term "meme", he had a definition in mind that is far different from an "internet game" of tagging one another in hopes that they will act in some [positive?] way.
Read my assessment of this mischaracterized use of the term "meme" and help me understand how so many people could be wrong about it, unless of course my assessment is correct, and the misuse of the term is itself, a "meme" and the definition is being improperly applied. ;-)
To be clear, I'm not attacking the nature of your definition of "memeing" (whis is debateable that such an act exists) - I think it's great to get to know each other better and make new social connections. I just want to understand [from a social perspective] how this group (and seemingly everyone at AR) justifies the use of this term in the context it is being applied.
Bill (I Don't Get It) FrenchMyST Technology Partners</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Fri, 04 Jul 2008 07:25:07 -0700</pubDate>
      <link>https://activerain.com/blogsview/578699/improper-characterization-of-a--meme--is-ironically-a--meme-</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/571873/easily-find-free-content-for-your-blogs</guid>
      <title>Easily Find Free Content for your Blogs</title>
      <description>I was having lunch with a few of our customers last week and one of them remarked that it was a pain looking through all the free content sites for images, articles, and stories that can be reused for blog posts and articles. I had also experienced this - the Creative Commons directories are wonderful, but not as friendly as good'ol Google's format and layout. And the new Creative Commons filters in Google are excellent, but your searches include the entire web - not always ideal if you're trying to narrow your scope to professional produced images, videos, and text.
&lt;img src="http://gimmefreecontent.com/docs/GimmeFreeContent-logo.jpg" style="float: right;margin-left: 6px;margin-right: 6px;"&gt;Sooo... I decided to build a comprehensive aggregation of all the great content sites that are free (or mostly free) and host it at GimmeFreeContent.com (Note: the DNS may still be propagating). This search engine is still a youngster with plenty of room for growth, but like the content it recommends, it is free to use so spread the URL.
I also have plans to add more free content directories and lots of other resources related to helping you blog faster and better. Be sure to submit your suggestions and comments - I'm sure this could be greatly improved and I'm confident many of you know of some great resources.
I've never been a big fan of using free content for blogging, but I've changed my tune after researching this arena. I was amazed to see the number of sites and catalogs that offer great images for free and open use in blogs and websites.
Enjoy. Cheers! --bf</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sun, 29 Jun 2008 12:57:35 -0700</pubDate>
      <link>https://activerain.com/blogsview/571873/easily-find-free-content-for-your-blogs</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/525245/fluid-search--more-on-unified-search-and-a-better-search-experience</guid>
      <title>Fluid Search: More on Unified Search and a Better Search Experience</title>
      <description>My last post was about unifying search across multiple domains to create a better search experience for your visitors. It’s no secret that real estate businesses typically have multiple domains, so I thought it fitting to write about the challenge of unified search here at Active Rain. However, any business with two (or more) domains is challenged to create a unified search experience, so this series of posts is important to every business segment.
At MyST we spend more time innovating than any other activity including selling; our CFO reminds us of this all the time. But we’re on a never-ending quest, and our recent research into custom search engines has uncovered some startling revelations that will change how we think about search.
Background
Until recently, our research and product development has focused primarily on helping business and marketing professionals create a sustainable and dominant presence in search engines and other Web 2.0 systems that help people find information. Our objectives have been constrained to produce content optimized for discoverability in Google (worldwide). I use the term “worldwide” to indicate the entire Google index; the one most people on the planet look to for recommendations.
As I mentioned in a previous post, few months ago we started to see a pattern; website (and blogsite) visitors that were increasingly dissatisfied with the second phase of search; this is the search process that begins after Google (worldwide) has recommended a handful of domains based on a given query.
The Two Phases of Search
The topology of the web has been slowly transforming into a sea of domain expertise nodes – clusters of smart and very focused sites created by individuals, teams, businesses, governments, clubs, social networks, etc. Blogs, websites, and social networks are the predominant implementations that enable domain expertise to surface on the web rapidly and become more findable. But the emergence of clusters representing focused expertise has created a new challenge – with all these loosely-coupled information silos, how do we manage the ability to search across multiple [disparate] applications without including the entire public web?
The search process is typically a two tiered experience; for most people it starts with a generalized search that includes more than 100 billion pages on the Internet (and growing). People have learned that to get close to what they're looking for, they must focus their keywords and use more of them. But the results generally point to pools of content that are likely to contain answers. The second phase of the search process begins once a pool of information is recommended; typically this requires a deeper search within a company's domain. But (as noted above) the definition of a “company’s domain” has shifted with the advent of multiple domains and loosely-coupled information sources. This is where the trouble begins and the visitor experience may suffer.While Google may find reasons to highly recommend one or two pages in the same domain for a given query, it typically won't consider all related company domains or sub-domains within a given search result. In fairness, Google wants to provide a diverse selection of good recommendations – this is a sensible approach. Imagine a single company dominating the top ten results for a particular query - Google users wouldn't find this behavior very useful. Given the way Google works, searchers may be able to get close to the content they want, but the second phase of the search experience (which begins when they land on one of many possible domains of a company) is typically lacking in many ways.
This is not surprising; search solutions are usually implemented differently from domain to domain, and the variety of publishing, content management, and web applications in use by a company tend to exacerbate search continuity challenges. Few systems have the ability to easily integrate or scope searchable content across specific collections of domains. Visitors [however] have far greater expectations – they expect to find answers within the domain(s) that Google recommends. In many cases, the first recommendation may not be exactly what the visitor was looking for, but most people are patient enough to search one more time using a site search feature. This represents a sizeable opportunity to engage the visitor, but if you want to capture attention and keep visitors from using the back button to go back to Google (worldwide), a unified search experience for 100% of your company's resources is necessary. The more resources you can offer them for easier discovery, the more likely they'll continue to investigate your content.
Participating in the new topology of search will require comprehensive organization of your customer-facing content, and Google is providing precisely the framework to make this possible. Everything you’ve published (website, blogsite, social network, forum, reservations tool, partner content, online catalog, etc.) matter even more. Access to this focused sphere of content becomes more critical the larger the Google index grows (e.g., as the size of Google’s worldwide index increases, the ability to find information [directly] decreases). The trend concerning custom search engines is clear; it's a new extension of a proven idea that will radically shape how we find good content, and it’s already happening – people now search for good places to search for what they truly want to consume.
The Revelation
The second phase of search must be agile and dynamic. In the realm of custom search engines, there are domains, facets, and refinements and each must be configured in a context. Simply stated, if your business content spans five domains, you need a search engine that includes all five domains. But more important – the context may change depending on each visitor’s experience and wants.
To that end, our research forced us to coin the phrase “fluid search”. Before you hop over to Google and type that in, let me save you some time – there’s only 5,700 references; most concerning hydraulic and bodily fluids. Fluid search is a simple idea – create a user experience that possess enough agility to keep your visitors engaged. Here’s an example…
Imagine a visitor has landed on a specific blog post (from a Google query) about a new community being built in your town. While the post performed exceptionally well by ranking high in Google (worldwide), it’s just a start as far as the visitor is concerned; she wants more information and immediately clicks on a new unified search component feature that’s embedded right in the post. It says – “Click here for a deeper search on this subject.”.
With one click the search application provides a list of all posts in the blogsite and website related to the subject that attracted the visitor through Google. In this example, the list of resources is pretty thin because it’s a relatively new community. However, the search application has anticipated the possibility that broadening the scope of the search to include all pages linked to from the blog might produce additional interesting content, and it does.
The originally unified search application has expanded its scope to find three additional stories that were linked to from other blog posts that weren’t closely related to the original query. This is an important point worth introspection –
The search engine has magically transformed itself (on the fly) to consider off-domain pages that the collective businesses’ domains have linked to about the community in question.
This is the definition of a fluid search application. Building a search experience like this is not easy; it requires a bunch of stuff that we don’t have time to go into. Besides, the business requirements are far more interesting to consider.
We’ve learned that our customers want the experience described in the example, and we’ve built the infrastructure to achieve it. More important – we built it in a way that seamlessly integrates into MyST Blogsite services which are easily upgraded across our services. We also designed the technology to integrate with non-MyST blogging tools and web applications. Drop me a note if you’d like to be considered for beta testing this technology.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Mon, 26 May 2008 18:01:12 -0700</pubDate>
      <link>https://activerain.com/blogsview/525245/fluid-search--more-on-unified-search-and-a-better-search-experience</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/506274/search-across-all-your-sites---one-interface</guid>
      <title>Search Across All Your Sites - One Interface</title>
      <description>I've become somewhat of an expert with Google CSE (custom search engine - launched in 2006) and this is one very cool tool. Our clients typically ask for a unified search solution that encompasses their blogsite, website and perhaps other domains such as forums and social networks. Typically, these services are each separate systems designed by separate vendors, so providing a unified search solution is difficult.
Googe CSE's make it possible for anyone to create and manage a fairly comprehensive solution to this problem. Furthermore, you can integrate the search UI and results into your blog, your website, or any other web page you have control over. There are many ways to integrate CSE's - my favorite is to use the Business Edition and take advantage of the XML API. But this is not required if you just want the basic functionality.
For giggles, I created a CSE for Vail Valley, Colorado - check it out. If you have questions, feel free to ask - happy to provide additional guidance and insight on this</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sun, 11 May 2008 15:51:02 -0700</pubDate>
      <link>https://activerain.com/blogsview/506274/search-across-all-your-sites---one-interface</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/284875/real-estate-2-0-and-xml---i-see-little-progress-towards-true-web-2-0-services</guid>
      <title>Real Estate 2.0 and XML - I See Little Progress Towards True Web 2.0 Services</title>
      <description>I hear alot of noise about "Real Estate 2.0", but I don't see any significant progress concerning the infrastructure necessary to participate in the Web 2.0 world as a web service.If we can assume that Real Estate 2.0 is an extension of Web 2.0, and if we can agree that this video exemplifies the complete mix of business and technical requirements on the path to become a "Web 2.0" business, why aren't we seeing more discussions and progress along the XML infrastructure axis?One explanation that we don't see the progress could be that when properly implemented, you'll never know when an application is truly XML compliant. This is plausible, but I have to ask -Is your website designed in a way that anyone can integrate your content into other use cases without asking you to write code?Is your content designed in a way that would allow it to be easily moved to another domain without making any changes to the content itself?With each new web site feature you add, do you build agility into it in a way that eliminates future code rewrites?If a social networking platform asked you for a list of your employees in FOAF format, would you be able to provide it?If a tagging/indexing service provided immediate integration of all your blog tags through XML, could you provide that?These are just a few of the questions that are best resolved with an XML infrastructure. Another requirement of Web 2.0 is social data agility through XML protocols such as FOAF and microformats. Most indexing services are starting to require content in XML format - Trulia and GoogleBase come to mind. Are your sites prepared to expose content in these formats?I have a hunch that we're all talking a good RE2.0 game and mimicking web 2.0 services, but few of us are actually building out our web applications with true Web 2.0 compliance in mind. I think that companies that actually undertake the task of building truly XML-based services will achieve a competitive advantage.What do you think? I'd love to see examples where you've created XML agility in your online content services.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sun, 25 Nov 2007 04:47:53 -0800</pubDate>
      <link>https://activerain.com/blogsview/284875/real-estate-2-0-and-xml---i-see-little-progress-towards-true-web-2-0-services</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/251033/web-2-0--bubble-2-0-</guid>
      <title>Web 2.0: Bubble 2.0?</title>
      <description>What if this new version of the Web (i.e., 2.0) is a bubble that's waiting to burst?</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Thu, 25 Oct 2007 18:35:26 -0700</pubDate>
      <link>https://activerain.com/blogsview/251033/web-2-0--bubble-2-0-</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/20635/advertising-on-your-blog--good-intentions---bad-idea</guid>
      <title>Advertising On Your Blog: Good Intentions - Bad Idea</title>
      <description>I'm amazed there are so many business people that believe they should try to monetize their weblog or website by selling ad space. Apparently many businesses are so dissatisfied with the benefits and performance of their online marketing strategy that they feel they must lower the total overall cost of this activity by going into a different business altogether. This is just silliness.I was on a panel at NAR this week with the folks at Blogging Systems; they're clearly bright people - they've written a book about business blogging. But they're misdirected. David Crockett, the panel moderator and owner of a community blog by Blogging Systems, suggested that it was a good idea to go around to your community members and try to sell them ad space on your blog for $75 per month. What? Did I hear that right? I thought he was a real estate professional? This suggests he's building an online portal to sell ads - isn't that Yahoo!'s gig?Consider the hidden costs of engaging your company, your time, and your brand in this activity.Your Brand Focus - You risk brand confusion by placing ads for other products and services on your pages. This is just one more way to show people how to leave your site or become interested in something besides that which you sell. Nothing says web site success like a bunch of links to another business.Your Company Focus - Ads suggest to customers and prospects that the advertisers must be more important to you than your own prospects and business focus, otherwise you would use that space to your own advantage. Your Message Focus - Exchanging valuable web page real estate for a banner or AdSense ad robs you of opportunities to say something important to your prospects. Consider the opportunity cost because it's a double whammy - not only are you bluring your marketing message, you are foregoing a chance to capture prospects by sending them elsewhere to buy other goods and services.Your Time - Why would any business spend their valuable minutes each day to even ponder this idea? It requires time to establish new revenue models - ad selling, contracts, negotiating, banner placement, responsibilities, ad changes, etc - all these tasks represent costs, and the revenues are miniscule compared to your own time-value.Think about it this way - if you could say something to a new prospect in the same place you put a banner ad, what would the value of that message opportunity be to your business? Placing a Goggle AdSense or banner ad might net you 15 cents per impression. Are you willing to forego using that space to say something important to your next web visitor in exchange for something that's valued less than a stick of gum? Anyone that would tell you this is a good idea is blessed with economic illiteracy.You might get $100 per month for a local ad, but that's only $3.33 a day, and for what - the chance to create a diversion for your audience? I put this into the you've-got-to-be-kidding-me class of business ideas. This is a fine strategy for people that write content for a living - indeed - people that are bloggers and have few options to monetize their content. I believe people like Mr. Crockett are simply misdirected by the booming voice of "bloggers" that believe business people should become "bloggers" and use blogs exactly as they have. In my view this is the first step to losing focus of what you sell, and why you blog for business objectives.Here's some advice - if the value of a new customer is at least as valuable as two month's total ad revenue, consider this a really bad idea because there's a good chance that greater marketing focus and a stronger message will net you at least one additional customer every 60 days. Another way to look at it - would you rather have an extra $900 or 6 more customers next year?************** update *************** I rarely update a post once I publish it, but since so many folks are beating me up on the premise that their own ads have actually increased credibility and improved user satisfaction without any risk or cost, I thought it would be a good idea to bring in some research that helped shape my own philosophy of ads. With specific regard to credibility, consider...There's a really useful site called the Stanford Web Credibility Research center for understanding all this techno-mumbo-jumbo about credibility and web sites. Below are some anecdotes that relate to the true cost of hosting ads."If possible, avoid having ads on your site." - here"Although banner ads are often said to be ignored, they are not transparent to users. Ads can reduce Web credibility in varying degrees." - Stanford-Makovsky's 2002 Web Credibility StudyGiven that online marketing initiatives such as blogsites and websites are specifically intended [by most users of these technologies] to enhance visibility, it goes without saying that credibility is one success factor of that endeavor. Why would you purposefully do anything that erodes your credibility? Many sites do, but I sense they do so without factoring in the true cost. And to be clear, there is a context where you might answer this affirmatively - when the net revenue from ad serving is greater than the loss in percieved credibility."Sponsorship provides an interesting lens through which to view Web credibility. Sites that were advertised on the radio or other media were reported to get a moderate credibility boost (mean = 0.77). Asking about advertising from a different angle, our study found that the credibility gained by using targeted online ads was nearly negligible (mean = 0.22)." - What Makes Websites CredibleAn understanding of sponsorship (i.e., using banner and text ads) really compelled me to think carefully about the question of ads. If we think about how the sponsors of a web site affect credibility we can state with almost certainty that sites that have ads that match the topic you are writing about, will produce a .22 mean credibility advantage to the site itself. To get a persepective of what this really means - consider that if you advertise your site on on the radio or other old media outlet, you can expect a .77 mean boost in credibility - or about 3.5 times more credibility over [just one] hosted ad that is specifically about your content. A .22 mean increase is not bad, but as the study found, it's very close to negligible. "For the most part, our respondents reported that advertising damaged a site's credibility. Simply having an ad on the site led to a slight decrease in credibility (mean = -0.60), while pop-up ads were regarded even more harshly, seriously damaging the perceived credibility of the site (mean = -1.64). Finally, sites that made it difficult to distinguish between ads and content were reported to be the least credible of all; the mean here of -1.90 was the most negative score in this study."- What Makes Websites CredibleBut the data is equally compelling and underscores a risk factor when you consider a site with an ad on every page - the mean credibility loss is -.60; simply stated, with almost absolute certainty, we can predict that any site with an ad on every page has diminished credibility - not much, but absolutely a non-zero amount that is functionally equal [but opposite] of the credibility benefit of advertising on old media outlets. A pop-up ad will net you a mean loss of -1.64, and blending ads so that they're difficult to pick out will create a negative credibility score of -1.90.In my comments below, you will see my assertion that hosting ads typically comes with a cost - a potential net loss in percieved credibility is indeed [one] component of that cost.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sun, 27 May 2007 08:38:05 -0700</pubDate>
      <link>https://activerain.com/blogsview/20635/advertising-on-your-blog--good-intentions---bad-idea</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/22524/ripping-the-roof-off-real-estate--the-book-</guid>
      <title>Ripping the Roof Off Real Estate (the book)</title>
      <description>I just finished Mollie Wasserman's new book, the same title as this post. It's an ideal book for someone like me - I'm just an ordinary consumer in the world of real estate. I do have a little more insight into this industry, but not enough to give me any significant advantage; perhaps just enough insight to be dangerous. This book is a ripping good reference that consistently reminds you how important a complete working knowledge of real estate is if you want to be a successful homebuyer or seller. Mollie explains in no uncertain terms how valuable a good real estate professional can be - no rational person could read this and ignore her insightful advice.I like Mollie's style - very down-to-earth; she makes the entire subject of real estate more approachable for the average consumer. I especially liked the section about the four financial potholes to steer clear of and I won't spoil her book sales by revealing them here. ;-) Overall, a very enjoyable and quick read that exposes a number of hidden aspects of real estate that serve as excellent guideposts for everyday homeowners and investors alike.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sat, 25 Nov 2006 14:13:13 -0800</pubDate>
      <link>https://activerain.com/blogsview/22524/ripping-the-roof-off-real-estate--the-book-</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/22183/print-media-versus-the-internet</guid>
      <title>Print Media Versus The Internet</title>
      <description>This may be another one of those topics that I know least about - the impact of the Web on the use of print media in real estate. However, I'd love to get a better understanding from people that know a good deal about the use of magazines, newspaper ads, and other print media for advertising homes. I'm particularly curious about the relationship between print media and the Web today, and in the future.The following dialog with a friend started me wondering about this tenuous relationship between the Internet and print media. Please read it and lend me your thoughts - my friend's comments are in quotes - my observations are italicized. I'm not suggesting any outcome is either good or bad - I'm just pondering what the future holds for print media."... most realtors have highly visual websites ..."This is the stuff of Web 1.0 - an era when only humans trolled the web. As such, visually appealing sites that were designed for people and relatively simple in functionality, met the exact requirements for businesses and its customers."Blogs aren't necessarily as visual as traditional websites."Correct - the success attributes of a Web 2.0 site must factor in the machines (i.e., the crawlers, agents, bots, and every manner of non-human arbitration possible). To be successful they must be "visually" appealing to machines and humans, but machines have a different definition of "vision" - a definition that is largely based on business rules and content topology."If home buyers &amp;amp; sellers are all searching online first and blogs generate SEO then why would any realtor need print?"Precisely the symptom of a coming disruption. When it becomes cheaper and more efficient to communicate without paper, paper will become irrelevant. Looking at it from another angle - when it becomes cheaper and more efficient to find and select homes without obtaining paper pamphlets, the internet will be most relevant. It's easy to speculate that the tipping point is near - 77% of all real estate quests begin online. This might correlate with a slow but constant drop in print effectiveness which we seem to hear from time-to-time. But is the growth in online search really the harbinger we assume it to be?We can see parallels in other industries. When was the last time you actually read a paper prospectus? Or stopped by your travel agents office to select from a wide array of travel pamphlets? When was the last time you paid an accountant to use a pen to fill out your tax return? Print (and paper) is almost meaningless in these three business sectors. Why would the future of real estate be any different?</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Wed, 22 Nov 2006 15:09:06 -0800</pubDate>
      <link>https://activerain.com/blogsview/22183/print-media-versus-the-internet</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/20822/typo-squatting-and-misspell-squatting---bad-idea</guid>
      <title>Typo-Squatting and Misspell-Squatting - Bad Idea</title>
      <description>At NAR this week a very nice realtor explained to me how many mispelled words she published to gain a competitive advantage in search - it was an impressive array of analysis and effort. It always amazes me how far someone will go to game the system to beat the competitor - who can blame them - they just want to get clicks.I pointed out to this well-intentioned soul that this was a tactic that is not sustainable and may soon become irrelevant. At the very instant Google decides to correct search query mispellings automatically and/or browser companies leverage smart URL corrections [automatically], this tactic will then start to work against you. Advice - when building *anything* on the web, always think about quality and sustainability.Somewhat relevant... Typo-Squatting Infringes the Anti-Cybersquatting Protection Act On September 1, 2006, a U.S. District Court in Wisconsin, decided that the Defendants, a group of the Plaintiff's affiliates, acted in bad faith when they used typo-squatting to generate revenue on the Plaintiff's sales by linking to their commercial website. In the case of Lands' End, Inc. v. Remy, the Defendants were accused of acting in bad faith when they attempted to gain extra commissions from the Plaintiff's affiliate program via its website, http://www.landsend.com./ Lands' End sued under the Anti-Cybersquatting Consumer Protection Act, [15 U.S.C. §1125(d)] ("ACPA"). The Defendants argued that they did not act in bad faith, however, the court did not agree.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Wed, 15 Nov 2006 09:11:42 -0800</pubDate>
      <link>https://activerain.com/blogsview/20822/typo-squatting-and-misspell-squatting---bad-idea</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/18992/blogs-are-woefully-underpowered-for-real-estate</guid>
      <title>Blogs are Woefully Underpowered for Real Estate</title>
      <description>I've started to see a pattern - the real estate industry is very data-centric; this is to say that most conversations and activities in real estate are based on information - lots of it, and very discrete in nature. For example, when helping a buyer, data elements such as price, number of bedrooms, location, distance to nearby public schools or college campuses are key elements of the discussion.Blogs, however, are [by nature] very unstructured. A typical blog tool supports just two fields of information - a title, and a text post field. Some support keywords, maybe categories, and a few support more discrete ideas like link collections that support mime-types (i.e., declaring for example that a link is a specific type of link like a podcast). But for the most part, even these more advanced capabilities are woeful underachievers - they lack the power to create more discrete information that is more findable and more relevant.Structured blogging is an alternative and perhaps the next evolutionary step for business blogs in general. It's not a new idea, but it is a good one. It's simple to understand as well - imagine a weblog that has fields for title, body, and keywords, but is also embellished with these additional fields:Property address# of bedroomsPriceFinancing optionsHome owners duesProximity to beachNearest collegeThese are each attributes that describe a property with greater clarity. As you all know it's very easy to describe a property in a blog post's text area, so why would you want to go to the trouble of entering this information in a more structured format? The same question was asked in 1995 when businesses "marked up" their HTML ecommerce pages with data including such things as product names, prices, warrantee, availability, shipping cost, etc. The reasons for avoiding this behavior are numerous and by 1998 most companies had recognized that separating data elements from the presentation code provided great benefits. It's much easier to determine the meaning of information if it is not embedded in text. Blogs will follow a similar progression because businesses want to use blogs for specific use cases like a blog post about an event, or a post about someone important, or an article about hotels with the best golf deals. In this context it would be a blog post related to specific real estate data like price, location, etc.Tagging, the idea of adding keywords to your posts to effectuate greater discovery through search and social netowrks, is a similar concept. By tagging documents in a specific way (such as Technorati's tagging model), you make it easier for these new Web 2.0 systems to find the tags on your page. Adding structure to your blogs is no different from tagging with one exception - keyword tags are a single dimension - they are just words describing what your post is about. Structured blogging is multi-demensional - it allows you to establish name-value pairs - this is techno-mumbo-jumbo for "data fields" much like you would find on a form.A name-value pair for a homes selling price would be "Selling Price: $195,990". The name is "selling price" and the value is $195,990".Some of the reasons you might want to "tag" posts with real estate-specific name-value pairs includes but is not limited to this list:All posts with real estate listing information is automatically uploaded to Google Base (Google Real Estate).All posts that contain real estate property data is searchable without seeing hits from posts that have no property-specific content.RSS readers that understand the usefulness of property-specific fields leverage that information to provide an improved customer experience.Property-specific data is delivered to search engines as more discrete information about the post, thus making it more recommendable for data-specific searches.Property-specific data can be leveraged more easily with web services such as Zillow, Trulia, and EdgeioThese are just a few of the examples why structured blogging will soon provide additional benefits to real estate blogging.Currently there are no blog tools that support structured blogging natively. There are a few plugins listed for Wordpress and Moveable Type here. We (at MyST) started experimenting with structured content integrated through web services processes - our first such attempt was for eBay store data which has provided insight, but not significant success as yet. I think this is a useful idea for real estate, but I'd like to hear your thoughts.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Mon, 06 Nov 2006 02:43:54 -0800</pubDate>
      <link>https://activerain.com/blogsview/18992/blogs-are-woefully-underpowered-for-real-estate</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/18556/fear-not-disintermediation--it-s-customer-relevance-you-should-focus-on</guid>
      <title>Fear Not Disintermediation; It's Customer Relevance You Should Focus On</title>
      <description>In a separate thread (Open MLS for the Public but still No Complete MLS for Brokers) over on Mitchell Hall's AR blog, I saw a couple of comments that led to a comment that led to this post. Make sure you read the entire post and comments - it got me thinking."Selling a home is not an easy task. Most people prefer to have a broker do it for them." -- Mitchell HallThis, I agree with. It's likely to be much harder than it looks. I'm just an ordinary consumer when it comes to real estate buying and selling. But I'm also aware that there are aspects of transactions that are foreign to me, and there are potential pitfalls. As such, I'm not likely to "go it alone" when it comes to selling my home. However, I am likely to do some of the tasks associated with buying my next home. It doesn't matter what those tasks are for this conversation - suffice it to say that there are slivers of activity that I might be more qualified to do than a real estate agent, and at less cost. Hold that thought...Transparency and Disintermediation There are other market parallels that have survived in the face of rampant transparency and [some] disintermediation. Many people still have stock brokers, some people still pay slightly more to have a travel professional. But in these cases [where transparency has reshaped entire industries], one thing is abundantly clear - historical transaction costs have fallen. It doesn't mean stocks or travels costs less - it just means market topologies have been reshaped; certain aspects of industries that were once very relevant, are now irrelevant.But it's important to note that the Internet's ability to rapidly dis-intermediate entire business sectors is really just a capability that may or may not be a good idea for any given market. The outcome (or forecast outcome) - i.e., what may happen because of this capability and why transparency leads to irrelevance - is the more important aspect that the real estate industry should pay attention to.Transparency is no Guarantee of DisintermediationTransparency *may* lead to disintermediation, but it may not; it depends on how consumers react to the pervasive availability of information in markets where the information has traditionally been inaccessible. My view is that the degree with which disintermediation actually impacts any business segment has little to do with how transparent the information becomes, and mostly to do with how irrelevant business providers become under the new light cast through increased transparency.Disclaimer - I'm certainly not qualified to predict what will happen in real estate, but I have a hunch that there are two dimensions of transparency that may each create irrelevance independently and to different degrees.The pervasive access to information relating to the selling of a home;The pervasive access to information relating to the buying of a home;Unlike say the travel industry, we consume airline seats for a specific period of time. However, we rarely sell an airline seat -- save the occasional bump in exchange for some economic advantage. Stocks, on the other hand, are more like real estate - we buy them and sell them. And even in this multi-dimensional market of activity, the stock buying and selling industry has been significantly reshaped."New Yorkers are not looking to dis-intermediate." - Mitchell HallI agree with this comment as well - no consumer wants to dis-intermediate. But all consumers want to optimize wealth which is synonymous with controlling costs. Regardless of how Mitchell shapes this statement, transaction costs will probably fall - the only question remaining - who will benefit?The real estate industry should not confuse the threat of disintermediation with the risk of becoming irrelevant to it's customers. While your industry may seem like it's under attack by companies like Google, Trulia, and Zillow, these are simply artifacts of market forces that are attempting to seek lower transaction costs. Some of these ideas will fail, and some will succeed. But as I said earlier, transaction costs will most certainly fall - not because consumers believe you are overcharging them - but because they can be achieved at lower cost. The businesses that manifest ways to lower transaction costs will be the likely benefactors, just as the businesses that create (or sustain) relevance given the new market topology will also emerge as highly competitive leaders.I believe that real estate's challenge is to identify it's own future relevance given the likelihood that greater information transparency will reshape some (perhaps many) of the present transaction inefficiencies. Could the answer be that simple? ;-)</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sat, 04 Nov 2006 02:22:34 -0800</pubDate>
      <link>https://activerain.com/blogsview/18556/fear-not-disintermediation--it-s-customer-relevance-you-should-focus-on</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/18197/traffic--schmaffic---you-re-delusional</guid>
      <title>Traffic, Schmaffic - You're Delusional</title>
      <description>We live in a traffic-centric world populated by very intelligent people that blindly assume traffic is the answer to everything and traffic metrics are the only thing we should worry about when measuring our online marketing strategies. This is a delusional attitude and lots of business people suffer from it.It's good to have traffic, but it's not an indicator of success in your core business objective - UNLESS of course, your core business objective is to create clicks on banner and AdSense ads. Most realtors are not trying to earn a living three pennies at a time with banner clicks. Traffic (in and of itself) is almost as irrelevant as the number of people that happen to glance at your sign while trying to see what suite number their dentist is in.Like cholestrol, traffic comes in two forms - good traffic and bad traffic. Driving traffic to your weblog based on relevant conversations that occur through natural and common interests is good traffic. Traffic acquired though participation in a link farm only ads to your delusions. Spending a lot of time trying to squeeze more clicks through to your blog or website is (for the most part) a fabrication of traffic and it serves only to feed your sickness.Most "bloggers" (i.e., people that blog all day) would have you believe that you must establish a large audience and high (and always growing) traffic numbers to be successful at business blogging. This is true if you want to be a "blogger". But I've learned that the vast majority of business people don't want to become bloggers - instead, they simply want to benefit from participation in the blogosphere. Traffic will naturally find your voice through long tail queries (search referrals on things you write about) and through participation in the blogosphere. But it's possible to leverage blogs successfully without spending a fortune in time doing it. The traffic you earn naturally is the traffic you [really] want - these are the people that are looking for exactly what you know or what you sell.I recommend a balanced diet of metrics because our websites and blogsites are used interchangeably to build brand, create awareness, promote products, sell stuff, and oh yea - my favorite - provide a high tech place where someone can grab the phone number to your circa 1968 technology known as the FAX machine. ;-) The variety and types of complex interactions that come from blogsites and websites are too diverse to lump into one measure that incompletely and inadequately describes business performance. Your traffic-centric sickness has convinced you that a click to your contact page about buying a home is of equal importance as a click to download a Rolling Stones ring-tone. Please... [sigh]You can escape the traffic affliction by looking at three basic business metrics for the last 12 months. As an exercise, try this:Divide your gross sales commissions by the number of page hits in each month - put the values in a column chart.In a separate comparison column, plot your page hits by month.In a third column, plot your gross sales commissions.These three data elements will tell you many things about traffic - you're all bright people - I don't need to explain what sales-to-hits tells you. Some of you will quickly conclude that traffic is meaningless when measured against sales. In fact, some might discover that traffic is rising rapidly while sales remain the same. And some of you might find a positive correlation - good for you! You might not be delusional! Don't get too excited -- correlations are often not causal, so investigate deeper.My advice -- take a chill pill on traffic and focus on measuring things that actually tell you something. Here are some examples that are blog-centric:Ratio of comments to total blog posts Ratio of comments to total blog page hits Ratio of contact me page hits to total page hits Ratio of total page hits to sales Ratio of pages indexed to sales Ratio of blog page hits to property search requestsThe number of new names in your email archive month over monthWhen tracked month over month, these metrics will tell you new things about your blogging effort and its relationship to specific business activity. I suspect you could create some new and creative measures for your web site as well.Get well. ;-)</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Thu, 02 Nov 2006 07:14:24 -0800</pubDate>
      <link>https://activerain.com/blogsview/18197/traffic--schmaffic---you-re-delusional</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/18093/more-on-google-s-supplemental-index--yellow-band-secondary-index-</guid>
      <title>More on Google's Supplemental Index (Yellow Band/Secondary Index)</title>
      <description>In case you missed it, there was a good debate ongoing in the numerous comments of a recent blog post by Mary McKnight. The debate closed in on a basic question - Is there more than one class of index in Google?And if so, if you have a larger percentage of pages in the "secondary index", would it affect content discoverability? The answer is clearly yes - read on.I've done a little more research and found that the idea of a yellow band (as coined by Mary), and secondary index (as suggested by me) is more commonly known as the "supplemental index"."Hey, pages get added to the supplemental index using automatic algorithms. You can imagine a lot of useful criteria, including that we saw a url during the main crawl but didn't have a have a chance to crawl it when we first saw it. Think of this as icing on the cake. If there's an obscure search, we're willing to do extra work with this new experimental feature to turn up more results. The net outcome is more search results for people doing power searches." - GoogleGuy, Aug 27, 2003 (this is the first indication Google started experimenting with the supplemental index)"As Google explains it, it’s a question of priorities. Supplemental results have a secondary priority. So they’re spidered less frequently and may well have less information held about them in the database. Google says that the PageRank is unaffected [by the supplemental index]. Currently there seem to be few supplemental results showing in typical keyword searches. That suggests to me it’s better to do what it takes to get your web pages into the regular index and avoid the supplemental index." - Barry Welford (Supplemental Results - A Word to the Wise)I noticed on a forum that one webmaster grappeling with the supplemental index wrote: "With a casual inspection I could see that all these pages in the supplemental were the php based dynamic URLs. Google does not seem to index them and though they are linked to high pagerank pages, they can not get out of the supplemental. So the only way to reduce such instances is to rewrite your applications which generate the dynamic URLS and make them search engine friendly."This is untrue. Blogsite is 100% dynamic and our customers average more than 90% of all their blogsite pages in the primary index; they achieve this by doing nothing special - they just blog. We believe our high rate of success is related to the architecture of our presentation layer (i.e., the way our platform generates HTML). Not many folks realize it but the MyST platform (the foundation of Blogsite and Real Estate Blogsites) was designed for knowledge menagement and high search optimization.Shimon Sandler offers a list of reasons why pages get shoved into the supplemental index:You have little unique text on your webpages (maybe a lot of images, and little text),Duplicate content,Your Title and Description meta tags are all identical,Your pages have similar header, sidebar, and footer sections,Your pages are dynamically generated from a database,Possibly most of your links are reciprocal links (not one way incoming links),Orphaned web pages, which are pages that no one links to, including yourself.Many of these points suggest (although not conclusively) architectural issues concerning your HTML affect your ability to avoide the supplemental index. This seems to corroborate what we see with Blogsite.The best evidence and overview of the supplemental index can be found at SEO Adept."The supplemental index is not a good place for your pages to be, as pages in the supplemental index have almost no chance of ranking for good keywords." - Staying Out of Google's Supplemtal IndexSEO Adept also offers these tips to help you get those pages out of the supplemental class.Make sure that your pages have enough content. Extremely short blog posts and other very brief pages sometimes end up in the supplemental index. Make sure that your pages have unique content, from each other and from other pages on the Internet. Make sure that no one is duplicating your pages elsewhere on the Internet. You can run a search on some of the unique phrases in your page to see if other pages may be similar. Try to acquire more and better links to your supplementally indexed pages. Try to get keywords that people are search for in the anchor text of links coming from authoritative, similary themed pages.These tips all make sense of course - nothing new here. What *is* new (to me anyway) is that the supplemental index is apparently quite real and avoiding it is an important success factor in terms of your online marketing strategy. Given this understanding, I'm going to continue to use the ratio of pages in the primary index to total pages in the index as a measure of index penetration success. This seems to be an excellent measure of blogging success because blogging already does a good job of addressing many of the [apparent] reasons that pages get supplementalized.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Wed, 01 Nov 2006 15:59:53 -0800</pubDate>
      <link>https://activerain.com/blogsview/18093/more-on-google-s-supplemental-index--yellow-band-secondary-index-</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/17777/free-beer--free-coffee-if-you-can-find-me-at-nar</guid>
      <title>Free Beer, Free Coffee If You Can Find Me At NAR</title>
      <description>I'll be at NAR this year which is uncommon - I typically frequent technical trade shows, not places where actual business people must make real livings. ;-) We have other business units in Technology, Public Relations, Venture Capital, and Automotive (CES, my favorite), and of course Real Estate, so my schedule is constantly being tugged in different directions. However, this NAR thing looks like a blast.In any case, here's an offer...If you can find me at NAR and I'm not busy speaking or involved in something I have no control over, I'll buy you a beer or coffee or any other fine beverage that happens to be reasonably affordable. Our CFO cut me loose with a Starbucks card with $500 bucks on it and the corporate Visa. Yee haaa! Dang, I'm feeling thirsty already.Seriously - I'm at NAR this year to hear about your industry and learn as much as I can. I have some answers as well, and I'm happy to spend time with anyone that has an interest in sharing, debating, educating or just a good conversation about industry trends.You'll be able to find me easily at a few venues - I think my official blog-related panel session is Saturday morning, 9am.I believe (not sure of the time/venue) Real Estate Blogsites (this link to them will have more content next week) is sponsoring a free blog session/panel/whatever topic that happens to come up for early Sunday morning in the convention hall. It's by invitation only - visit the REB booth 2060 to get an invitation card. Everyone's invited including the folks at Blogging Systems and RSS Pieces. I thought they said free coffee and food - not sure on that either.I'm staying at the Marriott/Convention Center - feel free to pick up any house phone and ring me. I'm up by 7am and usually awake until 2am.You'll likely find me in here, here, or here and maybe here when I'm not at the convention hall.You can also contact me via my Treo MSN IM at bfrench@myst-technology.com - and my cell is 970-389-3126 and my email address is bfrench@myst-technology.com.I think I have some booth dudty assignments as well, but I'll be doing my best to ditch those, so come by the booth (#2060) and pull me away whenever you can.If you're thirsty for answers or beer (or both), hunt me down - I need lots of excuses to avoid working. ;-)</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Tue, 31 Oct 2006 07:25:23 -0800</pubDate>
      <link>https://activerain.com/blogsview/17777/free-beer--free-coffee-if-you-can-find-me-at-nar</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/17765/keywords-and-tags---do-less--get-more</guid>
      <title>Keywords and Tags - Do Less, Get More</title>
      <description>I see blog posts everywhere and they have really long lists of keyword tags in hopes of ranking high for all of them. Burying your posts with more than a half dozen tags is really dumb because it actually hurts your ability to obtain better search recommendations. It sends a signal to indexing crawlers that this post is about a lot of different subjects; that just makes it harder for the engines to determine a highly relevant category for your information.Think about it - the best ranks are achieved on blog posts (and content) that are about very focused subjects. Indeed, if you write an article about one thing, you are many times more likely to see a good ranking for the subject of that post. Keywords are equally effective at zeroing in on a specific subject - the more keywords you have, the more likely your post covers many subjects instead of just one.This is such an important issue that we instrumented our own Blogsite system to flag any post with more than ten tags in it. We actually send out a quality assurance notice when authors cross this arbitrary boundary.Here's a good rule - if you can't tag your post with five terms or less and feel comfortable that you hit the high points, then your post is probably not focused enough.Try doing less work, you might be surprised at how much more effective your weblog will be. ;-)</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Tue, 31 Oct 2006 06:48:18 -0800</pubDate>
      <link>https://activerain.com/blogsview/17765/keywords-and-tags---do-less--get-more</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/15971/clicktrack-claims-business-blog-visitor-s-22--more-likely-to-engage</guid>
      <title>ClickTrack Claims Business Blog Visitor's 22% More Likely To Engage</title>
      <description>What impact does your blog have on your business web site?This was a question recently addressed by a company with some degree of bias, but seems to have done a good job of analyzing the data for at least one of their clients. I don't think they have an axe to grind on the blog versus no blog decisions; rather, they sell tracking services.The approach they used looks reasonable - you can read the details here. The suggestion that their blog readers are 22% more likely to sign up for their newsletter was interesting, but the average time spent on the site seems much more valueable to me -"Even I was surprised by the results—people who read the blog spend about 60% more time on the site than people who didn't." -- ClickTracks CustomerThis corresponds to something I've been saying for a long time - if you want to get a lot visitors, you probably should be concerned with the number of pages in the [primary] search index. Likewise, if you want to get prospects, you probably need to be concerned with the abount of time they spend looking at your content.</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sat, 21 Oct 2006 09:01:00 -0700</pubDate>
      <link>https://activerain.com/blogsview/15971/clicktrack-claims-business-blog-visitor-s-22--more-likely-to-engage</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/15939/are-long-tail-search-terms-identifiable-</guid>
      <title>Are Long Tail Search Terms Identifiable?</title>
      <description>I recently noticed this comment in AR..."... I know that I also rank for most long tail searches."I don't buy this argument at all. I'm not suggesting this person is a liar; I just don't believe that anyone can know what the long tail comprises for any subject, even the narrowest of subjects.My research indicates that the long tail is as fleeting as the conversation you are likely to have at a dinner on Route 66 in Santa Fe New Mexico five years from now. It is virtually unpredictable and when inspected (in hindsight) it can be both shocking and surprising. This is the feeling many of you have commented about when inspecting the hundreds of unique phrases that caused people to be referred to your web sites.I've learned that long tail search queries are typically formulated much like impulse buyers make purchases. You're sitting in front of a television and you see a story about distressed sail boats because of a recent hurricane. The news reporter indicates that every marina in Tampa has an excess of damaged boats at very cheap salvage prices. You jump on Google and you type "sail boats hurricane damage sale tampa". You get some really good hits not because anyone [knowingly or purposefully] optimized their website for this - indeed the storm just passed - SEO doesn't work that quickly. Instead, a marina in Tampa that sells sail boats had written many articles about avoiding hurricane damage - they rank #1 and they get first crack at the business.I recently wrote about Blogging for Organic Visibility vs PPC Campaigns and I was surprised at some of the calls and emails I recieved. Here's a small sample of the ways this post ranks - almost all of which were completely unexpected. I found these phrase by simply experimenting with some of the words in the post.Hidden Side of PPC - #1Google Blogging vs PPC - #5MSN Blogging vs PPC - #3AOL Blogging vs PPC - #1AOL Blog Visibility vs PPC - #1Blogging for Organic Visibility - #1PPC Campaigns Freakonomics - #1PPC Freakonomics - #3organic vs ppc ranking opportunities - #5blog vs ppc ROI - #8Of all these terms that I was able to find by experimentation, the next few took me totally by surprise. How did I find them? They are sitting in my server stats plain as day. ;-) These long tail terms actually caused new visitors to discover what I had to say on the subject in the first half of October 2006 and all related to the one blog post mentioned above.click percentage organic results vs ppc - #2, #3 (2 clicks)organic vs ppc percentage google - #2 (1 click)check google visibility - #3 (1 click)comscore AOL - #3 (1 click)Blog Commercial Real Estate (1 click)real estate training (5 clicks)The strangest ones I happened to notice while scanning the tail -why is it not smart to take out an interest only home mortgage? (1 click)related:www.mortgagenewsdaily.com/mortgage_fraud/report_Kansas.asp (1 click)internet visibility - #1, #2 (3 clicks)If I really looked at 100% of the tail, I estimate I could find about 30 clicks for 15 days related to this one post. Imagine how many clicks I get from the tail with 350 posts in this blog channel alone. Do the math - it's more than 20,000 search referral clicks [alone] per month from the long tail. But what's really interesting -- short tail clicks amount to less than 10% of the long tail clicks. That number is determined by removing all search referrals on phrases that generated five clicks or less - i.e, the more "popular" terms which total less than 2,000 referrals per month.With data like this, should we continue to focus 90% of our SEO energy on 10% of the attainable target? This is somewhat of a rhetorical question of course - there are many reasons the answer should be yes, but I'd like to hear yours.If you believe you have a way to know exactly what the long tail will be for any point in the future, please share. In the meantime, I'm going to continue to attack the long tail of things that relate to my business by simply publishing good domain expertise. ;-)</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Sat, 21 Oct 2006 05:35:49 -0700</pubDate>
      <link>https://activerain.com/blogsview/15939/are-long-tail-search-terms-identifiable-</link>
    </item>
    <item>
      <guid>https://activerain.com/blogsview/15741/rankings-up--traffic-down--search-users-are-getting-smarter</guid>
      <title>Rankings Up, Traffic Down? Search Users are Getting Smarter</title>
      <description>A real estate agent recently mentioned to me that even though he's still in the top of Google for common terms about his local real estate market, his traffic is down significantly. What would explain this? How could this be possible? All indications are that his market (a southwest US city) is growing rapidly and the number of new residents moving there is climbing faster than other cities.One of the explanations that I’ve come to understand about search dynamics is that the general population of search engine users has changed significantly over the last 8 years – essentially consumers are now better conditioned and know how to use search engines and with greater efficiency. Many of us now know that to get better results in a search query we must use more focused terms and typically more terms.Think about how you use Google versus someone who is less adept. Experienced search users have matured to be more efficient with search. Those of us that are search power users (i.e., anyone that has used Google for say, more than five years) don’t realize how efficient you really are. Furthermore, as advanced Internet users we [mistakenly] assume that less experienced Internet users continue to do things the way all newcomers to Internet search do things. This is simply not the case – almost all search users are traversing much the same learning curve that us more experienced users have followed – they are learning (with every query) that they can manipulate the results to get exactly what they want. The shift is ever so subtle – it’s almost impossible to detect, but it is there none-the-less.Armed with the ability to find exactly what they want, is it possible that your target audience of searchers are simply skipping over your high ranking terms that aren't so specific and getting right to the information they want?[read more...]</description>
      <dc:creator>Bill French (MyST Technology Partners)</dc:creator>
      <pubDate>Fri, 20 Oct 2006 02:57:43 -0700</pubDate>
      <link>https://activerain.com/blogsview/15741/rankings-up--traffic-down--search-users-are-getting-smarter</link>
    </item>
  </channel>
</rss>
