Admin

(X)HTML - Use a full set of tags

By
Services for Real Estate Pros with eshowings.com

I guess I am a little amused by people trying to SEO their sites, with out thinking about the Optimization part of the equation.  They want their page to come up as first in Google, but they don't really think about the rest of it.  All that is fine by me, to me it means that people are finally driven to care about the mess of HTML code that they are leaving behind.

Editors like this one, and Front-Page(the granddaddy of all hateful editors) have been leaving junk codes behind them in volumns (petrabytes per day I am sure), with people happily adding their words, and as long as it "looks right on my screen" that is good enough.  Along comes Google, and they disregaurded the meta tags. And SEO was born.

I like SEO because the goal is to produce a page that can be read by a none-human agent!  If Google can make contextual sense out of a page, then maybe a Reader like JAWS (used by the blind) will stand half an chance to wade through the mess in the backend.

To futher this goal of making clean pages, I put my styles out-side of my markup (in CSS files where they belong), I post a doc-type (which is more then simply rendering instructions) and I use a FULL SET OF HTML TAGS.  Yes and I will say it again, A FULL SET.  This includes the LABEL element, the EM, STRONG, FIELDSET, THEAD, CAPTION, LEGEND, TH, TBODY, TFOOT, BLOCKQUOTE, DL, DT, DD, CODE, Q, BIG, BUTTON, SUP, SUB, PRE, ADDRESS and SMALL.  These very nice tags are simply ignored by my industry and I don't know why.

Let me state my case.  Using a full set of tags increases the level of understanding that a rendering agent (like the GoogleBot) will have for your document. If the goal of GoogleBot is to make sense of your document, via contextual linking, then a full set of tags is a way for you to Tutor GoogleBot about what portion of your document is important (your keywords).  And before any of the SEO guys tell me that only TITLE and H1 and B are important and read, I will remind you that the Google guys are UberGeeks and they care strongly about these things also.

Regards,

 

Show All Comments Sort:
Jimmy McCall
JimmyMcCall.com - Cunningham, TN
The Ex-Mortgage Consultant
Gareth,  I didn't understand a single thing you said but it sounded good.  I guess I need to learn HTML now.  Thanks.
Mar 13, 2008 06:46 PM
Gareth Dirlam
eshowings.com - Bear, DE
Well Jim, thanks for commenting, if you produce web pages, your problably should know enough html to look at the source and see if its decent or not. Or you should encourage your geeks to be dilligent and do their job well.  Sorry, this post was a little bit of a rant.
Mar 13, 2008 11:46 PM
Bill French
MyST Technology Partners - Dillon, CO

Gareth:

No argument from me about the need to create highly competent HTML (and CSS) code. However, I will debate the need for business people to become highly skilled concerning these matters. It's my opinion that [most] business people don't want to become coders - but they certainly want to benefit from good code.

Businesses [by and large] don't think about building a good backhoe when they need to dig a ditch. For the same reasons, they probably should think twice before spending a lot of time and money becoming skilled HTML and CSS artisans. ;-) Jimmy's comment says it all - he doesn't understand a thing you said, but I'll bet he does understand how to provide great mortgage consulting advice. Does he earn more providing mortgage advice, or is there a chance he could earn more worrying about xHTML? I suspect it's the former. ;-)

I digress... How do business people benefit from good code without becoming involved in the development of good code?

One answer (of many) - use services that produce good code. As you know, this doesn't necessarily mean code that passes acid tests, or W3C compliance. The definition of good code is debatable until you add the requirement "effective". And even this term is slightly ambiguous. Effective at SEO? Effective human interface?

I digress further... it drives me nuts when I see real estate professionals "SEO'ing" their content.

This is absurd thinking. Anyone that believes they are doing themselves a favor by artificially transforming their content to sway a search engine is setting themselves up to have a large body of content that may not sustain high SEO benefits at all. Example - this post contains many comments that drift into this discussion but this particular comment describes precisely the problem - people believe they must exert SEO pressure on everything they write, and that's bad.

The alternative - use a platform that is designed for the science of networking which allows business people to do what they do best. ;-)

Mar 14, 2008 04:50 AM
Gareth Dirlam
eshowings.com - Bear, DE

First let me say thanks for taking the time to read the post.  I think you are at the heart of the matter.

I agree that business people do not need to know about the code, but I do think that business people should DEMAND high quality tools and need to know enough to DEMAND high quality content that offers interoperability.  I remember HIPPA compliance in the medical industry, and I think the results are something that we can live with long term.  I would love to see more industries develop full XML standards for data interoperability.

I remember the early days of databases, where everything was stored in "Dog Pile", and I/O streams were read out as FIFO, grabbing the first line that met the criteria.  These days we use tools like SQL Server that make sure everything is set up correctly. Maybe in 20 years the web will mature to the same level of automatic compliance. This should improve 508 compliance,  Accessibility Agents and also interoperability with WAP clients. 

 

 

 

Mar 14, 2008 06:50 AM
Bill French
MyST Technology Partners - Dillon, CO

Gareth:

"... need to know enough to DEMAND high quality content that offers interoperability."

Couldn't have said better. ;-) However, business people are typically educated by vendors, and vendors will say just about anything to get a sale. How you address that issue is beyond me and the scope of my comments. But I agree - in a perfect world, assessing the performance of anything that generates code is a dicey subject. Some code is intentionally optimized for human consumption (i.e., Flex), while other tools might optimize for findability (i.e., Blogsite).

One approach for business people to take is to assess solutions based on performance. Largely these are word-of-mouth transactions, and the blogosphere is an ideal place to ferret out the loosers and winners.

Another aspect of content quality is technical content quality. I've written about it here and here. If you are writing a business blog for strategic purposes, you need to think about the long term affects of technical decay. Most tools won't help you plan for the day when you have 7,000 posts and 20% of them reference pages that are now gone or broken for whatever reason. That 20% defect threshold is likely to affect your visitor's experience and possibly search recommendations.

Much of this boils down to business requirements - consider them carefully before anything else. If you have a site built with traditional HTML tools and you're not getting good performance from an SEO perspective, it's time to look for alternative solutions.

Mar 14, 2008 07:55 AM
Gareth Dirlam
eshowings.com - Bear, DE

Did you say "Flex", I love flex.   

Once again, thanks for taking the time to write, lot's of interesting stuff in that comment. 

". . . and vendors will say just about anything to get a sale. How you address that issue is beyond me and . . ." as a team leader and department head, I address the issue one developer at a time, by supporting web standards and making compliance to web standards compulsory in the "Best Practices" and "Standard Operating Procedures" of every group I lead.

I only got to do one project in Flex.  But Flex was optimized for back-end interoperability and for human consumption both.  I made a series of HTTP Web services accessing a back-end SQLDatabase, with Flex creating a series of XML files that used a Just-In-Time compiler on the web server, so that the individual content could be updated. And the end product was a GREAT Human(User) Interface. That was an AWESOME project, things making sense for a change. I do believe that Flex is allot more search-able then it's Flash brethren.

In my little world, I think a site that can be read in JAWS (508 compliant) should be Optimized for any reasonably competent spider-bot.  And if I can get to that level I consider that I have preformed my functions in a workman like manner.

If a page uses Title, Header, the title attributes in Hn's and Anchors, and uses  Caption, Legend , Label (and to a lesser extent CITE, Q, BlockQuote) I think that page tells the rendering agent hints on how to interpret the page.  In my experience these very useful items are woefully lacking in the web, and instead we get Keyword (Cite your excellent article here: http://activerain.com/blogsview/401191/Monitor-Your-Brand-Monitor ) Stuffing into the inner text of anchors.  

But the above citation is part of my problem with these CMS/Blog interfaces, do I really have to go into the HTML code in order to get that marked up properly.  Why isn't there a contextual menus that allow me to highlight that and cite the source.  If the CITE tag was used then the rendering agent (maybe a spider, maybe a browser, maybe a Screen Reader) would be able to figure out that I was pointing out your article. And don't even get me started on why I am italicizing your statements instead of <Q> inline quoting them.

But still I like SEO, because it at least starts people thinking about their electronic documents as more then just how they appear on THEIR OWN BROWSER.

"Another aspect of content quality is technical content quality" I had read these post previously, but did not comment on them.  When I was writing a content management system, I had come to the supposition that Google penalized quite heavily for broken links.

"Much of this boils down to business requirements", of course your right there, enough said.

"you need to think about the long term affects of technical decay" Now that is an interesting statement, and your right, of course I need to.  Cruft ("lint and crumbs under the bed") is the old industry term for artifacts both working and none functional that get left in an environment.  I like to say that every line of code has an associated TCO (Total Cost of Ownership) associated with it, especially the ones that don't work.

Mar 14, 2008 11:49 AM
Gareth Dirlam
eshowings.com - Bear, DE

Sorry missed one...

"assessing the performance of anything that generates code is a dicey subject" but I have watched the web tools go from hand writing vector lines(x,y and slope & curve, the early days of adobe PostScript), to visual tools that produced garbage, to the new set of XML based tools.  Flex is a great example but so is Visual Studio 2005, these things make highly workman output.

Regards,

Mar 14, 2008 11:55 AM
Bill French
MyST Technology Partners - Dillon, CO

Yep - I did say Flex. ;-) We're experimenting with it as an extension to our Captyx plug-in widget interface.

"Cruft ("lint and crumbs under the bed") is the old industry term for artifacts both working and none functional that get left in an environment."

Yep - we're pretty anal about stuff like that; our entire platform is "lint-friendly". We don't allow users to create objects that cannot be tested for integrity. This is a fundamental aspect of our platform and a business requirement for most of our customers. Unfortunately, smaller businesses rarely understand why these things are important because they typically do not consider the importance of content in a strategic sense. Rarely do business people such as real estate brokers ponder or concern themselves with sustainable content bases.

The world is different now; traditional web sites grew to some arbitrary practical level to address ecommerce and marketing functions of the business. Even businesses with large catelogs of products typically saw a ceiling that represented a stasis in total web pages. Enter the blogosphere and business blogs; the ceiling has suddenly vanihed. We have a customer with 67,000 indexed pages in their blogsite and they will likely have 200,000 pages by 2010. Blogsites must scale and maintaining quality nd servicability is critical.

"... do I really have to go into the HTML code in order to get that marked up properly."

Um, yeah. ;-) You do, today. I agree - editors pretty much suck through and through. You should build a new one and make a few bucks.

Mar 14, 2008 04:03 PM
Gareth Dirlam
eshowings.com - Bear, DE

As always Bill, thanks for commenting, I am starting to look forward to your insightful comments on the industry.  It has been giving me lots to think about. 

--Nice on that Automated Artifact Unit Testing...  Whoopee!!  That is a great business thought, and a thought that should make you money.  If software JUST WORKS for the user then it should sell itself, and will retain its customer base. 

"You should build a new one and make a few bucks.", I built mine in 1999, but I could not sell it (lack of personal marketing/sales skills and capital).  But it was fun, the experience taught me allot. Not least that I am a geek and not a salesman. 

"...editors pretty much suck through and through..." I think that these current GUI tools suck because of the DesignMode=On from the text area element.  What I mean by that, is all of these use the rich text area functionality embedded in the browser, and really don't extend from there.  I so often see my industry with BLINDERS totally covering everything they see.  It's easy to make this box, so EVERYONE, and I do mean everyone, just does the easy way.

We (developers) have GREAT editors.  But we give our customers sucky editors. "As if they are undeserving of an adequate user experience because they don't understand what goes on under the hood"(<--Not my opinion). Or are developers too lazy as an industry to reach for User Interfaces that do not inconvenience our clients. Or do we think that if we gave them more, that they would be unable to understand it (<--if that were the case, then the issue would be to refine the editor until it made sense, not dumb it down).

But these editors are going to have to get better.  The next step will be highlighting (choosing an area of the page you want to work with) and a dock window with the editor embedded in the dock, so it can be moved around.  But the designers will still have to get to the point that they care about the contextual nature of the text that they are working with. The program itself should ask, "is this a Quote?" not wait for the user to say, "make this a quote."

I was working with some very sharp young turks last year.  These guys really cared about text pattern recognition and advanced algorithms, and guess what, they were "only" web developers working in a design shop.  It will be a brave new world when that group gets a seat at the (industry) round table.

"Rarely do business people such as real estate brokers ponder..." But that is what I like about SEO, it is making business people think about the digital structure of their offerings.  Of course the SEO industry is selling "silver bullets and BS", but eventually real solutions always win in the marketplace.  I mean really, what is going to win, "XML/XSLT and digital Content Management , with work queues and sign-off" or "SEO Copy-writing and shadow gateways"?  The market place always chooses winners in the end.  And it sounds like your crew is trying to build sustainable solutions, so you must be cognisant that meeting actual business needs wins over snake-oil in the end.

 

Mar 16, 2008 03:45 AM
Bill French
MyST Technology Partners - Dillon, CO

"Rarely do business people such as real estate brokers ponder..." But that is what I like about SEO, it is making business people think about the digital structure of their offerings. 

Actually, on this point I disagree in part. It is certainly forcing people to think about SEO, but it is not leading them to an understanding (or any appreciation of an understanding) about smart content structure. Instead, most (the vast majority actually) follow the blind advice of people (mostly peers) that say you should keyword-stuff a signature field, participate in a link-farm, bold every tenth word, or write about subject "x" and title the page subject "y". This slight benefit of awareness is far outweighed by the eventual lack of performance and huge waste of time by business people that should consider using platforms which tend to provide healthy SEO as a by-product.

But business people (however aware they may be of the importance of SEO) continue to make poor choices about SEO with the help of well-intentioned peers, and less-than-well-intentioned vendors and consultants.

 

Mar 16, 2008 04:23 AM
Greg Fox
Realty World Wichita - Wichita, KS
Techy Broker in Wichita Kansas

Interesting article.  In the OLD days (3 years ago) I built all my sites in FP because it was easy.  I still code things there (because I want to see how the HTML looks), and then paste the  body in other places.

I went to a Template site simply because I didn't want to work on the format SEO.  I did spend Considerable time on the Body of the text, Keyword research, and the Meta tags.  When I hear SEO firms say they'll get me on page one, for 1 or 2 terms, on the Major search engines, makes my blood boil.

I want the #1 searched term in my location, on Google, 1st page.  with 70% of searchs (or so), everyone else falls off quickly.  Wichita Kansas Real Estate is searched 700 times a day, and 450 of those are in Google (followed by Yahoo then MSN).  The next closest searched term is 1/2 the daily searches. 

We need to be educated to understand what we're being sold.  We need to learn, or hire people who know such things, or we spend thousands of dollars on little results...

Jun 14, 2008 02:38 AM
Bill French
MyST Technology Partners - Dillon, CO

Greg -

"Wichita Kansas Real Estate is searched 700 times a day, and 450 of those are in Google (followed by Yahoo then MSN).  The next closest searched term is 1/2 the daily searches."

True, but the most popular searches for real estate still onl;y represent 3% to 5% of all searches for real estate in the Wichita Kansas area. Targeting a narrow slice of the short tail is good if you're happy with marketing to a very small percentage of the entire market.

Popular terms are (by definition) the first terms typed by most searchers. However, actionalble decisions are generally made at the end of the search process. Maybe it would be better to target the least popular terms as well - i.e., ... what was the last search query a user typed?

The last query indicates two probable outcomes - (i) they found what they were looking for and stopped, or (ii) they gave up looking.

bf

Jun 14, 2008 03:48 AM