What is the prevailing opinion on home warranties? You guys think they're worth it? I know there's some people who think they're a good thing. I also know some people who think they're basically worthless, cause they say they don't cover everything that they say they will. I think it can be a good marketing tool by offering a home warranty, but whether the actual warranty is worth it...I can see both sides of the argument and I'm not sure myself. What's everyones opinion?