Brian Cryer said:
running GOOD sites through the validator can be very
useful - because if you are unlucky (and I was once) an error in your markup
may stop a bot from crawling most of the page even though it renders fine in
the browser.
It's also not a bad way of seeing if a web design company know their stuff -
if there are lots of errors then avoid.
If a company makes websites that look nice and are good to use for the
majority of people, the number of validator reported mistakes is a
less important criterion, of course, than their individual importance
(errors are not all equal).
In reality, a company that makes validator error prone sites can and
do well if the consequences of their errors go largely unnoticed by
the regular visitors and especially the site owners.
The number of formal mistakes is hardly a really useful criterion for
a website owner who has little knowledge of the nuts and bolts (hence
their need to get someone else to make the site) because such people
have no sense of what is important.
Those who get sites made should, I agree, note validator reports. As a
quick cluefulness indicator. But the most important thing they should
do afterwards is stress test the sites, in different browsers, OS's,
pretending reasonable enough variations in eyesight (expressed in text
size settings at least). They should look at the logicality of the
navigation system etc. All the things that escape formal validators.