S.T. said:
I worry about what the marketplace has specified, not a W3C decade-long
adventure in producing a "Recommendation" that sometimes is, sometimes
is not followed.
The marketplace specifies sites that work. The recommendations are
followed by the browser developers more often than they are not; and
regardless, they are all we have to go on as the browser developers
don't share their error correction algorithms.
W3C is like the United Nations for geeks. A lumbering organization that
periodically produces some documents that the world then decides whether
they want to follow them or not. What they say means nothing to my
users. What my users see and interact with is what matters to my users.
But you can't judge your work by empirical evidence as you can't see
every browser/mode/configuration, past, present and future. To ensure
that your sites work (and continue to work) in the maximum number of
environments, validating your markup is an essential first step. After
that you need to consider what scripts you will allow between your
markup and your users. Just because you can't see something fail
doesn't mean it isn't failing (or will fail) for somebody somewhere.
You start out with no scripts, which means there is nothing to foul up
your documents. With each script that you add, particularly if you do
not know _exactly_ what they are doing, you increase the likelihood of
pissed off users.
I don't care, at all, about any document or specification the W3C
produces. I only care about what the market does (or doesn't do) with
those specifications.
But you can't quantify that. Just be assured that the browser
developers do care about those documents, so they represent your only
solid clues as to what browsers can be expected to do. Trying to
observe browsers to determine what will fly is folly. You could more
easily make general predictions about the behavior of birds by observing
pigeons at the park.
I see where you're coming from. It's not a bad practice by any means --
and perhaps I should put more effort into it -- but I'm not too worried
about it.
It shouldn't take more than five minutes to clean up the typical invalid
document. The bogus entities are some of the easiest to spot and
correct. You shouldn't even need a validation service to fix those, but
should always use one to check your work (another five seconds or so).
Only then can you stop worrying about the problem as it will no longer
exist.
Put it this way, if a browser comes out and cannot successfully handle
<a href="page.php?a=1&b=2">link</a> -- it's not going to have any market
share to warrant consideration.
You have no idea what a browser (or other agent, search engine, etc.)
may do in the future, even those that already enjoy a healthy share of
the market. Look what happens to bad sites every time a new version of
IE comes out. In most cases, it is the sites, not the browser that is
to blame. I validate my markup and CSS, use sound scripts with
appropriate feature testing and I can't remember the last time I had to
change a thing due to a new browser coming out (and contrary to your
previous assertion, I primarily work on very complicated sites and
applications). Coincidence?
Again, I'm not advocating nesting blocks within inline -- not a good
practice. But should it occur it's really not a big deal. For instance
if I have:
You are very quick to dismiss what you perceive as small issues. Why
not just do things right and avoid the cumulative effect of such an
attitude, which is invariably undesirable behavior (if not today,
tomorrow and if not your browser, one of your users').
<span class="blue">
We sell blue widgets cheap!
</span>
... and decide, for SEO purposes, I want:
<span class="blue">
We sell
<h2 style="display: inline; font-size:1em;">blue widgets<h2>
cheap!
</span>
Search engines can't stand tag soup.
... I'm not going to panic.
Of course not, you should simply structure your document appropriately
from the start and the search engines will love them. If you find
yourself trying to fool the search engines, you are doing something wrong.
Maybe I'll get around to restructuring
outlying tags, but it won't be because I'm worried whether I'll pass W3C
validation.
There's no reason to worry about _why_ you are doing it right. If you
simply do things right, everything else (e.g. search engines, disabled
visitors, oddball browsers) will take care of itself.
An unlikely example. I'd agree it's best to avoid Hx tags inside spans,
Yes, of course it is. It's a very silly rookie mistake and I only
pointed it out as the author in question had puffed about being an
"expert" on markup and didn't like hearing that he wasn't. Perhaps
he'll learn something from this dialog. Or perhaps not if I am any
judge of human nature.
but objected to a scathing condemnation of Dojo's site because they had
a block inside an inline and had the audacity to allow CSS to ensure the
user sees the intended effect.
That was one of dozens of rookie mistakes detailed.
Suggesting they swap the absolute
positioned span to an absolute positioned div is fine. Mocking them
because they haven't bothered to was absurd.
It wasn't mocking. I'm okay, they suck. Now that's mocking.
I don't know what all browsers do with valid markup. I know what the
overwhelming percentage of browser visits to my sites do with my markup.
That's the best I can do.
You can't possibly know that and certainly whatever you do know in that
regard has a near future expiration date as new browsers (and browser
versions) come out constantly these days. Add another five minutes of
effort (and a bit more understanding) to your best.
I have no delusions of my pages being future-proof, whether they
validate or not.
All you can do is your best.
It's worked for me for a very long
time. That's history, not delusion. In contrast, your position sounds
very much like eschewing smoke detectors because you have no delusions
of your house being fireproof. Doesn't make any sense does it?
I think anyone who believes their pages are
future-proof because they validate on W3C is kidding themselves.
It's just one measure. Put the detectors in. Granted, your house may
still burn down.
Not sure exactly what you're asking. Not sure how to dump the DOM.
Firebug is one tool that will let you inspect the DOM. Newer browsers
have similar tools built in.
If you're talking computed styles, I tested if an absolute positioned
span was rendered 'display: block' on various browsers
Well that was a waste of time as you could have just read the specs you
seem so keen on avoiding.
http://jsfiddle.net/9xrYg/
Also tested innerText/textContent on <span>a<h1>b</h1></span> to see
current browsers rendered it as <span>a</span><h1>b</h1>. Didn't appear
to be the case - always returned 'ab'.
Current browsers?