My best *guess* would be 8, but I would hate to rely on that, because
it's sometimes wrong.
Sure. But at least I know for sure what a char *is*. I can't be as confident
that I know what someone meant when they called something a "byte".
I have used processors with 16-bit "char", and no way to make an 8-bit
type (except as a bitfield). Nowhere, in any of the documentation,
manuals, datasheets, or anywhere else was there any reference to a
"byte" that is not 8 bits. It made clear that a /char/ was 16 bits
rather than the usual 8 bits, but they were never called "bytes".
I haven't used such devices much - but the vast majority of people who
use the term "byte" have never even heard of such devices, never mind
used them.
There are only two situations when "byte" does not automatically and
unequivocally mean "8 bits" - one is in reference to ancient computer
history (and documents from that era, such as network RFC's), and the
other is extreme pedantry. There is a time and place for both of these
- but you won't convince many people that you would ever /really/ think
a reference to a "byte" meant anything other than 8 bits.
(If you can give modern, or at least still-current, references to usage
of "byte" for something other than 8 bits, then I will recant and blame
the egg nog!.)
A "char", as you say, has a well defined meaning - but not a well
defined size.