J
James Kanze
* James Kanze:
I actually use wchar_t and have used wchar_t a lot.
It plays about the same rôle as int, the platform's "natural" wide
character type.
I guess part of the problem is that there is no generally
established convention as to what a "natural" wide character
type should support.
And it has the same problems: some things are more portable with wchar_t
(such as std::wstring for holding platform-specific wide character
strings), and some things are less portable with wchar_t (such as code
that assumes a specific encoding, or other size-dependent assumption).
And the same issues seem to hold for plain char.
To some degree. The difference is that when I use char, I
expect to be dealing with different encodings. The usual reason
for going beyond char is to have a specific encoding (Unicode),
and to avoid multi-byte encodings.
In the future, of course, we'll have char32_t.
On a platform with 16-bit char I imagine one wouldn't use
std::string for holding a narrow character string.
Sure you would. On a platform with 16-bit char, there's no
smaller type available.
characters: 8 bit, 16 bit and 32 bit. (8 bit also encompassesFrom a practical point of view: there are three sizes of
smaller encodings, like 7 bit ASCII. But these aren't very
relevant today.) And you have to know which one you're dealing
with. char is guaranteed to be at least 8 bit, so it's fine for
that. (Not completely, since char can be signed, which
introduces some problems. But it's usable.) wchar_t is also
guaranteed to be at least 8 bits, which is useless. In
practice, wchar_t can probably be counted on to be at least 16
bits on most general purpose platforms, so it can probably be
used for the 16 bit characters, but the 16 bit ones are the ones
I use the least. But what about the 32 bit ones? I currently
use uint32_t.