BartC said:
This would be opening a can of worms (with inheritance and all that crap).
Inheritance? We're talking about C, remember.
If I were proposing this, I wouldn't change the general behavior of
numeric types (or at least that would be a separate proposal).
Somebody suggested a new keyword "newtype" that acts like typedef except
that the new type is actually distinct. Consider:
Consider:
typedef int newint;
newtype int newint; /* changing the existing semantics of typedef is not
a reasonable option */
Suppose int and long happen to be the same size. Then int, long,
and newint are all distinct signed integer types with the same
representation.
Can you pass a newint to a function that takes an int? (Apparently not,
according to your other post).
Yes, just as you can pass a long to such a function.
Can you do newint+newint? If not, then this will make life difficult. If
yes, then it means sometimes newint is treated as int, and sometimes it
isn't.
Yes (otherwise there's not much point).
Yes, as for any "+" operation the operands are converted to a common
type.
What type is the literal 1234?
int, just as it's always been.
Can one do newint=int without a cast?
Yes, just as you can do long=int.
For numeric types, "newtype" can't create a new type with arbitrary
characteristics; it can only create a new type with the same
characteristics as some existing type. The "usual arithmetic
conversions" would have to be expanded slightly, so that a numeric
type created by "newtype" is implicitly converted to the predefined
type it's based on.
Given C's prolific implicit conversions between numeric types,
using "newtype" to create a new *numeric* type wouldn't be all
that much more useful than a typedef; it would matter mainly
for pointer-to-newtype types. (For example, you'd be guaranteed
that passing an int* to time() would be a constraint violation;
currently it's either a constraint violation or perfectly legal,
depending on the implementation.)
<OT>
Ada has predefined integer types similar to C's, but they're
distinct types, and there are no implicit conversions between
them; furthermore, the user can create new integer types.
It means you need a lot more explicit conversions than in C.
(Literals are of an anonymous "universal integer" type, which can be
implicitly converted.) Personally, I don't think that's a bad thing.
Most expressions probably shouldn't mix numeric values of different
types anyway; if you're doing that, it may be a sign that you should
have declared your objects more consistently.
</OT>
But I wouldn't suggest changing C in this way; it would break too
much existing code (and too many C programmers' minds).