c99 and the lack of warnings when int operations are applied to abool

K

Keith Thompson

Jens Gustedt said:
Am 04/27/2012 08:28 PM, schrieb Keith Thompson:

I don't see any good reason to forbid conversions *to* bool.

Remember that the whole point of this hypothetical language is that it's
reasonably similar to C, but with no concern for compatibility, and
removing features that *I personally* dislike. It's not so much a
proposed replacement for C as it is what C might have been like if I had
designed it myself, knowing what I know now.

Which makes this of questionable topicality, but perhaps it can provide
some insight into the design of C as it actually is.

By not having implicit conversions to bool, you don't have to define the
semantics of such conversions. I find the convention that zero is false
and anything else is true to be arbitrary and unnecessary. (I learned
Pascal before I learned C, and I spent a number of years programming in
Ada. Neither of them has such a convention, and both, IMHO, demonstrate
that a language can get along just fine without it.)

bool (or Boolean, or ...) is a type with values false and true.
Comparisons yield bool results. Conditions must be of type bool. It
requires a little extra typing now and then, but I personally find such
a design cleaner than C's handling of conditions (with type _Bool tacked
on years after the language was originally designed).
While I can image the confusion that was at the origin of this thread
of using bool in arithmetic, the converse rules how to interpret
numeric and pointer types in a boolean context are nice and clean.

Ok, but I find "if (num != 0)" and "if (ptr != NULL)" much clearer than
"if (num)" and "if (ptr)", respectively. For one thing, the more
verbose versions make it more obvious what kind of value is being
tested, if it has a name that's not as obvious as "num" or "ptr".
Also I don't think that the current conversion rules for integer types
are ideal, since they are inconsistent between the preprocessor phase
and the rest of the compilation. Why not just have all integer
arithmetic happen in intmax_t or uintmax_t (and let the optimizer do a
bit of work if it can be proven that smaller types can be proven to be
sufficient) ?

Interesting idea, but you could get some performance glitches as small
changes in an expression alter the optimizer's ability to use a smaller
type. Imagine a small tweak to an expression affecting the speed of an
inner loop by a factor of 10 or so. intmax_t and uintmax_t can be very
inefficient, and in an expression as simple as "x + y" it can be
impossible for the optimizer to prove that it won't overflow.
I don't see the value of this either. The problem in C is not with 0
(and other values) as a null pointer constant but with the
underspecification of NULL.

The problem is that I dislike the use of an int constant 0 to denote a
null pointer value. It contributes to the confusion about the
relationship between integers and pointers. The comp.lang.c FAQ,
<http://c-faq.com/>, has an entire section on null pointers; in my
hypothetical language, that section could probably be reduced to one
question, if that. (And section 9, "Boolean Expressions and Variables",
would probably also be a lot smaller.)
Having a catch-all initializer for all data types is quite convenient,
I would miss that.

I agree; being able to write:

some_type obj = { 0 };

and have it work for *any* type is very nice. I can think of ways to
get similar semantics in my hypothetical language, but this is getting
too long already.
this would be a good one, why not call it nullptr to keep it in sync
with C++ ? And in current C an identifier that is forced to the value
(void*)0 would perfectly play that role.

My hypothetical language explicitly rejects compatibility with C; why
should I care about staying in sync with C++? :cool:}

I think C++ used the name "nullptr" to avoid conflicting with
identifiers used in existing code. I like the name "nil" (and there's
precedent for it or for "null", in other languages).
I personally wouldn't like that one

Fair enough. I do.
I'd add another thing to that list: unprototyped functions. They are
particularly bad because of the lack of pointer conversion for their
arguments.

One could enforce an implicit conversion rule to void* for them, with
the consequence that passing an `int const*` to an unspecific
parameter wouldn't compile at all, or just forbid unprototyped
functions completely. I would prefer the later, I think, with the
consequence of also banning va_arg functions.

I hadn't thought about that particular feature, but yes, old-style
function declarations and definitions would definitely be on the
chopping block. They've been "obsolescent" in C since the 1989
ANSI standard.

I wouldn't ban variadic functions without having a good replacement
for them. printf is error-prone, but very powerful. (C++'s use of
an overloaded "<<" operator is nice, but the fact that specifying
a numeric output format changes the state of the stream is *ugly*.
(But I digress.))

And for those who dislike my hypothetical language, I don't think
there's much chance that I'll ever get around to specifying it, let
alone implementing it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,079
Messages
2,570,575
Members
47,207
Latest member
HelenaCani

Latest Threads

Top