If you could change the C or C++ or Java syntax, what would you like different?

K

Keith Thompson

Seebs said:
But it can be used as one, since the jump from type-name to named type
is unambiguous.

No, a type-name cannot be used as a type. A type-name is the name
of a type, used to *refer* to a type.

Keith is a person. "Keith" is not a person. "Keith" is a name that
refers to a person. They are very different things. The question
How many letters are there in "Keith"?
makes sense, and has a definitive answer. The question
How many letters are there in Keith?
does not make sense (unless I've been going to the mailbox for lunch).
Right. And the thing is... If we can say that "int is a type", then we
can just as accurately say that "foo is a type". They're both actually
names that refer to a type, but it turns out that you can simply use the
type name to refer to the type when talking about code, just as you
can in the code.

Of course. That's what names are for.
It's like identifiers.

int x;

This declares a variable, x, of type int... but wait! Actually, x isn't a
variable; it's the name that identifies the variable. And it's not of
type int. It's of the type denoted by the name int.

x is a variable. "x" is its name.
int is its type. "int" is the name of its type.

(The quotation marks indicate that I'm referring to fragments of C
source code, not C string literals.)
But it's completely unconfusing to just say "x is a variable of type int"
instead of "the name x denotes a variable of the type denoted by int".

It's not
The name x denotes a variable of the type denoted by int.
It's
The name "x" denotes a variable of the type denoted by "int".
I would argue that the abstraction exists only during compilation, and
types don't exist at runtime.

Hmm. Ok, that's probably a fair point. Further thought may be required.
Ahh, but here's where we get into fuzziness. That entity's name isn't "int"
any more than it's "signed int". Or, after a typedef, "foo".

What? Why not?

That entity (a type) has multiple names: "int", "signed int", and "foo".
I think that's a thing which is only situationally important. I've
never before this conversation had any reason to say "the type which
is denoted by the name int" rather than "the type int". And, similarly,
I've never once needed the phrase "the type which is denoted by the
name size_t".

Ok -- but isn't this very discussion a situation where it's important?
I think it's not so much that there's not a distinction there, as that
you can pretty much ignore that distinction while programming, and you
only really need it when trying to explain the "under the hood" parts
of typedef. The rest of the time, "x is a variable of type int" is good
enough.

I agree that "x is a variable of type int" is good enough. My point is
that size_t and <something> (where <something> might be unsigned int,
unsigned long, or something else) *are the same type*.
 
S

Seebs

No, a type-name cannot be used as a type. A type-name is the name
of a type, used to *refer* to a type.

Sure, but there's no way to interact with types in C except via type
names. So any time we talk about needing a "type", what we really need
is a type name.
Keith is a person. "Keith" is not a person. "Keith" is a name that
refers to a person. They are very different things. The question
How many letters are there in "Keith"?
makes sense, and has a definitive answer. The question
How many letters are there in Keith?
does not make sense (unless I've been going to the mailbox for lunch).

But because it doesn't make sense, people will automatically disambiguate
by hopping to the next nearest question, and say "five".
x is a variable. "x" is its name.
int is its type. "int" is the name of its type.
(The quotation marks indicate that I'm referring to fragments of C
source code, not C string literals.)

The thing is... I'm not sure it's really true that x is a variable.
I think the variable and the name stay separated, just as the type-name
and the type do.
It's not
The name x denotes a variable of the type denoted by int.
It's
The name "x" denotes a variable of the type denoted by "int".

Ah-hah!

Okay. So consider: We can simplify that to:
The name "x" denotes a variable of type int.

And everyone seems to agree that this short-hand is perfectly reasonable.

Now...
size_t z;

The name "z" denotes a variable of type size_t.

Behold! While "size_t" is a type name, size_t is a type.
What? Why not?
That entity (a type) has multiple names: "int", "signed int", and "foo".

Right. So it's not clear to me that any one of those merits the title of
"the" name.
Ok -- but isn't this very discussion a situation where it's important?

Sort of! It's a discussion where it's important to think about that
distinction in order to determine whether the distinction is important
to the programmers we're hypothetically talking about who may or may
not be confused by the claim "typedef defines types".
I agree that "x is a variable of type int" is good enough. My point is
that size_t and <something> (where <something> might be unsigned int,
unsigned long, or something else) *are the same type*.

Okay, so if I concede this, we now have the useful statement:

size_t is a type.

So when typedef defines the type-name "size_t" as referring to that particular
type, it does something as a result of which it becomes meaningful and
communicative to claim that size_t (not quoted) is a type now. But it
wasn't before the typedef.

So something happened, before which you couldn't talk about "a variable
of type size_t", and after which you could talk about "a variable of
type size_t". So that's pretty much the same as saying a type was defined;
not a "type" in the sense that we're using when we say that
size_t and some other type denote the same "type", but a "type" in the
sense that we're using when we say "z is of type size_t".

It is a shorthand, I agree. It's just that it's a shorthand we use so
consistently and so commonly with other types that I don't think it's a
problem to use it for "types" that were defined by using typedef to
define new typedef names.

-s
 
J

Joshua Maurice

Another way of looking at it:  The definition is always in the dictionary
sense, the thing that really tells you the specifics of what something is,
not just roughly what kind of thing it is.  For typedef names and
enumeration constants, you can have exactly one per name per translation
unit.  For objects and functions, you can have exactly one per name per
final program.

inline functions
 
K

Keith Thompson

Seebs said:
Sure, but there's no way to interact with types in C except via type
names. So any time we talk about needing a "type", what we really need
is a type name.


But because it doesn't make sense, people will automatically disambiguate
by hopping to the next nearest question, and say "five".




The thing is... I'm not sure it's really true that x is a variable.
Hmm?

I think the variable and the name stay separated, just as the type-name
and the type do.

Right. When I say x without quotation marks, that's a shorthand for the
entity to which the name "x" refers. So x is an object.

(I seem to recall someone arguing that the word "variable" refers to the
name rather than to the object. Since the standard defines the word
"object" but not "variable", I suggest we stick with the term "object".)
Ah-hah!

Okay. So consider: We can simplify that to:
The name "x" denotes a variable of type int.
Yes.

And everyone seems to agree that this short-hand is perfectly reasonable.

Now...
size_t z;

The name "z" denotes a variable of type size_t.

Behold! While "size_t" is a type name, size_t is a type.
Yes.




Right. So it's not clear to me that any one of those merits the title of
"the" name.

Oh, that's what you meant. "size_t" is *a* name of the type. It's not
*the* name, because the type has multiple names. (In fact the number of
names is infinite if you ignore implementation limits on line and
identifier length.)
Sort of! It's a discussion where it's important to think about that
distinction in order to determine whether the distinction is important
to the programmers we're hypothetically talking about who may or may
not be confused by the claim "typedef defines types".

I'd really like to nail down the language-level concepts as a
prerequisite to clearing up any confusion about the concepts built
on top of those concepts. Of course we can do both, but let's not
mix them up.
Okay, so if I concede this, we now have the useful statement:

size_t is a type.
Ok.

So when typedef defines the type-name "size_t" as referring to that particular
type, it does something as a result of which it becomes meaningful and
communicative to claim that size_t (not quoted) is a type now. But it
wasn't before the typedef.

So something happened, before which you couldn't talk about "a variable
of type size_t", and after which you could talk about "a variable of
type size_t".

Right. What happened was that a new type-name was created. Nothing
changed about the type itself.
So that's pretty much the same as saying a type was defined;
not a "type" in the sense that we're using when we say that
size_t and some other type denote the same "type", but a "type" in the
sense that we're using when we say "z is of type size_t".

It is a shorthand, I agree. It's just that it's a shorthand we use so
consistently and so commonly with other types that I don't think it's a
problem to use it for "types" that were defined by using typedef to
define new typedef names.

But it's *still* important, if you want to understand the language,
to know that size_t and <something> (where <something> is some
(standard or extended) unsigned integer type) are names for the
same type. You might not make use of that fact when writing correct
and portable code.

And even if you don't think it's important to remember it, it's
still true.
 
S

Seebs

Right. When I say x without quotation marks, that's a shorthand for the
entity to which the name "x" refers. So x is an object.

Okay.

So why does it seem surprising to do the same thing with size_t?
Oh, that's what you meant. "size_t" is *a* name of the type. It's not
*the* name, because the type has multiple names. (In fact the number of
names is infinite if you ignore implementation limits on line and
identifier length.)

Pretty much.
I'd really like to nail down the language-level concepts as a
prerequisite to clearing up any confusion about the concepts built
on top of those concepts. Of course we can do both, but let's not
mix them up.

Fair enough.
Right. What happened was that a new type-name was created. Nothing
changed about the type itself.

But there's two separate things we can mean by "the type". When we
say "size_t is a type", we are talking both about "the type denoted
by size_t" and "the denotation of that type by the name size_t". Sort
of.

Basically, there's a thing you can do before which size_t is not a
type and after which size_t is a type; it strikes me as totally
reasonable to say that we just defined the type size_t.
But it's *still* important, if you want to understand the language,
to know that size_t and <something> (where <something> is some
(standard or extended) unsigned integer type) are names for the
same type. You might not make use of that fact when writing correct
and portable code.
Yup.

And even if you don't think it's important to remember it, it's
still true.

True. I just don't see any conflict between that and the claim that
the type size_t was defined. Outside of objects and functions, "defined"
doesn't mean "newly created", I don't think.

Basically, I'm pretty sure that "define a type" is notational shorthand,
but since what it denotes is pretty easy to figure out, it's a reasonable
shorthand. Again, I've never heard of anyone being surprised by the
discovery that the newly defined type isn't a different type from the
previously existing one.

-s
 
K

Keith Thompson

Seebs said:
But there's two separate things we can mean by "the type". When we
say "size_t is a type", we are talking both about "the type denoted
by size_t" and "the denotation of that type by the name size_t". Sort
of.

The latter is certainly not what I mean by "the type".

And if we call "the denotation of that type by the name size_t"
"the type", then the statement in C99 6.7.7p3:

A typedef declaration does not introduce a new type, only a synonym
for the type so specified.

is, at least arguably, false.
Basically, there's a thing you can do before which size_t is not a
type and after which size_t is a type; it strikes me as totally
reasonable to say that we just defined the type size_t.


True. I just don't see any conflict between that and the claim that
the type size_t was defined. Outside of objects and functions, "defined"
doesn't mean "newly created", I don't think.

Basically, I'm pretty sure that "define a type" is notational shorthand,
but since what it denotes is pretty easy to figure out, it's a reasonable
shorthand. Again, I've never heard of anyone being surprised by the
discovery that the newly defined type isn't a different type from the
previously existing one.

I've seen it flatly denied in this thread.
 
S

Seebs

The latter is certainly not what I mean by "the type".

But it's arguably the thing that's defined by said:
And if we call "the denotation of that type by the name size_t"
"the type", then the statement in C99 6.7.7p3:
A typedef declaration does not introduce a new type, only a synonym
for the type so specified.
is, at least arguably, false.

For that meaning, yes.

I am pretty sure that the standard uses the word "type" in two separate
modes, and does not clearly distinguish them. Similar to the way that
the standard explicitly says that sometimes an identifier is used to
denote the thing denoted by the identifier.
I've seen it flatly denied in this thread.

In a different sense, most likely.

That's the thing. It's obviously different in that it is clearly Bad
Code to pass a size_t to a printf using %lu. But I've never seen anyone
who was surprised by the information that, in some cases, they were
"the same" in that the compiler would regard pointers to them as
compatible pointer types.

In short, I'm pretty sure that every time I've seen someone claim that
it's a "different type", they've been talking at a different level of
abstraction -- I've never seen someone come in complaining that the
compiler didn't catch the bug when they passed a foo * to a function
expecting int *.

-s
 
F

Felix Palmen

* Joshua Maurice said:
I guess that they're not much more I can do on this argument though. I
think it's an affront to good practice in any discipline to start
mixing inconsistent ontologies in a single document. The C standard
should be defined in terms of a single ontology, preferably one which
makes sense to the programmer.

Indeed, that would be much better, so the way typedef works could be
considered a quirk in C.
The programmer should not need to adopt
a different inconsistent ontology to be a good programmer. In fact, I
don't think you do. It works perfectly well just to think that "The
definition of size_t is unspecified and dependent on implementation.
It's likely a typedef for an unsigned integer type. However, as it's
unspecified, don;'t rely on any particular implementation." There's no
need to think that "size_t is a distinct type",

Just compare the complexity. The latter is
- what it really SHOULD mean (and, by the way, is there any statement in
the standard that forces size_t to be a typedef to an integer type?)
- much more comprehensible

In fact, I know what's probably going on "under the hood" when using
types like 'size_t' and I still prefer to think of them as distinct
types.
and I think that's a
very bad way to think about it because it implies certain untrue
things, such as:
unsigned int * x;
size_t * y = x;
is guaranteed to produce a diagnostic (specifically a type error) on a
conforming implementation.

I disagree. There are lots of ways to write logically incorrect code in
C that aren't needed to be diagnosed according to the standard. GCC goes
a long way trying to diagnose them anyway, given appropriate command
line arguments. I don't know wheter it can diagnose this one and can't
test it right now, but generally, a C programmer just knows he can't
rely on the compiler issuing warnings about any logical error.

Regards,
Felix
 
I

Ike Naar

But for things like enum constants and typedef names, they're unique for
each translation unit, [...]
For typedef names and
enumeration constants, you can have exactly one per name per translation
unit.

Not sure what you mean here.
The (valid) translation unit below uses the typedef name
`mytype'' more than once.

int main(void)
{
{ typedef double mytype; }
{ typedef long mytype; }
return 0;
}
 
J

Jon

Nick said:
typedef, sizeof and offsetof all look pretty compound to me

I'm open to it, I spoke before I considered it, but I immediately saw it
as better than 'typedef'. (Time will tell if "typealias" shows up in a
new language).
 
J

Jon

Nick said:
only C programmers. Ada for instance creates new types.


argument by analogy is often flawed. A word is not a type.

Argument by strawman or "both sides of one's mouth" is "flawed". How old
are you? How long have you been a C programmer? And you *still* don't use
the terminology of *the standard*? "Something's gotta give".
 
J

Jon

Nick said:
being raised by lawyers I learned at an early age to spot shift of
subject used as a debating tactic. If defeated on one field, move
fields.

"noted". That makes you.... what? A lawyer (apparently you rejected
that)? A linguistic mime in your own promotion? A snakeoild salesman (for
what company? your own?).

All questions rhetorical.
 
J

Jon

Nick said:
I think he was trying to avoid admitting that he was initially
mistaken in his assumption as to what typedef did. I remeber being a
bit disappointed to find that typedef didn't define distinct types.
typedef defines a new name for an existing type. You can sloppily
contract that to "it defines a type" but I think it's a slop too far.

I think a lot of C programmers just assume it has to be that way. And
probably didn't relaise that struct actually defines distinct types.

I think "old" people assume too much.
 
J

Joshua Maurice

Indeed, that would be much better, so the way typedef works could be
considered a quirk in C.

I still think you're not getting me. Let me try it like this. The C
standard and the programming community at large have a very clear
definition of distinct types and what it means to specify a new type.
typedef does not do that. Anyone who says that typedef specifies new
types is inventing his own ontology which is inconsistent with the C
standard, the C type system, and type theory at large, and I think
that is horrible to confuse the terms so.
Just compare the complexity. The latter is
- what it really SHOULD mean (and, by the way, is there any statement in
  the standard that forces size_t to be a typedef to an integer type?)
- much more comprehensible

Lots of places require that size_t is an unsigned integer type, such
as 7.17 Common definitions <stddef.h> / 2.

However, nothing requires that size_t is a typedef name. However
again, the recommended practice according to the C standard is that
size_t is defined as a typedef, see 7.17 Common definitions
In fact, I know what's probably going on "under the hood" when using
types like 'size_t' and I still prefer to think of them as distinct
types.

You're welcome to think of them as distinct types, but you would be
technically incorrect. What really gets me is that this confusion of
terms may impede learning by new people about what a type is, what a
type system is, and so on. In fact, it leads to fun conversations like
this one where people still insist that typedef specifies new types
where it mots clearly does not.

It's also a huge pet peeve of mine when people redefine terms because
the redefinition is appealing. The truth of a statement is entirely
independent of its utility if true, and of its desirability as true.
It might be nice if size_t was a distinct type, and we might want that
size_t is guaranteed to be a distinct type, but it's not.
I disagree. There are lots of ways to write logically incorrect code in
C that aren't needed to be diagnosed according to the standard. GCC goes
a long way trying to diagnose them anyway, given appropriate command
line arguments. I don't know wheter it can diagnose this one and can't
test it right now, but generally, a C programmer just knows he can't
rely on the compiler issuing warnings about any logical error.

That's a rather bleak picture of C, and also incorrect. There are lot
of kinds of errors which are undefined behavior, no diagnostic
required. However, simple type errors like the above are not.
Assigning a type to an incompatible type is diagnostic required. gcc
will diagnose the following, as will any C compiler at all worth
using:
int main()
{
short * x;
int * y = x;
}

Just tested it now. Comeau online compiler gives a fatal error. gcc
4.1.2 with no options gives a warning. gcc 4.1.2 with -pedantic-errors
gives a fatal error.
 
J

Jon

BartC said:
LOL

(Also consider this is the first time I've ever used 'LOL' in 15
years of internet...)

Well I gotta start somewhere. "right", bitch? (There *is* life after C
afterall?).
 
J

Jon

Rui said:
Particularly when "we kids" just so happen to be right.

Well don't get too excited and do stupid stuff on that. Research: "youth
is wasted on the young", and hopefully that will calm you down. (LOL, it
won't! I'm telling jokes to myself!).
 
S

Seebs

But for things like enum constants and typedef names, they're unique for
each translation unit, [...]
For typedef names and
enumeration constants, you can have exactly one per name per translation
unit.
Not sure what you mean here.
The (valid) translation unit below uses the typedef name
`mytype'' more than once.

Whoops, right you are. Forgot about scope.
int main(void)
{
{ typedef double mytype; }
{ typedef long mytype; }
return 0;
}

That's a brilliant example, thanks.

-s
 
K

Keith Thompson

Joshua Maurice said:
However, nothing requires that size_t is a typedef name. However
again, the recommended practice according to the C standard is that
size_t is defined as a typedef, see 7.17 Common definitions
<stddef.h> / 4.
[...]

I've read 7.17p4 several times. This is the second time you've
made this claim, and it just isn't true

(It's worth noting that that paragraph does not exist in the C99
standard; it was added by one of the Technical Corrigenda, and
appears in N1256, which is close enough.)

But that paragraph does not mention "typedef", nor does the word
"typedef" appear anywhere in 7.17. (In fact, the *only* occurrences
of the word "typedef" in section 7 are in 7.18, (<stdint.h>),
and in 7.26.8, which refers to <stdint.h>.)

Here's 7.17p4:
The types used for size_t and ptrdiff_t should not have an
integer conversion rank greater than that of signed long int
unless the implementation supports objects large enough to make
this necessary.

That's just about *which* types should be used; it neither says or
implies anything about using typedef that wasn't already implied
by the previous paragraphs.

7.17p1 says "The following types ... are defined", and 6.7.7, which
describes typedef, is titled "Type definitions". And if that's not
enough to imply the use of typedef, there is no other construct in
standard C that could define wchar_t, ptrdiff_t, and size_t in a way
that would satisfy the requirements. (#define wouldn't do the job;
strictly conforming code may use those identifiers in inner scopes.)

Yes, the types "defined" in <stddef.h> are almost certainly typedefs.
No, paragraph 4 doesn't recommend that.
 
S

Seebs

Yes, the types "defined" in <stddef.h> are almost certainly typedefs.
No, paragraph 4 doesn't recommend that.

Man, now you've got me wondering.

Imagine, if you will, an implementation which has generic multi-precision
integers available.

And imagine that <stddef.h> were to do:
__INT_TYPE(36) size_t;
thus defining size_t to be a 36-bit type... Assuming this complied with
the integer conversion rank rules, so far as I can tell, this would be
completely legit. There wouldn't be any other type compatible with it,
and potentially it wouldn't even create a compatible type if you wrote:
__INT_TYPE(36) my_size_t;
because it's up to them whether their custom type declaration magic creates
compatible types or not. :)

That said, it seems pretty obvious that the *intent* is that you can declare
these types with typedef, and that doing so "defines types", whatever the
heck that means.

-s
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,079
Messages
2,570,575
Members
47,207
Latest member
HelenaCani

Latest Threads

Top