If you could change the C or C++ or Java syntax, what would you like different?

N

Nick Keighley

Well, as this applies to nearly anything you have written in response,
I'll just put it here:

You can either pretend you're a compiler and don't know anything more
abstract than your grammar (and the standard defining it) -- or you can
just apply some human abstraction to understand the /intention/ behind
C's language constructs and ask yourself in that context whether typedef
is a good name or not.

My decision is clear: the name is perfect.

I'm not sure you've grasped the distinction between distinct types and
aliased types

int main (void)
{
struct Apple {int i};
struct Orange {int i};
typedef Apple Pear;
Apple apple = {0};
Orange orange = {0};
Pear pear;

apple = pear; /* this is ok */
apple = orange; /* this isn't */

return 0;
}

apples and oranges are different types but apples and pears aren't (in
this program)
 
N

Nick Keighley

Just try -- for a second -- to realize there is no confusion at all.

assuming for a moment that black is white...
There *is* confusion- and I suspect you are one of the confused
people!
Then think about why people decided to give typedef its name. Probably
to confuse people, right?

assume there is no confusion but it was named to cause confusion.
Sorry come again? I confused is there confusion or not?

I suspect a type alias was easy to hack in and there was an existing
mechanism to define structured types (called struct) anyway. typedef
probably seemed short and snappy and good enough. I think there are
plenty of other not-quite-perfect things about C. But mostly they're
good enough.
 
N

Nick Keighley

Most people use the term "create a new type" to mean "create a new type".

only C programmers. Ada for instance creates new types.
By your logic, the dictionary does not define words.  Which, I'm sure in
your twisted mind, it doesn't (as you - or one of your socks - will no
doubt explain that to me any minute now)...

argument by analogy is often flawed. A word is not a type.

If I define an int its a new int.
 
N

Nick Keighley

I did, more than enough. As long as you refuse to think about the
/meaning/ of a typedef (to the programmer, that is .. YES that's one
abstraction level ABOVE your standards), this discussion ist just
pointless.

the semantics of the C language are defined by the standard. If you
want to layer another set of sematics on top of that that's your
lookout. I wouldn't encourage anyone to do that
 
N

Nick Keighley

Just out of curiosity, did anyone else understand the point he was
trying to make?

I think he was trying to avoid admitting that he was initially
mistaken in his assumption as to what typedef did. I remeber being a
bit disappointed to find that typedef didn't define distinct types.
typedef defines a new name for an existing type. You can sloppily
contract that to "it defines a type" but I think it's a slop too far.

I think a lot of C programmers just assume it has to be that way. And
probably didn't relaise that struct actually defines distinct types.
 
F

Felix Palmen

* Nick Keighley said:
assuming for a moment that black is white...
There *is* confusion- and I suspect you are one of the confused
people!

I suspect /you/ are, because:
I suspect a type alias was easy to hack in and there was an existing
mechanism to define structured types (called struct) anyway. typedef
probably seemed short and snappy and good enough. I think there are
plenty of other not-quite-perfect things about C. But mostly they're
good enough.

That's kind of what I said. My argument is the name "typedef" is perfect
because it describes what the programmer should use it for, NOT what
happens internally. The mismatch between the name "typedef" and what it
really does (aka how it is "implemented" in M1 layer, in that case, the
C standard) is neither accidental nor meant to confuse people but just
pragmatic.

Regards,
Felix
 
N

Nick Keighley

Well I didn't.  The passing shot (quoted above) suggests that he was
making a point, disconnected from the C language, about how one should
think about a typedef'd type.  However, I don't think that was the point
he was making earlier in the thread.

being raised by lawyers I learned at an early age to spot shift of
subject used as a debating tactic. If defeated on one field, move
fields.
 
W

Willem

Nick Keighley wrote:
) On 18 Oct, 16:39, (e-mail address removed) (Felix Palmen) wrote:
)> I'll just put it here:
)>
)> You can either pretend you're a compiler and don't know anything more
)> abstract than your grammar (and the standard defining it) -- or you can
)> just apply some human abstraction to understand the /intention/ behind
)> C's language constructs and ask yourself in that context whether typedef
)> is a good name or not.
)>
)> My decision is clear: the name is perfect.
)
) I'm not sure you've grasped the distinction between distinct types and
) aliased types

This distinction is irrelevant. The argument is that 'typedef' denotes
something in a higher, more abstract way.
Why do you keep on making these irrelevant points ?

) <snip>
)
) apples and oranges are different types but apples and pears aren't (in
) this program)

Technically, they aren't. Conceptually, they are.
Because good programmers think in concepts, 'typedef' is a perfect name.
Who cares if it's not a good name from a compiler-writer point of view ?

That would be the same as using different commands to move a file across
filesystems than on the same filesystem. (For example, the default action
when dragging files in windows is different when you're going to another
drive. This confuses the hell out of a lot of people.)


SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT
 
J

Joshua Maurice

Nick Keighley wrote:

) On 18 Oct, 16:39, (e-mail address removed) (Felix Palmen) wrote:
)> I'll just put it here:
)>
)> You can either pretend you're a compiler and don't know anything more
)> abstract than your grammar (and the standard defining it) -- or you can
)> just apply some human abstraction to understand the /intention/ behind
)> C's language constructs and ask yourself in that context whether typedef
)> is a good name or not.
)>
)> My decision is clear: the name is perfect.
)
) I'm not sure you've grasped the distinction between distinct types and
) aliased types

This distinction is irrelevant.  The argument is that 'typedef' denotes
something in a higher, more abstract way.
Why do you keep on making these irrelevant points ?

Our point is that's a purposeful confusion of terms.

In your model in UML on the whiteboard, you are welcome to call
whatever you want a type.

However, in the C programming language, types are defined according to
the C standard, and to some extent the programming culture at large.
Luckily, they are entirely consistent on this point. A type is defined
in terms of a type system. A type in C is defined in terms of C's type
system. By that definition, C's typedef does not create types, it does
not introduce types, and it does not define types. typedef does
absolutely nothing with types besides provide a new type name which
refers to an already-defined type. Please see: "6.7.7 Type
definitions / 4" for the clearest text in the C standard on this
subject, specifically "the type of distance is int" which very clearly
proves that the typedef name MILES is a type name which refers to the
same type as the type name "int" - They are synonyms. MILES is not a
distinct type from int. They are the same type. All other conflicting
text in the standard which I've encountered thus far can be attributed
to a looseness of terms and shorthand - conflating the name and the
named thing. Where it matters, in the text defining typedef and the
following example, the C standard is very clearly state that typedef
just defines a type name which is a synonym for an already-defined
type.
 
S

Seebs

I think that is, at best, vastly overstating the case. If the standard
wanted to make this clear, why is this simple supposed fact about
typedefs not present in the section that defines and discusses typedefs?

Well, it's there in the header. :)
More often than not (it is hard to count exactly) the standard is
careful to talk about typedef *names* being defined (and, unfortunately,
sometimes "declared"). If the intent was to be clear that typedefs
define types, why are there so many uses of the alternate wording?

I suspect that this leads to the central observation:

Because types don't have any existence in object modules, types *are*
purely nomenclature, and the name and the type are largely interchangeable.

You're right, though, it doesn't so much clearly say that typedefs define
types, as it clearly uses the phrase "define types" to refer to things
done by typedef. We're left to infer that if typedef does something referred
to as "define types", that typedef defines types.

I was thinking about this more. I think the real issues become clearer
when you look at pairs of source modules. Consider a pair of modules:

foo.c:
struct foo { int a; } x;

bar.c:
struct foo { double d; } y;

Consider: You can compile both of these, link them into a single
program, and *nothing is wrong*. There is no clash, there is no
conflict. Sure, you have two definitions of the type "struct foo".
Sure, those definitions are incompatible.

But no one cares! Since there's no case where the type leaks over from
one to the other, it's not an issue.

For it to become an issue, you have to do something like:

foo.c:

struct foo { int a; };
extern int foos_a(struct foo *f) { return f->a; }

bar.c:

struct foo { double d; };
extern int foos_a(struct foo *f);
main() { struct foo z = { 0 }; return foos_a(&z); }

Suddenly, you have undefined behavior, not because you have conflicting
definitions, but because a thing matching one definition was passed to
a thing using another.

C is very strangely affected by the fact that each module can define its
own types, and there's no way at all to actually have "the same type"
between two modules unless it's one of the base-or-derived-types. You
can't have the same struct type in two modules; you can only have compatible
struct types.

The C standard has carefully avoided requiring linkers to be smarter than
that. (That's also why we have that strangeness we saw recently with the
inline function not providing a valid definition for a call to it in the
same file.)

-s
 
S

Seebs

int main (void)
{
struct Apple {int i};
struct Orange {int i};
typedef Apple Pear;
Apple apple = {0};
Orange orange = {0};
Pear pear;

apple = pear; /* this is ok */
apple = orange; /* this isn't */

return 0;
}
apples and oranges are different types but apples and pears aren't (in
this program)

I would argue that Apple and Pear are *logically* different types, and
that it's a flaw in the program to mix them whether or not it works.

.... You know, come to think of it. I think this is getting at another
thing that's been bugging me.

There are cases where we clearly mean typedef merely to provide a
convenient shorthand for something, but we don't intend to make a
new and distinct type:

typedef struct foo *foo_t;

There are cases where we clearly mean typedef to make a new and distinct
type:

typedef unsigned int size_t;

I think a big part of the confusion comes from the fact that we only
have one keyword for the two uses. We thus end up with a keyword which
is used for two very different things, and for one of them, you need
to keep in mind that it's just a synonym, but for the other, you will write
better code if you pretend it's a completely distinct and incompatible
type.

-s
 
S

Seebs

the semantics of the C language are defined by the standard. If you
want to layer another set of sematics on top of that that's your
lookout. I wouldn't encourage anyone to do that

I would, at least when it comes to "keeping logical types distinct
even if you happen to know that they're mechanically interchangeable."

-s
 
I

Ike Naar

Because 'bar' and 'struct foo *' are interchangeable, so the compiler
can't tell them apart.

Nitpick:
'bar' and 'struct foo *' are incompatible (non-pointer vs. pointer).
(end of nitpick)

There must be a reason why a C compiler won't tell `bar'' and
``struct foo'' apart. And a very plausible reason is that, at the
language level, the types are indistinguishable in every aspect,
so one might as well say that they are the same type.

The types may (or may not) differ at a higher level of abstraction,
but one cannot tell this from the code alone. One can only assume.

(There are languages, e.g. Ada, where the programmer can specify
whether new types are compatible or incompatible with existing types,
but C is not such a language).
 
K

Keith Thompson

Seebs said:
You miss my point. It DOES NOT MATTER to a programmer trying to write
good code whether the two types are compatible -- they should be
treated as fully distinct types anyway.

I for one understand and agree with that point.

But the fact remains that they are not fully distinct types, and I see
no point in refusing to acknowledge that fact.

It's possible to think about types on more than one level of
abstraction.
 
K

Keith Thompson

Joshua Maurice said:
Fine. I can't call that position idiotic. However, I can and will call
your supposition that "signed integer overflow is a type error"
retarded. You either have a very serious English problem, you have no
clue what you're talking about, or your brain left you for a short
period of time.

Perhaps you could do something with your urge to spew personal abuse
than posting it here.
 
J

Joshua Maurice

[...]
Fine. I can't call that position idiotic. However, I can and will call
your supposition that "signed integer overflow is a type error"
retarded. You either have a very serious English problem, you have no
clue what you're talking about, or your brain left you for a short
period of time.

Perhaps you could do something with your urge to spew personal abuse
than posting it here.

True. I was just overcome that someone with so little knowledge could
make it to the C standards committee. Either he's a troll, and he
deserved what he got, or he actually was on the standards committee
and he needs to go read a book. He actually said that signed integer
overflow is a type error. /sigh
 
S

Seebs

I for one understand and agree with that point.
But the fact remains that they are not fully distinct types, and I see
no point in refusing to acknowledge that fact.

It's not so much refusing to acknowledge it as ignoring it or treating it
as irrelevant. It turns out that it's never useful to me, so while I'm
aware of it, I treat it about the way I treat information like "the
mass of an object changes with its velocity". True, but essentially
totally irrelevant to my daily life. There is no circumstance under
which I can make better decisions about driving based on the information
that things moving at a substantial fraction of c have increased mass.

In theory, quantum mechanics says that, while incredibly unlikely,
it's possible for water to freeze when you put it on the stove instead
of boiling. This information is apparently true, but it's completely
irrelevant -- knowing this will not make you a better cook.
It's possible to think about types on more than one level of
abstraction.

True, but very hard to talk about more than one at a time -- and people
tend to be influenced by the level they spend the most time working at.

"typedef" is a great name at one level of abstraction, and a fairly
dubious name at another. As it happens, the level where it's a good
name is the level where I find it most useful to stay when actually
writing code which is not the implementation of typedef in a compiler.

If someone tells me that typedef defines types, my first assumption
isn't that they're wrong, but that they're working at a higher level
of abstraction and ignoring the underlying details of the type
system. If someone tells me that typedef doesn't define types, my
first assumption is that they're wrong, but that's 'cuz I'm sort
of slow on the uptake; I quickly revise it with the realization that
they're talking about the mechanics of the type system rather than
the abstract type system, and they're probably right there. (The
question of what "define" means takes some poking at.)

In general, it is unusual for me to encounter anyone who is making
statements about what typedef does who is actually wrong, but it's
quite common for them to not immediately disclose to me which level
of abstraction they're working with.

-s
 
S

Seebs

Perhaps you could do something with your urge to spew personal abuse
than posting it here.

He's not nearly as amusing as my previous kook, but at least he doesn't
post as much.

I think what probably happened was I got two lines of discussion
conflated. We had at one point been talking about when "x = y" was
valid or invalid -- and of course, if both are signed integer types,
and y has larger rank, it could be "invalid" in that it can
create undefined behavior. I was thinking, not just about "will
be flagged by the compiler at compile time due to the formal type
system", but "invalid" -- which is a much broader category.

Consider:
int *p;
free(p);

it's obvious to me that this is "invalid" -- but the invalidity hasn't
got anything to do with compatible or incompatible types. But if
you talk about "compatibility" meaning "you can safely do x = y",
there's an ambiguity as to whether you're talking about type checking
only, or also about whether the code is sure not to produce undefined
behavior.

It is not in my usual habits of thought to assume that something
which compiles is therefore "valid". Obviously, if you restrict
the discussion only to type checking,... oh, wait. If you restrict
the discussion only to type checking *of unqualified types*,
compatibility is an equivalence relation.

Otherwise, consider:
char *s = "foo";
const char *t = "bar";
s = t;
t = s;

One of these is "valid", the other isn't.

Responding here 'cuz I plonked the guy for being a jerk. I don't care
how smart you are, if you can't be bothered to be polite, or to read
the posts you're responding to, I have better uses of my time.

Which reminds me: Felix, Keith, you were talking past each other, and
so far as I can tell, each of you was correctly describing C, just in
ways that used incompatible terminology. Felix, don't killfile Keith,
he's a smart guy and he knows C. Sometimes people talk past each other;
don't make a big deal about it, just move on.

-s
 
J

Joshua Maurice

He's not nearly as amusing as my previous kook, but at least he doesn't
post as much.

I think what probably happened was I got two lines of discussion
conflated.  We had at one point been talking about when "x = y" was
valid or invalid -- and of course, if both are signed integer types,
and y has larger rank, it could be "invalid" in that it can
create undefined behavior.  I was thinking, not just about "will
be flagged by the compiler at compile time due to the formal type
system", but "invalid" -- which is a much broader category.

Consider:
        int *p;
        free(p);

it's obvious to me that this is "invalid" -- but the invalidity hasn't
got anything to do with compatible or incompatible types.  But if
you talk about "compatibility" meaning "you can safely do x = y",
there's an ambiguity as to whether you're talking about type checking
only, or also about whether the code is sure not to produce undefined
behavior.

It is not in my usual habits of thought to assume that something
which compiles is therefore "valid".  Obviously, if you restrict
the discussion only to type checking,... oh, wait.  If you restrict
the discussion only to type checking *of unqualified types*,
compatibility is an equivalence relation.

Otherwise, consider:
        char *s = "foo";
        const char *t = "bar";
        s = t;
        t = s;

One of these is "valid", the other isn't.

Responding here 'cuz I plonked the guy for being a jerk.  I don't care
how smart you are, if you can't be bothered to be polite, or to read
the posts you're responding to, I have better uses of my time.

Which reminds me:  Felix, Keith, you were talking past each other, and
so far as I can tell, each of you was correctly describing C, just in
ways that used incompatible terminology.  Felix, don't killfile Keith,
he's a smart guy and he knows C.  Sometimes people talk past each other;
don't make a big deal about it, just move on.

It was twice you made this mistake in this thread, confusing value
conversion errors for type errors.

Also, still not going to reply to "6.7.7 Type definitions / 4"? It
clearly states that the typedef type "is" the original type.
Specifically
typedef MILES int;
MILES distance;
The type of distance is int.
 
J

Joshua Maurice

It's not so much refusing to acknowledge it as ignoring it or treating it
as irrelevant.  It turns out that it's never useful to me, so while I'm
aware of it, I treat it about the way I treat information like "the
mass of an object changes with its velocity".  True, but essentially
totally irrelevant to my daily life.  There is no circumstance under
which I can make better decisions about driving based on the information
that things moving at a substantial fraction of c have increased mass.

In theory, quantum mechanics says that, while incredibly unlikely,
it's possible for water to freeze when you put it on the stove instead
of boiling.  This information is apparently true, but it's completely
irrelevant -- knowing this will not make you a better cook.


True, but very hard to talk about more than one at a time -- and people
tend to be influenced by the level they spend the most time working at.

"typedef" is a great name at one level of abstraction, and a fairly
dubious name at another.  As it happens, the level where it's a good
name is the level where I find it most useful to stay when actually
writing code which is not the implementation of typedef in a compiler.

If someone tells me that typedef defines types, my first assumption
isn't that they're wrong, but that they're working at a higher level
of abstraction and ignoring the underlying details of the type
system.  If someone tells me that typedef doesn't define types, my
first assumption is that they're wrong, but that's 'cuz I'm sort
of slow on the uptake; I quickly revise it with the realization that
they're talking about the mechanics of the type system rather than
the abstract type system, and they're probably right there.  (The
question of what "define" means takes some poking at.)

In general, it is unusual for me to encounter anyone who is making
statements about what typedef does who is actually wrong, but it's
quite common for them to not immediately disclose to me which level
of abstraction they're working with.

Please pick a consistent argument, or at least address when you're
switching contexts. This makes me think that you're trying to weasel
out of your earlier arguments. You have been arguing that the C type
system is different than what wiki calls a type system and what every
other sensible programmer calls a type system. In fact, you said that
typedef types are different types, or something, and this is according
to the wording of the C standard, not some other abstraction level.

In your newest post here, you just went back on that. You're now
saying that the usage of typedef in the C standard is "dubious". This
is quite a departure from your earliest points in this thread as I
understand them.

My points have always been simple:

Firstly, "To define a type" has a well accepted meaning in any
programming language with a type system. That meaning is when a
programmer specifying the "definition" or "contents" of the type to
the compiler and/or type system checker. Every type is distinct, and
the type checker will tell you that (though what exactly that means
depends on the specifics of the type system and the type checker). In
C, this means that if "x = y" is valid code, then the type of x is the
same type as y, or there is an implicit conversion from the type of y
to the type of x.

Secondly, this position is upheld by the C standard. In short, lots of
parts are vague, lots of it uses a loose terminology where it equates
a named with the named thing, some parts use ambiguous or even
downright wrong wording, and so on. The most clear wording I can find
on the subject is in two places.

One, "6.7.7 Type definitions / 4" which clearly states that the
typedef type is the same type as the "original" type.

Two, "7.17 Common definitions <stddef.h> / 4" which says that the
recommended practice is that size_t is a typedef of one of the "basic"
integer types. However, in that other section, it clearly talks about
a header which "defines" size_t. The most reasonable reading, which
also happens to be consistent with the programming community at large,
is to say that when it talks about "defining size_t", it really means
that it's defining a type name. Otherwise, we have to conclude that
"6.7.7 Type definitions / 4" is just wrong, and conclude that C uses
terminology quite inconsistent with the programming world at large.

Thirdly, I claim that you should not introduce new meaning to existing
terms with clear meaning in such a way as to introduce ambiguity in
the usage of the term. When you start talking about different
abstraction levels, and how it might be a distinct type in one but not
another abstraction level, this only serves to cause confusion.

Finally, for types like "size_t", there is already a well defined
terms which captures the essence of such "logically distinct" types.
Such terms include "it's an opaque type", "it's a forward declared
type", and "the type is implementation dependent". Sometimes the
English contract definition of an opaque type is that it's a typedef
for another type, like an integer type. This allows us to infer basic
properties of the opaque type, but it also strongly suggests to us a
proper programming style to remain portable and correct. The type is
not a distinct type from all other types. Instead, its definition is
opaque and platform dependent. Perhaps this distinction is not crucial
to being a good programmer, but it does a disservice to say that the
distinction is not there or to incorrectly use terms.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,083
Messages
2,570,591
Members
47,212
Latest member
RobynWiley

Latest Threads

Top