It's not so much refusing to acknowledge it as ignoring it or treating it
as irrelevant. It turns out that it's never useful to me, so while I'm
aware of it, I treat it about the way I treat information like "the
mass of an object changes with its velocity". True, but essentially
totally irrelevant to my daily life. There is no circumstance under
which I can make better decisions about driving based on the information
that things moving at a substantial fraction of c have increased mass.
In theory, quantum mechanics says that, while incredibly unlikely,
it's possible for water to freeze when you put it on the stove instead
of boiling. This information is apparently true, but it's completely
irrelevant -- knowing this will not make you a better cook.
True, but very hard to talk about more than one at a time -- and people
tend to be influenced by the level they spend the most time working at.
"typedef" is a great name at one level of abstraction, and a fairly
dubious name at another. As it happens, the level where it's a good
name is the level where I find it most useful to stay when actually
writing code which is not the implementation of typedef in a compiler.
If someone tells me that typedef defines types, my first assumption
isn't that they're wrong, but that they're working at a higher level
of abstraction and ignoring the underlying details of the type
system. If someone tells me that typedef doesn't define types, my
first assumption is that they're wrong, but that's 'cuz I'm sort
of slow on the uptake; I quickly revise it with the realization that
they're talking about the mechanics of the type system rather than
the abstract type system, and they're probably right there. (The
question of what "define" means takes some poking at.)
In general, it is unusual for me to encounter anyone who is making
statements about what typedef does who is actually wrong, but it's
quite common for them to not immediately disclose to me which level
of abstraction they're working with.
Please pick a consistent argument, or at least address when you're
switching contexts. This makes me think that you're trying to weasel
out of your earlier arguments. You have been arguing that the C type
system is different than what wiki calls a type system and what every
other sensible programmer calls a type system. In fact, you said that
typedef types are different types, or something, and this is according
to the wording of the C standard, not some other abstraction level.
In your newest post here, you just went back on that. You're now
saying that the usage of typedef in the C standard is "dubious". This
is quite a departure from your earliest points in this thread as I
understand them.
My points have always been simple:
Firstly, "To define a type" has a well accepted meaning in any
programming language with a type system. That meaning is when a
programmer specifying the "definition" or "contents" of the type to
the compiler and/or type system checker. Every type is distinct, and
the type checker will tell you that (though what exactly that means
depends on the specifics of the type system and the type checker). In
C, this means that if "x = y" is valid code, then the type of x is the
same type as y, or there is an implicit conversion from the type of y
to the type of x.
Secondly, this position is upheld by the C standard. In short, lots of
parts are vague, lots of it uses a loose terminology where it equates
a named with the named thing, some parts use ambiguous or even
downright wrong wording, and so on. The most clear wording I can find
on the subject is in two places.
One, "6.7.7 Type definitions / 4" which clearly states that the
typedef type is the same type as the "original" type.
Two, "7.17 Common definitions <stddef.h> / 4" which says that the
recommended practice is that size_t is a typedef of one of the "basic"
integer types. However, in that other section, it clearly talks about
a header which "defines" size_t. The most reasonable reading, which
also happens to be consistent with the programming community at large,
is to say that when it talks about "defining size_t", it really means
that it's defining a type name. Otherwise, we have to conclude that
"6.7.7 Type definitions / 4" is just wrong, and conclude that C uses
terminology quite inconsistent with the programming world at large.
Thirdly, I claim that you should not introduce new meaning to existing
terms with clear meaning in such a way as to introduce ambiguity in
the usage of the term. When you start talking about different
abstraction levels, and how it might be a distinct type in one but not
another abstraction level, this only serves to cause confusion.
Finally, for types like "size_t", there is already a well defined
terms which captures the essence of such "logically distinct" types.
Such terms include "it's an opaque type", "it's a forward declared
type", and "the type is implementation dependent". Sometimes the
English contract definition of an opaque type is that it's a typedef
for another type, like an integer type. This allows us to infer basic
properties of the opaque type, but it also strongly suggests to us a
proper programming style to remain portable and correct. The type is
not a distinct type from all other types. Instead, its definition is
opaque and platform dependent. Perhaps this distinction is not crucial
to being a good programmer, but it does a disservice to say that the
distinction is not there or to incorrectly use terms.