The 'hypothetical' was enough. 'Hypothetical' means 'I have no
counterexample so i'll just start playing games'.
It doesn't bother me that I have no counter-example, because the point
I'm making only concerns with what the standard mandates and allows.
Keeping track of what's actually true on all current implementations of
C is something that would require more work than I have time for.
If that constitutes "playing games" in your book, then feel free to
think of it that way; I'll continue to think of what you're doing as
"making unjustified assumptions".
....
How nice.
Nope. I just program _real_ computers instead of hypothetical ones.
I program real computers with code that will work even on hypothetical
ones, so long as there is a compiler on those machines which is at least
backwardly compatible with C99; which means my code will continue to
work even if your assumptions about what "real machines" do become
inaccurate.
....
Besides, your claim to know what i care about or not is not only
presumptious, but also quite pigheaded.
You're correct, I can only infer what you care about from your actual
words, you could secretly care about these issues far more than your
words imply. But I can only pay attention to your words, I don't have
the power to read your mind. I'll have to form my guesses as to what you
care about based upon your actual words. If those guesses are
inaccurate, it's due to a failure of your words to correctly reflect
your true beliefs.
....
....
Your rather pompous assertions about my sense of responsability
I said nothing about your sense of responsibility, only your
responsibilities themselves. That you might be insensible of those
responsibilities seems entirely plausible, and even likely.
....
Nope. I just wanted to know whether or not you have any idea what
you're talking about. It appears you have no idea.
I have an idea, it's just not my specialty. I know enough to know that
all three of the methods permitted by the C standard for representing
signed integers have actually been implemented, on the hardware level,
on real machines, and that only one of those methods (admittedly, the
most popular) provides any support for your arguments. That is, it would
provide such support for your arguments, if it were the only possible
way of doing signed integer arithmetic; but it isn't.
....
In terms of hardware, the actual gates being used, both 1's complement
(mainly because of the double 0) and signed magnitude are expensive.
Which is very different from being impossible, which is what would need
to be the case to support your argument. The other ways of representing
signed integers have their own peculiar advantages, or they would never
have been tried. When our current technologies for making computer chips
have become obsolete, and something entirely different comes along to
replace them, with it's own unique cost structure, the costs could
easily swing the other way.
I don't know whether this particular assumption will ever fail. But if
you make a habit of relying on such assumptions, as seems likely, I can
virtually guarantee that one of your assumptions will fail. Code which
avoids making unnecessary assumptions about such things will survive
such a transition; the programmers who have to port the other code will
spend a lot of time swearing at you.