D
DSF
Yes. But there are lots of platforms where pointers cast to uint will
have lost some of their bits, and where you might occasionally see
strange behavior.
Assume ints are 32-bit, and pointers 64-bit, and consider what you
get from (uint) 0x100000010 - (uint) 0x0FFFFFFF0.
I'm well aware of that. At the time, speed (of finishing the
project, not execution time) was of a priority. I don't recall right
now what that code was part of, but I believe it's still a work in
progress. (I was sick for over a month.)
In that case, I think it's a safe bet that you would spend an order
of magnitude less time converting to a modern compiler than you are
spending writing code that's less maintainable and more likely to
have subtle bugs than what a modern compiler would do.
True. But that is bound to be a major undertaking and I have
several projects in progress that I need yesterday! The time
learning/getting used to a new compiler plus some code conversion
would set me back months.
Which sort of renders the entire exercise a little silly, no?
No. Because it wasn't an exercise. I needed 16-bit character
support and converting the 16-bit code to an 8-bit counterpart for
each of the string functions I wrote was a simple matter.
#define TONGUE_IN_CHEEK
Who knows? It might still be incredibly faster than a current
compiler. When I get the two remaining projects done and have some
time to look into finding/learning a new compiler, I'll let you know.
#undef TONGUE_IN_CHEEK
DSF
"'Later' is the beginning of what's not to be."
D.S. Fiscus