D
David Brown
Skybuck said:Well that clearly sucks.
The world is not completely 64 bit, The world is not statis it fluctuates.
Sometimes the program only needs 32 bits, sometimes 64 bits.
Always choosing 64 bits would hurt performance LOL.
So if your program needs 32 bits, use 32 bits. If it needs 64 bits, use
64 bits.
I work in the world of embedded systems - it can often make a huge
difference whether you pick 8 bits, 16 bits, or 32 bits for your data.
People sometimes prefer 24 bits or 40 bits - whatever makes sense for
the task in question. This all makes far more of a difference than a
choice of 32-bit or 64-bit integers (on a 32-bit or 64-bit processor),
yet programmers have no trouble dealing with it.
If you are trying to get performance, figure out how to use libraries
written by experts, rather than trying to roll your own code at this
level - you haven't a chance of getting optimal code until you first
understand what you want your program to do, and then understand the
issues that actually make a difference in real life programming rather
than some little test snippet of code.