Hi,
I am a beginer programmer, i would like to know the exact difference
between long and int. Why do we have two names for the same size of
variable.
They are not necessarily the same. In fact, on many implementations,
they are not.
All that you can ever portably rely on, is that a signed int must be
capable of representing integers in at least the range of -32767 to
+32767, and signed long int must be capable of representing integers
in at least the range of -2147483647 to +2147483647.
Now, as it happens on some popular platforms, both signed int and
signed long have the same range: -2147483648 to +2147483647.
^
On those platforms, it doesn't much matter which one you decide to
use. However, when you're coding portable apps, the decision process
(for me, at least) is usually:
If I don't expect to need to represent integers outside of
the range [-32767,32767], then
Using a short or an int should be fine.
Else, if the range [-2147483647,2147483647] is likely to suffice, then
Using a long int should be fine.
Else, if even the range [-2147483647,2147483647] may not be enough, then
Consider using a long long or intmax_t (defined in
<inttypes.h>), or maybe a more flexible, variably-sized integer library.
Note that none of these options (with the possible exception of using
a variable-length int library) obviates the need for you to always
check to make sure that you don't over-/underflow the variable. Always
check for this: it will make your life much easier, in the long
run. In some extreme cases, it has been shown to literally save lives
(
http://en.wikipedia.org/wiki/Therac-25).
Does the size of long and int change on various architectures..
YES.
HTH,
Micah