Y
ymuntyan
jacob navia said:
Richard said:Dann Corbit said:
<snip>
[...] Those who want a
super-fast bignum library would be well advised to use GMP or Miracl.
Neither of those libraries are a good choice for 64 bit operations.
Quite so, although the way I see it, if you need more than 32 bits, you
probably need arbitrarily many bits, or at least way more than 64. The
most common use I'm aware of for needing more than C90 gives you is
calcs involving RSA and D-H, for both of which 64 bits isn't anywhere
like enough.
<snip>Never used a file bigger than 4GB?
You really don't get it, do you? Right - file sizes are growing rapidly.
What makes you think they will stop growing when they hit 2^64?
With disks of 500GB now, a database of more than 4GB is
quite common.
At least one UK organisation (the Met Office) currently adds more than a
Terabyte to its data store *every day*, and the rate of increase has
itself been increasing over the years. My point is not that 32 bits are
enough, but that 64 are *not* enough.
You really don't get it, do you? Real hardware
and real files are a little different from imaginary
exponential growth of the Earth population. 2^24
is sixteen million; hundred terabytes a day is
some 495 years. Well, nevermind, that organization
perhaps can have 16 million times a terabyte of
storage. That organization probably can write a new OS
with fancy filesystem to make that storage into
one big file. Well, okay. Is it going to use
bignum for file offsets? Still unlikely. Most likely
it will use 128 bits and that's going to be enough
for next ten years for sure.
Anyway, is it really such a strange idea that
for some range of problems 32 bits are enough,
and for some bigger range of problems 64 bits
are enough? Say, 64 bits *are* enough if you
want to deal with files today...
Yevgen