C90 long long

Y

ymuntyan

jacob navia said:






Richard said:
Dann Corbit said:

<snip>
[...] Those who want a
super-fast bignum library would be well advised to use GMP or Miracl.
Neither of those libraries are a good choice for 64 bit operations.
Quite so, although the way I see it, if you need more than 32 bits, you
probably need arbitrarily many bits, or at least way more than 64. The
most common use I'm aware of for needing more than C90 gives you is
calcs involving RSA and D-H, for both of which 64 bits isn't anywhere
like enough.
<snip>
Never used a file bigger than 4GB?

You really don't get it, do you? Right - file sizes are growing rapidly.
What makes you think they will stop growing when they hit 2^64?
With disks of 500GB now, a database of more than 4GB is
quite common.

At least one UK organisation (the Met Office) currently adds more than a
Terabyte to its data store *every day*, and the rate of increase has
itself been increasing over the years. My point is not that 32 bits are
enough, but that 64 are *not* enough.

You really don't get it, do you? Real hardware
and real files are a little different from imaginary
exponential growth of the Earth population. 2^24
is sixteen million; hundred terabytes a day is
some 495 years. Well, nevermind, that organization
perhaps can have 16 million times a terabyte of
storage. That organization probably can write a new OS
with fancy filesystem to make that storage into
one big file. Well, okay. Is it going to use
bignum for file offsets? Still unlikely. Most likely
it will use 128 bits and that's going to be enough
for next ten years for sure.

Anyway, is it really such a strange idea that
for some range of problems 32 bits are enough,
and for some bigger range of problems 64 bits
are enough? Say, 64 bits *are* enough if you
want to deal with files today...

Yevgen
 
I

Ian Collins

Richard said:
At least one UK organisation (the Met Office) currently adds more than a
Terabyte to its data store *every day*, and the rate of increase has
itself been increasing over the years. My point is not that 32 bits are
enough, but that 64 are *not* enough.
Support is already there, the largest supported file size in the largest
file system I know of (ZFS) is 16 exbibytes (2^64 bytes). Mind you, the
largest pool size is 256 zebibytes, so you could store quite a few of them.
 
B

Bart

jacob navia said:
Richard said:
Dann Corbit said:

<snip>
[...] Those who want a
super-fast bignum library would be well advised to use GMP or Miracl.
Neither of those libraries are a good choice for 64 bit operations.
Quite so, although the way I see it, if you need more than 32 bits, you
probably need arbitrarily many bits, or at least way more than 64. The
most common use I'm aware of for needing more than C90 gives you is
calcs involving RSA and D-H, for both of which 64 bits isn't anywhere
like enough.
<snip>
Never used a file bigger than 4GB?

You really don't get it, do you? Right - file sizes are growing rapidly.
What makes you think they will stop growing when they hit 2^64?

64-bits is useful /now/ (and has been for a while) for such things as
file sizes.

64-bits datatypes are commonly available to anyone who doesn't insist
on C90 for example.

Exceeding 64-bits for file sizes, that's not going to be common, and
there are a number of techniques for dealing with that: emulate 96/128-
bit values, use a 2-part value, address files in units other that
bytes, and so on. Most people won't have to bother.

Totally disregarding migrating from 32 to 64-bits, simply because one
day in the future 64-bits may not quite be enough, is silly. Like not
buying that medium sized car now because, in ten years, you might need
a bigger one...

You may have noticed that many machines now have a 64-bit native type,
and totally ignoring that fact is also silly.
 
R

Richard

Richard Heathfield said:
jacob navia said:
Richard said:
Dann Corbit said:


<snip>

[...] Those who want a
super-fast bignum library would be well advised to use GMP or Miracl.
Neither of those libraries are a good choice for 64 bit operations.

Quite so, although the way I see it, if you need more than 32 bits, you
probably need arbitrarily many bits, or at least way more than 64. The
most common use I'm aware of for needing more than C90 gives you is
calcs involving RSA and D-H, for both of which 64 bits isn't anywhere
like enough.

<snip>

Never used a file bigger than 4GB?

You really don't get it, do you? Right - file sizes are growing rapidly.
What makes you think they will stop growing when they hit 2^64?

They wont - but do you realise just how big a file can get with 64 bits?
 
J

jacob navia

Richard said:
They wont - but do you realise just how big a file can get with 64 bits?

Let's suppose the machines arrive at the density of DNA, i.e. around
(very roughly) 50 atoms/bit.

We have:
50*2^64 --> 922337203685477580800
Divided by Avogrado's number 6.022E23

is 0.001531612 liters

1.5 ml. The volume then, is just 1.5 ml.

Of course there is the packing overhead, let's say that
you multiply the bit density by 10. You arrive at
1.5Liters for a file of 2^64 bits!

In bytes you multiply by 8, you get 12 liters the volume
to store a file of 2^64 bytes

Very feasible.

Probably by 2020-2025
 
B

Ben Bacarisse

Richard Heathfield said:
Dann Corbit said:
[...] Those who want a
super-fast bignum library would be well advised to use GMP or Miracl.

Neither of those libraries are a good choice for 64 bit operations.

Quite so, although the way I see it, if you need more than 32 bits, you
probably need arbitrarily many bits, or at least way more than 64. The
most common use I'm aware of for needing more than C90 gives you is calcs
involving RSA and D-H, for both of which 64 bits isn't anywhere like
enough.

No, but every doubling of integer width that is allowed roughly
doubles the speed of RSA and D-H calculations. If C99 (or C0x)
becomes widespread, faster portable RSA and D-H code will be possible.
The fastest cryptographic code will always use non-portable
extensions, but a portable 32-bit multiply (=> 64 bit result) would be
very useful.
 
N

Noob

Richard said:
A computer whose job is to record every human born on this planet
(assuming the population continues to grow at about 10% every 7
years) will not overflow a 64-bit ID until the year 3604

Wait a minute.

The number of world births has remained stable for several years,
and is projected to remain stable for several years to come.

http://www.census.gov/ipc/prod/ib96_03.pdf

(There were 133M births in 1996, 129M births in 2004.)

+10% every 7 years = +1.371% every year

Your assumptions mean we should expect...

o ~500 million births in 2104
o ~2 billion births in 2204

Unpossible :)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,995
Messages
2,570,226
Members
46,815
Latest member
treekmostly22

Latest Threads

Top