J
JohnF
I'm getting a tiny-cum-microscopic, but nevertheless fatal,
difference in the behavior of the exact same C code compiled
on one 64-bit linux machine...
o dreamhost.com
uname -a Linux mothman 2.6.32.8-grsec-2.1.14-modsign-xeon-64 #2 SMP
Sat Mar 13 00:42:43 PST 2010 x86_64 GNU/Linux
cc --version cc (Debian 4.3.2-1.1) 4.3.2
versus two other 32-bit linuxes...
o panix.com
uname -a NetBSD panix3.panix.com 6.1.2 NetBSD 6.1.2 (PANIX-USER) #0:
Wed Oct 30 05:25:05 EDT 2013 i386
cc --version cc (NetBSD nb2 20110806) 4.5.3
o my own local box running slackware 14.0 32-bit
cc --version cc (GCC) 4.7.1
The code is an en/de-cryption utility forkosh.com/fm.zip,
which is way too many lines to ask anybody to look at.
But my own debugging is failing to identify where the
difference creeps in, and googling failed to help suggest
where to look more deeply.
Firstly, both executables "work", i.e., if you encrypt and
then decrypt, you get back the exact same original file.
But if you encrypt using the 32-bit executable, scp the
encrypted file to the 64-bit machine (md5's match) and then
decrypt, the result is exactly the same length and almost
identical except for about one byte in a thousand that doesn't
diff. Vice versa (encrypt on 64-bit, decrypt on 32) gives
the same behavior. (By the way, the 32-vs-64-bit encrypted files
are also ~one-in-a-thousand different, so both stages exhibit
this small problem.)
And I tried cc -m32 on the 64-bit machine, but there's
some stubs32.h that it's missing. So instead, I cc -static
on my own box, and that executable does work on the 64-bit
machine when run against files encrypted on either 32-bit box.
So the problem doesn't seem to be the 64-bit os, but rather
the cc executable, though I'm not 100% sure.
What I'm really finding weird is that ~one-byte-in-a-thousand
diff. The program uses several streams of random numbers
(generated by its own code) to xor bytes, permute bits, etc.
The slightest problem would garble up the data beyond belief.
Moreover, it's got a verbose flag, and I can see the streams
are identical. And everywhere else I've thought to look
seems okay, too, as far as I can tell.
So I'm asking about weird-ish 32/64-bit cc differences
that might give rise to this kind of behavior. Presumably,
there's some subtle bug that I'm failing to see in the code,
and which the output isn't helping me to zero in on. Thanks,
difference in the behavior of the exact same C code compiled
on one 64-bit linux machine...
o dreamhost.com
uname -a Linux mothman 2.6.32.8-grsec-2.1.14-modsign-xeon-64 #2 SMP
Sat Mar 13 00:42:43 PST 2010 x86_64 GNU/Linux
cc --version cc (Debian 4.3.2-1.1) 4.3.2
versus two other 32-bit linuxes...
o panix.com
uname -a NetBSD panix3.panix.com 6.1.2 NetBSD 6.1.2 (PANIX-USER) #0:
Wed Oct 30 05:25:05 EDT 2013 i386
cc --version cc (NetBSD nb2 20110806) 4.5.3
o my own local box running slackware 14.0 32-bit
cc --version cc (GCC) 4.7.1
The code is an en/de-cryption utility forkosh.com/fm.zip,
which is way too many lines to ask anybody to look at.
But my own debugging is failing to identify where the
difference creeps in, and googling failed to help suggest
where to look more deeply.
Firstly, both executables "work", i.e., if you encrypt and
then decrypt, you get back the exact same original file.
But if you encrypt using the 32-bit executable, scp the
encrypted file to the 64-bit machine (md5's match) and then
decrypt, the result is exactly the same length and almost
identical except for about one byte in a thousand that doesn't
diff. Vice versa (encrypt on 64-bit, decrypt on 32) gives
the same behavior. (By the way, the 32-vs-64-bit encrypted files
are also ~one-in-a-thousand different, so both stages exhibit
this small problem.)
And I tried cc -m32 on the 64-bit machine, but there's
some stubs32.h that it's missing. So instead, I cc -static
on my own box, and that executable does work on the 64-bit
machine when run against files encrypted on either 32-bit box.
So the problem doesn't seem to be the 64-bit os, but rather
the cc executable, though I'm not 100% sure.
What I'm really finding weird is that ~one-byte-in-a-thousand
diff. The program uses several streams of random numbers
(generated by its own code) to xor bytes, permute bits, etc.
The slightest problem would garble up the data beyond belief.
Moreover, it's got a verbose flag, and I can see the streams
are identical. And everywhere else I've thought to look
seems okay, too, as far as I can tell.
So I'm asking about weird-ish 32/64-bit cc differences
that might give rise to this kind of behavior. Presumably,
there's some subtle bug that I'm failing to see in the code,
and which the output isn't helping me to zero in on. Thanks,