I refer you to the file format standard for ANYTHING EVER,..
And folks who write code that deals with these formats need to be
fully up to speed on the format, don't they. And in the case of
evolving formats, need to consider upgrading so they can continue
to read newer formats. This thread has touched on many of the
*tools* (e.g. network transport layers) available to deal with these
binary formats, AND THAT'S THE POINT: you need all this *stuff* and
knowledge.
Text is simple. You stop even *thinking* about a lot of stuff.
And it has the advantage of easy human readability, a "nice to have"
for debugging and maintanence purposes.
Binary, in comparison, is a headache. (-:
Without any of the computers that used it? Pretty close to zero,
even with the help of an electron microscope.
No, it would be silly of me to mean that.
Assuming you have no hex editor,...
Hey, I'll even grant you the hex editor!
...but you do have a computer and a text editor, then
obviously text will be easier to display.
Even if you can examine the hex, do you see the hassle required to
analyse what all those bits *mean*? Compare that to a text file
that very likely *tags* (labels) the data! I mean, come on, how
can you beat named, trivial-to-view data?
And ya see that? Even given equal ability to examine the raw file
(that is, sans intelligent interpreter), text is a monster winner.
Contrariwise, if you have no text editor but do have a hex editor,
binary will be easier to display.
Ummmm, you're winging it here. (-: First, really, a hex viewer, but
no text viewer? I think that'd be a first in computing history, but
stranger things have happened.
Second, doncha think viewing the text in the hex viewer would still
be a lot more obvious (given those labels) than the raw bin bits?
Even when you tilt the playing field insanely, text still wins! (-:
Neither will necessarily be easier to interpret unless you have a
copy of the relevant file format standard, and then the point is
pretty much moot anyway.
Well, right, we're assuming the fileformat is lost or unavailable.
And even if we somehow lost the "format" to text/plain, the pattern
of text lines with repeating delimiters is a red flag. Consider too
that at this extreme--where we've forgotten ASCII--how much harder
would it be to figure out binary storage formats (remember there's
likely no clue where object boundaries are)?
How often have people come here to ask help in writing "Hello
world!" programs? How often have people come to sci.crypt to
ask help in "deciphering" cryptograms? If you're saying that a
lot of people are stupid, I'm inclined to agree with you.
No (well, actually, yes that's true, but not my point right now
.
I'm pointing out--comparing like with like--no one stumbling on a
text file containing important data comes begging interpretation.
Cryptograms are play, and I doubt the urgent, often work-related
situation happens in s.crypt.
Ick, floating point!
[bwg] Exactly my point! Which would you rather deal with:
"99.1206" 0x42c63dbf
Seriously, I don't have much experience with floating point, but I
would expect you'd either use a fixed-point representation (common
in the domains in which I work),...
Let me guess. CAD/CAM or NC or something involving physical coords?
Fixed point isn't uncommon in environments when you know the range
of values expected. When you don't and need the largest range possible.
(Or when you DO and need a huge range.) You need floating point.
How do you save a floating-point number to a text file?
As you'd expect. printf("%d") ... strtod()
Within certain parameters, close enough. Once you're dealing with FP,
you sorta have to give up the concept of lossless. Experts in FP know
how to deal with it to make the pain as low as possible, but FP is all
about approximation.
If you need absolute precision, you could always save the bytes as a
hex string. Fast and easy in and out.
How many lines of <your PLOC here> code is that?
Only a few surrounding strtod() if you don't mind a little edge loss.
(IIRC, within precision limits, text<=>FP *is* fully deterministic?)