B
botfood
after extensive testing I am stuck with the realization that a database
file salvaged from a corrupted webserver has been messed up just enough
so that tie() fails to open it. I don't know whether it was a virus, a
mechanical crash, or caused by a software bug. using -f and -r checks
pass the file as existing and readable, but the tie() using DB_File
fails and the $! reports 'No such file or directory'.
....how I determined that has taken two days, and some kicking around in
a different thread. What I am after now is any thoughts people might
have on how to 'fix' the file to recover as much of the data as
possible.
To work with the file i pulled a copy local from the Linux server to my
PC with binary transfer in hopes i can pick the data out...
If I open the file with wordpad (too big for notepad) I think I see a
pattern of readable text, with lines of binary gibberish mixed in that
I assume has to do with DB_File indexing and Berkley internal markers.
The trouble with this is that the man-readable data is in chunks, and
the first record of each chunk doesnt have the key value discernable as
the rest in each chunk that look like:
156100.1DANICA||ROMEROI...and more fields
where the value separated by the squares i recognize as the key. The
first record in each chunk typically has a long string of gibberish
with the values starting right afterward rather than the value between
'block' characters, so I dont see how to pick out the key for the first
value string in each block of man-readable text.
What I am wondering to the group here is whether it is worth my time to
attempt to extract the pattern of readable text from the file created
as a tie()ed DB_File, save as plain text, and then write another import
tool to write the text back into a tie()ed file.
or... lost cause?
I tried to paste in a more complete 'chunk' of what i am looking at so
you can see the pattern, but some of the special characters wont go
into the google message composition window.:
file salvaged from a corrupted webserver has been messed up just enough
so that tie() fails to open it. I don't know whether it was a virus, a
mechanical crash, or caused by a software bug. using -f and -r checks
pass the file as existing and readable, but the tie() using DB_File
fails and the $! reports 'No such file or directory'.
....how I determined that has taken two days, and some kicking around in
a different thread. What I am after now is any thoughts people might
have on how to 'fix' the file to recover as much of the data as
possible.
To work with the file i pulled a copy local from the Linux server to my
PC with binary transfer in hopes i can pick the data out...
If I open the file with wordpad (too big for notepad) I think I see a
pattern of readable text, with lines of binary gibberish mixed in that
I assume has to do with DB_File indexing and Berkley internal markers.
The trouble with this is that the man-readable data is in chunks, and
the first record of each chunk doesnt have the key value discernable as
the rest in each chunk that look like:
156100.1DANICA||ROMEROI...and more fields
where the value separated by the squares i recognize as the key. The
first record in each chunk typically has a long string of gibberish
with the values starting right afterward rather than the value between
'block' characters, so I dont see how to pick out the key for the first
value string in each block of man-readable text.
What I am wondering to the group here is whether it is worth my time to
attempt to extract the pattern of readable text from the file created
as a tie()ed DB_File, save as plain text, and then write another import
tool to write the text back into a tie()ed file.
or... lost cause?
I tried to paste in a more complete 'chunk' of what i am looking at so
you can see the pattern, but some of the special characters wont go
into the google message composition window.: