N
Newbie
Hi All
Wondered if you could help.
I have created a little backup routine (using classic ASP and ADO) to get my
data from my db into a bespoke XML file, ie I query the data into a
recordset, concat the XML tags with this data and then put it into a text
file using FileSysObj.
This backup proc all works fine, but the problem is when I go to restore
this backup. I'm using the xml dom object to get the dataset by tagname and
then building up the 'insert into' statements by concating it with the
xml-based data, but my ASP scripting is always timing out by the time this
process is complete on any dataset with a large amount of data in it.
PLEASE NOTE: my scripting timeout is set to 15 secs by my host and I cannot
change this or my ISP at this current time.
The backup/restore process all falls into place really nicely, apart from
the fact that I can't get the XML restore bit to work fast enough to get the
data back into the db. For example, I had an xml file consisting of 6000+
rows, which contained 18 cols/fields per row. I did a simple test of
extracting this data using 2 x FOR NEXT loops and writing the field/tag
values to the page and the ASP couldn't complete this before the timeout.
Its as if the XML object/process is too slow to handle this amount of data.
Is this the case?
When I've done the ADO's recordset save option to binary this works like
lightning and it's xml offering isn't too much slower as well, but I don't
like the way the binary/xml file is put together because I can't add the
queries to the same file, ie using the ADO binary/xml method I have about 22
separate files, whereas my bespoke xml file is just 1 file with all the data
in.
Other users are going to be backing up my db via an ASP page, so they only
backup the data I want them to back up (hence the reason why I don't just
use a direct sql backup program), but I seem to be resigned to the fact that
I will only be able to use the ADO version instead of my own.
Anybody had this problem and got round it?
Thanks
Wondered if you could help.
I have created a little backup routine (using classic ASP and ADO) to get my
data from my db into a bespoke XML file, ie I query the data into a
recordset, concat the XML tags with this data and then put it into a text
file using FileSysObj.
This backup proc all works fine, but the problem is when I go to restore
this backup. I'm using the xml dom object to get the dataset by tagname and
then building up the 'insert into' statements by concating it with the
xml-based data, but my ASP scripting is always timing out by the time this
process is complete on any dataset with a large amount of data in it.
PLEASE NOTE: my scripting timeout is set to 15 secs by my host and I cannot
change this or my ISP at this current time.
The backup/restore process all falls into place really nicely, apart from
the fact that I can't get the XML restore bit to work fast enough to get the
data back into the db. For example, I had an xml file consisting of 6000+
rows, which contained 18 cols/fields per row. I did a simple test of
extracting this data using 2 x FOR NEXT loops and writing the field/tag
values to the page and the ASP couldn't complete this before the timeout.
Its as if the XML object/process is too slow to handle this amount of data.
Is this the case?
When I've done the ADO's recordset save option to binary this works like
lightning and it's xml offering isn't too much slower as well, but I don't
like the way the binary/xml file is put together because I can't add the
queries to the same file, ie using the ADO binary/xml method I have about 22
separate files, whereas my bespoke xml file is just 1 file with all the data
in.
Other users are going to be backing up my db via an ASP page, so they only
backup the data I want them to back up (hence the reason why I don't just
use a direct sql backup program), but I seem to be resigned to the fact that
I will only be able to use the ADO version instead of my own.
Anybody had this problem and got round it?
Thanks