P
Paul
I have users who want to search 6 different large flat xml documents
I can only fit 3 of these documents into memory at one time
So I continually have to swap XML documents in and out of memory
Is it best to use DOM or SAX? or maybe something else?
Using SAX seems like the technology of choice for large xml files
because there is no need to put the xml into memory. But under load
would there not be a hard disk issue from numerous concurrent searches
on a big xml file?
Using DOM would give really quick search times, but since the
different xml files need to keep swapping in and out of memory, surly
constantly parsing the files into memory is hammering the hd just as
much as SAX?
So presumably SAX is the best of the worse?
or is there some other technique that would be better (Discount normal
databases and native xml databases) I know these would be faster, but
we need a quick fix
I can only fit 3 of these documents into memory at one time
So I continually have to swap XML documents in and out of memory
Is it best to use DOM or SAX? or maybe something else?
Using SAX seems like the technology of choice for large xml files
because there is no need to put the xml into memory. But under load
would there not be a hard disk issue from numerous concurrent searches
on a big xml file?
Using DOM would give really quick search times, but since the
different xml files need to keep swapping in and out of memory, surly
constantly parsing the files into memory is hammering the hd just as
much as SAX?
So presumably SAX is the best of the worse?
or is there some other technique that would be better (Discount normal
databases and native xml databases) I know these would be faster, but
we need a quick fix