J
jdev8080
We are looking at creating large XML files containing binary data
(encoded as base64) and passing them to transformers that will parse
and transform the data into different formats.
Basically, we have images that have associated metadata and we are
trying to develop a unified delivery mechanism. Our XML documents may
be as large as 1GB and contain up to 100,000 images.
My question is, has anyone done anything like this before?
What are the performance considerations?
Do the current parsers support this size of XML file?
Has anyone used fast infoset for this type of problem?
Is there a better way to deliver large sets of binary files (i.e. zip
files or something like that)?
Any input would be great. If there is a better board to post this,
please let me know.
Thx,
Bret
(encoded as base64) and passing them to transformers that will parse
and transform the data into different formats.
Basically, we have images that have associated metadata and we are
trying to develop a unified delivery mechanism. Our XML documents may
be as large as 1GB and contain up to 100,000 images.
My question is, has anyone done anything like this before?
What are the performance considerations?
Do the current parsers support this size of XML file?
Has anyone used fast infoset for this type of problem?
Is there a better way to deliver large sets of binary files (i.e. zip
files or something like that)?
Any input would be great. If there is a better board to post this,
please let me know.
Thx,
Bret