V
Vince
Hi everybody
for one of my customer I need to program an adapter that basically
receives about 25'000 MQ Messages, parses them and then finally writes
the data into an Oracle DB.
I must say at this point that every message has a variable length and
thus variable amount of data/fields that must be returned to the DB.
During a workshop I got an argue with another developer because he said
we should use PrepareStatements instead of normal Statements, they've
got a better performance - compiling once on DB Server. I generally
agreed, but said that in our case the PrepareStatements won't help in
terms of performance since we have a variable amount of columns for each
SQL Insert Statement. What IMHO would lead to an overhead because we
would have to concatenate the PrepareStatements individually (25000
times) plus we would then have to assign a value for each question
mark.. We basically better concatenate the Insert Statement once for
each row of data and then use a batch and commit it to the DB...
My question now: How would you solve it? Would the PrepareStatement
still improve the performance, considering though the variable amount of
columns?
Thanks for your opinion and help
Vince
for one of my customer I need to program an adapter that basically
receives about 25'000 MQ Messages, parses them and then finally writes
the data into an Oracle DB.
I must say at this point that every message has a variable length and
thus variable amount of data/fields that must be returned to the DB.
During a workshop I got an argue with another developer because he said
we should use PrepareStatements instead of normal Statements, they've
got a better performance - compiling once on DB Server. I generally
agreed, but said that in our case the PrepareStatements won't help in
terms of performance since we have a variable amount of columns for each
SQL Insert Statement. What IMHO would lead to an overhead because we
would have to concatenate the PrepareStatements individually (25000
times) plus we would then have to assign a value for each question
mark.. We basically better concatenate the Insert Statement once for
each row of data and then use a batch and commit it to the DB...
My question now: How would you solve it? Would the PrepareStatement
still improve the performance, considering though the variable amount of
columns?
Thanks for your opinion and help
Vince