R
Rob Meade
Dear all,
We have purchased a product called SA-FileUp which we have used in our
organisation for a number of years, I'm quite an an avid supporter of it,
recently I've been trying to ensure some of our older applications which
used native .Net uploading moved across to our .Net implementation of
SA-FileUp.
Because there is development time involved in this I have now been asked by
my manager to "test" both native and SA techniques to see how bad the issue
is if we didn't use the SA-FileUp component.
I appreciate that there is a percentage of "sales" talk behind the reasons
for using their product on their website, however, there is a lot of mention
of the native approach not being great because it uses the servers memory to
store the files in temporarily prior to the final save to disc, as opposed
to their products saving to a temporary file prior to the final save. I
personally believe the product to be a significant tool for us to use and as
such need to prove it!
My understanding of the two processes is limited, but I think it goes
something like this...
If we use SA-FileUp, the component streams the file and saves it as a
temporary file on the server before finishing the transaction and saving it
as a normal file.
Where-as the native .net approach loads the file into memory on the server
instead prior to saving as a file.
I've been quized on the native approach along the lines of "surely after the
file is saved the memory is released?", which would make sense, but I would
argue that if the application is putting a heavy demand on the server,
either in the number of uploads at anyone point, or the size of the files in
question, or both, this could have a significant effect on the memory
consumption on the server.
Would anyone here, perhaps someone else who has used it or just has some
theories be able to provide be with any technical information regarding the
two approaches?
Any links would also be advantageous, and in addition, any suggestions for
conducting a test of both methods - I suggested to my manager that this
might be hard to simulate as we'd want to test it in anger with many files,
many uploads, no other network traffic to the server etc etc.
Any information for any testing you may have performed yourselves would be
really appreciated.
As I said initially, I am a fan of the product and would hope that they
remain in use in this organisation, I am sadly now faced with having to
prove its worth due to financial reasons
Regards
Rob
We have purchased a product called SA-FileUp which we have used in our
organisation for a number of years, I'm quite an an avid supporter of it,
recently I've been trying to ensure some of our older applications which
used native .Net uploading moved across to our .Net implementation of
SA-FileUp.
Because there is development time involved in this I have now been asked by
my manager to "test" both native and SA techniques to see how bad the issue
is if we didn't use the SA-FileUp component.
I appreciate that there is a percentage of "sales" talk behind the reasons
for using their product on their website, however, there is a lot of mention
of the native approach not being great because it uses the servers memory to
store the files in temporarily prior to the final save to disc, as opposed
to their products saving to a temporary file prior to the final save. I
personally believe the product to be a significant tool for us to use and as
such need to prove it!
My understanding of the two processes is limited, but I think it goes
something like this...
If we use SA-FileUp, the component streams the file and saves it as a
temporary file on the server before finishing the transaction and saving it
as a normal file.
Where-as the native .net approach loads the file into memory on the server
instead prior to saving as a file.
I've been quized on the native approach along the lines of "surely after the
file is saved the memory is released?", which would make sense, but I would
argue that if the application is putting a heavy demand on the server,
either in the number of uploads at anyone point, or the size of the files in
question, or both, this could have a significant effect on the memory
consumption on the server.
Would anyone here, perhaps someone else who has used it or just has some
theories be able to provide be with any technical information regarding the
two approaches?
Any links would also be advantageous, and in addition, any suggestions for
conducting a test of both methods - I suggested to my manager that this
might be hard to simulate as we'd want to test it in anger with many files,
many uploads, no other network traffic to the server etc etc.
Any information for any testing you may have performed yourselves would be
really appreciated.
As I said initially, I am a fan of the product and would hope that they
remain in use in this organisation, I am sadly now faced with having to
prove its worth due to financial reasons
Regards
Rob