G
Geoff
Well, I've changed the whole process to use a std:array...and it
hasn't helped at all. 8<{{
So, okay, I've got some other problem - somewhere in the data
conversion and preparation process. Here's what I'm doing (without
exposing you all to an enormous amount of detail code):
1. I am reading a CSV file, which I parse into individual fields.
Simple enough, you're looking for commas and gobbling the data between
them.
2. For each field (~15 of them, I either trim the alphanumeric data or
convert the numeric data to populate the data object (which has been a
std::map, std::vector, and now is a std::array object).
3. I do some data validation on the data fields, before I post he
object to the container.
Describe "validation". String matching can be expensive. Post the
code.
4. As each object is stored, I display a running count of objects
processed. This is where I can see that the activity is running slowly
and erratically - the running counter does not update smoothly, and it
is quite slow for the large volume of records (~30,000) I'm processing.
Is this running count appearing in a window? Is that code running
inside your processing loop or monitoring a counter for which you have
read-only asynchronous access? Multithreading? Mutex?
The problem wasn't really noticeable prior to running this large
input file. I had been dealing with ~2000 input records and never
noticed a concern.
I now see that something in the above processing has substantial
processing, but I don't yet know where or what. For now, I'm going to
comment out various functions to see if I can determine where the
overhead is. <sigh...>
Post the code.