T
Thomas Lehmann
There are written so many articles but no article is really providing
the answer I'm looking for.
Given a scenario with 100000 data elements the day (500000 the week).
Potential we might have 3000-5000 updates for individual elements
each second.
Imagine a filter that decides on each update the a data element will
be added, updated or removed.
The final container which is sorted by a user defined criteria must be
able to handle this amount of data.
Info: basically each value of that sorted container is a pointer to the
real data.
Final scope: we display the data in a table which provides index based
positions (row)
NOW, what is the right way to implement a fast container for this?
Some of the problems I was confronted with:
- Using a std::vector deletion/insertion is expensive, is it?
- Using a std::set I have no index access. Or is there a way to handle this properly?
- Somebody told me something about a weighted balanced tree as possible solution.
- Somebody else told me to use standard containers only.
the answer I'm looking for.
Given a scenario with 100000 data elements the day (500000 the week).
Potential we might have 3000-5000 updates for individual elements
each second.
Imagine a filter that decides on each update the a data element will
be added, updated or removed.
The final container which is sorted by a user defined criteria must be
able to handle this amount of data.
Info: basically each value of that sorted container is a pointer to the
real data.
Final scope: we display the data in a table which provides index based
positions (row)
NOW, what is the right way to implement a fast container for this?
Some of the problems I was confronted with:
- Using a std::vector deletion/insertion is expensive, is it?
- Using a std::set I have no index access. Or is there a way to handle this properly?
- Somebody told me something about a weighted balanced tree as possible solution.
- Somebody else told me to use standard containers only.