* Ioannis Vranos:
In TC++PL3 on page 464 (Standard Container Operations), times of the kind
O(log(n)) are mentioned.
As far as I know, general "asymptotic times" in general algorithms, are at
least O(n) (that is, at least relative to the number of elements), so how
can it be possible to be O(log(n)) in the case of standard library container
operations, in non-parallel systems?
It's not clear exactly what you're referring to, nor is it clear exactly what
you don't understand.
But anyway, let's start with the big-oh notation.
Formally big-oh refers to a lower bound. But the Holy Standard uses this
notation in the sense of big-Omega, which implies both a lower and an upper
bound. That is, in the standard's notation, if some operation has complexity
(number of fundamental operations) f(n), and the standard says it's O(g(n)),
then that means that when n is sufficiently large f(n) has K1*g(n) as a lower
bound, and K2*g(n) as an upper bound, for some constants K1 and K2.
For example, f(n) = 3*n^2 + 5 is O(n^2) because for sufficiently large n there
exists constants K1 and K2 such that K1*n^2 <= f(n) and f(n) <= K2*n^2.
Logarithmic times mainly occur when a set of items can be recursively
partitioned into subsets, where those subsets can be treated independently. For
example, a std::map may store the items in a sorted and fairly balanced tree.
Assuming such a tree, and that it's binary (branching factor 2), then starting
at the root of the tree the search for one of the n items reduces to either
finding it at the root, or searching among approx. n/2 items in the left
sub-tree, or searching among approx n/2 items in the right sub-tree. Descending
into the tree, at each way choice the number of items to search among is halved.
For n items you can only halve n approximately lg(n) times before you've reduced
the number of items to 1, whence the sought for item is found or not there.
Cheers & hth.,
- Alf