D
Dan Stromberg
Yes, that is one major detail. There must be many more.
This is not just a detail: O(1) tends to be beat O(logn) pretty easily
for large n.
Yes, that is one major detail. There must be many more.
Dan Stromberg said:This is not just a detail: O(1) tends to be beat O(logn) pretty easily
for large n.
There is no O(1) hash table.
Dan Stromberg said:
Ian said:In my view the second one is wrong. a += b should be understood as
being equivalent to a = a + b, but with the *possible* and by no means
guaranteed optimization that the operation may be performed in-place.
In fact, if you read the documentation for lists, you may notice that
while they clearly cover the + operator and the extend method, they do
not explicitly document the list class's += operator.
Please elaborate.
Dan Stromberg said:[...]
it's still amortized O(1).
This interpretation is at odds with the Language Reference,
section 6.2.1, Augmented Assignment Statements:
"An augmented assignment expression like x += 1 can be rewritten as x =
x + 1 to
achieve a similar, but not exactly equal effect... when possible, the
actual operation is performed
in-place, meaning that rather than creating a new object and assigning
that to
the target, the old object is modified instead."
Note that it says "when possible", not "if the implementation
feels like it".
The "when possible" clause, together with the fact that lists
are mutable, implies that it *will* be done in-place. There
is no need to document all the in-place operators explicitly
for every type.
This interpretation is at odds with the Language Reference,
section 6.2.1, Augmented Assignment Statements:
"An augmented assignment expression like x += 1 can be rewritten as x = x +
1 to
achieve a similar, but not exactly equal effect... when possible, the actual
operation is performed
in-place, meaning that rather than creating a new object and assigning that
to
the target, the old object is modified instead."
Note that it says "when possible", not "if the implementation
feels like it".
A hash table of fixed size is O(1) until you fill it so full that you
start getting enough collisions to make it O(n), as one bucket becomes
a linked list. This is because the hash function is O(1), and looking
up a value in a C array is O(1).
A more flexible hash table will not have a fixed size; instead it will
grow itself as needed. This growth operation tends to be O(n), but
happens vanishingly infrequently as the table grows, so it's still
amortized O(1).
On Sun, Mar 9, 2014 at 4:03 PM, Gregory Ewing
That's quite vague, and not much stronger a guarantee than "maybe". It's
technically "possible" for this augmented assignment to be performed in
place:
x = 12
x += 4
But it's not done in-place, because ints are meant to be immutable.
In any case, this means that whether the operation is actually performed
in-place is an implementation detail -- if not of the Python
implementation then at least of the class -- and not something the user
should take for granted.
There is no O(1) hash table.
A hash table can also only ever be O(1) in the average case.
Steven D'Aprano said:Proof: I create a hash table that accepts unsigned bytes as keys, where
That's incorrect. Ints aren't merely "meant" to be immutable, which
implies that's it's optional, they are defined by the language
specification and the reference implementation as immutable. Any
interpreter where ints are mutable *is not Python*.
Whether += operates in place or not is part of the interface of the
class, not the implementation.
Would you say that whether list.append operates in place or creates a new
list is an implementation detail? Whether str.upper() creates a new
string or modifies the existing one in place?
Mutability versus
immutability is part of the interface, not implementation, not
withstanding that somebody could create an alternative class with the
opposite behaviour: a MutableStr, or ImmutableList.
The O(f(n)) notation has no meaning when n is limited.
This thing is not just pedantry.
This is not just a detail: O(1) tends to be beat O(logn)
pretty easily for large n.
The discussion was how a balanced tree
fares in comparison with hash tables. A simplistic O(1) vs O(log n) was
presented as a proof that balanced trees don't have a chance in practice
or theory. I wasn't so easily convinced.
That's true, but is beside the point, which is that "when possible" is
not very meaningful.
Of course not. list.append is documented as modifying the list.
str.upper is documented as returning a copy of the string.
If the in-place behavior of += is held to be part of the interface, then
we must accept that += is not polymorphic across mutable and immutable
types,
which in my mind largely* defeats the purpose of having it.
After all, there should be one -- and preferably only one -- obvious way
to do it. If you want in-place concatenation, the obvious way to do it
is by calling extend. If you want copy concatenation, the obvious way
to do it is with the + operator. Why then should not just mutable
sequences but immutable sequences as well even offer the += operator?
Steven D'Aprano said:If I am right, that certainly would explain your apparent inability to
understand the difference between "is" and == operators, your
insistence that object IDs are addresses, and your declaration that
object identity is philosophically untenable.
In other words, I suspect you are trolling.
You and I certainly have a very hard time communicating. Your
paraphrasing of my positions demonstrates that.
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.