For floats it is understandable. But for integers, seriously, 4% is a lot. I would never have thought an interpreter would have differences likethis in syntax for something as fundamental as adding 1.
Its, seriously, not even kind of a lot at all. Percentages without
context are meaningless: its 4% slower, sure -- but that is 4% of an
incredibly small probably constant-time amount of time.
Picking "i += 1" over "i = i + 1" based on one being 4% slower is sorta
kinda crazy. The difference in speed is probably related to churn and
cache as much as anything else (its not as consistent on my machine, for
example): or the ceval loop doing a few more case-tests between them as
others have mentioned. All in all, if 4% of a nanomicrofraction of a
chunk of time is that meaningful, you're probably best served not using
Python.
That said: my advice is always to avoid += like a plague. It is magic
and impossible to predict without intimate knowledge of exactly what's
on the left-side.
i += 1
n += x
Those two things look very similar, but they may do -completely-
different things depending on just what "n" is.
It may or may not do something that is like:
n = n + x
Or, it may do something that's more akin to
n.extend(x)
n = n
Those aren't even kind of equivalent actions. And things get more
complicated if 'n' is say, n[0] (especially if something goes wrong
between the extend and the rebinding).
Python's usually all explicit and pretty well-defined in how its basic
syntax and behaviors operate, and you usually don't really have to know
details about how a data-type works to predict exactly what it's doing:
in fact, its often beneficial to not pay too much attention to such
details, and just assume the data type will work approximately as you'd
expect. That way people can slip something-something to you and wink and
say of /course/ its a dict, darling. Try it, you'll like it, okay? This
sorta thing is encouraged, but it kinda depends on trusting objects to
behave a certain way and for things to be predictable in both how they
work and how they fail.
With "i = i + 1", I know that generally speaking, my "i" is being
assigned a new object and that's that, no matter what type "i" is.
(Okay: I do know that you could modify __add__ to do something
underhanded here, tweaking internal state and then returning self.
People going out of their way to behave unpredictably is not my
objection: supposedly easy and straight-forward normal Python-fu being
inherently unpredictable is).
For example: I just /know/ that it doesn't matter who or what may have
their own binding to that object before I go and increment it, they
won't be affected and everything just will work fine. With augmented
assignment, I can't be sure of that. Now, while I admit, you generally
do have to keep track in your head of which of your data-types are
mutable vs immutable and take care with sharing mutables, the fact that
"n += x" is described and generally thought of as merely syntactical
sugar for:
n = n + x
... lets one easily think that this should be entirely safe, even with
mutable objects, because if += were merely syntactical sugar, it would
be. But its not! Because += is wiggly. It can do more then one entirely
different kind of behavior.
Anyways. </rant> I've been kinda annoyed at augmented assignment for
years now
--
Stephen Hansen
... Also: Ixokai
... Mail: me+list/python (AT) ixokai (DOT) io
... Blog:
http://meh.ixokai.io/
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.10 (Darwin)
iQEcBAEBAgAGBQJOUboaAAoJEKcbwptVWx/l0bUH/0XTvNSVtbZopA0ZKdRIUqZv
iOmCyX22PK7MSc90NrTp/uoLUsZEyUt/hIEWmj3EIyDFyFb3VbNyHkXNzqDc3DXy
wKzhbt7/wcL3Zoxx8uai1BX6+wHgfrR+ycVf43hhtorCL0LEoERLA70Tgkmsl7a0
oxAGqhTl7WRcLJ7CA9Ayza7VIloWVbrWZGajHr/8ZMcNXtdsD4td0XwAOTIe5Q0k
H8aPEYTlaqaSe8sZgcBNvMWqUiR3J2c2tm2DUGrjyBW/51Q68UT5tj9kno+lyVGg
CqKq3HCnLOAkaIMDCFSu7x9xyOsPqqjSH8dcLnfiKYpUtfMBYkgJgDksAM7OPd0=
=zsLA
-----END PGP SIGNATURE-----