Steven D'Aprano said:
No, repeating it ninety-seven times doesn't make it easier to read, it
makes it *harder* to read, and I'm sure I don't need to explain why.
You do, if you claim you're consistent in your "never _wrong_"
assertion. How many totally useless repetitions are "never wrong"? I
claim: ZERO; even ONE totally useless statement is one too many. You
appear to want to put the bar elsewhere, but then it's not clear WHERE.
That's debatable. Why does Python have decorators when there was already a
perfectly usable syntax for setting a method to function(method)? And
let's not even mention x += 1 etc.
A decorator "takes away" some redundancy and repetition from an
important idiom:
@deco
def somefunction ...
...
stands for
def somefunction ...
...
somefunction = deco(somefunction)
where 'somefunction' needed to be repeated THREE times. Similarly,
though not quite as dramatically,
somevariable += whatever
can stand for
somevariable = somevariable + whatever
again taking away one (now needless) repetition. (Actually, the
existence of inplace addition for some types makes += even more useful,
as it polymorphically resolves to "what it _should_" resolve to
.
In sharp and enormous contrast, the construct you're defending ADDS
useless repetition -- a statement that carries NO relevant information
whatsoever, and in particular needlessly repeat a variable name.
How can you SERIOUSLY claim ANY equivalence between constructs that
REMOVE redundancy, and your defense of one that ADDS it?!
Sure, in the *specific* example given, the body of the function was so
short that it would be a pretty poor developer who didn't know it was a
global.
But in a more substantial function, one using lots of variables, it might
not be clear which were global and which weren't unless you studied the
code, line-by-line.
If a function is so long and complicated, and in particular uses so many
variables, that you lose track of what variables are local and which
ones global, then adding a 'global' statement (or more) is tantamount to
putting a bandaid over a large, gaping wound that's bleeding copiously
and unarrestably. Forget such pitiful attempts at half-hearted kludgey
"remedies", and refactor mercilessly -- that one, huge, hopelessly
confused and confusing function, MUST become a (hopefully small) set of
small, shiny, crystal-clear ones. In this sense, the presence of a
totally useless "global" statement, which may have been used in the vain
hope of effecting such unworkable "remedy", is yet another red flag
waving -- it may indicate the programmer suspects he's overflowed the
boundaries of good taste and maximum sensible complication, while
lacking the guts to do the refactoring such a situation desperately
calls for.
I'm not going to defend *any* of those practices. But I don't think
explicitly stating that a name is global, even when strictly unnecessary,
is in the same category. In practice, I wouldn't do so for a function that
was as short as the one the Original Poster used.
I think it's an even more horrible practice than all others I list
(except for my later note on comments that lie). Not only would I never
use it, but I would never tolerate it in any way, shape, or form: I
would not pass a code review for any code using it, if a book used it or
defended it I would recommend to all potential readers to avoid the
book, if I was teaching a programming course and a student used it I
would fail the student, if I was interviewing a programming candidate
and the candidate used it I would not hire thre candidate, and so on,
and so forth.
But consider also something like this:
def func():
x, y = 1, 2
z = x + y
# lots more code doing many things here
# some of which involve w
return z + w
Let's pretend that there is sufficient code in there that it isn't obvious
at a glance that w is a global, okay?
If there's too much code before the FIRST use of w, so that it's not
obvious that w is never set before use (and thus must be global), then
'func' is too big and complicated.
There's a off-by-one error in the code, which we fix:
def func():
x, y = 1, 2
z = x + y
# lots more code doing many things here
# some of which involve w
w = w + 1
return z + w
"UnboundLocalError". Oops.
Now, I cheerfully admit that this scenario is contrived. Some people might
even argue that it is good for newbies to run into this error sooner
rather than later, but for those who don't think so, defensively inserting
a global statement might help prevent the issue from coming up.
Do you realize what you're advocating? A 'global w' would make that "w
= w + 1" (better expressed as "w += 1", of course) into a POLLUTION OF
THE GLOBAL NAMESPACE -- instead of an error easily identified by the
runtime (and easily fixed by removing that silly statement and using
insread a "return z + w + 1"), now thanks to the "defensively inserting"
you're ADVOCATING, you're having an exceedingly subtle shift into the
semantics of your whole program, including future calls to func and any
other function that uses that horridly-named 'w' (single-letter global
names are hardly a defensible practice, either).
If the coder has no clue as to whether w is local or global, then that
coder MOST DEFINITELY has no business whatsoever in REBINDING the
(global!) name w. "defensively" (HA!) putting yourself into a position
where you may end up rebinding global names WITHOUT KNOWING YOU'RE DOING
SO, i.e. POLLUTING the global namespace, is an even stronger reason that
I'd yet advanced for loathing those ``useless'' global statements, and
strongly confirms my unbounded detestation of them.
I'm a big believer in letting newbies walk before running. I'd rather see
beginner programmers over-use global than under-use it. You're welcome to
disagree, but since UnboundLocalError seems to be one of the more
perplexing errors newbies suffer from, I think it is better for them to
avoid it until they've got a little more experience.
I think encouraging newbies to overuse globals is a horrible practice,
and if you're ever teaching to newbies you're seriously damaging their
chance to learn to program decently, or at least ensuring they'll take a
needlessly long time for that learning. And I've both taught and
practiced programming at a huge variety of levels, as well as quite
successfully, so I do NOT believe your opinions on this subject should
carry exactly the same weight as mine -- to be frank, I believe that in
this case your opinion has a NEGATIVE worth, and that by putting it in
practice you're seriously hurting the people you think you're helping.
Alex