is this a valid import sequence ?

S

Stef Mientki

This might be a very weird construction,
but it's the most easy way in translating another language into Python (for simulation).

Although it works, I like to know if this a valid construction:

I've defined a class, like this,
attaching a not yet defined global to itself

class T6963_device (tDevice):
def __init__ (self):
global LCD
LCD = self


In the same module I've a function,
that runs a method of the above class instance,
and uses the global defined in the init of the class

def Write_LCD_Data ( data ):
global LCD
LCD.Write_Data ( data )

In another module a create one and only one instance of the class,
in the normal way:

Graphical_LCD = T6963_device('', Pos=[196,240], Color=wx.CYAN, Timer_On=True)


thanks
Stef Mientki
 
S

Steven D'Aprano

This might be a very weird construction,
but it's the most easy way in translating another language into Python (for simulation).

Although it works, I like to know if this a valid construction:

Since it works, how can it NOT be a valid construction?

However, using global variables is almost always a bad idea. Passing
parameters around is really cheap in Python, that's almost always a better
solution.
 
S

Stef Mientki

thanks Steven,
Since it works, how can it NOT be a valid construction?
ok that seems a plausible reasoning
However, using global variables is almost always a bad idea. Passing
parameters around is really cheap in Python, that's almost always a better
solution.
Yes I know,
but that's a fact of the "real thing" I'm simulating ;-)

cheers,
Stef
 
S

Scott David Daniels

Stef said:
... I've defined a class, like this, ...

class T6963_device (tDevice):
def __init__ (self):
global LCD
LCD = self
... In the same module I've a function,
that runs a method of the above class instance, ...

def Write_LCD_Data ( data ):
global LCD
LCD.Write_Data ( data )

The global statement in Write_LCD_Data is completely unnecessary. The
only time you need "global" is if you want to reassociate the global
name to another object (such as LCD = LCD + 1 or whatever). You only
read the global name-to-object mapping (though you may be using methods
on the named object to alter the referenced object). You only need
"global" when you need to "write" (re-bind) the global name-to-object
mapping.

--Scott David Daniels
(e-mail address removed)
 
S

Steven D'Aprano

The global statement in Write_LCD_Data is completely unnecessary. The
only time you need "global" is if you want to reassociate the global
name to another object (such as LCD = LCD + 1 or whatever).

That's technically true, but declaring it with global makes the code
self-documenting and therefore easier to read.

It's never _wrong_ to use the global statement, even if it is strictly
unnecessary for the Python compiler.
 
S

Stef Mientki

Steven said:
That's technically true, but declaring it with global makes the code
self-documenting and therefore easier to read.

It's never _wrong_ to use the global statement, even if it is strictly
unnecessary for the Python compiler.
Although I'm not an expert,
I guess you're both right.

thanks and cheers,
Stef Mientki
 
A

Alex Martelli

Steven D'Aprano said:
That's technically true, but declaring it with global makes the code
self-documenting and therefore easier to read.

It's never _wrong_ to use the global statement, even if it is strictly
unnecessary for the Python compiler.

So, repeat that global statement ninetyseven times -- that's not
"wrong", either, in exactly the same sense in which it's not "wrong" to
have it once -- the Python compiler will not complain. And by repeating
it over and over you make it less likely that a reader could miss it, so
it's even more "self-documenting" and "easier to read", right?

"Perfection is reached, not when there is no longer anything to
add, but when there is no longer anything to take away", as Antoine de
Saint-Exupery wrote. Since that global statement is utterly useless
(it's impossible to read and understand any substantial amount of Python
code without realizing that accessing a variable not locally assigned
means you're accessing a global, so the "self-documenting" character
claimed for that redundancy is quite fallacious), it IS perfectly
suitable to take away, and so it's at least a serious imperfection. It
violates Occam's Razor, by multiplying entities (specifically
statements) without necessity. It's just about as bad as sticking a
semicolon at the end of every statement (to make it "self-documenting"
that the statement ends there), parentheses around the conditions in if
and while statements and the argument in return statements (to make it
"self-documenting" where those expressions start and end), and a few
other less common ways to waste pixels, screen space, readers' attention
spans, and everybody's time. In other words, it's almost as bad as it
can get in Python without outright breakage of syntax or semantics
("almost" only because long comments that lie outright are worse:).


Alex
 
S

Steven D'Aprano

So, repeat that global statement ninetyseven times -- that's not
"wrong", either, in exactly the same sense in which it's not "wrong" to
have it once -- the Python compiler will not complain. And by repeating
it over and over you make it less likely that a reader could miss it, so
it's even more "self-documenting" and "easier to read", right?

No, repeating it ninety-seven times doesn't make it easier to read, it
makes it *harder* to read, and I'm sure I don't need to explain why.

"Perfection is reached, not when there is no longer anything to
add, but when there is no longer anything to take away", as Antoine de
Saint-Exupery wrote.

That's debatable. Why does Python have decorators when there was already a
perfectly usable syntax for setting a method to function(method)? And
let's not even mention x += 1 etc.

Since that global statement is utterly useless
(it's impossible to read and understand any substantial amount of Python
code without realizing that accessing a variable not locally assigned
means you're accessing a global, so the "self-documenting" character
claimed for that redundancy is quite fallacious),

Sure, in the *specific* example given, the body of the function was so
short that it would be a pretty poor developer who didn't know it was a
global.

But in a more substantial function, one using lots of variables, it might
not be clear which were global and which weren't unless you studied the
code, line-by-line.

it IS perfectly
suitable to take away, and so it's at least a serious imperfection. It
violates Occam's Razor, by multiplying entities (specifically
statements) without necessity. It's just about as bad as sticking a
semicolon at the end of every statement (to make it "self-documenting"
that the statement ends there), parentheses around the conditions in if
and while statements and the argument in return statements (to make it
"self-documenting" where those expressions start and end), and a few
other less common ways to waste pixels, screen space, readers' attention
spans, and everybody's time.

I'm not going to defend *any* of those practices. But I don't think
explicitly stating that a name is global, even when strictly unnecessary,
is in the same category. In practice, I wouldn't do so for a function that
was as short as the one the Original Poster used.

But consider also something like this:

def func():
x, y = 1, 2
z = x + y
# lots more code doing many things here
# some of which involve w
return z + w

Let's pretend that there is sufficient code in there that it isn't obvious
at a glance that w is a global, okay?

There's a off-by-one error in the code, which we fix:

def func():
x, y = 1, 2
z = x + y
# lots more code doing many things here
# some of which involve w
w = w + 1
return z + w

"UnboundLocalError". Oops.


Now, I cheerfully admit that this scenario is contrived. Some people might
even argue that it is good for newbies to run into this error sooner
rather than later, but for those who don't think so, defensively inserting
a global statement might help prevent the issue from coming up.

I'm a big believer in letting newbies walk before running. I'd rather see
beginner programmers over-use global than under-use it. You're welcome to
disagree, but since UnboundLocalError seems to be one of the more
perplexing errors newbies suffer from, I think it is better for them to
avoid it until they've got a little more experience.
 
A

Alex Martelli

Steven D'Aprano said:
No, repeating it ninety-seven times doesn't make it easier to read, it
makes it *harder* to read, and I'm sure I don't need to explain why.

You do, if you claim you're consistent in your "never _wrong_"
assertion. How many totally useless repetitions are "never wrong"? I
claim: ZERO; even ONE totally useless statement is one too many. You
appear to want to put the bar elsewhere, but then it's not clear WHERE.

That's debatable. Why does Python have decorators when there was already a
perfectly usable syntax for setting a method to function(method)? And
let's not even mention x += 1 etc.

A decorator "takes away" some redundancy and repetition from an
important idiom:

@deco
def somefunction ...
...

stands for
def somefunction ...
...
somefunction = deco(somefunction)

where 'somefunction' needed to be repeated THREE times. Similarly,
though not quite as dramatically,

somevariable += whatever

can stand for

somevariable = somevariable + whatever

again taking away one (now needless) repetition. (Actually, the
existence of inplace addition for some types makes += even more useful,
as it polymorphically resolves to "what it _should_" resolve to:).

In sharp and enormous contrast, the construct you're defending ADDS
useless repetition -- a statement that carries NO relevant information
whatsoever, and in particular needlessly repeat a variable name.

How can you SERIOUSLY claim ANY equivalence between constructs that
REMOVE redundancy, and your defense of one that ADDS it?!

Sure, in the *specific* example given, the body of the function was so
short that it would be a pretty poor developer who didn't know it was a
global.

But in a more substantial function, one using lots of variables, it might
not be clear which were global and which weren't unless you studied the
code, line-by-line.

If a function is so long and complicated, and in particular uses so many
variables, that you lose track of what variables are local and which
ones global, then adding a 'global' statement (or more) is tantamount to
putting a bandaid over a large, gaping wound that's bleeding copiously
and unarrestably. Forget such pitiful attempts at half-hearted kludgey
"remedies", and refactor mercilessly -- that one, huge, hopelessly
confused and confusing function, MUST become a (hopefully small) set of
small, shiny, crystal-clear ones. In this sense, the presence of a
totally useless "global" statement, which may have been used in the vain
hope of effecting such unworkable "remedy", is yet another red flag
waving -- it may indicate the programmer suspects he's overflowed the
boundaries of good taste and maximum sensible complication, while
lacking the guts to do the refactoring such a situation desperately
calls for.

I'm not going to defend *any* of those practices. But I don't think
explicitly stating that a name is global, even when strictly unnecessary,
is in the same category. In practice, I wouldn't do so for a function that
was as short as the one the Original Poster used.

I think it's an even more horrible practice than all others I list
(except for my later note on comments that lie). Not only would I never
use it, but I would never tolerate it in any way, shape, or form: I
would not pass a code review for any code using it, if a book used it or
defended it I would recommend to all potential readers to avoid the
book, if I was teaching a programming course and a student used it I
would fail the student, if I was interviewing a programming candidate
and the candidate used it I would not hire thre candidate, and so on,
and so forth.

But consider also something like this:

def func():
x, y = 1, 2
z = x + y
# lots more code doing many things here
# some of which involve w
return z + w

Let's pretend that there is sufficient code in there that it isn't obvious
at a glance that w is a global, okay?

If there's too much code before the FIRST use of w, so that it's not
obvious that w is never set before use (and thus must be global), then
'func' is too big and complicated.

There's a off-by-one error in the code, which we fix:

def func():
x, y = 1, 2
z = x + y
# lots more code doing many things here
# some of which involve w
w = w + 1
return z + w

"UnboundLocalError". Oops.

Now, I cheerfully admit that this scenario is contrived. Some people might
even argue that it is good for newbies to run into this error sooner
rather than later, but for those who don't think so, defensively inserting
a global statement might help prevent the issue from coming up.

Do you realize what you're advocating? A 'global w' would make that "w
= w + 1" (better expressed as "w += 1", of course) into a POLLUTION OF
THE GLOBAL NAMESPACE -- instead of an error easily identified by the
runtime (and easily fixed by removing that silly statement and using
insread a "return z + w + 1"), now thanks to the "defensively inserting"
you're ADVOCATING, you're having an exceedingly subtle shift into the
semantics of your whole program, including future calls to func and any
other function that uses that horridly-named 'w' (single-letter global
names are hardly a defensible practice, either).

If the coder has no clue as to whether w is local or global, then that
coder MOST DEFINITELY has no business whatsoever in REBINDING the
(global!) name w. "defensively" (HA!) putting yourself into a position
where you may end up rebinding global names WITHOUT KNOWING YOU'RE DOING
SO, i.e. POLLUTING the global namespace, is an even stronger reason that
I'd yet advanced for loathing those ``useless'' global statements, and
strongly confirms my unbounded detestation of them.

I'm a big believer in letting newbies walk before running. I'd rather see
beginner programmers over-use global than under-use it. You're welcome to
disagree, but since UnboundLocalError seems to be one of the more
perplexing errors newbies suffer from, I think it is better for them to
avoid it until they've got a little more experience.

I think encouraging newbies to overuse globals is a horrible practice,
and if you're ever teaching to newbies you're seriously damaging their
chance to learn to program decently, or at least ensuring they'll take a
needlessly long time for that learning. And I've both taught and
practiced programming at a huge variety of levels, as well as quite
successfully, so I do NOT believe your opinions on this subject should
carry exactly the same weight as mine -- to be frank, I believe that in
this case your opinion has a NEGATIVE worth, and that by putting it in
practice you're seriously hurting the people you think you're helping.


Alex
 
K

Kay Schluehr

Since that global statement is utterly useless
(it's impossible to read and understand any substantial amount of Python
code without realizing that accessing a variable not locally assigned
means you're accessing a global, so the "self-documenting" character
claimed for that redundancy is quite fallacious), it IS perfectly
suitable to take away, and so it's at least a serious imperfection.

Allow me a pun: self is pretty self-documenting.

With Python 3.0 we finally get even two declarations for accessing a
name for assignment from an outer scope. Maybe the Python Zen should
be altered in the following way:

- Namespaces are one honking great idea -- let's do more of those!
+ Accessor declaratives are one honking great idea -- let's do more of
those!

Kay
 
S

Steven D'Aprano

On Sat, 23 Jun 2007 21:11:42 -0700, Alex Martelli wrote a lot, with lots
of YELLING.

Wow.

What can I say?

Given the amount of SHOUTING in your post, and the fact that you feel so
strongly about the trivial question of the redundant use of the global
statement that you would "fail a student" who did it -- even if they did
everything else correctly, efficiently and elegantly -- it seems to me
that you are beyond rational discussion on this subject.

Perhaps you should consider writing a PEP to make the redundant use of the
global statement a compile-time error? Then there would be absolutely
no doubt in anyone's mind that it is _wrong_ (and not just unnecessary or
redundant) to use the global statement in the circumstances discussed.

Then we can move on to removing floats of the form 1.e0, unary-plus on
numeric types, and string-slices like s[:].

I'm not sure where you got the idea that I'm "encouraging newbies to
overuse globals", when I wrote in an earlier post to this same thread:

"However, using global variables is almost always a bad idea. Passing
parameters around is really cheap in Python, that's almost always a better
solution."

If you manage to calm down enough to answer without exaggerating and
misrepresenting my views, I would like to hear your opinion of whether the
following two functions are equally as wrong:

def f1(gizmo):
global spam # holds the frommet needed for the gizmo
gizmo.get_frommet(spam)

def f2(gizmo):
# global spam holds the frommet needed for the gizmo
gizmo.get_frommet(spam)

I'm sure they're both wrong, but I'd like to know if there are degrees of
wrongness.
 
M

Marc 'BlackJack' Rintsch

Steven said:
Perhaps you should consider writing a PEP to make the redundant use of the
global statement a compile-time error?

Sometimes I wished that it would be a compile time error or at least
triggering a warning when ``global`` is used at module level. It seems a
common error from people used to declare variables at that level in other
languages.

Ciao,
Marc 'BlackJack' Rintsch
 
S

Scott David Daniels

Steven said:
On Sat, 23 Jun 2007 21:11:42 -0700, Alex Martelli wrote a lot, with lots
of YELLING.

Given the amount of SHOUTING in your post, and the fact that you feel so
strongly about the trivial question of the redundant use of the global
statement that you would "fail a student" who did it -- even if they did
everything else correctly, efficiently and elegantly -- it seems to me
that you are beyond rational discussion on this subject.

I, for one, appreciate a second voice suggesting that your (Steve's)
vehement rejection of my technically correct and non-condemnatory post
explaining that one use of global in the OP's code was superfluous.

You said (in the previous post):
> That's technically true, but declaring it with global makes the code
> self-documenting and therefore easier to read.
>
> It's never _wrong_ to use the global statement, even if it is strictly
> unnecessary for the Python compiler.

Your post led a newbie to presume the extra use of global was "good
style," while I think you'll find there is no such consensus.

--Scott David Daniels
(e-mail address removed)
 
A

Alex Martelli

Scott David Daniels said:
I, for one, appreciate a second voice suggesting that your (Steve's)
vehement rejection of my technically correct and non-condemnatory post
explaining that one use of global in the OP's code was superfluous.

Glad to hear this! I think the root of the issue is in learning to read
"superfluous" as a NEGATIVE word -- follow Occam, and learn to not
multiply entities beyond need:).

You said (in the previous post):

Your post led a newbie to presume the extra use of global was "good
style," while I think you'll find there is no such consensus.

I concur: having discussed style issues at many Python shops, I'm quite
convinced that the general consensus is closer to the "redundant is bad"
approach. Exhaustively listing all of the redundancies that are to be
eschewed would of course take far too long; a more common approach is to
try to identify those extremely few cases where redundancy IS explicitly
deemed OK (and leave all other redundancies intrinsically disapproved).

The cases I've seen with reasonable frequency for accepting certain
redundancies basically boil down to accepting some "redundant
parentheses". Python has many levels of priorities in expressions, and
while they do tend to work "just right" there are always some corner
cases where even a frequent Python coder MAY feel uncertain for a moment
(and these uncertainties grow for coders that also have to use, e.g., C,
or Fortran, &c, frequently). So, spelling things out as, e.g.,
(-a) ** b
versus
-(a ** b)
is not unreasonable (vs just coding '-a**b' and relying on the reader to
know exactly which of the two cases applies). An important subcase has
to do with tuples -- while I personally prefer to use parentheses around
tuples only where they're indispensable, I understand the opposite
stance, where parentheses are always placed around tuples (it may be
hard to memorize exactly all cases where they're required, e.g. when the
tuple is the expression in a listcomp...).

A more debatable case, IMHO, is slicing (and the related cases of range
and xrange). Do you ever write x[0:N:1], xrange(0, N), etc? Or are the
simpler x[:N], xrange(N), etc, always to be preferred? This is one of
the few cases where I've seen group consensus fail to emerge in
discussions about Python style even in close-knit teams...


Alex
 
M

Michele Simionato

On Jun 24, 1:29 pm, Steven D'Aprano
I would like to hear your opinion of whether the
following two functions are equally as wrong:

def f1(gizmo):
global spam # holds the frommet needed for the gizmo
gizmo.get_frommet(spam)

def f2(gizmo):
# global spam holds the frommet needed for the gizmo
gizmo.get_frommet(spam)

I'm sure they're both wrong, but I'd like to know if there are degrees of
wrongness.

I am not Alex Martelli, but I will tell you my opinion anyway.
To me f2 is not wrong: at worse you can say that the comment
is redundant since it is already clear from the code that
spam is a global, but it is not a big deal. As a code
reviewer I would not have had issues with f2. OTOH I would
have had serious issues with f1. Since the global
statement in correct Python code is solely used to declare
that a global variable is being set in an inner scope, I
would have to guess that:

1. function f1 wrong; maybe the author cut and pasted it
from someplace, forgetting the line where the global
variable spam was set;

2. maybe f1 is right, but then the author forgot to remove
the global declaration after the cut & paste;

3. the author does not know Python, and he believes that he
has to use global to denote the fact that the method
gizmo.get_frommet(spam) is setting a global variable.

So I would have had to look at get_frommet to see that actually
'spam' is not set there, and finally I would have reached the
conclusion that

4. the author was completely wrong and used global without
knowing its meaning.

All that analysis would have cost me some time, potentially
a lot of time depending on the complexity of the code, and
all that time would have been wasted time.
So f1 is misleading code, and I consider misleading code
actually *worse* than wrong code, since it makes you waste
your time without a good reason.


Michele Simionato
 
S

Steven D'Aprano

On Jun 24, 1:29 pm, Steven D'Aprano

I am not Alex Martelli, but I will tell you my opinion anyway.
To me f2 is not wrong: at worse you can say that the comment
is redundant since it is already clear from the code that
spam is a global, but it is not a big deal. As a code
reviewer I would not have had issues with f2. OTOH I would
have had serious issues with f1. Since the global
statement in correct Python code is solely used to declare
that a global variable is being set in an inner scope, I
would have to guess that:

1. function f1 wrong; maybe the author cut and pasted it
from someplace, forgetting the line where the global
variable spam was set;

2. maybe f1 is right, but then the author forgot to remove
the global declaration after the cut & paste;

3. the author does not know Python, and he believes that he
has to use global to denote the fact that the method
gizmo.get_frommet(spam) is setting a global variable.

So I would have had to look at get_frommet to see that actually
'spam' is not set there,

Why do you do that? I'm not arguing that you shouldn't, but I'm trying to
understand your reasoning. Are you assuming (for the sake of the argument)
that there's a bug somewhere in the code? If you're trying to track down a
bug, you'll likely need to look at get_frommet regardless of the presence
or absence of the global statement. Or are you trying to analyze the
entire module? If so, you also have to dig into get_frommet.

(I repeat, I'm not saying you shouldn't, but I'm trying to understand why
you think the way you do.)

and finally I would have reached the
conclusion that

4. the author was completely wrong and used global without
knowing its meaning.

So you're with Alex that "redundant" == "wrong"?

I still can't my head around that. To me, redundant and wrong are
orthogonal, not synonyms.

This code is wrong but not redundant (assuming you have a need for such
a function):

def sin_deg(x):
"""Return the sine of x degrees."""
return math.sin(x/math.pi*180) # oops! should be x*math.pi/180


To me, this code is redundant but not wrong:

def sin(x):
return math.sin(x)

It's not wrong, because it does everything that it is supposed to do, and
nothing that it isn't supposed to do.

Am I wrong?
 
M

Michele Simionato

To me, this code is redundant but not wrong:

def sin(x):
return math.sin(x)

It's not wrong, because it does everything that it is supposed to do, and
nothing that it isn't supposed to do.

I told you, redundant/useless/misleading/poor code is worse than
wrong: wrong code
speaks (you see the bug, you have no choice but to fix it) whereas
redundant
code is silent: you see how damaging it is only when doing
maintenance, i.e. too late,
so it tends to perpetuate itself forever (whereas a bug *has* to be
fixed, otherwise
the application does not work).


Michele Simionato
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Staff online

Members online

Forum statistics

Threads
473,995
Messages
2,570,230
Members
46,816
Latest member
SapanaCarpetStudio

Latest Threads

Top