def a((b,c,d),e):

A

AdSR

Fellow Pythonistas,

Please check out

http://spyced.blogspot.com/2005/04/how-well-do-you-know-python-part-3.html

if you haven't done so yet. It appears that you can specify a function
explicitly to take n-tuples as arguments. It actually works, checked
this myself. If you read the reference manual at
http://docs.python.org/ref/function.html
really carefully, you will find that it is indeed part of the language
spec, but it's a likely candidate for the least advertised Python
feature. Small wonder since it looks like one of those language
features that make committing atrocities an order of magnitude easier.

Has anyone actually used it in real code?

Cheers,

AdSR
 
S

Simon Percivall

You can always unpack a tuple that way, like in:

..>>> import sys
..>>> for (index, (key, value)) in enumerate(sys.modules.iteritems()):
pass
 
M

Michael Spencer

AdSR said:
Fellow Pythonistas,

Please check out

http://spyced.blogspot.com/2005/04/how-well-do-you-know-python-part-3.html

if you haven't done so yet. It appears that you can specify a function
explicitly to take n-tuples as arguments. It actually works, checked
this myself. If you read the reference manual at
http://docs.python.org/ref/function.html
really carefully, you will find that it is indeed part of the language
spec, but it's a likely candidate for the least advertised Python
feature.
See also the source of inspect.getargs for just how much this complicates the
argument-passing logic!
> Small wonder since it looks like one of those language
> features that make committing atrocities an order of magnitude easier.
>
> Has anyone actually used it in real code?

It appears in a handful of places in the stdlib, mostly tests:
#Search C:\Python23\Lib
# Files *.py
# For def [\w]+\(\(
c:\python23\lib\test\test_compile.py(49) def comp_args((a, b)):
c:\python23\lib\test\test_compile.py(53) def comp_args((a, b)=(3, 4)):
c:\python23\lib\test\test_grammar.py(159) def f5((compound, first), two): pass
c:\python23\lib\test\test_scope.py(318) def makeAddPair((a, b)):
c:\python23\lib\test\test_scope.py(319) def addPair((c, d)):
c:\python23\lib\site-packages\wx-2.5.3-msw-ansi\wx\lib\imageutils.py(36) def
makeGray((r,g,b), factor, maskColor):
c:\python23\lib\cgitb.py(82) def html((etype, evalue, etb), context=5):
c:\python23\lib\cgitb.py(168) def text((etype, evalue, etb), context=5):
c:\python23\lib\urlparse.py(118) def urlunparse((scheme, netloc, url, params,
query, fragment)):
c:\python23\lib\urlparse.py(127) def urlunsplit((scheme, netloc, url, query,
fragment)):
 
F

Fredrik Lundh

AdSR said:
Small wonder since it looks like one of those language features
that make committing atrocities an order of magnitude easier.

eh?

def f((a, b)):
...

is short for

def f(tmp):
a, b = tmp
...

if you think this is an "atrocity", maybe programming isn't for you.
Has anyone actually used it in real code?

yes. grep the standard library for a number of typical use cases.

</F>
 
F

Fredrik Lundh

Michael said:
See also the source of inspect.getargs for just how much this complicates the
argument-passing logic!

it doesn't complicate the argument-passing in any way at all -- it complicates
the reverse engineering code a bit, though, since it has to convert the bytecode
for

def f(tmp):
a, b = tmp
...

back to the original source form

def f((a, b)):
...

but that has absolutely nothing whatsoever to do with how argument passing
works at run time.

</F>
 
A

AdSR

if you think this is an "atrocity", maybe programming isn't for you.

My resume might suggest otherwise but I guess that's not the main topic
here. Maybe I got carried away -- this one took me completely by
surprise.

Anyway, this gets interesting:

def z(((a, b), (c, d)), (e, f)):
pass

although I see that it could be perfectly valid in some contexts.

Cheers,

AdSR
 
D

Diez B. Roggisch

AdSR said:
Fellow Pythonistas,

Please check out

http://spyced.blogspot.com/2005/04/how-well-do-you-know-python-part-3.html

if you haven't done so yet. It appears that you can specify a function
explicitly to take n-tuples as arguments. It actually works, checked
this myself. If you read the reference manual at
http://docs.python.org/ref/function.html
really carefully, you will find that it is indeed part of the language
spec, but it's a likely candidate for the least advertised Python
feature. Small wonder since it looks like one of those language
features that make committing atrocities an order of magnitude easier.

Has anyone actually used it in real code?

Yes, but usually not so much in function arguments but more in
list-comprehensions or other places where unpacking was useful. I love the
feature - I just don't have nested enough data to use it more :)

What python offers in this respect can be seen as a limited form of
pattern-matching known from functional programming - and instead of beeing
considered an atrocity it's actually frequently requested to be enhanced.
 
A

AdSR

Yes, but usually not so much in function arguments but more in
list-comprehensions or other places where unpacking was useful. I love the
feature - I just don't have nested enough data to use it more :)

I use tuple unpacking in its typical uses, it's one of the first
language features I learned about. Somehow it never occurred to me that
you could use it in function arguments this way - I only knew f(*args,
**kwargs) style in this context. That's what I made the whole fuss
about...

AdSR
 
?

=?iso-8859-1?Q?Fran=E7ois?= Pinard

[Diez B. Roggisch]
AdSR wrote:
It appears that you can specify a function explicitly to take
n-tuples as arguments. [...] Has anyone actually used it in real
code?

I do not use it often in practice, but sometimes, yes. If the feature
was not there, it would be easy to do an explicit tuple unpacking from
the argument at the start of the function. It allows me to spare
inventing a name for the compound formal argument. :)
[...] as a limited form of pattern-matching [...]

In one application, written long ago, I had a flurry of functions
kept in a list, which were tried in turn until the call does not
raise an Exception, so indicating a match. Since then, I used other
means, first hoping to save at least the time it takes for creating a
traceback object (but I did not time it), and later trying to get a
better-than-linear time while trying to find the correct function.

I never considered the feature to be atrocious. Implicit tuple
unpacking occurs in a few places in Python already, it is only elegant
that it also occurs while functions receive their argument. The most
useful place for implicit tuple unpacking, in my experience, is likely
at the left of the `in' keyword in `for' statements (and it is even
nicer when one avoids extraneous parentheses).
 
J

John Machin

Fellow Pythonistas,

Please check out

http://spyced.blogspot.com/2005/04/how-well-do-you-know-python-part-3.html

if you haven't done so yet. It appears that you can specify a function
explicitly to take n-tuples as arguments. It actually works, checked
this myself. If you read the reference manual at
http://docs.python.org/ref/function.html
really carefully, you will find that it is indeed part of the language
spec, but it's a likely candidate for the least advertised Python
feature. Small wonder since it looks like one of those language
features that make committing atrocities an order of magnitude easier.

Thanks for pointing this out. However I see no atrocity potential here
-- what did you have in mind?

See below. Better documentation in the "def" (even better than having
say "year_month_day" instead of my lazy "dt_tup"). No overhead;
byte-code is the same.
.... year, month, day = dt_tup
.... return
.... 2 0 LOAD_FAST 0 (dt_tup)
3 UNPACK_SEQUENCE 3
6 STORE_FAST 3 (year)
9 STORE_FAST 1 (month)
12 STORE_FAST 2 (day)

3 15 LOAD_CONST 0 (None)
18 RETURN_VALUE.... return
.... 1 0 LOAD_FAST 0 (.0)
3 UNPACK_SEQUENCE 3
6 STORE_FAST 1 (year)
9 STORE_FAST 2 (month)
12 STORE_FAST 3 (day)

2 15 LOAD_CONST 0 (None)
18 RETURN_VALUE
Cheers,
John
 
D

Diez B. Roggisch

AdSR said:
I use tuple unpacking in its typical uses, it's one of the first
language features I learned about. Somehow it never occurred to me that
you could use it in function arguments this way - I only knew f(*args,
**kwargs) style in this context. That's what I made the whole fuss
about...

Well, if you think about it the whole positional argument passing is nothing
more than tuple unpacking. Its like having an anonymous variable that gets
unpacked:

def foo(a,b,c = i_m_so_anonymous):
pass


So it's just orthogonal to have the full functionality of unpacking
available for function arguments - including the nested tuples.
 
A

AdSR

Thanks for pointing this out. However I see no atrocity potential
here
-- what did you have in mind?

Bad choice of words. I meant obfuscated, something like

def z(((a, b), (c, d)), e, f):
pass

but much worse. But it looks like there is nothing unusual about it
after all. Oh, well...

AdSR
 
G

George Sakkis

François Pinard said:
The most useful place for implicit tuple unpacking, in my experience,
is likely at the left of the `in' keyword in `for' statements (and
it is even nicer when one avoids extraneous parentheses).

.... and would be nicest (IMO) if default arguments and *varargs were
allowed too; check http://tinyurl.com/dcb2q for a relevant thread.

George
 
?

=?iso-8859-1?Q?Fran=E7ois?= Pinard

[George Sakkis]
François Pinard wrote:
... and would be nicest (IMO) if default arguments and *varargs were
allowed too; check http://tinyurl.com/dcb2q for a relevant thread.

It's appealing, indeed, trying to create more uniformity between tuple
unpacking and argument passing. There are two approaches towards such
uniformity, either upgrading tuple unpacking (as the above thread
discusses) or downgrading argument passing (as suggested by those who
found atrocious the current behaviour).

I started recently to study the R system and language, and saw many good
ideas in there about argument passing. Translated in Python terms, it
would mean that `*varargs' and `**keywords' are not necessary last,
that named keywords may be intermixed with positional keywords, that
keywords may be abbreviated, and much more hairy, that the default
values for keywords are not pre-evaluated at `def' time, and that
the computation of actual expressions given as arguments is lazily
postponed until their first use within the function. It surely looks
all strange at first, but these choices are surprisingly productive in
practice, as I merely begin to understand. Curious minds may start at
http://cran.r-project.org/doc/manuals/R-lang.html#Arguments and read
down. I do not know if there will ever be cross-pollinisation between R
and Python, but I would guess good things might came out of this...
 
G

George Sakkis

François Pinard said:
I started recently to study the R system and language, and saw many good
ideas in there about argument passing. Translated in Python terms, it
would mean that `*varargs' and `**keywords' are not necessary last,
that named keywords may be intermixed with positional keywords, that

This would be neat indeed. Occasionally I come across situations where
I wished to be able to specify default arguments after positional, as
for example in "def accumulate(*items, default=0)". The current
possible workarounds are:
- pass an iterable instead of positional arguments: "def
accumulate(items, default=0)". This is not always elegant, especially
if the function is called with few independently derived items (as
opposed, for example, to items derived by a list/generator
comprehension, which is already an iterable).
- pass named keywords instead of default: "def accumulate(*items,
**kwds)". I tend to think of **kwds as a nice feature for functions
with almost open-ended functionality, that intend to be extended in the
future with more customization options. In more typical cases though of
a function with one or two defaults, using **kwds obscures the
function's signature without a visible benefit. Also, I'm not sure of
the efficiency penalty imposed by constructing and passing a dict of
length 1 or 2 instead of default arguments.
- Put the default(s) before the positional arguments: "def
accumulate(default=0, *items)". This practically negates the purpose of
using defaults in the first place since the first passed argument is
bounded to default.

Allowing non-default arguments after *varargs doesn't make sense, but
it does for default arguments. The parameter binding rule would just
need to be augmented so that default parameter arguments after *varargs
would be bounded to their default value unless specified explicitly in
the call as named arguments:
accumulate(1,2) == accumulate(1,2 default=0) and
accumulate([1],[2],default=[]) == accumulate([1],default=[], [2]) ==
accumulate(default=[], [1], [2])
keywords may be abbreviated, and much more hairy,

Hmm.. -1 on this. It may save a few keystrokes, but it's not good for
readability and maintenability.
that the default
values for keywords are not pre-evaluated at `def' time, and that


Definitely a +1 on this. I've always seen pre-evaluation as a wart,
especially with respect to mutable default values. If some sort of
state needs to be preserved between calls, the right way is to
encapsulate it in class, not in a default value.
the computation of actual expressions given as arguments is lazily
postponed until their first use within the function.

Is this like an implicit lambda before each argument ? If so, why does
it have to be restricted to function arguments ? It seems to me that
argument passing and lazy evaluation are orthogonal dimensions. Let's
keep the "should python become lazy ?" question for a future thread :)

Concerning default argument expressions, a neat feature would be to
allow an expression to refer to other arguments, as in "def
foo(a,b,s=a+b)" instead of the current workaround which goes like:
def foo(a,b,s=None):
if s is None: s = a+b
Or for a more exotic use:
def prologLikeSum(a=s-b, b=s-a, s=a+b):
return a,b,s
prologLikeSum(1,2) == prologLikeSum(1,s=3) == prologLikeSum(b=2,s=3) ==
(1,2,3)

This seems pretty hard to change though. For one thing, it would
require new syntax to denote which names in the expression refer to
other arguments instead of the enclosing scope. Also the binding of
arguments to values would no more be considered "parallel"; the order
of the bindings would be significant, and even worse, it would have to
be computed for each call, as the prologLikeSum example shows.

Probably-I'm-just-rambling-ly yrs
George
 
?

=?iso-8859-1?Q?Fran=E7ois?= Pinard

[George Sakkis]
Allowing non-default arguments after *varargs doesn't make sense,

In R, one may use the ... format argument anywhere in the argument list,
but may later test if a non-default argument was provided or not. I
tend to write R a bit like I would write Python, but hopefully, I'll
eventually understand R enough that I could break that habit.
Hmm.. -1 on this. It may save a few keystrokes, but it's not good for
readability and maintenability.

That was my first impression too. Yet, interactively, I found that
feature very convenient. Already with Python, I use shortcuts
interactively that I would not write nor keep in real, saved, permanent
programs. And for very common functions and features, which are not
really fluctuating anymore, some abbreviations became well known idioms.
Is this like an implicit lambda before each argument ?

Not exactly true, but that's surely a way of understanding it. Such
things are not new: I first saw them in Algol-60, except that once an
argument has been evaluated, it is cached and not evaluated again. It's
true that in R, much more than in Python, evaluation of an argument may
often be computationally expensive.

Moreover, in R, the initial writing of the argument is preserved on
the form of a parsed tree which can be operated upon. This allow
for strange things (at least for a Python eye), like computing the
symbolic derivative of an argument. I toyed with this facility to build
mathematical images, and then, animations. For an exemple, see:

http://pinard.progiciels-bpi.ca/plaisir/NRart/

and from there, click on `nr.image' near the end.
If so, why does it have to be restricted to function arguments ? It
seems to me that argument passing and lazy evaluation are orthogonal
dimensions.

In R, laziness is automatic while calling functions, but not otherwise,
and from what I saw so far, less meaningful in other contexts anyway.
However, I think laziness is available explicitely if needed (there is
library function that returns a "promise" of its argument).
Or for a more exotic use:
def prologLikeSum(a=s-b, b=s-a, s=a+b):
return a,b,s
prologLikeSum(1,2) == prologLikeSum(1,s=3) == prologLikeSum(b=2,s=3) ==
(1,2,3)

This is exactly how R people use the feature most of the times (at least
so far that I naively saw, as I'm still pretty new at all this).
This seems pretty hard to change though.

Oh, I would not even dream about it for Python. The idea would have to
make its way first within the developers, and this might take years, if
ever. The best I (we) could do is keep the idea in the air, for a good
while. It would likely never survive all the debates it would generate.
But who knows! :)
[...] and even worse, [bindings] would have to be computed for each
call, as the prologLikeSum example shows.

Yet, already, as it stands, argument passing in Python is not innocuous.
A bit more, a bit less, nobody would notice! :)
 
G

George Sakkis

François Pinard said:
That was my first impression too. Yet, interactively, I found that
feature very convenient. Already with Python, I use shortcuts
interactively that I would not write nor keep in real, saved, permanent
programs. And for very common functions and features, which are not
really fluctuating anymore, some abbreviations became well known
idioms.

Yes, interactive use can be very different from saved modules to be
used, read and modified in the future. However if such a feature is
part of the language, it is up to the programmer's experience and
responsibility to use it only interactively. As for interactive
shortcuts, I've been using IPython for some months now as my standard
interpreter; with features such as tab completion, logged history,
'macros' and integration with the shell, shortcuts are everywhere ! Tab
completion does not work for named function arguments yet, but it
should be possible with introspection.

George
 
B

Bengt Richter

.=2E. and would be nicest (IMO) if default arguments and *varargs were
allowed too; check http://tinyurl.com/dcb2q for a relevant thread.
You can be a little devious about the left of the 'in' in 'for' statements:

----< tupk.py >------------------------------
class Tupk(object):
def _fset(self, arg):
# implement unpacking (a, (x, y='default'))
self.a = arg[0]
if type(arg[1]) is not tuple: # accept (a, x) in place of (a,(x,))
self.x = arg[1]
self.y = 'default for non-tuple'
else:
self.x = arg[1][0]
self.y = len(arg[1])==2 and arg[1][1] or 'default'
u = property(fset=_fset)
def __iter__(self):
return iter((self.a, self.x, self.y))

def test():
upk = Tupk()
for upk.u in [(1,(2,3)), (4,(5,)), (7,8)]:
print upk.a, upk.x, upk.y
upk.u = (9,10)
a,b,c = upk
print 'a=%r, b=%r, c=%r' % (a,b,c)
print list(upk), tuple(upk)

if __name__=='__main__': test()
---------------------------------------------
Output: 1 2 3
4 5 default
7 8 default for non-tuple
a=9, b=10, c='default for non-tuple'
[9, 10, 'default for non-tuple'] (9, 10, 'default for non-tuple')
You could obviously give Tupk an __init__(fmt, defaults) method that would accept
an unpacking spec like 'a, (x, y=%0))', [<default-value 0>]
And give its instances a __call__ method so you can use it like
a,b,c = upk('a, (x, y=%0))', [555])((7,8)) => a,b,c == (7, 8, 555)

How useful how often though?

Regards,
Bengt Richter
 
G

Greg Ewing

AdSR said:
if you haven't done so yet. It appears that you can specify a function
explicitly to take n-tuples as arguments.

Has anyone actually used it in real code?

Yes. In PyGUI I have some point and rectangle manipulation
utilities that do things like

def add_pt((x1, y1), (x2, y2)):
return (x1 + y1, x2 + y2)

In cases like this, it can help to make things more concise
and probably also slightly more efficient.
> it looks like one of those language features that make
> committing atrocities an order of magnitude easier.

I don't remember ever being seriously burned by using it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,236
Messages
2,571,183
Members
47,818
Latest member
KazukoXea6

Latest Threads

Top