Python syntax in Lisp and Scheme

K

Kenny Tilton

Pascal said:
Many programming languages require you to build a model upfront, on
paper or at least in your head, and then write it down as source code.
This is especially one of the downsides of OOP - you need to build a
class hierarchy very early on without actually knowing if it is going to
work in the long run.

Whoa! The MOP and CLOS went to a lot of trouble to create an OOP for
Lisp that lived up to the Lisp heritage of figuring things out as we go.
I am forever refactoring class hierarchies, dragging slots from here to
there, adding some, erasing others, changing initforms and inheritance.
Best of all I can do all this for a couple of hours after landing in a
backtrace and then simply pick an appropriate stack frame from which to
restart and all the existing instances adjust themselves on the fly.

kenny
 
R

Raffael Cavallaro

Matthias Blume said:
Most of the things that macros can do can be done with HOFs with just
as little source code duplication as with macros.

Most, but not all. From <http://okmij.org/ftp/papers/Macros-talk.pdf>

"One sometimes hears that higher-order functions (and
related non-strictness) make macros unnecessary. For
example, In Haskell, 'if' is a regular function. However,
every language with more syntax than lambda-calculus
has phrases that are not expressions. Examples of such
second-class forms are: type, module, fixity and other
declarations; binding forms; statements. Only macros
can expand into a second-class object. The result of a
function is limited to an expression or a value."

(And with macros
only the source code does not get duplicated, the same not being true
for compiled code. With HOFs even executable code duplication is
often avoided -- depending on compiler technology.)

So you're willing here to trade code size for readability. The pro-macro
camp (myself included) find that macros make source code easier to read
and write than the equivalent HOF solution. We're willing to trade that
ease of use for a little compiled code size, especially when this means
you can write your code in what amounts to a domain specific language.

This is false. Writing your own macro expander is not necessary for
getting the effect. The only thing that macros give you in this
regard is the ability to hide the lambda-suspensions.

But this hiding of the lambda-suspensions is the whole point. Why look
at how the code works unless you have to? Why not work in a syntax, a
domain specific language, that matches the problem? Put the complexity
into one place (the macro) and make the rest of the code easier to
write, and clearer to read.

For me, macros are about making the code one writes match the problem
one is thinking about. HOFs seem to me to be about looking cleverly
functional, not making the code look like the problem domain.
 
G

Greg Ewing (using news.cis.dfn.de)

Alex said:
Besides,
"if you want PL/I you know where to find it" has nice precedents (in
the only other language which was widely successful in the real world
while adhering to "provide only one way to perform an operation" as
one of its guiding principles -- not perfectly, but, close enough:).

Pardon? Wasn't PL/I the language that had two wildly different
syntaxes for declaring variables, one bearing a close resemblance
to Fortran, and the other looking suspiciously like Cobol?
(All right, two... there are *two* ways to do it...)
 
B

Bengt Richter

Bengt Richter wrote:
...

In theory, yes, I think it could (and wrt my similar idea with 'do' has
the advantage of not requiring a new keyword). In practice, trying to
hack the syntax to allow it seems a little nightmare. Wanna try your
hand at it? I'm thinking of Grammar/Grammar and Modules/parsermodule.c ...
Also tokenizer.c, so as not to ignore indentation when tokenizing a nameless
def inside a bracketed expression where (in|de)dents are otherwise ignored.

The thing is, the current tokenizer doesn't know def from foo, just that they're
names. So either indenting has to be generated all the time, and the job of
ignoring it passed on upwards, or the single keyword 'def' could be recognized
by the parser in a bracketed context, and it would generate a synthetic indent token
in front of the def name token as wide as if all spaces preceded the def, and then
continue doing indent/dedent generation like for a normal def, until the def suite closed,
at which point it would resume ordinary expression processing (if it was within brackets --
otherwise is would just be a discarded expression evaluated in statement context, and
in/de/dent processing would be on anyway. (This is speculative until really getting into it ;-)
Special-casing on a keyword in the tokenizer might be a practical implementation shortcut,
but it wouldn't be very aesthetic ;-/

IWT changes also in the compiler/code generator so it can handle generating code
for an anonymous def which will plug in like lambda in an expression, as
opposed to binding a name in a statement context, hopefully a slight change,
since it would be stacking a code object and calling makefunction either way.
The anonymous def just won't do the store to bind a name. Generating the code
object for the nameless def should be identical to the normal def IWT,
almost by definition ;-)

But the whole thing will be a bit of a chore I'm sure ;-)
Would it have a chance of getting adopted, do you think?

(Recent [cross]postings from the friendly :)))) people got me to
playing with writing a little toy scheme environment in python, which is sooo
pleasant compared to using MASM on a 16-mhz 386 with 2mb ram (whoo, time flies).

Regards,
Bengt Richter
 
D

Daniel P. M. Silva

Alex said:
Of course not -- but it *cannot possibly* do what Gat's example of macros,
WITH-MAINTAINED-CONDITION, is _claimed_ to do... "reason" about the
condition it's meant to maintain (in his example a constraint on a
variable named temperature), about the code over which it is to be
maintained (three functions, or macros, that start, run, and stop the
reactor), presumably infer from that code a model of how a reactor
_works_, and rewrite the control code accordingly to ensure the condition
_is_ in fact
being maintained. A callable passed as a parameter is _atomic_ -- you
call it zero or more times with arguments, and/or you store it somewhere
for later calling, *THAT'S IT*. This is _trivially simple_ to document
and reason about, compared to something that has the potential to dissect
and alter the code it's passed to generate completely new one, most
particularly when there are also implicit models of the physical world
being
inferred and reasoned about.

Don't fear code that rewrites code. Hell, if Psycho didn't exist (or if I
didn't think it did a good enough job), I'd sure like to say:

optimized:
for x in range(2**63):
pass

and let an 'optimized' special form rewrite my code into just "pass".

Or maybe I don't want to manually delete unexported functions in my modules:

clean_module exporting [a, b, d]
and exporting_from {some.module:[fun1,fun2]}:
import some.module
def a(): pass
def b(): pass
def c(): pass
def c(): pass

becomes:

import some.module
def a(): pass
def b(): pass
def c(): pass
def d(): pass
fun1 = some.module.fun1
fun2 = some.module.fun2
del c

Or cleanly write code using another module's namespace:

using_module some.module:
print x, some.other.module.y
using_module __main__:
print x

In this case using_module inspects the code and rewrites it, leaving all
qualified identifiers untouched but modifying the global references, so the
example becomes:

print some.module.x, some.other.module.y
print x

Is that so dangerous?
Without macros, when you see you want to design a special-purpose
language you are motivated to put it OUTSIDE your primary language,
and design it WITH its intended users, FOR its intended purposes, which
may well have nothing at all to do with programming. You parse it with a
parser (trivial these days, trivial a quarter of a century ago), and off
you
go.

Hmm, but isn't every program a language?

http://blogs.gotdotnet.com/emeijer/PermaLink.aspx
ea13a4da-7421-44af-99e8-fc86de84e29c

Guy Steele agrees:

http://java.sun.com/features/2003/05/steele_qa.html

////////////////////
Q: You have written that "a language design can no longer be a thing. It
must be a pattern -- a pattern for growth - a pattern for growing the
pattern for defining the patterns that programmers can use for their real
work and their main goal." You said that a good programmer does not just
write programs, but engages in language design, building on the frame of a
base language. Could you elaborate on this?

A: Sure. Every time you write a new function, a new method, and give it a
name, you have invented a new word. If you write a library for a new
application area, then the methods in that library are a collection of
related words, a new technical jargon for that application domain. Look at
the Collection API: it adds new words (or new meanings for words) such as
"add", "remove", "contains", "Set", "List", and "LinkedHashSet". With that
API added to Java, you have a bigger vocabulary, a richer set of concepts
to work with.

Some concepts are more powerful, more general, more widely used than others
-- "liberty" and "mortgage" are more widely used than "belly button ring"
or "faucet wrench". But every new word, every new meaning, every new idiom
enriches the language.
////////////////////////


But talk is cheap. Maybe my next project should be a Python preprocessor :)

- Daniel
 
R

Rayiner Hashem

From that point of view, "car" and "cdr" are as good
as anything!
Well, if you're going to call the thing a 'cons' you might as well go
all the way and use 'car' and 'cdr' as operators. A little flavor is
nice, although I think that "4th" would be easier to read than
"cadddr"...
 
P

Pascal Costanza

Rayiner said:
Well, if you're going to call the thing a 'cons' you might as well go
all the way and use 'car' and 'cdr' as operators. A little flavor is
nice, although I think that "4th" would be easier to read than
"cadddr"...

....but cadddr might not be "fourth". It might be some leaf in a tree. Or
something completely different. "fourth" doesn't always make sense.

(And just for the sake of completeness, Common Lisp does have FOURTH and
also (NTH 3 ...).)

Pascal
 
J

John Roth

Greg Ewing (using news.cis.dfn.de) said:
I don't think code blocks per se are regarded as a bad thing.
The problem is that so far nobody has come up with an entirely
satisfactory way of fitting them into the Python syntax as
expressions.

I know. I played around with the idea a bit after it came up a couple
of weeks ago, and identified a number of issues.

1. One code block, or a code block for any parameter?
This isn't as simple as it seems. Ruby does one code block
that is an implicit parameter to any method call, but in
Smalltalk any method parameter can be a code block.

2. How do you invoke a code block? Does it look just like
a function? I presume so. If you do one code block per
method call, though, it gets a bit sticky. Again, Ruby
uses a special keyword ('yield') to invoke such a code
block, while if code blocks were simply anon functions,
then it's a non-issue.

3. Expression or statement syntax? Ruby avoids the
problem by making its single code block a special
construct that immediately follows the method
call parameter list, and it doesn't have the chasm
between expression and statement syntax that's built
into Python.

4. Do we want it to be smoothly substitutable for
lambda? I presume so, simply based on the principle
of minimum surprise. Then that forces multiple
code blocks in a method, which in turn reduces
a lot of other issues.

5. Is uglyness really an issue? One of the major
discussion points (read: flame war issues) any time
expanding expression syntax comes up is that
expressions that are too long become unreadable
very rapidly.

So what I come up with at this point is twofold:

1. We need to be able to insert a code block in
any parameter, and

2. Code blocks need to have statement syntax.

So let's say I want to use a code block instead of
a lambda or a named function in a map:

foobar = map(def (x, y, z):
astatement
anotherstatement
list1, list2, list3)

This doesn't actually look anywhere near as bad
as I thought it might. The indentation, though, is a
bit peculiar. The first point is that the statements
in the code block are indented with respect to the
enclosing statement, NOT with respect to the first
word ('def') that starts the code block.

The second point is that the continuation of the
embedding expression has to dedent to close the
code block without closing the embedding statement,
and this has to be visually identifiable.

A third item is that I don't really care if we use 'def'
or not. Borrowing the vertical bar from Ruby, the map
example becomes:

foobar = map(| x, y, z |
astatement
anotherstatement
list1, list2, list3)

I kind of like this better, except for one really unfortunate
issue: it's going to raise havoc with code coloring algorithms
for a while.

John Roth
 
M

Marcin 'Qrczak' Kowalczyk

Most, but not all. From <http://okmij.org/ftp/papers/Macros-talk.pdf>

"One sometimes hears that higher-order functions (and
related non-strictness) make macros unnecessary. For
example, In Haskell, 'if' is a regular function.

It's not. It could easily be a regular function which would look like
'if condition branch1 branch2' and behave exactly the same (arguments
would often have to be parenthesized), but it's a keyword with the syntax
'if condition then branch1 else branch2' (condition and branches don't
have to be parenthesized because of 'then' and 'else' delimiters).
OTOH && and || are regular functions.
So you're willing here to trade code size for readability. The pro-macro
camp (myself included) find that macros make source code easier to read
and write than the equivalent HOF solution. We're willing to trade that
ease of use for a little compiled code size, especially when this means
you can write your code in what amounts to a domain specific language.

Note that Lisp and Scheme have a quite unpleasant anonymous function
syntax, which induces a stronger tension to macros than in e.g. Ruby or
Haskell.

In Haskell one often passes around monadic actions instead of anonymous
nullary functions, so it's not only the lambda syntax. Putting such action
in a function argument doesn't make it run. Laziness also reduces the
number of anonymous functions. Partial application doesn't require lambda,
binary operators can be partially applied on either argument. The 'do'
notation and list comprehensions are another case where other languages
would use anonymous functions. Yes, they are built in the language rather
than library features - but with all these things only few anonymous
functions remain and thus they are not so scary.

I happen to be in the other camp. Macros indeed make it easier to embed a
domain-specific language, OTOH they require the rest of the syntax to be
more regular than pretty (so they can examine code) and they make the
language and its implementations complicated. Just a tradeoff...
 
S

Stephen Horne

I used to think something like that would be more logical, too.
Until one day it occurred to me that building lists is only
one possible, albeit common, use for cons cells. A cons cell
is actually a completely general-purpose two-element data
structure, and as such its accessors should have names that
don't come with any preconceived semantic connotations.

From that point of view, "car" and "cdr" are as good
as anything!

"left" and "right" - referring to 'subtrees'?
 
P

Pascal Costanza

Stephen said:
"left" and "right" - referring to 'subtrees'?

Sure, why not?

(defun left (tree)
(car tree))

(defun right (tree)
(cdr tree))


;-)

Note: Why break anyone else's code just because you prefer a different
vocabulary?

(Yes, this is different from the Python mindset. What I have learnt from
this thread is that the languages might seem similar on the technical
level, but the "social" goals of the languages are vastly different.)


Pascal
 
P

Paolo Amoroso

[The original followup was to comp.lang.python. But since Alex mostly
discusses Lisp features, and we probably both don't subscribe to each
other's group, I follow up to both of them]

Alex said:
As in, no lisper will ever admit that a currently existing feature is
considered a misfeature?-)

Paul Graham is possibly the best known such lisper. You may check the
documents about the Arc dialect at his site.


[Pascal Costanza]
Of course not -- but it *cannot possibly* do what Gat's example of macros,
WITH-MAINTAINED-CONDITION, is _claimed_ to do... "reason" about the
condition it's meant to maintain (in his example a constraint on a variable
named temperature), about the code over which it is to be maintained
(three functions, or macros, that start, run, and stop the reactor),
presumably infer from that code a model of how a reactor _works_, and
rewrite the control code accordingly to ensure the condition _is_ in fact
being maintained. A callable passed as a parameter is _atomic_ -- you
call it zero or more times with arguments, and/or you store it somewhere
for later calling, *THAT'S IT*. This is _trivially simple_ to document and
reason about, compared to something that has the potential to dissect
and alter the code it's passed to generate completely new one, most
particularly when there are also implicit models of the physical world being
inferred and reasoned about. Given that I've seen nobody say, for days!,

The word "reason" looks a bit too AI-sh: macros do much more mundane
things. If I correctly understand Erann Gat's example in the nuclear
reactor context, things would work like this.

Some domain primitives--e.g. for controlling temperature,
starting/stopping the reactor, etc.--would be written by, or with the
help of, nuclear reactor experts. These primitives, typically
implemented as ordinary functions/classes, would embody a model of how
the reactor works. At this point, there's nothing different with what
would be done with other languages.

Now suppose you have a code module in which you have to "maintain" a
certain condition in the reactor. By "maintain" I mean arrange a
possibly long sequence of calls to domain primitives in such a way
that the condition is maintained (e.g. call the function that starts
the reactor with appropriate arguments, call functions for getting
temperature sensor readings with other arguments, check the
temperature readings and take appropriate decisions based on the
values, etc.). I guess this is also what would be done with other
languages--and Lisp.

A WITH-MAINTAINED-CONDITION macro would just provide syntactic sugar
for that possibly long statement/expression sequence for "maintaining"
the condition. That's it. It would typically accept parameters
describing the condition, and would generate the right sequence of
domain primitives with appropriate parameters.

WITH-MAINTAINED-CONDITION wouldn't have its own nuclear reactor model,
or other physical model. It would merely generate code templates,
mostly calls to ordinary functions, that the programmer would write
anyway (or put into a higher level function). Documenting such a macro
would be as easy as documenting the individual functions and/or an
equivalent function with internal calls to domain primitives.

Erann: is my understanding correct?

Alex: how would this way of using macros be dangerous?


Paolo
 
M

Michele Dondi

It's certainly true that mathematicians do not _write_
proofs in formal languages. But all the proofs that I'm
aware of _could_ be formalized quite easily. Are you
aware of any counterexamples to this? Things that
mathematicians accept as correct proofs which are
not clearly formalizable in, say, ZFC?

I am not claiming that it is a counterexample, but I've always met
with some difficulties imagining how the usual proof of Euler's
theorem about the number of corners, sides and faces of a polihedron
(correct terminology, BTW?) could be formalized. Also, however that
could be done, I feel an unsatisfactory feeling about how complex it
would be if compared to the conceptual simplicity of the proof itself.


Just a thought,
Michele
--
Comments should say _why_ something is being done.
Oh? My comments always say what _really_ should have happened. :)
- Tore Aursand on comp.lang.perl.misc
 
P

Pascal Bourguignon

Pascal Costanza said:
Sure, why not?

(defun left (tree)
(car tree))

(defun right (tree)
(cdr tree))

;-)

Wrong:

(defun left (tree) (car (car tree)))
(defun right (tree) (cdr (car tree)))
(defun label (tree) (cdr tree))
 
S

Sean Ross

John Roth said:
foobar = map(| x, y, z |
astatement
anotherstatement
list1, list2, list3)

I kind of like this better, except for one really unfortunate


Hi.
There was a short discussion along these lines back in June where I
mentioned the idea of using thunks, or something like that
http://groups.google.ca/groups?hl=en&lr=&ie=UTF-8&[email protected]

The idea goes something like this:

Let's take imap as an example

def imap(function, *iterables):
iterables = map(iter, iterables)
while True:
args = [i.next() for i in iterables]
if function is None:
yield tuple(args)
else:
yield function(*args)


Here, you would use imap as follows:

mapped = imap(lambda x: x*x, sequence)

My idea would be to define imap as follows:

def imap(&function, *iterables):
iterables = map(iter, iterables)
while True:
args = [i.next() for i in iterables]
if function is None:
yield tuple(args)
else:
yield function(*args)

and the use could be more like this:

mapped = imap(sequence) with x:
print x # or whatever
return x*x

with x: ... creates a thunk, or anonymous function, which will be fed as an
argument to the imap function in place of the &function parameter.

I also had some hazy notion of having a two way feed. Where a thunk that is
associated with an iterator function could be fed arguments, and executed on
each iteration. Something like this:

y = 0
itertools.count(10) do with x:
y += x
print y


Anyway. I was thinking that the foobar example would look cleaner if the
block did not have to be included directly as an argument to the function
call, but could instead be associated with the function call, tacked onto
the end, like so

foobar = map(list1, list2, list3) with x, y, z:
astatement
anotherstatement

or maybe

foobar = map(list1, list2, list3) { (x, y, z):
astatement
anotherstatement
}

or, if we want to be explicit:

foobar = map(&thunk, list1, list2, list3) with x, y, z:
astatement
anotherstatement

so that we know where the thunk is being fed to as an argument. But, this
would probably limit the number of blocks you could pass to a function.

Anyway, as I've said, these are just some fuzzy little notions I've had in
passing. I'm not advocating their inclusion in the language or anything like
that. I just thought I'd mention them in case they're of some use, even if
they're just something to point to and say, "we definitely don't want that".

Sean
 
J

Jon S. Anthony

<typical unthinking stuff>


You're unrelenting tenacity to remain ignorant far exceeds any
inclination on my part to educate.

/Jon
 
D

Dave Brueck

<typical unthinking stuff>


You're unrelenting tenacity to remain ignorant far exceeds any
inclination on my part to educate.

/Jon

+1 QOTW, under the comedy section.

-Dave
 
S

Stephen Horne

Note: Why break anyone else's code just because you prefer a different
vocabulary?

I wasn't really suggesting a change to lisp - just asking if they
might be more appropriate names.

Actually, I have been having a nagging doubt about this.

I had a couple of phases when I learned some basic lisp, years ago. A
bit at college in the early nineties, and IIRC a bit when I was still
at school in the mid eighties. This was well before common lisp, I
believe.

Anyway, I'd swear 'car' and 'cdr' were considered backward
compatibility words, with the up-to-date words (of the time) being
'head' and 'tail'.

Maybe these are/were common site library conventions that never made
it into any standard?

This would make some sense. After all, 'head' and 'tail' actually
imply some things that are not always true. Those 'cons' thingies may
be trees rather than lists, and even if they are lists they could be
backwards (most of the items under the 'car' side with only one item
on the 'cdr' side) which is certainly not what I'd expect from 'head'
and 'tail'.
 
J

John Roth

Sean Ross said:
Anyway. I was thinking that the foobar example would look cleaner if the
block did not have to be included directly as an argument to the function
call, but could instead be associated with the function call, tacked onto
the end, like so

foobar = map(list1, list2, list3) with x, y, z:
astatement
anotherstatement

or maybe

foobar = map(list1, list2, list3) { (x, y, z):
astatement
anotherstatement
}

or, if we want to be explicit:

foobar = map(&thunk, list1, list2, list3) with x, y, z:
astatement
anotherstatement

so that we know where the thunk is being fed to as an argument. But, this
would probably limit the number of blocks you could pass to a function.

That's the basic problem with the Rubyesque syntaxes: it limits you to
one block per function, and it makes it an implicit parameter. I don't
know that that's bad per se - people who like Ruby don't seem to feel
it's a huge limitation. However, it simply doesn't slide into Python well.
That's why I used map() as my example: it's a function that almost has to
take another function to do anything useful, and that function is a specific
parameter.

John Roth
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,169
Messages
2,570,920
Members
47,464
Latest member
Bobbylenly

Latest Threads

Top