What's better about Ruby than Python?

  • Thread starter Brandon J. Van Every
  • Start date
H

Hans Nowak

Doug said:
I just don't find that argument compelling. By that logic we should
write the most restrictive language possible on the most restrictive
platform possible (ie VB on Windows) because allowing choice is
clearly a bad thing.

Don't introduce a feature because it would be so cool that everyone
would use it? That's just plain weird.

The problem is that a macro system that is too powerful can be harmful.

Let's say you write a useful module. Python 3.6 just added a very powerful
macro system, and you use it to, say, write functions with lazy evaluation,
make strings mutable, and write your own flavor of the for-loop. Now I cannot
read your code anymore. A simple function call, or a loop, does not mean what
it used to mean.

One of Python's strengths is that you can create powerful abstractions with
functions and classes. But no matter what you do with these, they will always
be functions and classes that adhere to some common language rules. There is
no way to go "over the top" and change the language proper. Right now I can
read everybody's Python code (unless deliberately obfuscated); this would
change if there were macros that were so powerful that they would change
language constructs, or allow new ones. Maybe they would make your problesm
easier to solve, but Python itself would lose in the long run.

My $0.02,
 
D

Dave Brueck

Ramon Leon Fournier said:
Well, not just for low-level 3D graphics. There are many other things
you would not code in Python unless you are a complete fool. Your
companies mail server, for instance.

Actually, that's not a very good example - Python is *very* well
suited for many types of servers, mail servers included. The I/O heavy
nature of many servers lessens the significance of Python being slow
in terms of raw CPU speed. Lots of I/O can also mean the effects of
the GIL less of a factor on multi-CPU boxes than they otherwise would
be. Finally, given the fact that you don't see too many buffer
overruns and other similar security holes in Python, I'd sleep
*better* at night implementing my server in Python than in C++.

But I do agree with the notion that Python isn't good for *all*
problems, as does everyone else it seems. ;-)

-Dave
 
J

JanC

(e-mail address removed) (John J. Lee) schreef:
Mind you, I am the type who, when faced with a new language, tends to
read everything slowly, chew the cud, *then* start writing.

Sounds familiar, you're not the only "sick puppy"... ;-)
 
J

JanC

Brandon J. Van Every said:
You and I have different social theories. My social theory is, people
are very stubborn. Nobody will engage in Right behavior the minute
you tell them to. But if I killfile people, and tell them why (i.e.
"Because you are a Troll Hunter, and such people are useless."), then
someday they may wake up and figure it out. It may be 6 months from
now, it may be 2 years from now. The point is to have a cumulative
effect on people's newsgroup behavior.

You know this proverb: "change the world, start with yourself" ?
 
O

Olivier Drolet

Alex Martelli said:
Doug Tolton wrote:
...
(...)

... a zillion mediocre ones.


Alex


Macros, as found in Common Lisp, do not change the underlying language
at all! Common Lisp macros, when run, always expand into 100% ANSI
Common Lisp code! Using macros to become more productive is no
different from using function abstractions or class hierarchies to
become more productive. They all require that you, the programmer,
become familiar with them. Macros don't cause Common Lisp to fork
anymore than function or class abstractions do. They only alter the
readability of "program code" (usually for the better), just like
function or class abstractions do.

Saying that all hell will break loose in the Python community seems
rather unfounded and a bit knee-jerk. None of what you claim would
eventually happen within Python circles is currently happening within
the Common Lisp community. After years of macro use, ANSI Common Lisp
is till the same. Macros don't bypass ANSI committees anymore than
they would the Guidos of this world. On the contrary, they preclude
the need to bypass them in the first place, and all parties end up
getting what they need: on the one hand, a static base language, and
on the other, much greater expressiveness.

Speaking of expressiveness, someone asked on comp.lang.lisp.fr what
macros were good for, concretely, and what quantitative difference
they made in commercial applications (cf. "Macros: est-ce utile ?
(attn Marc)"). The responses (in French) were quite enlightening. It
boils down to using multiple macros, in multiple instances, thus
allowing to reduce total code size (of otherwise pure CL code) by VERY
significant margins. You can think of it as reuse (as per OOP) or as
code compression.

Macros do not have to be used all the time or at all. There are times
when a macro should not be used, e.g. when a function would do just
fine. But they are very powerful. As Paul Graham put it, macros allow
you to program up towards the problem at hand, as opposed to adapting
the problem to fit the language specification. They allow greater
expressiveness, when you need it. They allow you to "use" many lines
of code you no longer have to write. And the lines of code you don't
have to write are also the lines of code you don't have to debug (as
it were).

Cheers.
 
A

Alex Martelli

Kenny Tilton wrote:
...
thread: Python is not trying to be everything. Fair enough. Let Python
be Python, let Lisp be Lisp.

ie, If someone wants macros, they probably would also like special
variables and closures and lexical scope and multi-methods and they may
as well get it over with and learn Lisp and stop trying make Python more
than it wants to be.

Hear, hear! Or one you just can't stand the nested-parentheses idea,
then that's what Dylan was designed for: much the same semantics
and power as Lisp, including all of the features you mention!, but with
infix syntax.


Alex
 
A

Alex Martelli

Andrew Dalke wrote:
...
I confess to being fond of the

atom = Atom()

idiom. I know it breaks down, eg, for a function which returns
a newly created class, but it's too ingrained in me.

Back when I gave Eiffel a serious try, I easily slid into [the
equivalent of]:
itsAtom = Atom()
[for an instance member variable -- anAtom for a local,
theirAtom for a class-variable, etc -- Robert Martin's idea
to distinguish those lexically in languages which confuse
the scopes]. In other words, naming a basically-anonimous
"generic instance of class Atom" hardly requires a case
sensitive language, IMHO.


Alex
 
A

Alex Martelli

Andrew Dalke wrote:
...
be that way. It's the inability to get into another's shoes -
to understand view different than one's own - which annoys
me the most. A deficiency sadly typical of all too many
enthusiastic new language designers.

Just as prevalent is the wish to please EVERYone -- that's
how one gets, say, PL/I, or perl... by NOT deliberately refusing
to "get into other's shoes" and rejecting their "different views"
for purposes of inclusion into the new language. Even GvR
historically did some of that, leading to what are now his mild
regrets (lambda, map, filter, ...).


Alex
 
A

Alex Martelli

Andrew Dalke wrote:
...
(Need to use the function instead of a class with __call__
so that the method gets bound correctly. And I believe

You could use a class if its instances where descriptors, i.e.
supported __get__ appropriately -- that's basically what
functions do nowadays. See Raymond Hettinger's HOWTO
on descriptors -- a *wonderful* treatment of the subject.


Alex
 
A

Alex Martelli

Dave said:
I agree with your position, more or less, but I am curious - how far do
you think this deprecation should go? Just changing a function in
__builtins__? Changing a function in any module? Or changing *anything*
about a module from the outside?

Good question! I don't have any preset answers yet. We definitely do
want to impose those restrictions that have "multiple" benefits, i.e.
that both let speed increase (by allowing the compiler to 'inline' the
calls to built-in functions, once it does know they're built-ins) AND
encourage correctness -- but we don't want to cripple ordinary useful
usage, particularly not when the benefits are uncertain. Where the
line should consequently be drawn is not 100% obvious -- which is in
part why nothing about this ended up in Python 2.3, but rather it's all
scheduled for consideration in 2.4.


Alex
 
A

Andrew Dalke

Alex:
Even GvR
historically did some of that, leading to what are now his mild
regrets (lambda, map, filter, ...).

and != vs. <>

Can we have a deprecation warning for that? I've never
seen it in any code I've reviewed.

Andrew
(e-mail address removed)
 
A

Andrew Dalke

Alex:
You could use a class if its instances where descriptors, i.e.
supported __get__ appropriately ... See Raymond Hettinger's
HOWTO on descriptors -- a *wonderful* treatment of the subject.

I'm slowing easing my way into this new stuff. I read it through
once, but it didn't all sink in. Could you post an example of how
to use it for my cache example?

Andrew
(e-mail address removed)
 
A

Alex Martelli

Ramon said:
Well, not just for low-level 3D graphics. There are many other things
you would not code in Python unless you are a complete fool. Your
companies mail server, for instance. But you wouldn't do that in C#

I deeply disagree: Twisted, coded in Python, lets you do a GREAT job
coding such applications as mail servers -- highly scalable, etc etc.
And even without Twisted, Python is splendid for such I/O bound jobs.

Granted that my motivation to use Python is different from that of many
others here. I have work to do, and to finish it as quickly as
possible. I figured that Python is a great language for people like me,

I think that's the TYPICAL motivation for using Python: it lets you do
your job with great productivity.

BTW: What exactly has "Smell the Windows" to do with the current Ruby
vs. Python debate? Are you actually serious about learning any of

I dunno: back when I checked carefully, Python was MUCH better integrated
with Windows, thanks to win32all etc etc, while Ruby perched precariously
on cygwin &c. A Windows-centric view would thus appear to favour Python
over Ruby (unless Ruby's Windows implementation has made great strides).


Alex
 
B

Borcis

Andrew said:
Olivier Drolet:



I've created a new language, called Speech. It's based on the core
primitives found in the International Phonetic Alphabet. I've made some
demos using Speech. One is English and another is Xhosa. This just
goes to show how powerful Speech is because it can handle so many
domains. And it's extensible! Anything you say can be expressed in
Speech!

I believe you unwittingly locate an issue. Machine translation of human
languages has been an unescapable project for computer scientists, a challenge
that has consistently revealed harder to achieve than expected. Idiomatic
machine translation of *programming* languages, in comparison, looks like a
toy problem, an appetizer. But all the endless debates in the p.l. newsgroups
certainly show one thing : we don't expect idiomatic translation between
computer languages to solve our problems. While it clearly could.

I believe our reasons for not doing it boil down to : (a) the issue of
*conservation* of programming languages *biodiversity* not having gained
attention as the key issue it is (b) lack of imagination by programmers too
engrossed with pet language advocacy.

What I mean is that the metaphor you use puts the joke on you (or us). You
should really distinguish between the case of translating between *existing*
"sibling" languages (be they human languages or programming languages) and
otoh the case of translating between a newly-bred variant of a language and a
parent language.

Isn't it the case that most objections to macros fail to persist invariant if
we set the purpose of our "macro" facility, to that of *grafting some
language's surface syntax and basic control structures onto some other
language's objects and library ? This, to illustrate and set a positive
criterion, well enough that (say) we will be able to manage with (basically)
the first language's syntax mode under emacs, what will really run as code in
the second language* ?
 
M

Matthias

Chris said:
I'm curious. Why do you feel such a need for macros? With metaclasses,
etc., etc., what significant advantage would macros buy you? Do you have
any examples where you think you could make a significantly crisper and
easier to read and understand program with macros than without.

The first thing I would do with macros in Python is build an inline-facility
which allows certain functions being expanded whenever they are called.
This would hopefully increase speed at the sake of space efficiency.
Historically, macros have been used for building Prolog-like languages,
constrained-programming languages, lazy languages, very flexible object
oriented languages, etc. on top of Lisp.

In newer days, Design Patterns can be coded into the language via macros.
This enables you to program on a higher level of abstraction than with
classes. You can express your patterns more clearly and much shorter than
without them.

Indeed, the C++ community, which does not have macros, uses (some would say:
misuses) their template facility to do metaprogramming. They implement
design patterns, inline numerical code, etc. The resulting template code
is ugly. Compiler messages unreadable. Programming painful. But
developers feel it is worth to do because the abstractions they build are
powerful and run fast.

Having said that, I totally agree with earlier posters who said that macros
in the hand of bad programmers are catastophic. Where I don't agree is
that therefore the language should be more limited than necessary.

Matthias
 
A

Andrew Dalke

Alex Martell:
... def __get__(self, obj, cls):
... self.obj = obj
... return self.cached_call

That's the part where I still lack understanding.

class Spam:
def f(self):
pass
f = CachedCall(f)

obj = Spam()
obj.f()

Under old-style Python
obj.f is the same as getattr(obj, "f")
which fails to find 'f' in the instance __dict__
so looks for 'f' in the class, and finds it
This is not a Python function, so it does not
get bound to self. It's simply returned.

obj.f() takes that object and calls it. In my original
code (not shown) I tried implementing a __call__
which did get called, but without the instance self.


Under new-style Python
obj.f is the same as getattr(obj, "f")
which fails to find 'f' in the instance __dict__ so
looks for 'f' in the class, and finds the CachedCall.

Python checks if the object implements __get__,
in which case it's called a descriptor. If so, it's
called with the 'obj' as the first parameter. The
return value of this call is used as the value for
the attribute.

Is that right?
should closely mimic your semantics, including ignoring
what I call obj and you call self in determining whether
a certain set of argumens is cached.

Why should obj make a difference? There's only
one CachedCall per method per .... Ahh, because it's
in the class def, not the instance. Adding support for
that using a weak dict is easy.

Yeah, and my approach won't work with kwargs nor any
other unhashable element. Since I didn't know what the
Lisp code did nor how Lisp handles unhashable elements,
I decided just to implement the essential idea.

Andrew
(e-mail address removed)
 
A

Aahz

Alex:

and != vs. <>

Can we have a deprecation warning for that? I've never
seen it in any code I've reviewed.

We will, probably 2.4 or 2.5. (Whenever 3.0 starts getting off the
ground.)
 
A

Alex Martelli

Andrew said:
Alex Martell:

That's the part where I still lack understanding.

class Spam:
def f(self):
pass
f = CachedCall(f)

That's an oldstyle class -- use a newstyle one for smoothest
and most reliable behavior of descriptors
obj = Spam()
obj.f()

Under old-style Python
obj.f is the same as getattr(obj, "f")

This equivalence holds today as well -- the getattr
builtin has identical semantics to direct member access.
which fails to find 'f' in the instance __dict__
so looks for 'f' in the class, and finds it
This is not a Python function, so it does not
get bound to self. It's simply returned.

obj.f() takes that object and calls it. In my original
code (not shown) I tried implementing a __call__
which did get called, but without the instance self.
Sure.

Under new-style Python
obj.f is the same as getattr(obj, "f")
Yes.

which fails to find 'f' in the instance __dict__ so
looks for 'f' in the class, and finds the CachedCall.
Sure.

Python checks if the object implements __get__,
in which case it's called a descriptor. If so, it's
Exactly.

called with the 'obj' as the first parameter. The
return value of this call is used as the value for
the attribute.

Is that right?

Yes! So what is it that you say you don't get?

Why should obj make a difference? There's only
one CachedCall per method per .... Ahh, because it's
in the class def, not the instance. Adding support for
that using a weak dict is easy.

If obj is such that it can be used as a key into a dict
(weak or otherwise), sure. Many class instances of some
interest can't -- and if they can you may not like the
result. COnsider e.g.

class Justanyclass:
def __init__(self, x): self.x = x
def compute(self, y): return self.x + y

pretty dangerous to cache THIS compute method -- because,
as a good instance method should!, it depends crucially
on the STATE of the specific instance you call it on.

Yeah, and my approach won't work with kwargs nor any
other unhashable element. Since I didn't know what the
Lisp code did nor how Lisp handles unhashable elements,
I decided just to implement the essential idea.

An automatically cachable method on general objects is
quite tricky. I don't think the Lisp code did anything
to deal with that trickiness, though, so you're right
that your code is equivalent. Anyway, I just wanted to
show how the descriptor concept lets you use a class,
rather than a function, when you want to -- indeed any
function now has a __get__ method, replacing (while
keeping the semantics of) the old black magic.


Alex
 
A

Anton Vredegoor

[defines cognitive macro called Speech]
I've created a new language, called Speech. It's based on the core
primitives found in the International Phonetic Alphabet. I've made some
demos using Speech. One is English and another is Xhosa. This just
goes to show how powerful Speech is because it can handle so many
domains. And it's extensible! Anything you say can be expressed in
Speech!

[snip lots of examples using it, trying to make macros look bad?]
In short, no one is denying that the ability to create new macros is
a powerful tool, just like no one denies that creating new words is
a powerful tool. But both require extra training and thought for
proper use, and while they are easy to write, it puts more effort
for others to understand you. If I stick to Python/English then
more people can understand me than if I mixed in a bit of Erlang/
Danish, *even* *if* the latter makes a more precise description
of the solution.

You have found a wonderful analogy, however you seem to assume that
your prejudices are so self explanatory that the conclusion that
macros are bad is natural.

I am not a native English speaker, and so my expressiveness in this
language is severely handicapped, while I consider myself a person
with good linguistic abilities.

Obviously there are people from other linguistic backgrounds
participating in discussions in this newsgroup who have the same
problems. Maybe they are greater speakers than me and yet have still
more problems using English.

However this does not in the least cause this newsgroup to deviate
substantially (or maybe it does but as a non-native speaker I can not
discern the difference) from English. Rather we all strive to speak
the same language in order to make as much people as possible
understand what we are saying.

While using Python as a programming language we strive for Pythonicity
and for combining elegance with concisiveness and readability. We are
using docstrings to comment our code and answer questions about it in
the newsgroup. Helpful people debug our code and assist in formulating
our algorithms in Python.

IMO there is a strong tendency towards unification and standardization
among the readers of this newsgroup and the need to conform and the
rewards this brings are well understood.

Seeing all this it would be a missed chance not to give the community
the freedom of redefining the language to its advantage.

Of course there are risks that the community would dissolve in
mutually incompatible factions and it would be wise to slow down the
process according to the amount of responsibility the group can be
trusted with.

The rewards would be incomparably great however, even to the amount
that I would be ready to sacrifice Python only to give this thing a
tiny chance. Suppose you could make a bet for a dollar with an
expected reward of a thousand dollars? Statistically it doesn't matter
whether you get a .999 chance of getting a thousand dollars or a
..00999 chance of getting a million dollars.

Therefore, the only thing pertinent to this question seems to be the
risk and gain assessments.
By this analogy, Guido is someone who can come up with words
that a lot of people find useful, while I am someone who can come
up withs words appropriate to my specialization, while most
people come up with words which are never used by anything
other than close friend. Like, totally tubular dude.

Another relevant meme that is running around in this newsgroup is the
assumption that some people are naturally smarter than other people.
While I can certainly see the advantage for certain people for keeping
this illusion going (it's a great way to make money, the market
doesn't pay for what it gets but for what it thinks it gets) there is
not a lot of credibility in this argument.

The "hardware" that peoples minds are running on doesn't come in
enough varieties to warrant such assumptions. For sure, computer
equipment can vary a lot, but we as people all have more or less the
same brain.

Of course there is a lot of variation between people in the way they
are educated and some of them have come to be experts at certain
fields. However no one is an island and one persons thinking process
is interconnected with a lot of other persons thinking processes. The
idea that some kind of "genius" is solely responsible for all this
progress is absurd and a shameful deviation to the kind of
"leadership" philosophical atrocities that have caused many wars.

To come back to linguistic issues, there's a lot of variation in the
way people use their brain to solve linguistic problems. There are
those that first read all the prescriptions before uttering a word and
there are those that first leap and then look. It's fascinating to see
"look before you leap" being deprecated in favor of "easier to ask
forgiveness than permission" by the same people that would think twice
to start programming before being sure to know all the syntax.

In my case for example studying old latin during high school there was
a guy sitting next to me who always knew the different conjugations
the latin words where in and as a result he managed to get high grades
with exact but uninteresting translations. My way of translating latin
was a bit different, instead of translating word for word and looking
up each form of each separate word (is it a genitivus, ablativus
absolutus, imperativus, etcetera) I just read each word and going from
the approximative meaning of all words put in a sequence of sentences
I ended up with a translation that was seventy percent correct and
that had a lot of internal consistency and elegance. It was usually
enough to get a high enough grade and also some appraisal: "si non e
vero, e ben trovato" or something like that.

What this all should lead to I am not really sure, but I *am* sure
that breaking out of formal mathematical and linguistic and
programmatic rules is the only way to come to designs that have great
internal consistency and that can accommodate for new data and
procedures.

It is sometimes impossible for a language designer to exactly pinpoint
the reasons for a certain decision, while at the same time being sure
that it is the right one.

The ability to maintain internal consistency and the tendency of other
people to fill in the gaps so that the final product seems coherent is
IMO the main reason for this strange time-travel-like ability of
making the right decisions even before all the facts are available.

Well, maybe I have made the same mistake as you by providing arguments
to the contrary of my intention of advocating the emancipation of the
average Python user to the level of language designer.

However if I have done so, rest assured that my intuition "knows" from
before knowing all the facts that this is the way to go, and the
rewards are infinitely more appealing than the risks of breaking up
the Python community are threatening.

One way or the other this is the way programming will be in the
future, and the only question is: Will Python -and the Python
community- be up to the task of freeing the programmers expressiveness
and at the same time provide a home and starting point to come back
to, or will it be left behind as so many other valiant effort's fate
has been?

Anton
 
J

Jacek Generowicz

Andrew Dalke said:
As a consultant, I don't have the luxury of staying inside a singular
code base. By your logic, I would need to learn each different
high level abstraction done at my clients' sites.

The alternative is to understand (and subsequently recognize) the
chunks of source code implementing a given patten for which no
abstraction was provided (often implemented slightly differently in
different parts of the code, sometimes with bugs), each time that it
occurs.

I'd rather use multimethods that implement the visitor pattern.

I'd rather look at multimethods, than at code infested with
implementations of the visitor pattern.

(The above comments are _not_ about the visitor pattern per se.)
The inference is that programming language abstractions should not
be more attractive than sex.

Why ever not? Don't you want to put the joy back into programming :)
Functions and modules and objects, based on experience, promote
code sharing. Macros, with their implicit encouragement of domain
specific dialect creation, do not.

I don't believe you can reasonably draw a rigid and well-defined
boundary between functions, modules and objects on one side, and
macros on the other. They all offer means of abstraction. All are open
to abuse. All can be put to good use.

In all four cases, I'd rather have the opportunity to create
abstractions, rather than not.

I find your suggestion that macros are in some way more "domain
specific" than modules, or objects or functions, bogus.
A language which allows very smart people the flexibility to
customize the language, means there will be many different flavors,
which don't all taste well together.
A few years ago I tested out a Lisp library. It didn't work
on the Lisp system I had handy, because the package system
was different. There was a comment in the code which said
"change this if you are using XYZ Lisp", which I did, but that
that's a barrier to use if I ever saw one.

You are confusing the issues of

- extensibility,
- standard non conformance,
- not starting from a common base,
- languages defined my their (single) implementation.

A few days ago I tested out a C++ library. It didn't work on the C++
system I had handy because the STL implementation was
different/template support was different. etc. etc.

A few days ago I tested out a Python library. It didn't work on the
implementation I had handy because it was Jython.
4) a small change in a language to better fit my needs has
subtle and far-reaching consequences down the line. Instead,
when I do need a language variation, I write a new one
designed for that domain, and not tweak Python.

So, what you are saying is that faced with the alternatives of

a) Tweaking an existing, feature rich, mature, proven language, to
move it "closer" to your domain.

b) Implementing a new language from scratch, for use in a single
domain

you would choose the latter?

If so, you are choosing the path which pretty much guarantees that
your software will take much longer to write, and that it will be a
lot buggier.

It's an extreme form of Greenspunning.

How do you reconcile
when I do need a language variation, I write a new one designed for
that domain, and not tweak Python.
with

Functions and modules and objects, based on experience, promote
code sharing. Macros, with their implicit encouragement of domain
specific dialect creation, do not.

?

You criticize macros for not encouraging code sharing (they do, by
encouraging you to share the (vast) underlying language while reaching
out towards a specific domain), while your preferred solution seems to
be the ultimate code non-sharing, by throwing away the underlying
language, and re-doing it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,264
Messages
2,571,317
Members
48,003
Latest member
coldDuece

Latest Threads

Top