The type/object distinction and possible synthesis of OOP andimperative programming languages

I

Ian Kelly

Well that bumps our count to five then:
Six.
....
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: type 'ellipsis' is not an acceptable base type
 
T

Terry Jan Reedy

Well that bumps our count to five then:

--> NoneType = type(None)
--> NoneType
<class 'NoneType'>
--> class MoreNone(NoneType):
... pass
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: type 'NoneType' is not an acceptable base type

'NoneType' is not a builtin name in builtins, which is precisely why you
accessed it the way you did ;-). From issue 17279 (for 3.3):

"Attached subclassable.py produces these lists:
Among named builtin classes, these cannot be subclassed:
bool, memoryview, range, slice,
Among types classes, these can be subclassed:
ModuleType, SimpleNamespace,"
 
M

Mark Janssen

Actually, I'm not sure how you'd go about inheriting from a function.
Why not just create a bare class, then assign its __call__ to be the
function you're inheriting from?

I think his point remains valid, from a theoretical pov. Python
prides itself on the idea of "first-class functions" and such, but
unlike the world of lambda calculus, this selling point is a bit
invalid. Because for Python (and any C-based language), it is roots
squarely in the Turing machine and its real-word implementation.

(Note this contrasts starkly with Java(script), which doesn't seem
to be based on anything -- can anyone clarify where Java actually
comes from?)
 
I

Ian Kelly

I think his point remains valid, from a theoretical pov. Python
prides itself on the idea of "first-class functions" and such, but
unlike the world of lambda calculus, this selling point is a bit
invalid. Because for Python (and any C-based language), it is roots
squarely in the Turing machine and its real-word implementation.

I'm having a hard time following what you're trying to say here.
Lambda calculus and Turing machines are theoretical models of
computation, not languages. You can model Lisp programs with Turing
machine, and you can model C programs with lambda expressions.
Practically speaking you would probably have an easier time doing it
the other way around, due to the procedural nature of the Turing
machine versus the functional nature of the lambda calculus.

By the usual definition of "first-class function" [1], Python
functions are first-class; this has nothing to do with functional vs.
procedural programming (although it is more commonly found in the
former) or to do with Turing machines (which don't even include
functions as a concept).
(Note this contrasts starkly with Java(script), which doesn't seem
to be based on anything -- can anyone clarify where Java actually
comes from?)

I don't understand why you would consider Python to be "C-based" or
"Turing machine-based" but not Java or Javascript.

[1] http://en.wikipedia.org/wiki/First-class_citizen
 
S

Steven D'Aprano

I think his point remains valid, from a theoretical pov. Python prides
itself on the idea of "first-class functions" and such, but unlike the
world of lambda calculus, this selling point is a bit invalid.


Python functions are first-class functions, which is short-hand for
saying "functions which are also capable of being treated as values,
which means they can be created at run-time, returned from functions,
passed around as arguments, and assigned to variables".

Python's function type is not a first-class object-type, because it
cannot be subclassed in at least three of the main implementations. But
this has nothing to do with whether or not functions are first-class
functions, which is an unrelated meaning. One can conceive of a language
where FunctionType is a first-class type capable of being subclasses, but
functions are *not* first-class values capable of being passed around as
arguments.


Because for Python (and any C-based language),

Python-the-language is not C-based, or at least, very little of Python is
C-based. It's main influences are, according to GvR, Lisp and ABC, with
Pascal, Haskell and of course C also having some influence. Syntax-wise,
it is much more of an Algol-inspired language than a C-inspired language.


it is roots squarely in the
Turing machine and its real-word implementation.

Python is certainly not inspired by Turing machines. Since a Turing
machine is a simple device with an infinitely long paper tape which can
have marks added and deleted from it, very few real programming languages
are based on Turing machines.

It is, however, Turing-complete. Just like every programming language
worthy of the name, whether it has syntax like Python, C, Lisp, Forth,
INTERCAL, Oook, Applescript, Inform-7, Java, PHP, or x86 assembler.

(Note this contrasts starkly with Java(script), which doesn't seem
to be based on anything -- can anyone clarify where Java actually comes
from?)

C.
 
A

Antoon Pardon

Op 16-04-13 18:49, Terry Jan Reedy schreef:
As a practical matter, the change is non-trivial. Someone has to be
motivated to write the patch to enable subclassing, write tests, and
consider the effect on internal C uses of slice and stdlib Python used
of slice (type() versus isinstance).
I see. It seems I have underestimated the work involved.
Did the idea actually require that instances *be* a slice rather than
*wrap* a slice?

As far as I remember I wanted my slice object usable to slice lists
with. But python doesn't allow duck typing when you use your object to
"index" a list. No matter how much your object resembles a slice, when
you actualy try to use it to get a slice of a list, python throw a
TypeError with the message "object cannot be interpreted as an index".
This in combination with slice not being subclassable effectively killed
the idea.

As I already said I don't know if the idea would have turned up
something usefull. The following years I never had the feeling how great
it would have been should I have been able to pursue this idea. I just
thought it was a pity I was so thoroughly stopped at the time.
 
M

Michael Torrie

(Note this contrasts starkly with Java(script), which doesn't seem
to be based on anything -- can anyone clarify where Java actually
comes from?)

Java is not equal in any way with JavaScript. The only thing they share
are semicolons and braces. Naming EMCAScript JavaScript was a very
unfortunate thing indeed.

For the record, JavaScript is what they call a "prototype-based
language." http://en.wikipedia.org/wiki/Prototype-based_programming.
You can emulate an OOP system with a prototype-based language.

I highly recommend you read a book on formal programming language theory
and concepts.
 
N

Neil Cerutti

Java is not equal in any way with JavaScript. The only thing
they share are semicolons and braces. Naming EMCAScript
JavaScript was a very unfortunate thing indeed.

For the record, JavaScript is what they call a "prototype-based
language." http://en.wikipedia.org/wiki/Prototype-based_programming.
You can emulate an OOP system with a prototype-based language.

I highly recommend you read a book on formal programming
language theory and concepts.

Let me recommend Concepts, Techniques and Models of Computer
Programming, Van Roy and Haridi.

http://www.info.ucl.ac.be/~pvr/book.html
 
S

Steven D'Aprano

For the record, JavaScript is what they call a "prototype-based
language." http://en.wikipedia.org/wiki/Prototype-based_programming.
You can emulate an OOP system with a prototype-based language.

Prototype languages *are* OOP. Note that it is called OBJECT oriented
programming, not class oriented, and prototype-based languages are based
on objects just as much as class-based languages. They are merely two
distinct models for OOP.
 
R

Roy Smith

Steven D'Aprano said:
Prototype languages *are* OOP. Note that it is called OBJECT oriented
programming, not class oriented, and prototype-based languages are based
on objects just as much as class-based languages. They are merely two
distinct models for OOP.

One of the nice things about OOP is it means so many different things to
different people. All of whom believe with religious fervor that they
know the true answer.
 
M

Mark Janssen

One of the nice things about OOP is it means so many different things to
different people. All of whom believe with religious fervor that they
know the true answer.

Here's a simple rule to resolve the ambiguity. Whoever publishes
first, gets to claim origin of a word and its usage, kind of like a
BDFL. The rest can adapt around that, make up their own word, or be
corrected as the community requires.
 
N

Ned Batchelder

Here's a simple rule to resolve the ambiguity. Whoever publishes
first, gets to claim origin of a word and its usage, kind of like a
BDFL. The rest can adapt around that, make up their own word, or be
corrected as the community requires.

You won't solve the problem of confusing, ambiguous, or conflicting
terminology by making up a rule. "Object-oriented" means subtly
different things to different people. It turns out that computing is a
complex field with subtle concepts that don't always fit neatly into a
categorization. Python, Java, Javascript, Ruby, Smalltalk, Self, PHP,
C#, Objective-C, and C++ are all "object-oriented", but they also all
have differences between them. That's OK. We aren't going to make up a
dozen words as alternatives to "object-oriented", one for each language.

You seem to want to squeeze all of computer science and programming into
a tidy hierarchy. It won't work, it's not tidy. I strongly suggest you
read more about computer science before forming more opinions. You have
a lot to learn ahead of you.

--Ned.
 
M

Mark Janssen

You won't solve the problem of confusing, ambiguous, or conflicting
terminology by making up a rule. "Object-oriented" means subtly different
things to different people.

That's a problem, not a solution.
It turns out that computing is a complex field
with subtle concepts that don't always fit neatly into a categorization.

But that is the point of having a *field*.
Python, Java, Javascript, Ruby, Smalltalk, Self, PHP, C#, Objective-C, and
C++ are all "object-oriented", but they also all have differences between
them. That's OK. We aren't going to make up a dozen words as alternatives
to "object-oriented", one for each language.

Well, you won't, but other people *in the field* already have,
fortunately. They have names like dynamically-typed,
statically-typed, etc.
You seem to want to squeeze all of computer science and programming into a
tidy hierarchy.

No on "squeeze" and "tidy". Maybe on "hierarchy".
It won't work, it's not tidy. I strongly suggest you read
more about computer science before forming more opinions. You have a lot to
learn ahead of you.

Okay, professor is it, master? What is your provenance anyway?

-- :)
 
S

Steven D'Aprano

That's a problem, not a solution.

It's a fact, not necessarily a problem.

"Sandwich" means subtly different things to different people in different
places, but human beings manage to cope, and very few people accidentally
eat a sandwich of differently doped silicon crystals (i.e. a transistor)
when they intended to eat a sandwich of bread and cheese.

So long as people recognise the existence and nature of these subtle
differences, it's all good. Java's OOP model is different from Python's,
which is different from Lua's, which is different from Smalltalk's.
That's all grand, they all have their strengths and weaknesses, and if
all programming languages were the same, there would only be one. (And it
would probably be PHP.)

But that is the point of having a *field*.

Reality is the way it is. However we would like fields of knowledge to
neatly fit inside pigeonholes, they don't.

Well, you won't, but other people *in the field* already have,
fortunately. They have names like dynamically-typed, statically-typed,
etc.

They are not names for variations of OOP. They are independent of whether
a language is OOP or not. For example:


Java is statically typed AND object oriented.
Haskell is statically typed but NOT object oriented.

Python is dynamically typed AND object oriented.
Scheme is dynamically typed but NOT object oriented.
 
R

rusi

That all being said, the thrust of this whole effort is to possibly
advance Computer Science and language design, because in-between the
purely concrete "object" architecture of the imperative programming
languages and the purely abstract object architecture of
object-oriented programming languages is a possible middle ground that
could unite them all.

Just been poking around with eclipse.
And saw this: http://softwaresimplexity.blogspot.in/2013/02/where-java-fails.html

For context, the guy seems to be big in the java/eclipse community
[Or at least is well hyped -- he's finalist for eclipse' best
developer award]

This context is needed to underscore something he says in one of the
comments on that page:

"OO just sucks…"
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,989
Messages
2,570,207
Members
46,782
Latest member
ThomasGex

Latest Threads

Top