Chris Carlen a écrit :
Hi:
From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.
However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?
None. Definitively. wrt/ developper time and memory, it's mostly a
matter of fit-your-brains. If it does, you'll find it easier, else
choose another programming style. wrt/ cpu time and memory, and using
'low-level' languages (C/C++/Pascal etc) OO is usually worse than
procedural for simple programs. For more complex ones, I'd say it tends
to converge since these programs, when written procedurally, usually
rely on many abstraction/indirection layers.
The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988. Then didn't program for nearly 10 years during
which time OOP was popularized. Starting in 1999 I got back into
programming, but the high-level-ness of PC programming and the
completely foreign language of OOP repelled me. My work was in analog
and digital electronics hardware design, so naturally I started working
with microcontrollers in .asm and C. Most of my work involves low-level
signal conditioning and real-time control algorithms, so C is about as
high-level as one can go without seriously loosing efficiency.
You may still want to have a look on some more functional languages like
Haskell, OCaml or Erlang. But if you find OO alien, I doubt you'll have
a strong feeling for functional programming.
The
close-to-the-machine-ness of C is ideal here. This is a realm that I
truly enjoy and am comfortable with.
Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
OTOH, OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept. According to him, each object
is a (simulation of) a small machine.
Perhaps the only thing that may have clicked regarding OOP is that in
certain cases I might prefer a higher-level approach to tasks which
involve dynamic memory allocation.
While OO without automatic memory management can quickly become a major
PITA, OO and GC are two orthogonal concepts - some languages have
builtin support for OO but nothing specific for memory management
(ObjectPascal, C++, ObjectiveC), and some non-OO languages do have
builtin memory management (mostly but not only in the functional camp).
If I don't need the execution
efficiency of C, then OOP might produce working results faster by not
having to worry about the details of memory management, pointers, etc.
It's not of feature of OO per se. But it's clear that not having (too
much) to worry about memory management greatly enhance productivity.
But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?
Don't you design your programs ? AFAICT, correct design is not easier
with procedural programming.
Now to answer your question, I'd say it depends on your experience of
OO, and of course of the kind of OO language you're using. With
declaratively statically typed languages - like C++, Java etc - you are
forced into a lot of upfront design (way too much IMHO). Dynamic
languages like Smalltalk, Python or Ruby are much more lightweight in
this area, and tend to favor a much more exploratory style - sketch a
quick draft on a napkin, start coding, and evolve the design while
you're coding.
And FWIW, Python doesn't *force* you into OO - while you'll be *using*
objects, you can write most of your code in a procedural way, and only
"fall down" into OO for some very advanced stuff.
Ultimately I don't care what the *name* is for how I program. I just
need to produce results.
Indeed !-)
So that leads back to objectivity. I have a
problem to solve, and I want to find a solution that is as quick as
possible to learn and implement.
Problem:
1. How to most easily learn to write simple PC GUI programs
GUI are one of the best (and more successfull) application of OO - and
as a matter of fact, even GUI toolkits implemented in plain C tend to
take an OO approach (GTK+ being a clear example, but even the old
Pascal/C Mac GUI API does have a somewhat "object based" feeling).
that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.
So what you want is an hi-level, easy to learn language with a rich
collection of libraries. The Goodnews(tm) is that Python is one of the
possible answers.
2. Must be cross-platform: Linux + Windows.
Idem. You can even add most unices and MacOS X to the list.
This factor can have a big
impact on whether it is necessary to learn a new language, or stick with
C. If my platform was only Linux I could just learn GTK and be done
with it. I wouldn't be here in that case.
Possible solutions:
Form 1: Use C and choose a library that will enable cross-platform GUI
development.
Pro: Don't have to learn new language.
Con: Probably will have difficulty with cross-platform implementation
of serial comms. This will probably need to be done twice. This will
waste time.
Con: C is a low-level language (not a criticism - it has been designed
so), which greatly impact productivity.
Con: the only serious C (not++) cross-platform GUI toolkit I know is
GTK+, which is less cross-platform than wxWidgets, and *is* OO.
Form 2: Use Python and PySerial and TkInter or wxWidgets.
I'd probably go for wxWidgets.
Pro: Cross-platform goal will likely be achieved fully.
Very likely. There are a couple of things to take care of, but nothing
close to what you'd have to do in C.
Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library.
Yes, obviously. The (other) GoodNews(tm) is that, according to most
estimations, an experimented programmer can become productive in Python
in a matter of weeks at worst (some manage to become productive in a few
days). This won't mean you'll master the language and use it at its
best, but don't worry, you'll get things done, and perhaps in less time
than with C.
Must possibly learn a
completely new way of thinking (OOP)
Not necessarly. While Python is OO all the way down - meaning that
everything you'll work with will be an object (functions included) -, it
doesn't *force* you into OO (IOW : you don't have to define classes to
write a Python program). You can as well use a procedural - or even
somewhat functional - approach, and most Python programs I've seen so
far are usually a mix of the three.
not just a new language syntax.
You forgot one of the most important part of a language : idioms. And
it's definitively *not* idiomatic in Python to use classes when a
simpler solution (using plain functions and modules) is enough.
Not necessarily that much.
Form 3: Use LabVIEW
Pro: I think that the cross-platform goal can be met.
Con: Expensive. I would prefer to use an Open Source solution. But
that isn't as important as the $$$. I have also generally found the 2D
diagrammatical programming language of "G" as repelling as OOP. I
suspect that it may take as much time to learn LabVIEW as Python.
I don't have much knowledge of LabVIEW so I can't comment on this. But I
remember a thread here about G, and I guess you'll find Python much more
familiar - even if you'll need some 'thinking adjustment' to grok it.
In
that case the time spent on Python might be better spent since I would
be learning something foundational as opposed to basically just learning
how to negotiate someone's proprietary environment and drivers.
IMHO, the biggest gain (in learning Python vs LabVIEW) is that you'll
add a very valuable tool to your toolbox - the missing link between C
and shell scripts.
HTH