C
Chris Carlen
Hi:
From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.
However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?
The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988. Then didn't program for nearly 10 years during
which time OOP was popularized. Starting in 1999 I got back into
programming, but the high-level-ness of PC programming and the
completely foreign language of OOP repelled me. My work was in analog
and digital electronics hardware design, so naturally I started working
with microcontrollers in .asm and C. Most of my work involves low-level
signal conditioning and real-time control algorithms, so C is about as
high-level as one can go without seriously loosing efficiency. The
close-to-the-machine-ness of C is ideal here. This is a realm that I
truly enjoy and am comfortable with.
Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
Perhaps the only thing that may have clicked regarding OOP is that in
certain cases I might prefer a higher-level approach to tasks which
involve dynamic memory allocation. If I don't need the execution
efficiency of C, then OOP might produce working results faster by not
having to worry about the details of memory management, pointers, etc.
But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?
Ultimately I don't care what the *name* is for how I program. I just
need to produce results. So that leads back to objectivity. I have a
problem to solve, and I want to find a solution that is as quick as
possible to learn and implement.
Problem:
1. How to most easily learn to write simple PC GUI programs that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.
2. Must be cross-platform: Linux + Windows. This factor can have a big
impact on whether it is necessary to learn a new language, or stick with
C. If my platform was only Linux I could just learn GTK and be done
with it. I wouldn't be here in that case.
Possible solutions:
Form 1: Use C and choose a library that will enable cross-platform GUI
development.
Pro: Don't have to learn new language.
Con: Probably will have difficulty with cross-platform implementation
of serial comms. This will probably need to be done twice. This will
waste time.
Form 2: Use Python and PySerial and TkInter or wxWidgets.
Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.
Form 3: Use LabVIEW
Pro: I think that the cross-platform goal can be met.
Con: Expensive. I would prefer to use an Open Source solution. But
that isn't as important as the $$$. I have also generally found the 2D
diagrammatical programming language of "G" as repelling as OOP. I
suspect that it may take as much time to learn LabVIEW as Python. In
that case the time spent on Python might be better spent since I would
be learning something foundational as opposed to basically just learning
how to negotiate someone's proprietary environment and drivers.
Comments appreciated.
--
Good day!
________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
(e-mail address removed)
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.
However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?
The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988. Then didn't program for nearly 10 years during
which time OOP was popularized. Starting in 1999 I got back into
programming, but the high-level-ness of PC programming and the
completely foreign language of OOP repelled me. My work was in analog
and digital electronics hardware design, so naturally I started working
with microcontrollers in .asm and C. Most of my work involves low-level
signal conditioning and real-time control algorithms, so C is about as
high-level as one can go without seriously loosing efficiency. The
close-to-the-machine-ness of C is ideal here. This is a realm that I
truly enjoy and am comfortable with.
Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
Perhaps the only thing that may have clicked regarding OOP is that in
certain cases I might prefer a higher-level approach to tasks which
involve dynamic memory allocation. If I don't need the execution
efficiency of C, then OOP might produce working results faster by not
having to worry about the details of memory management, pointers, etc.
But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?
Ultimately I don't care what the *name* is for how I program. I just
need to produce results. So that leads back to objectivity. I have a
problem to solve, and I want to find a solution that is as quick as
possible to learn and implement.
Problem:
1. How to most easily learn to write simple PC GUI programs that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.
2. Must be cross-platform: Linux + Windows. This factor can have a big
impact on whether it is necessary to learn a new language, or stick with
C. If my platform was only Linux I could just learn GTK and be done
with it. I wouldn't be here in that case.
Possible solutions:
Form 1: Use C and choose a library that will enable cross-platform GUI
development.
Pro: Don't have to learn new language.
Con: Probably will have difficulty with cross-platform implementation
of serial comms. This will probably need to be done twice. This will
waste time.
Form 2: Use Python and PySerial and TkInter or wxWidgets.
Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.
Form 3: Use LabVIEW
Pro: I think that the cross-platform goal can be met.
Con: Expensive. I would prefer to use an Open Source solution. But
that isn't as important as the $$$. I have also generally found the 2D
diagrammatical programming language of "G" as repelling as OOP. I
suspect that it may take as much time to learn LabVIEW as Python. In
that case the time spent on Python might be better spent since I would
be learning something foundational as opposed to basically just learning
how to negotiate someone's proprietary environment and drivers.
Comments appreciated.
--
Good day!
________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
(e-mail address removed)
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.