Andrew said:
Doug Tolton:
That point has been made over and over to you. The argument is
that expressive power for a single developer can, for a group of
developers and especially those comprised of people with different
skill sets and mixed expertise, reduce the overall effectiveness of the
group.
Yes and I have repeatedly stated that I disagree with it. I simply do
not by that allowing expressiveness via high level constructs detracts
from the effectiveness of the group. That argument is plainly
ridiculous, if it were true then Python would be worse than Java,
because Python is *far* more expressive.
If this is indeed the crux, then any justification which says "my brain"
and "I" is suspect, because that explicitly ignores the argument.
Apparently you can't read very well. I simply stated that I believe our
point of contention to be that issue, I never stated I believe that
because it's some vague theory inside my head.
By
comparison, Alex's examples bring up
- teaching languages to others
- interference between his code and others' (the APL example)
- production development
"Imagine a group of, say, a dozen programmers, working together ...
to develop a typical application program of a few tens of thousands
of
function points -- developing about 100,000 new lines of delivered
code
plus about as much unit tests, and reusing roughly the same amount"
- writing books for other people
With the exception of writing books for other people, I have done all of
those things. I have worked on fairly large development teams > 20
people. I have built multi-million dollar systems. I have taught
people programming languages, both on the job and as a University
Course. So don't come off with this attitude of I have no idea what I'm
talking about.
Macro's are precisely better for large groups of people. Any time you
are building systems with Large groups of people, and you want to have
re-usable code, you abstract it. There are all kinds of ways to do
that, Macros are just one. I have never seen any large successful
coding project that does not abstract things well. If you are incapable
of abstracting software successful and usefully then no project will be
successful, not if it's non-trivial.
which at the very least suggests the expertise and background
by which to evaluate the argument. It may be that his knowledge of
how and when to use macros is based on the statements of people he
respects rather than personal experience, but given the discussions on
this topic and the exhibited examples of when macros are appropriately
used, it surely does seem that metaclasses, higher-level functions, and
iterators can be used to implement a solution with a roughly equal amount
of effort and clarity. Th only real advantage to macros I've seen is the
certainty of "compile-time" evaluation, hence better performance than
run-time evaluation
As I said to Alex, that's because you don't understand Macros. Relying
on what someone else says about Macros only gets you so far. At some
point, if you don't want to look like a complete idiot, you might want
to really learn them or just shut up about them. It's very difficult to
have a conversation with someone who really doesn't know what they are
talking about, but is instead just spouting an opinion they picked up
from someone else. The discussion doesn't go anywhere at that point.
Macros are like any things else, a tool in your tool box. If you know
how to use them they can be used very effectively. If you don't, you
can probably work around the problem and solve it a different way.
However as the toolset differential gets bigger, the person with more
tools in their arsenal will be able to outperform the people with less
tools.
Alex:
You mean "estimating"; for measuring I suspect you can use a
combination of a clock and a calendar. (This from a guy who recently
posted that the result of 1+1 is 4.
No, what I was referring to wasn't estimation. Rather I was referring
to the study that found that programmers on average write the same
number of lines of code per year regardless of the language they write
in. Therefore the only way to increase productivity is to write
software in a language that uses less lines to accomplish something
productive. See Paul Grahams site for a discussion.
You should use McConnell as a more recent reference than Brooks.
(I assume you are arguing from Mythical Man Month? Or from his
more recent writings?) In any case, in Rapid Development McConnell
considers various alternatives then suggests using LOC, on the view
that LOC is highly correlated with function points (among 3rd
generation programming languages! see below) and that LOC has a
good correlation to development time, excluding extremes like APL
and assembly. However, his main argument is that LOC is an easy
thing to understand.
The tricky thing about using McConnell's book is the implications
of table 31-2 in the section "Using Rapid Development Languages",
which talks about languages other than the 3rd generation ones used
to make his above estimate.
Table 31-2 shows the approximate "language levels" for a wider
variety of languages than Table 31-1. The "language level" is
intended to be a more specific replacement for the level implied
by the phrases "third-generation language" and "fourth-generation
language." It is defined as the number of assembler statements
that would be needed to replace one statement in the higher-level
language. ...
The numbers ... are subject to a lot of error, but they are the best
numbers available at this time, and they are accurate enough to
support this point: from a development poing of view, you should
implement your projects in the highest-level language possible. If
you can implement something in C, rather than assembler, C++
rather than C, or Visual Basic rather than C++, you can develop
faster.
And here's Table 31-2
Statements per
Language Level Function Point
-------- ----- --------------
Assembler 1 320
Ada 83 4.5 70
AWK 15 25
C 2.5 125
C++ 6.5 50
Cobol (ANSI 85) 3.5 90
dBase IV 9 35
spreadsheets ~50 6
Focus 8 40
Fortran 77 3 110
GW Basic 3.25 100
Lisp 5 65
Macro assembler 1.5 215
Modula 2 4 80
Oracle 8 40
Paradox 9 35
Pascal 3.5 90
Perl 15 25
Quick Basic 3 5.5 60
SAS, SPSS, etc. 10 30
Smalltalk (80 & V) 15 20
Sybase 8 40
Visual Basic 3 10 30
Source: Adapted from data in 'Programming Languages
Table' (Jones 1995a)
I'll use Perl as a proxy for Python; given that that was pre-OO
Perl I think it's reasonable that that sets a minimum level for
Python. Compare the Lisp and Perl numbers
Lisp 5 65
Perl 15 25
You are saying that Python and Perl are similarly compact?!?
You have got to be kidding right?
Perl is *far* more compact than Python is. That is just ludicrous.
and the differences in "statements per function point" (which isn't
quite "LOC per function point") is striking. It suggests that
Python is more than twice as concise as Lisp, so if LOC is
used as the estimate for implementation time then it's a strong
recommendation to use Python instead of Lisp because it
will take less time to get the same thing done. And I do believe
Lisp had macros back in the mid-1990s.
Sadly, this is a secondary reference and I don't have a
copy of
Jones, Capers, 1995a. "Software Productivity Research
Programming Languages Table," 7th ed. March 1995.
and the referenced URL of
www.spr.com/library/langtbl.htm
is no longer valid and I can't find that table on their site.
It's always nice just to chuck some arbitrary table into the
conversation which conveniently backs some poitn you were trying to
make, and also conveniently can't be located for anyone to check the
methodology.
If you want some real world numbers on program length check here:
http://www.bagley.org/~doug/shootout/
Most of those programs are trivially small, and didn't use Macros.
Macros as well as high order functions etc only come into play in
non-trivial systems.
I just don't buy these numbers or the chart from Mcconell on faith. I
would have to see his methodolgy, and understand what his motivation in
conducting the test was.
It was a long winded digression into how LOC can be a
wrong basis by which to judge the appropriateness of a
language feature.
It still wasn't relevant to Macros. However, because neither of you
understand Macros, you of course think it is relevant.
See that smiley and the "--"? This is a throwaway point at the end
of the argument, and given Alex's noted verboseness, if it was a
serious point he would have written several pages on the topic.
This is something we are very much in agreement on.
My response was just a little dig, because it does seem to be indicative
of his attitude in general IMO.