B
Ben Bacarisse
Rainer Weikusat said:Ben Bacarisse said:[...]ela said:"Ben Bacarisse" <[email protected]> wrote in message
I'd appreciate if I can learn more from you about the thinking philosophy.
As said previously, I only thought of a lot of "if"'s and "hash"'s and never
able to use function to wrap up some of the concepts. Would you mind telling
me by which cues trigger you to think about using function?
There's no easy answer to that. You must try to get into the habit of
dreaming. You think: what function, if it were available, would make
the job easier? The academic answer is that you think "top down" -- you
imagine the very highest level of the program before you have any idea
how to write it:
table = read_table();
for each group:
print classify(group, table);
You know that classify will need both the group list and the full table
to do its job so you pass these as parameters. Then you break down
classify:
classify(group, table):
for each column in table
top_item = most_common_item_in(column, table);
freq = freq_of(top_item, column, table);
if freq > threshold
return top_item
return 'inconsistent'
Here you go "ah, classify needs to know the threshold" so you revise the
parameter list.
When you write most_common_item_in and freq_of you will find that the do
very similar things and you may decide to combine them. That's what I
did.
The trouble with this plan (and why I say this is the academic answer)
is that this breakdown interacts at all stages with the design of the
data structures that you will use.
I assure you that this is not 'the academic answer' but a perfectly
workable design methodology
I did not say it was unworkable. It's perfectly workable. However, the
description I gave is "academic" in the sense that it is too simple. At
least that is my experience. One rarely finds novel algorithms that
way, so when a problem has an interesting algorithmic core, top-down
design will often miss some interesting solutions. Also it does not
always lead to good data structures first time round. I often find
myself backing up a few levels, re-jigging the data and setting off
again is a slightly different direction.
which has essentially been ignored ever
since its invention in the last century, using whatever pretext seemed
to be most suitable for that.
Has it been ignored? I was taught it and I taught it to others. The
last time I knew about such things (about a decade ago) it was widely
taught in UK universities.
For as long as the intent is to create
working and easily maintainable code in order to solve problems (as
opposed to, say, "contribute your name to the Linux kernel changelog
for the sake of it being in there") 'stepwise refinement' is
definitely worth trying it instead of just assuming that it cannot
possibly work and hence - thank god! - 'we' can continue with the
time-honoured procedure of 'hacking away at it until it's all pieces'.
I hope you did not think I was suggesting that it could not possibly
work. I explained my stepwise approach to the problem precisely because
it led to a simple and clean solution. Maybe you took "academic" to
mean "impractical" -- I meant only "simplified for pedagogic reasons".