E. Robert Tisdale said:
I used Google
http://www.google.com/
to search for
+"premature optimization"
and I found lots of stuff.
<snip>
Here's a document I came across a while ago that I like. It talks
specifically about Perl, but most of the points are equally valid for
other languages:
http://magnonel.guild.net/~schwern/talks/How_To_Be_Lazy/full_slides/rules_of_optimization.html
Snippet:
The Rules Of Optimization
Rule #0 - These are not rules
They are heuristics. They're not hard and fast, just a strong set of
guidelines. They should be walked through, and decided if they should be
violated, in order.
Rule #1 - Don't
Do not optimize. Optimization is the killer of schedules and destroyer
of clean code.
* Optimizations increase code entropy
The more you screw with things, the weirder you make your code.
* They will often wind up slowing down the whole system
For example, pseudo-hashes were added as a way to have faster,
more memory efficient hashes with fixed keys. They're faster by
about 10-15% when used carefully. However, the extra code slows
down *all* hashes and arrays by 10-15%.
* Adds bugs
Its new code, and often more complicated than the last. New code
means new bugs.
* Takes more time
Often the unoptimized version is the fastest one to code.
* Leaks encapsulation
Many optimizations remove subroutine calls, or increase the scope
of routines.
* Readability damaged
Micro-optimizations often do funny tricks to get more speed out.
For example, replacing lexical variables with $_
* Hampers future optimizations
Often, adding in a poorly thought out optimization will actually
make future, and more wide-ranging, ones more difficult with a
net loss in the long run. Consider the case of replacing $obj->foo
with $obj->{foo} (this is bad BTW).
-Kevin