Considering that rapiding took about 1200ms (ish - again, cold cache)
previously, adding even just 250ms is noticeable.
Please excuse my skepticism, but in my experience, that would probably
mean in practice:
.... rapiding took about 1200ms, plus or minus 200ms, plus 500ms if the
system is under load, plus 800ms if the developer vagued out for a
moment, plus 1900ms if he happened to be scratching an itch, plus 2700ms
if the anti-virus happened to be scanning something, plus 4100ms if the
dev decided this was a good time to take a sip of coffee, plus 437000ms
if he needed to make the coffee first, plus 72000000ms if he was just
taking a moment to check something on Reddit or answer an email...
I don't have a lot of sympathy for this sort of micro-optimization of
interactive software, where the random variation from run to run is often
larger than the time being optimized, and where the human element is
usually one or two orders of magnitude greater still. Yes, developers
complain when they have to wait for the computer for half a second, or at
least the one time in thirty that they *notice* that they're waiting. The
other 29 times the computer is actually waiting for them.
Still, I'm probably no better. Only instead of optimizing code, I tend to
"optimize" travel time with "short cuts" that are guaranteed[1] to shave
off a minute from a journey that takes thirty minutes, plus or minus six
minutes. Show me a way to avoid waiting at traffic lights for 30s, and
I'll take it, even if it means waiting for a break in the traffic at a
Give Way sign for three minutes. So I shouldn't mock too much
You're right, of course, that 1/4 second is noticeable. I just find it
hard to credit that it's *significant* in the circumstances you're
describing. But I could be wrong.
[1] Guarantee void in the presence of rain, fog, heavy winds, light
winds, police radar traps, industrial action, civil protests, riots,
wars, earthquakes, acts of God, or other cars.