C
C.DeRykus
x> If the OP's worry about size is justified, I'd go with this:
x> /.cc$/ and delete $h{$_} while defined ($_=each %h);
x> As your method builds two in-memory lists, one with all keys and one with
x> all keys meeting the criteria.
sure that could be faster but it may not be as you think. you make N
calls to delete and calls to each in a loop. perl ops are the major
bottleneck in perl speed. if the slice lists are short enough (and they
could still be pretty long), IMO my solution should be faster. but only
benchmarking can tell for sure. my method does use more ram but ram is
cheaper than speed.
I don't how meaningful just a raw opcode count is but the hash-slice-
delete does have fewer:
perl -MO=Concise -e 'delete @h{ grep !/\.cc$/, keys %h }'|wc -l
17
perl -MO=Concise -e ' /.cc$/ and delete $h{$_} while defined ($_=each
%h);'
25
"Perl Hacks" (#82) cites B::COP timing info to see how long the
opcodes are taking.