J
John W. Kennedy
Snail said:John W. Kennedy wrote:
Thank you for your response. So does this behavior stem from hardware?
Like how a CPU handles it? or? (when you said "hardware architect's
design" above.)
The design of C supposes that / and % for integers are mostly used with
positive numbers, and that object code should therefore be generated
that will run as fast as possible with positive numbers. The "divide"
instruction on most architectures will do this easily. But to force the
results for negative numbers to fit any particular pattern will take
extra instructions on some machines. Therefore, C gives the compiler
designer the freedom to create the fastest code for positive numbers, no
matter what the results are for negative numbers, and a well-thought-out
C compiler will normally do that. (Some compilers may include options to
force one philosophy or another.)
I did not realize that c (and c++?) do not define it themselves. If it
is related to hardware as I am now suspecting, it might explain why. But
even so, I don't think it would have been difficult to program the
conversion algorithms to work a certain way. It seem, from what I've
gathered so far in this thread, that that is what Perl does.
Perl does not compile to true object code, so forcing the decision has
only a very minor effect on speed. Forcing a rule on C might make divide
operations expand from about two instructions to about six. Forcing it
in Perl might make it expand from, say, about fifty to about fifty-four.