M
Michel Rouzic
I need to determine how long does an addition take and how long does a
multiplication takes. So far I've been trying to use the clock()
function in my programs in order to find out how long it took the CPU
to compute it, the only problem is that I get fairly inconsistent
results, inconsistent enough not to know between two codes which code
runs faster without running each about ten times and making the average
of the reported CPU times.
Right now my main problem is to determine whether 3 float additions are
slower or faster than one float multiplication. I tried making small
programs looping some additions or some multiplications to try to find
out, but not only do I still get inconsistent results, I'm also not
sure if it is even right (i'm not sure that performing 3*5 about one
billion times is relevant)
So how do I determine reliably how long it takes for my CPU to compute
an addition or a multiplication (or even other opeartions)?
multiplication takes. So far I've been trying to use the clock()
function in my programs in order to find out how long it took the CPU
to compute it, the only problem is that I get fairly inconsistent
results, inconsistent enough not to know between two codes which code
runs faster without running each about ten times and making the average
of the reported CPU times.
Right now my main problem is to determine whether 3 float additions are
slower or faster than one float multiplication. I tried making small
programs looping some additions or some multiplications to try to find
out, but not only do I still get inconsistent results, I'm also not
sure if it is even right (i'm not sure that performing 3*5 about one
billion times is relevant)
So how do I determine reliably how long it takes for my CPU to compute
an addition or a multiplication (or even other opeartions)?