counting in long variable in perl

W

Wayne Fulton

A cgi counter (code that I copied from somewhere) was reseting to zero
frequently. My guess is that it was when it got to 65535, overflowing
16 bits (I am not entirely sure of that, but it seems logical).

Code was:

sub incrementCount {
$counterFile = "filenamehere.txt";
if (-e $counterFile) {
open(COUNT,"$counterFile") || die("Can't open $counterFile: $!\n");
}
$total = <COUNT>;
chop $total;
close(COUNT);
$total++;
open(COUNT,">$counterFile") || die "$0: can\'t open $counterFile:
$!\n";
print COUNT "$total\n";
close(COUNT);
}

Since perl variables are not typed, I reasoned that it was the final
print doing the 16 bit reset. I changed it to

printf (COUNT "%lu\n", $total);

and seemingly have corrected the problem - it counts past 65535 now.

Questions:

Why was it resetting in the first place? I know mostly C, but I
thought perl sensed what variable type was necessary? Why was it
truncated as short? I assume this must be a print default, but I am
unalbe to find reference to this in the Programming Perl 2nd book.

Probably a seek 0 would be more efficient than two opens/closes, but is
this %lu format as good as any way of fixing the "short" problem?

Comments appreciated. Thanks.
 
T

Tad McClellan

Wayne Fulton said:
A cgi counter (code that I copied from somewhere) was reseting to zero
frequently.


That can happen in a multitasking environment when you
don't implement file locking, and the crufty code you snarfed
doesn't do file locking.

It was clearly written by an amateur. Be careful with it...

My guess is that it was when it got to 65535, overflowing
16 bits (I am not entirely sure of that, but it seems logical).


I doubt that that is it.

Numbers are double precision floating point internal to perl,
they can go waaaaay beyond 65536.

if (-e $counterFile) {
open(COUNT,"$counterFile") || die("Can't open $counterFile: $!\n");


There is a race there.

(and I don't see what the point of the -e file test is either.)

chop $total;


Looks like the code is 8 years old or so.

In contemporary Perl that is done with:

chomp $total;

close(COUNT);
open(COUNT,">$counterFile") || die "$0: can\'t open $counterFile:
$!\n";


There is yet another race.

Why was it resetting in the first place?


Because of a race, most likely.

Probably a seek 0 would be more efficient than two opens/closes,


It isn't a question of efficiency.

It is a question of what can work and what cannot work.

The code you posted cannot work (in a multitasking environment).

Comments appreciated. Thanks.


Throw that code where it belongs (in the trash) and rewrite it
to use file locking.


perldoc -q "\block"

How can I lock a file?

Why can't I just open(FH, ">file.lock")?

I still don't get locking. I just want to incre­
ment the number in the file. How can I do this?
 
W

Wayne Fulton

That can happen in a multitasking environment when you
don't implement file locking, and the crufty code you snarfed
doesn't do file locking.

It was clearly written by an amateur. Be careful with it...


It came from a free script site. I used this one routine out of it to
increment a count file, a long time back. It is locked prior to the call,
via a named file. I know flock is better, important, but not in this
case. The value of this file is the timestamp it was last written, but
since it does count to change that timestamp, I'm curious why it doesnt
count right, why it appears to reset.

A race might occur, but for a counter that can only increment by one, I
dont see how such race could ever reset or decrease the count - it seems
to me that it might only miss a count or two, now and then. Such miss is
not important here in this case, but I'm puzzled why it resets.

Numbers are double precision floating point internal to perl,
they can go waaaaay beyond 65536.

That's why I suspected the print statement, but I dont know the cause. I
take it that you you dont think print is doing it.

I have not seen it reset to zero, nor turn over at 65K, that was just my
guess. The evidence I have seen (a few times) is that a prior larger
number like say 41234 might be 12926 next time I look (like after a few
months). I've never seen it past 65K, and I know it is counting several
times an hour, 24/7, so even if it is missing many of the counts, it
should be way over 65K by now, and I dont see how it can ever decrease.

OK, thanks, I will keep looking for the cause. I'll probably add flock
too, but I dont see how that can reset a counter.
 
S

Sam Holden

It came from a free script site. I used this one routine out of it to
increment a count file, a long time back. It is locked prior to the call,
via a named file. I know flock is better, important, but not in this
case. The value of this file is the timestamp it was last written, but
since it does count to change that timestamp, I'm curious why it doesnt
count right, why it appears to reset.

A race might occur, but for a counter that can only increment by one, I
dont see how such race could ever reset or decrease the count - it seems
to me that it might only miss a count or two, now and then. Such miss is
not important here in this case, but I'm puzzled why it resets.

If it is of the form:

1. open for reading
2. read number
3. increment number
4. close file #can leave this one out
5. open for writing
6. write number
7. close file

Than there is an obvious race to reset the counter, in fact it's the one
I think I've seen as an example the most since it is the reason why
locking a file after opening it for plain old writing doesn't work.

Two processes execute the code. The first process runs all the way to
step 5 (ie. it has just opened the file for writing) then the second
process starts up and executes steps 1 and 2.

The second process will read nothing from the file, since the first
process truncated it upon opening. Since perl will convert the undef to
0 when arithmetic is done on it, the second process will end up writing
a 1 to the file.
 
W

Wayne Fulton

If it is of the form:

1. open for reading
2. read number
3. increment number
4. close file #can leave this one out
5. open for writing
6. write number
7. close file

Than there is an obvious race to reset the counter, in fact it's the one
I think I've seen as an example the most since it is the reason why
locking a file after opening it for plain old writing doesn't work.

Two processes execute the code. The first process runs all the way to
step 5 (ie. it has just opened the file for writing) then the second
process starts up and executes steps 1 and 2.

The second process will read nothing from the file, since the first
process truncated it upon opening. Since perl will convert the undef to
0 when arithmetic is done on it, the second process will end up writing
a 1 to the file.


Ah so, I can see that reset now. Thanks very much Sam.
 
M

Martien Verbruggen

It came from a free script site.

In that case, maybe you should have a look at
http://nms-cgi.sourceforge.net/ before downloading more from that
site. The NMS stuff is generally of a good quality, and doesn't suffer
from many problems that other free CGI stuff suffers from.

They do have a counter script there as well.

Martien
 
W

Wayne Fulton

In that case, maybe you should have a look at
http://nms-cgi.sourceforge.net/ before downloading more from that
site. The NMS stuff is generally of a good quality, and doesn't suffer
from many problems that other free CGI stuff suffers from.


Thanks Martien. Looks good, I will bookmark it, but meanwhile
I have implemented the flock method referenced in perldoc.
Best part is that I think I've learned something too.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,141
Messages
2,570,818
Members
47,367
Latest member
mahdiharooniir

Latest Threads

Top