A
alexteo21
I have created a script using python that will batch process data
files every hour
The script is running on Solaris. Python version 2.3.3
t=open(filename,'rb')
data=t.read()
#processing data...
t.close()
The script is working fine on the day of execution.
It is able to process the data files very hour. However, the
processing fail one day later i.e. the date increment by 1.
Traceback (most recent call last):
File "./alexCopy.py", line 459, in processRequestModule
sanityTestSteps(reqId,model)
File "./alexCopy.py", line 699, in sanityTestSteps
t = open(filename, 'rb')
IOError: [Errno 24] Too many open files:
I have explicitly closed the file. Is there something else I need to
do?
Appreciate your comments
files every hour
The script is running on Solaris. Python version 2.3.3
t=open(filename,'rb')
data=t.read()
#processing data...
t.close()
The script is working fine on the day of execution.
It is able to process the data files very hour. However, the
processing fail one day later i.e. the date increment by 1.
Traceback (most recent call last):
File "./alexCopy.py", line 459, in processRequestModule
sanityTestSteps(reqId,model)
File "./alexCopy.py", line 699, in sanityTestSteps
t = open(filename, 'rb')
IOError: [Errno 24] Too many open files:
I have explicitly closed the file. Is there something else I need to
do?
Appreciate your comments