G
Gilles Ganault
Hello
I'm using urllib2 to download web pages. The strange thing in the code
below, is that it seems like urllib2.urlopen retries indefinitely by
itself instead of raising an exception:
=====
timeout = 30
socket.setdefaulttimeout(timeout)
i = 0
while i < 5:
try:
url = 'http://www.acme.com'
print url
req = urllib2.Request(url, None, headers)
response = urllib2.urlopen(req).read()
except:
#Never called :-/
print Timed-out."
if i == 4:
print "Exiting."
connection.close(True)
sys.exit()
else:
print "Trying again"
i = i + 1
time.sleep(10)
continue
=====
I haven't found a switch within urllib2 that would tell it to raise an
exception after it times out trying to download a web page. Any idea
how to have it stop trying after 5 tries?
Thank you.
I'm using urllib2 to download web pages. The strange thing in the code
below, is that it seems like urllib2.urlopen retries indefinitely by
itself instead of raising an exception:
=====
timeout = 30
socket.setdefaulttimeout(timeout)
i = 0
while i < 5:
try:
url = 'http://www.acme.com'
print url
req = urllib2.Request(url, None, headers)
response = urllib2.urlopen(req).read()
except:
#Never called :-/
print Timed-out."
if i == 4:
print "Exiting."
connection.close(True)
sys.exit()
else:
print "Trying again"
i = i + 1
time.sleep(10)
continue
=====
I haven't found a switch within urllib2 that would tell it to raise an
exception after it times out trying to download a web page. Any idea
how to have it stop trying after 5 tries?
Thank you.