J
John Nagle
I have an application which uses feedparser (Python 2.6, of course),
on a Linux laptop. If the network connection is up at startup, it
works fine. If the network connection is down, feedparser reports
a network error. That's fine.
However, if the network is down, including access to DNS,
when the application starts, but
comes up later, the program continues to report a network error.
So, somewhere in the stack, someone is probably caching something.
My own code looks like this:
def fetchitems(self) : # fetch more items from feed source
try : # try fetching
now = time.time() # timestamp
# fetch from URL
d = feedparser.parse(self.url,etag=self.etag,
modified=self.modified)
# if network failure
if d is None or not hasattr(d,"status") :
raise IOError("of network or news source failure")
if d.status == 304 : # if no new items
self.logger.debug("Feed polled, no changes.")
return # nothing to do
self.logger.debug("Read feed: %d entries, status %s" %
(len(d.entries), d.status))
if d.status != 200 : # if bad status
raise IOError("of connection error No. %d" %
(d.status,))
...
The exception at: IOError("of network or news source failure")
is raised.
Looking in feedeparser.py, "parse" calls "_open_resource",
which, after much fooling around, builds a urllib2 request,
builds an "opener" via urllib2, and calls its "open" method.
So I'm not seeing any state that should persist from call
to call. What am I missing?
John Nagle
on a Linux laptop. If the network connection is up at startup, it
works fine. If the network connection is down, feedparser reports
a network error. That's fine.
However, if the network is down, including access to DNS,
when the application starts, but
comes up later, the program continues to report a network error.
So, somewhere in the stack, someone is probably caching something.
My own code looks like this:
def fetchitems(self) : # fetch more items from feed source
try : # try fetching
now = time.time() # timestamp
# fetch from URL
d = feedparser.parse(self.url,etag=self.etag,
modified=self.modified)
# if network failure
if d is None or not hasattr(d,"status") :
raise IOError("of network or news source failure")
if d.status == 304 : # if no new items
self.logger.debug("Feed polled, no changes.")
return # nothing to do
self.logger.debug("Read feed: %d entries, status %s" %
(len(d.entries), d.status))
if d.status != 200 : # if bad status
raise IOError("of connection error No. %d" %
(d.status,))
...
The exception at: IOError("of network or news source failure")
is raised.
Looking in feedeparser.py, "parse" calls "_open_resource",
which, after much fooling around, builds a urllib2 request,
builds an "opener" via urllib2, and calls its "open" method.
So I'm not seeing any state that should persist from call
to call. What am I missing?
John Nagle