P
Prad
Hi,
I am trying to fetch some URLs using LWP, but keep getting read
timeout error at times.
Here is what I am doing.
Foreach my $url (@urls) {
my $ua = LWP::UserAgent->new(timeout => 300, agent => 'Mozilla/5.0',
);
my $req = new HTTP::Request POST => '$url' ;
$req->content("...");
my $content;
if ($res->is_success)
{
$content= $res->content;
}
else
{
use Data:umper;
print "Request : " ,Dumper $req;
print "Response : " ,Dumper $res;
die "Could not get content : Message : " , $res->message;
}
}
The loop runs and works fine for 10-15 urls, and then dies with the
error : read timeout at d:\perllib\URLBatch.pm
It doesn't dies consistently at the same URL. If I try to view the URL
in browser, the URL shows up fine, and well before 300 seconds (the
timeout).
I can't figure out, what am I doing wrong.
Any pointers would be highly appreciated.
The output from the Dumper statements is below.
Thanks,
-Pradeep
$VAR1 = bless( {
'_method' => 'POST',
'_headers' => bless( {
'user-agent' => 'Mozilla/5.0'
}, 'HTTP::Headers' ),
'_uri' => bless( do{\(my $o =
'http://xyz.com/URLBatchFactory.cgi')}, 'URI::http' ),
'_content' => 'ProjectID=111&Command=GetDetails'
}, 'HTTP::Request' );
$VAR1 = bless( {
'_request' => bless( {
'_method' => 'POST',
'_headers' => bless( {
'user-agent' => 'Mozilla/5.0'
},
'HTTP::Headers' ),
'_uri' => bless( do{\(my $o =
'http://xyz.com/URLBatchFactory.cgi')}, 'URI::http' ),
'_content' =>
'ProjectID=209Command=GetDetails'
}, 'HTTP::Request' ),
'_headers' => bless( {
'client-date' => 'Mon, 23 Feb
2004 11:18:20 GMT'
}, 'HTTP::Headers' ),
'_msg' => 'read timeout',
'_rc' => 500,
'_content' => ''
}, 'HTTP::Response' );
I am trying to fetch some URLs using LWP, but keep getting read
timeout error at times.
Here is what I am doing.
Foreach my $url (@urls) {
my $ua = LWP::UserAgent->new(timeout => 300, agent => 'Mozilla/5.0',
);
my $req = new HTTP::Request POST => '$url' ;
$req->content("...");
my $content;
if ($res->is_success)
{
$content= $res->content;
}
else
{
use Data:umper;
print "Request : " ,Dumper $req;
print "Response : " ,Dumper $res;
die "Could not get content : Message : " , $res->message;
}
}
The loop runs and works fine for 10-15 urls, and then dies with the
error : read timeout at d:\perllib\URLBatch.pm
It doesn't dies consistently at the same URL. If I try to view the URL
in browser, the URL shows up fine, and well before 300 seconds (the
timeout).
I can't figure out, what am I doing wrong.
Any pointers would be highly appreciated.
The output from the Dumper statements is below.
Thanks,
-Pradeep
$VAR1 = bless( {
'_method' => 'POST',
'_headers' => bless( {
'user-agent' => 'Mozilla/5.0'
}, 'HTTP::Headers' ),
'_uri' => bless( do{\(my $o =
'http://xyz.com/URLBatchFactory.cgi')}, 'URI::http' ),
'_content' => 'ProjectID=111&Command=GetDetails'
}, 'HTTP::Request' );
$VAR1 = bless( {
'_request' => bless( {
'_method' => 'POST',
'_headers' => bless( {
'user-agent' => 'Mozilla/5.0'
},
'HTTP::Headers' ),
'_uri' => bless( do{\(my $o =
'http://xyz.com/URLBatchFactory.cgi')}, 'URI::http' ),
'_content' =>
'ProjectID=209Command=GetDetails'
}, 'HTTP::Request' ),
'_headers' => bless( {
'client-date' => 'Mon, 23 Feb
2004 11:18:20 GMT'
}, 'HTTP::Headers' ),
'_msg' => 'read timeout',
'_rc' => 500,
'_content' => ''
}, 'HTTP::Response' );