C
Christoph Kukulies
Sorry, it's a bit off topic but I don't know which NG to turn to.
I'm using libfetch under FreeBSD and
http_fetcher http://http-fetcher.sourceforge.net/ under Linux since there
is no libfetch (unfortunately, libfetch is richer of features and not
so cumbersome as http_fetcher). Anyway, my website runs a cgi
program which in turn fetches an URL.
This fetch fails strangely when I access certain websites,
e.g.
#include <stdio.h>
#include "http_fetcher.h"
main(argc,argv)
int argc;
char **argv;
{
char **cp;
int groesse;
if(http_setUserAgent(NULL)!= 0 )
http_perror(http_strerror()),exit(3);
groesse= http_fetch(argv[1], cp);
fprintf(stderr,"%s, %d\n",argv[1],groesse);
if(0 >= groesse) {
http_perror(http_strerror()),exit(2);
}
}
When I pass the following URL to this program
it times out after a couple of minutes without an error message.
And without writing anything to the output.
(it seems to take another exit).
Does anyone know of other libs to fetch URLs?
(I need to do this in C and not in a scripting language).
Again sorry for the OT.
I'm using libfetch under FreeBSD and
http_fetcher http://http-fetcher.sourceforge.net/ under Linux since there
is no libfetch (unfortunately, libfetch is richer of features and not
so cumbersome as http_fetcher). Anyway, my website runs a cgi
program which in turn fetches an URL.
This fetch fails strangely when I access certain websites,
e.g.
#include <stdio.h>
#include "http_fetcher.h"
main(argc,argv)
int argc;
char **argv;
{
char **cp;
int groesse;
if(http_setUserAgent(NULL)!= 0 )
http_perror(http_strerror()),exit(3);
groesse= http_fetch(argv[1], cp);
fprintf(stderr,"%s, %d\n",argv[1],groesse);
if(0 >= groesse) {
http_perror(http_strerror()),exit(2);
}
}
When I pass the following URL to this program
it times out after a couple of minutes without an error message.
And without writing anything to the output.
(it seems to take another exit).
Does anyone know of other libs to fetch URLs?
(I need to do this in C and not in a scripting language).
Again sorry for the OT.