Hi All,
I am looking for a simple C/ C++ web crawler code.
It should be very simple with minimal functionality.
You will have to consult the newsgroup of the platform you wish the web
crawler to run on.
For example:
DOS, BIOS, Memory Models,
interrupts, screen handling,
hardware
MS/Windows: Mice, DLLs, hardware
MS 32-bit API
OS/2 Programming
Macintosh Programming
General Unix: processes, pipes,
POSIX, curses, sockets
news:comp.unix.[vendor] Various Unix vendors
Linux application programming
I am particularly interested in the code to grab the content of a url
and the code to search this content for other urls.
Code for retrieving URLs is not possible in standard C and needs a
third-party library (or a set of wrapper functions), but it is on-topic to
discuss how to retrieve URLs from the retrieved content.
The simplest way to detect a Url is to call char *strstr( const char
*string, const char *strCharSet ), the first parameter being a line of the
file being scanned, and the second parameter being a simple "http://". If
you want to do a case-insensitive search, you will have to roll your own
function, since there isn't even a "stristr" in a non-standard
implementation.
I can't tell you how to detect the end-point of the URL - you will have to
take a look at the RFCs for that.