library for http/ftp requests

A

A.Leopold

hi,

which is a good library to work with
to enable downloading files / http pages in my project?

thanks,

leo
 
M

maverik

hi,

which is a good library to work with
to enable downloading files / http pages in my project?

There is no library (as a part of the C++ language or C++ standard)
that provide files / http pages / ... downloading.
 
T

Thomas J. Gritzan

Sam said:
The only one I know of is W3C's libwww, but it's very old and is not
actively maintained.

There's curl / libcurl, which is easy to use and seems to be actively
maintained.
 
J

jason.cipriani

hi,

which is a good library to work with
to enable downloading files / http pages in my project?

I'm personally a big fan of curl. However, if you check out this page:

http://curl.haxx.se/libcurl/competitors.html

It contains a good list of alternatives. Of course the list is a bit
biased towards curl, but just ignore the spin. I do like curl though.
Releases are available for many platforms on their download page:

http://curl.haxx.se/download.html

Also you may have platform-specific options. For example, if you are
using C++ Builder (or RAD Studio), the VCL comes with some HTTP client
components. If ActiveX is an option, google for "activex http client",
a number of components come up.

HTH,
Jason
 
J

jason.cipriani

The only one I know of is W3C's libwww, but it's very old and is not
actively maintained.

However, http is not rocket science. Neither is ftp. Both are fairly simple
protocols to implement from the client side. If you start now, by this time
tomorrow you should have a bare-bones, but a working http and ftp client
code that you can proceed with.

There are enough actively maintained and functional HTTP clients, with
support for non-trivial features such as compression, SSL, etc., out
there that reinventing the wheel here is not a good solution.

And while it may not be rocket science, per se, if you want to make a
conforming client, you have a 176-page standard to work through:

ftp://ftp.rfc-editor.org/in-notes/rfc2616.txt

Jason
 
J

jason.cipriani

If all you need is a way to grab a document given its URL, using HTTP, you
do not need to waste time coding support for byte ranges, chunked encoding,
or 95% of the stuff described in that document.

That's incorrect if the URL is arbitrary, of course. You *could* leave
out all of those feature if, say, you had full control over the HTTP
server, or you were able to safely make assumptions about how the
server will deliver a response (or if it does not matter if those
assumptions don't hold). Even then, grabbing curl and using it gives
you a full featured HTTP client in *less* than a few dozen lines of
code, and you don't have to have any knowledge of networking, etc, to
do it (e.g. perhaps *you* don't have a problem putting that together,
but there is a level of knowledge required when reinventing wheels
that isn't required when using code that somebody else is responsible
for). That's certainly more desirable than a client you've hacked
together, and probably poorly in haste.

In any case, the OP was specifically looking for a library to solve
the problem, advising a programmer to use their own poor
reimplementation of something that's been done dozens of times already
and maintained by a community of other programmers is never good
advice.

Jason
 
J

jason.cipriani

The HTTP server has no say in it. Perhaps you are not familiar with HTTP. If
the client does not indicate that it is capable of accepting HTTP/1.1
chunked encoding, or byte ranges, it's not going to get them from the
server.


A response from a compliant HTTP server that uses an easily parsable minimum
subset of HTTP is guaranteed, if one knows what he's doing. The reverse is
also true. I don't know if you did that yourself, but I have implemented a
bare-bones HTTP client, and server, that had no interoperability problems
whatsoever with peers that supposedly implemented the full HTTP/1.1 suite
(for example, like both Java and D$B"u(B SOAP libraries).

The notion that one cannot reliably implement an interoperable, but simple,
HTTP client or server without the aid of some library is absurd.


Only if you think that fork()+exec() is sufficient for a robust
implementation.


As if it was some kind of a bad thing.


Perhaps, but it's not rocket science.


The management of an 800lb Wall Street gorilla would disagree would that
assessment.


That's rather arrogant of you, taking it for granted that the OP's
reimplementation is guaranteed to turn out "poor".

But I do agree that, over the last couple of years, there's been a noticable
trend of people demanding pre-digested food, because chewing their own is
just too darn difficult for them, even if the results go down smoother and
provide better nutrition.

application_pgp-signature_part
< 1KViewDownload


The fundamental flaw that invalidates every portion of what you just
wrote is that if I knew your full name, I'd never consider hiring you
to work on a project knowing how much of your time and my money you
wasted reinventing the wheel. End of story.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,197
Messages
2,571,041
Members
47,634
Latest member
Alba16710

Latest Threads

Top