R
Ray Chen
Hi,
I am currently working with some code for fetching webpages, and I have
run into a problem. The current implementation does not fetch webpages
with _ in the subdomain, for ex http://a_b.google.com.
I have poked around the forum posts and read that the _ in the subdomain
violates an RFC standand, but in my case it is necessary to retrieve
those pages regardless. Before I dive a bit more into this code that I
inherited, has anyone successfully retrieved such pages?
The code uses URI.parse for URI parsing and Net::HTTP for page
retrieval. Currently the code breaks at the URI.parse. Will it suffice
just to rewrite the URI.parse or do I need to find an alternative to
Net::HTTP as well?
Thanks in advance.
I am currently working with some code for fetching webpages, and I have
run into a problem. The current implementation does not fetch webpages
with _ in the subdomain, for ex http://a_b.google.com.
I have poked around the forum posts and read that the _ in the subdomain
violates an RFC standand, but in my case it is necessary to retrieve
those pages regardless. Before I dive a bit more into this code that I
inherited, has anyone successfully retrieved such pages?
The code uses URI.parse for URI parsing and Net::HTTP for page
retrieval. Currently the code breaks at the URI.parse. Will it suffice
just to rewrite the URI.parse or do I need to find an alternative to
Net::HTTP as well?
Thanks in advance.