P
pantagruel
Hi,
I am wondering if this is a problem with the setup of squid on my
network or a problem with java, I have the following code
import java.net.*;
import java.io.*;
public class WebSiteReader {
public static void main(String args[]){
String nextLine;
URL url = null;
URLConnection urlConn = null;
InputStreamReader inStream = null;
BufferedReader buff = null;
try{
//reate the URL obect that points
//t the default file index.html
System.setProperty("http.proxyHost","the proxy server");
System.setProperty("http.proxyPort", "80");
url = new URL("some URL to read in");
System.out.println("opening connection");
urlConn = url.openConnection();
System.out.println("connection opened");
inStream = new InputStreamReader(
urlConn.getInputStream());
buff= new BufferedReader(inStream);
System.out.println("before while");
//ead and print the lines from index.html
while (true){
nextLine =buff.readLine();
if (nextLine !=null){
System.out.println(nextLine);
}
else{
break;
}
}
} catch(MalformedURLException e){
System.out.println("Please check the URL:" +
e.toString() );
} catch(IOException e1){
System.out.println("Can't read from the Internet: "+
e1.toString() );
}
}
}
If I put in a url that is a top level domain, for example http://www.google.com
it returns the apache page for folder browsing, whatever that's
called, the one that says Index of /
/ being the path. (this is on a primarily windows network, but there
are probably some linux setups on it that I don't know of, at any rate
the Apache server is running on Ubuntu - probably in a VM somewhere)
If I try any url that is not a top level one, for example
http://www.google.com/ig?hl=en&esrch=BetaShortcuts&btnG=Search
I get a java.io.FileNotFoundException on the url.
Now if I try any url in a browser it goes directly through of course.
If I try any url in Curl (not just top-level domains) it tells me it
can't connect to the host, if I try curl with the same proxy
configuration I set in my java code above it gets all urls correctly.
So is there a setting I should set in my java code, or is it something
that should be fixed on the server proxy configurations, can anyone
point to something that would be causing this?
Anyone think of a way to track the problem? I don't want to go
complain to the admin unless absolutely necessary.
thanks.
I am wondering if this is a problem with the setup of squid on my
network or a problem with java, I have the following code
import java.net.*;
import java.io.*;
public class WebSiteReader {
public static void main(String args[]){
String nextLine;
URL url = null;
URLConnection urlConn = null;
InputStreamReader inStream = null;
BufferedReader buff = null;
try{
//reate the URL obect that points
//t the default file index.html
System.setProperty("http.proxyHost","the proxy server");
System.setProperty("http.proxyPort", "80");
url = new URL("some URL to read in");
System.out.println("opening connection");
urlConn = url.openConnection();
System.out.println("connection opened");
inStream = new InputStreamReader(
urlConn.getInputStream());
buff= new BufferedReader(inStream);
System.out.println("before while");
//ead and print the lines from index.html
while (true){
nextLine =buff.readLine();
if (nextLine !=null){
System.out.println(nextLine);
}
else{
break;
}
}
} catch(MalformedURLException e){
System.out.println("Please check the URL:" +
e.toString() );
} catch(IOException e1){
System.out.println("Can't read from the Internet: "+
e1.toString() );
}
}
}
If I put in a url that is a top level domain, for example http://www.google.com
it returns the apache page for folder browsing, whatever that's
called, the one that says Index of /
/ being the path. (this is on a primarily windows network, but there
are probably some linux setups on it that I don't know of, at any rate
the Apache server is running on Ubuntu - probably in a VM somewhere)
If I try any url that is not a top level one, for example
http://www.google.com/ig?hl=en&esrch=BetaShortcuts&btnG=Search
I get a java.io.FileNotFoundException on the url.
Now if I try any url in a browser it goes directly through of course.
If I try any url in Curl (not just top-level domains) it tells me it
can't connect to the host, if I try curl with the same proxy
configuration I set in my java code above it gets all urls correctly.
So is there a setting I should set in my java code, or is it something
that should be fixed on the server proxy configurations, can anyone
point to something that would be causing this?
Anyone think of a way to track the problem? I don't want to go
complain to the admin unless absolutely necessary.
thanks.