J
Javayang
Howdy folks,
Wondering if anyone has any experience / solution to following
problem:
I have a UDP Client and Server UDP application. On the client side I
set a Timeout value of 2000ms on the DatagramSocket using
setSoTimeout(). It seems everytime that a server sends out a packet,
if the receive() on the Datagram socket times out, the packet that the
server sent is lost. This happens consistently everytime. I am
running these applications on Windows 2000 machines.
I know UDP is unreliable but the consistency of the loss packet
corresponding to the timeout seems rather odd. When the
InterrptedIOException is caught(timeout), I checked the
bytesTransferred value, it has the value 0.
Has anyone else experience this problem and have found a work around?
I guess I could spawn a thread that does a blocking read (0 timeout)
instead and simply use a separate timer thread to monitor and
determine how much time has elapsed since the start of the blocking
read. And if the time expires, I can killed the thread doing the
blocking read. Does that sound reasonable?
Thanks in advance for any input, insight, help.
Kent
Wondering if anyone has any experience / solution to following
problem:
I have a UDP Client and Server UDP application. On the client side I
set a Timeout value of 2000ms on the DatagramSocket using
setSoTimeout(). It seems everytime that a server sends out a packet,
if the receive() on the Datagram socket times out, the packet that the
server sent is lost. This happens consistently everytime. I am
running these applications on Windows 2000 machines.
I know UDP is unreliable but the consistency of the loss packet
corresponding to the timeout seems rather odd. When the
InterrptedIOException is caught(timeout), I checked the
bytesTransferred value, it has the value 0.
Has anyone else experience this problem and have found a work around?
I guess I could spawn a thread that does a blocking read (0 timeout)
instead and simply use a separate timer thread to monitor and
determine how much time has elapsed since the start of the blocking
read. And if the time expires, I can killed the thread doing the
blocking read. Does that sound reasonable?
Thanks in advance for any input, insight, help.
Kent