Timeout causes loss of Datagram Packet (UDP).

J

Javayang

Howdy folks,

Wondering if anyone has any experience / solution to following
problem:

I have a UDP Client and Server UDP application. On the client side I
set a Timeout value of 2000ms on the DatagramSocket using
setSoTimeout(). It seems everytime that a server sends out a packet,
if the receive() on the Datagram socket times out, the packet that the
server sent is lost. This happens consistently everytime. I am
running these applications on Windows 2000 machines.

I know UDP is unreliable but the consistency of the loss packet
corresponding to the timeout seems rather odd. When the
InterrptedIOException is caught(timeout), I checked the
bytesTransferred value, it has the value 0.

Has anyone else experience this problem and have found a work around?

I guess I could spawn a thread that does a blocking read (0 timeout)
instead and simply use a separate timer thread to monitor and
determine how much time has elapsed since the start of the blocking
read. And if the time expires, I can killed the thread doing the
blocking read. Does that sound reasonable?

Thanks in advance for any input, insight, help.

Kent
 
S

Steve Horsley

Howdy folks,

Wondering if anyone has any experience / solution to following
problem:

I have a UDP Client and Server UDP application. On the client side I
set a Timeout value of 2000ms on the DatagramSocket using
setSoTimeout(). It seems everytime that a server sends out a packet,
if the receive() on the Datagram socket times out, the packet that the
server sent is lost. This happens consistently everytime. I am
running these applications on Windows 2000 machines.

I know UDP is unreliable but the consistency of the loss packet
corresponding to the timeout seems rather odd. When the
InterrptedIOException is caught(timeout), I checked the
bytesTransferred value, it has the value 0.

That sounds chicken-and-egg to me. Getting the exception means that
we waited for 2 seconds, and nothing arrived during that time. That may
well be because the packet got lost somehow. So it's not that the
exception causes the lost packet, it's that the lost packet causes the
exception.

Or it might be that you don't wait long enough (if the server is slow).
In which case, the exception IS the cause of the lost packet because it
causes you to give up waiting.
Has anyone else experience this problem and have found a work around?

I guess I could spawn a thread that does a blocking read (0 timeout)
instead and simply use a separate timer thread to monitor and
determine how much time has elapsed since the start of the blocking
read. And if the time expires, I can killed the thread doing the
blocking read. Does that sound reasonable?

Sounds reasonable, although possible not necessary. I would certainly use
a different thread than a GUI event thread, to avoid frozen pauses. And
since you are using UDP, I would implement a retransmission timer, and try
several times if I don't get an answer first time.

Steve
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,995
Messages
2,570,226
Members
46,815
Latest member
treekmostly22

Latest Threads

Top