C
Christoph Boget
I'm experiencing a very odd problem and it's happening only in IE6. IE7,
Safari, Opera and Firefox are all working properly. What's happening is
that I'm using XHR request responses to update the pages DOM, there appears
to be a conflict and/or race condition that pretty much locks the browser
down for several minutes until one of the requests times out. A huge red
flag stands out to me from the XHR object documentation
(http://msdn.microsoft.com/en-us/library/ms536648(VS.85).aspx):
----------------------------
XMLHttpRequest.open(sMethod, sUrl [, bAsync] [, sUser] [, sPassword])
Performance Note When bAsync is set to false, send operations are
synchronous, and Windows Internet Explorer does not accept input or produce
output while send operations are in progress. Therefore, this setting should
not be used in situations where it is possible for a user to be waiting on
the send operation to complete.
----------------------------
What we are seeing is that when we try to update the DOM after the first
request, that doesn't seem to get to complete (in as much as the new data is
not presented to the user onscreen) before subsequent XHR requests are sent
out. So I guess because IE is prevented from outputting the DOM update from
the first request, it has issues updating/outputting the DOM update from
subsequent requests. Does that sound right to you? It doesn't to me. And
it appears that the issue presents itself only after the first DOM update.
This will be illustrated in the code below.
So consider the following:
This is the function I'm using to get an XHR object, which is working fine
in and of itself across all browsers I've tested on.
function getXHRObject()
{
var oXMLHttp = null;
try
{
oXMLHttp = new XMLHttpRequest;
XHR.getXHRObject = function()
{
return new XMLHttpRequest;
};
}
catch(e)
{
var aMSxml = ['MSXML2.XMLHTTP.3.0', 'MSXML2.XMLHTTP',
'Microsoft.XMLHTTP'];
for (var nI = 0; nI < aMSxml.length; nI++)
{
try
{
oXMLHttp = new ActiveXObject(aMSxml[nI]);
XHR.getXHRObject = function()
{
return new ActiveXObject(aMSxml[nI]);
};
break;
}
catch(e)
{
}
}
}
return oXMLHttp;
}
Here is the main process that is causing the lock up; a few things have been
fairly abbreviated and includes only minimum error checking here...
dojo.io.queueBind( { url: pageUrl,
handler: function(type, data, evt)
{
try
{
var childDiv = dojo.byId(divId);
childDiv.innerHTML = data;
var head =
document.getElementsByTagName("head")[0];
aJavascriptIncludes = // Regex here to
pull out all the
//
javascript src= files from the
//
data variable
for (var idx = 0; idx <
aJavascriptIncludes.length; ++idx)
{
var scriptElement =
document.createElement('SCRIPT');
var current = aJavascriptIncludes[idx];
// Pull the source.
var src =
current.match(/src=\"(.*?)\"/i);
if(src != null )
{
xmlhttp = getXHRObject();
if (xmlhttp === null)
{
alert ("Browser does not support
asynchronous HTTP requests");
}
else
{
// Make the call.
xmlhttp.open("GET", src, false);
xmlhttp.send(null);
scriptElement.text =
xmlhttp.responseText;
}
}
// Append to the DOM.
head.appendChild(scriptElement);
}
}
catch (e)
{
alert('Error Data received:\n' + data);
}
}
} );
Please note, and let me stress, that all of the above works perfectly fine
in all the browsers mentioned above. The only browser that seems to have an
issue with it is IE6.
It seems to me that IE6 is locking up somewhere between this line:
childDiv.innerHTML = data;
and these lines
xmlhttp.open("GET", src, false);
xmlhttp.send(null);
scriptElement.text = xmlhttp.responseText;
....
head.appendChild(scriptElement);
Because xmlhttp.send() is preventing output, if IE6 doesn't have the
opportunity to finish updating the DOM to display the innerHTML of the
childDiv object, it just locks up when it tries to append the scriptElement
to the DOM.
One thing I tried (in verifying that it was trying to get the proper JS
files) was to issue an alert() for each file. When I did that, the entire
process ended up working fine and IE6 did not lock up. So I thought that
perhaps I just needed a delay between each send() request. After much
experimentation and testing, it turned out that I only needed a delay after
the innerHTML of the childDiv was updated. When I changed the above code to
what follows:
dojo.io.queueBind( { url: pageUrl,
handler: function(type, data, evt)
{
try
{
var childDiv = dojo.byId(divId);
childDiv.innerHTML = data;
var head =
document.getElementsByTagName("head")[0];
var pauseProcess = setTimeout( function()
{
var dataInfo = data;
aJavascriptIncludes = // Regex here to
pull out all the
//
javascript src= files from the
//
dataInfo variable
for (var idx = 0; idx <
aJavascriptIncludes.length; ++idx)
{
var scriptElement =
document.createElement('SCRIPT');
var current =
aJavascriptIncludes[idx];
// Pull the source.
var src =
current.match(/src=\"(.*?)\"/i);
if(src != null )
{
xmlhttp = getXHRObject();
if (xmlhttp === null)
{
alert ("Browser does not support
asynchronous HTTP requests");
}
else
{
// Make the call.
xmlhttp.open("GET", src, false);
xmlhttp.send(null);
scriptElement.text =
xmlhttp.responseText;
}
}
// Append to the DOM.
head.appendChild(scriptElement);
}
}, 500 );
}
catch (e)
{
alert('Error Data received:\n' + data);
}
}
} );
everything started working again in IE; no lockups occurred, so it's not as
if there were issues in going out, getting the files and then ultimately
appending them to the DOM. So just by waiting 500ms (250ms is what we
testing as being the bare minimum) before starting the requests for the JS
files, all was well. This is why I said earlier that it seemed to me that
the only delay needed was one between updating the childDiv's innerHTML and
the first request. If we reduce the delay to less than 250ms, we've seen
that IE6 is instead locking up when it tries to go out and get the images
that are part of what the innerHTML was set to.
So, what's going on in IE6? Why is it having issues with this and causing a
lock up? Any help and/or advice would be *greatly* appreciated!
thnx,
Christoph
Safari, Opera and Firefox are all working properly. What's happening is
that I'm using XHR request responses to update the pages DOM, there appears
to be a conflict and/or race condition that pretty much locks the browser
down for several minutes until one of the requests times out. A huge red
flag stands out to me from the XHR object documentation
(http://msdn.microsoft.com/en-us/library/ms536648(VS.85).aspx):
----------------------------
XMLHttpRequest.open(sMethod, sUrl [, bAsync] [, sUser] [, sPassword])
Performance Note When bAsync is set to false, send operations are
synchronous, and Windows Internet Explorer does not accept input or produce
output while send operations are in progress. Therefore, this setting should
not be used in situations where it is possible for a user to be waiting on
the send operation to complete.
----------------------------
What we are seeing is that when we try to update the DOM after the first
request, that doesn't seem to get to complete (in as much as the new data is
not presented to the user onscreen) before subsequent XHR requests are sent
out. So I guess because IE is prevented from outputting the DOM update from
the first request, it has issues updating/outputting the DOM update from
subsequent requests. Does that sound right to you? It doesn't to me. And
it appears that the issue presents itself only after the first DOM update.
This will be illustrated in the code below.
So consider the following:
This is the function I'm using to get an XHR object, which is working fine
in and of itself across all browsers I've tested on.
function getXHRObject()
{
var oXMLHttp = null;
try
{
oXMLHttp = new XMLHttpRequest;
XHR.getXHRObject = function()
{
return new XMLHttpRequest;
};
}
catch(e)
{
var aMSxml = ['MSXML2.XMLHTTP.3.0', 'MSXML2.XMLHTTP',
'Microsoft.XMLHTTP'];
for (var nI = 0; nI < aMSxml.length; nI++)
{
try
{
oXMLHttp = new ActiveXObject(aMSxml[nI]);
XHR.getXHRObject = function()
{
return new ActiveXObject(aMSxml[nI]);
};
break;
}
catch(e)
{
}
}
}
return oXMLHttp;
}
Here is the main process that is causing the lock up; a few things have been
fairly abbreviated and includes only minimum error checking here...
dojo.io.queueBind( { url: pageUrl,
handler: function(type, data, evt)
{
try
{
var childDiv = dojo.byId(divId);
childDiv.innerHTML = data;
var head =
document.getElementsByTagName("head")[0];
aJavascriptIncludes = // Regex here to
pull out all the
//
javascript src= files from the
//
data variable
for (var idx = 0; idx <
aJavascriptIncludes.length; ++idx)
{
var scriptElement =
document.createElement('SCRIPT');
var current = aJavascriptIncludes[idx];
// Pull the source.
var src =
current.match(/src=\"(.*?)\"/i);
if(src != null )
{
xmlhttp = getXHRObject();
if (xmlhttp === null)
{
alert ("Browser does not support
asynchronous HTTP requests");
}
else
{
// Make the call.
xmlhttp.open("GET", src, false);
xmlhttp.send(null);
scriptElement.text =
xmlhttp.responseText;
}
}
// Append to the DOM.
head.appendChild(scriptElement);
}
}
catch (e)
{
alert('Error Data received:\n' + data);
}
}
} );
Please note, and let me stress, that all of the above works perfectly fine
in all the browsers mentioned above. The only browser that seems to have an
issue with it is IE6.
It seems to me that IE6 is locking up somewhere between this line:
childDiv.innerHTML = data;
and these lines
xmlhttp.open("GET", src, false);
xmlhttp.send(null);
scriptElement.text = xmlhttp.responseText;
....
head.appendChild(scriptElement);
Because xmlhttp.send() is preventing output, if IE6 doesn't have the
opportunity to finish updating the DOM to display the innerHTML of the
childDiv object, it just locks up when it tries to append the scriptElement
to the DOM.
One thing I tried (in verifying that it was trying to get the proper JS
files) was to issue an alert() for each file. When I did that, the entire
process ended up working fine and IE6 did not lock up. So I thought that
perhaps I just needed a delay between each send() request. After much
experimentation and testing, it turned out that I only needed a delay after
the innerHTML of the childDiv was updated. When I changed the above code to
what follows:
dojo.io.queueBind( { url: pageUrl,
handler: function(type, data, evt)
{
try
{
var childDiv = dojo.byId(divId);
childDiv.innerHTML = data;
var head =
document.getElementsByTagName("head")[0];
var pauseProcess = setTimeout( function()
{
var dataInfo = data;
aJavascriptIncludes = // Regex here to
pull out all the
//
javascript src= files from the
//
dataInfo variable
for (var idx = 0; idx <
aJavascriptIncludes.length; ++idx)
{
var scriptElement =
document.createElement('SCRIPT');
var current =
aJavascriptIncludes[idx];
// Pull the source.
var src =
current.match(/src=\"(.*?)\"/i);
if(src != null )
{
xmlhttp = getXHRObject();
if (xmlhttp === null)
{
alert ("Browser does not support
asynchronous HTTP requests");
}
else
{
// Make the call.
xmlhttp.open("GET", src, false);
xmlhttp.send(null);
scriptElement.text =
xmlhttp.responseText;
}
}
// Append to the DOM.
head.appendChild(scriptElement);
}
}, 500 );
}
catch (e)
{
alert('Error Data received:\n' + data);
}
}
} );
everything started working again in IE; no lockups occurred, so it's not as
if there were issues in going out, getting the files and then ultimately
appending them to the DOM. So just by waiting 500ms (250ms is what we
testing as being the bare minimum) before starting the requests for the JS
files, all was well. This is why I said earlier that it seemed to me that
the only delay needed was one between updating the childDiv's innerHTML and
the first request. If we reduce the delay to less than 250ms, we've seen
that IE6 is instead locking up when it tries to go out and get the images
that are part of what the innerHTML was set to.
So, what's going on in IE6? Why is it having issues with this and causing a
lock up? Any help and/or advice would be *greatly* appreciated!
thnx,
Christoph