Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamicindex17/ajaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.
If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</a> works just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]
}
</script></head><body>...
within the homepage, and swopping is you can
Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.
ShimmyShack -
Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TR> elements. Is
there any harm here?
Thanks again.