Dynamic Content in High-Traffic Sites

P

pbd22

Hi.

This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:

http://www.dynamicdrive.com/dynamicindex17/ajaxcontent.htm

I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.

I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.

Is this a risk?

Thanks.
 
S

shimmyshack

Hi.

This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:

http://www.dynamicdrive.com/dynamicindex17/ajaxcontent.htm

I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.

I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.

Is this a risk?

Thanks.

post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
 
D

Darko

post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.

Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.

To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax, and if speed was an issue,
they certainly wouldn't accept it. Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
 
L

-Lost

Darko said:
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.

No, actually it isn't. And I am sure by shimmyshack's post history it
is evident he didn't mean load 19 Megabytes worth of markup and then
switch it via hidden and shown DIVs.

And in fact, bytefx uses this very same method.

http://www.devpro.it/bytefx/

The only problem that arises is when JavaScript is disabled, so you
should also make sure each DIV has a named anchor in or near it, so
links (or whatever you are clicking) can still allow you to reach that
section of information. Or just provide links to the pages in question
which is also acceptable, degradable JavaScript.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax, and if speed was an issue,
they certainly wouldn't accept it. Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.

Your suggestion is just as bad, in my opinion worse. You recommend
using XMLHttpRequests based on their popularity? The original poster
was asking a "disaster (management) recovery" question. So recommending
he use a technology simply because you think it is a popular option is
not wise.

Regardless of whether or not it is suitable or scalable.
 
S

shimmyshack

Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.

no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.

If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're

Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</a> works just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,

you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]
}

Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.

but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.
 
P

pbd22

Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.

no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.

If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.


To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're

Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</a> works just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,

you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
and if speed was an issue,
they certainly wouldn't accept it.

It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]

}

Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.

but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.

ShimmyShack -

Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TR> elements. Is
there any harm here?

Thanks again.
 
S

shimmyshack

no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.
If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</a> works just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]

</script></head><body>...
within the homepage, and swopping is you can
Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.

ShimmyShack -

Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TR> elements. Is
there any harm here?

Thanks again.

well I was assuming your code would be a few pages of content (one
piece of content per div) but rembember any content which you are
storing in divs, will be in the DOM unless they are hard coded in the
markup to have display:none - setting this in the CSS after page load
wouldnt be good because the whole page would have to render before
hiding some divs, and so yes, storing a lot of hidden tables is not
good, a "ton" of TRs just shouldnt exsit any more, it has been years
since css replaced the need for tables!!
I was assuming your code was modern sematic markup with css for the
display/look&feel. If you are using tables then you could consider
storing your contents as javascript strings of pure text, and creating
the table dynamically, however in the end, the best bet might well be
to simply use good old fashioned links until you markup is modern, and
then apply the modern techniques to it, ending up with a very clean
easy to update site.
 
P

pbd22

Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamicindex17/ajaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.
If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</a> works just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]
}
</script></head><body>...
within the homepage, and swopping is you can
Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.
ShimmyShack -
Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TR> elements. Is
there any harm here?
Thanks again.

well I was assuming your code would be a few pages of content (one
piece of content per div) but rembember any content which you are
storing in divs, will be in the DOM unless they are hard coded in the
markup to have display:none - setting this in the CSS after page load
wouldnt be good because the whole page would have to render before
hiding some divs, and so yes, storing a lot of hidden tables is not
good, a "ton" of TRs just shouldnt exsit any more, it has been years
since css replaced the need for tables!!
I was assuming your code was modern sematic markup with css for the
display/look&feel. If you are using tables then you could consider
storing your contents as javascript strings of pure text, and creating
the table dynamically, however in the end, the best bet might well be
to simply use good old fashioned links until you markup is modern, and
then apply the modern techniques to it, ending up with a very clean
easy to update site.

Hi, thanks. I have no problem trying to learn "the right way" to do
things once I know what those things are. By "modern techniques" do
you mean a page with only DIV tags and CSS - no tables at all? I am
assuming this is the way to go? Thanks again.
 
S

shimmyshack

Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamicindex17/ajaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.
If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</a> works just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]
}
</script></head><body>...
within the homepage, and swopping is you can
Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.
ShimmyShack -
Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TR> elements. Is
there any harm here?
Thanks again.
well I was assuming your code would be a few pages of content (one
piece of content per div) but rembember any content which you are
storing in divs, will be in the DOM unless they are hard coded in the
markup to have display:none - setting this in the CSS after page load
wouldnt be good because the whole page would have to render before
hiding some divs, and so yes, storing a lot of hidden tables is not
good, a "ton" of TRs just shouldnt exsit any more, it has been years
since css replaced the need for tables!!
I was assuming your code was modern sematic markup with css for the
display/look&feel. If you are using tables then you could consider
storing your contents as javascript strings of pure text, and creating
the table dynamically, however in the end, the best bet might well be
to simply use good old fashioned links until you markup is modern, and
then apply the modern techniques to it, ending up with a very clean
easy to update site.

Hi, thanks. I have no problem trying to learn "the right way" to do
things once I know what those things are. By "modern techniques" do
you mean a page with only DIV tags and CSS - no tables at all? I am
assuming this is the way to go? Thanks again.

yeah, just the type of html/xhtml that you choose to use.
Bear in mind that it takes a bit of thinking to change, but that the
learning curse is WELL worth it. Back in 2004 there was this huge
table driven site, each page's source code printed to 7 A4 pages of
closely packed text, with css, it went down to 2 nicely formatted
pages. It's better for you, for those who use assistive devices, for
your search engine rating, and for you clients as the rendering time
is slashed.
I recommend checking out sites like http://alistapart.com/
Check out the source code to alistapart and see no tables!
The front page looks as if it could use tables, but download firefox
and use "view->page style->none" to see that it is just css styling
that produces the sites look and feel. Which means that there are just
a couple of separate documents included in the head section of each
html page that dictates the entire look and feel of the website, if
you feel like a change just change the css document, and your whole
site completely changes in an instant, or offer multiple look&feels
for those who require high visibility, allow your site to zoom in,
etc... all with no changes to any of the new style html you are going
to write.
Tools you can use include the web developer extension for firefox - to
highlight all the elements, <p> <h1> <h2>... <ul> that you will start
using more often now to see what the bounding box for these elements
looks like and how to shimmy them around in the page using css. You
can use firebug to edit the css live, or other extensions like that,
and you're on your way.
Consider that when you use javascript for functionality in your pages,
it should not be "core" to the website, it should add to an already
working website, so code your website to work in the old fashioned way
and add a layer over the top, of unobtrusive javascript that hijacks
the links and does the fancy stuff.
Once you start using css+(x)html you wont be worrying about
maintainability, you wont mind having 20 pages of markup per site, you
will find it easier to code a website, hijacking it afterwards, and
your work is done; the old tablebased sites are so hard to maintain
once a change is needed that the work involved means you reach around
for shortcuts, and draggin in content from iframes and so on and on...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,159
Messages
2,570,884
Members
47,419
Latest member
ArturoBres

Latest Threads

Top