AJAX and search engines - solutions?

Q

Qu0ll

I have done some research into the problem of AJAX not playing well with
search engines and it seems that the best solution available at the moment
to get all your AJAX pages linked is to create two actual sites, one with
the AJAX-ified pages which is presented to the user and another with static
HTML which is presented to the spider to index the pages which links back to
the actual site. Other possible solutions don't seem to be scalable IMHO in
that they would require possibly hundreds or thousands of static links whose
only purpose is to direct the spider to static pages of linkable content.

However, some say that this dual-site solution is known as "cloaking" and
that sophisticated spiders will recognise this and not index the pages
appropriately.

Does anyone know of any other potential solutions to this problem which must
face many web site designers or if the cloaking fear is confirmed?

--
And loving it,

-Q
_________________________________________________
(e-mail address removed)
(Replace the "SixFour" with numbers to email me)
 
H

Henry

I have done some research into the problem of AJAX not playing
well with search engines and it seems that the best solution
available at the moment to get all your AJAX pages linked is
to create two actual sites, one with the AJAX-ified pages
which is presented to the user and another with static
HTML which is presented to the spider to index the pages
which links back to the actual site.
Insanity!

Other possible solutions don't seem to be scalable

Doing everything twice is at least twice the scale it could be to
start with.
IMHO in that they would require possibly hundreds or
thousands of static links whose only purpose is to direct
the spider to static pages of linkable content.

A worse alternative doesn't justify an inherently bad idea.
However, some say that this dual-site solution is known as
"cloaking" and that sophisticated spiders will recognise
this and not index the pages appropriately.

They may well.
Does anyone know of any other potential solutions to this
problem which must face many web site designers or if the
cloaking fear is confirmed?

Yes, realise that AJAX is a design option not a necessity. AJAX is
great for particular types of contexts, such as web applications, and
a seriously bad idea in others (evidently including those where search
engine handling is significant).
 
D

David Mark

I have done some research into the problem of AJAX not playing well with
search engines and it seems that the best solution available at the moment

Search engines have no concept of Ajax.
to get all your AJAX pages linked is to create two actual sites, one with
the AJAX-ified pages which is presented to the user and another with static
HTML which is presented to the spider to index the pages which links back to

Forget that.
the actual site.  Other possible solutions don't seem to be scalable IMHO in
that they would require possibly hundreds or thousands of static links whose
only purpose is to direct the spider to static pages of linkable content.

I don't know what that means.
However, some say that this dual-site solution is known as "cloaking" and

It is a bad idea by any name.
that sophisticated spiders will recognise this and not index the pages
appropriately.

Does anyone know of any other potential solutions to this problem which must
face many web site designers or if the cloaking fear is confirmed?

It is very simple. First markup your content in semantic documents
without any script at all. Then add scripted enhancements as
appropriate. Using Ajax for navigation is not appropriate. If you
feel you must do that (which is what I assume you mean by "ajaxify"),
then use server side scripts to send just the content portion of each
page when responding to Ajax requests.
 
S

slebetman

<snip>

However, some say that this dual-site solution is known as "cloaking" and
that sophisticated spiders will recognise this and not index the pages
appropriately.

Does anyone know of any other potential solutions to this problem which must
face many web site designers or if the cloaking fear is confirmed?

Developing two copies of your site is not maintainable. That is not
the solution at all (not even a potential solution).

There are only two ways to do this decently:
1. graceful degradation
2. progressive enhancement

both require you to think about browsers with javascript turned off
while developing your AJAX page. Essentially you need to make the
javascript stuff optional. But of the two, progressive enhancement is
better. Here's how you do it:

1. Develop an old-school web1.0 site without any javascript at all. In
fact most people will tell you to go really old school and not even
worry about layout at this point - just use the default colors: black
text on white background. But remember to tag bits with appropriate id
and classes and group relevant stuff in divs.

2. Have your art guy do layout for the site with css, modifying the
page content as little as possible. It may sometimes be necessary to
reposition stuff or add extra divs or span due to limitation of css
but this should be kept to a minimum.

3. Write javascript to enhance the interactivity of the page. For
simple stuff you can easily add behaviors directly to html elements
for onclick, onmouseover etc. For more complex stuff you can rip
elemente out of the page and replace it with dynamically constructed
javascript based widgets.

The thing to realise here is that you can remove and insert elements
into pages with javascript. So your "basic" page does not need to be
AJAXified -- use javascript to AJAXify it after it is loaded.
 
P

Peter Michaux

Search engines have no concept of Ajax.

That may not be true anymore or at least not true for much longer. If
developers are making sites that depend on Ajax for navigation then it
will be in the best interest of the search engine companies to find a
way to index the pages.

It is very simple.  First markup your content in semantic documents
without any script at all.  Then add scripted enhancements as
appropriate.  Using Ajax for navigation is not appropriate.  If you
feel you must do that (which is what I assume you mean by "ajaxify"),
then use server side scripts to send just the content portion of each
page when responding to Ajax requests.

I agree that is good advice. (especially for a content-based site that
is not behind a login which seems to be the situation under
consideration here.)

Peter
 
H

Henry

That may not be true anymore or at least not true for much
longer. If developers are making sites that depend on Ajax for
navigation then it will be in the best interest of the search
engine companies to find a way to index the pages.
<snip>

Whether it would be in their "best interests" or not may not be the
deciding factor. When crawling HTML sites it is relatively easy to
find and interpret HREF, SRC and ACTION attributes (and so follow them
to other material and present the references to the information to the
search engine user). With javascript driven interaction it is not even
easy to determine which DOM elements are triggering significant
events, let alone work out which sequences of user interactions would
need to be run through to see all of the site in question in order to
make a worthwhile search. And then, assuming it could be made to work,
how would you present the results to the search engine user, if the
information they want to get at is accessed by visiting a particular
URL and then clicking buttons X followed by button Y to get the
information displayed?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,982
Messages
2,570,185
Members
46,736
Latest member
AdolphBig6

Latest Threads

Top