Generated javascript from .pl files

T

TonyV

Hi all,

I'm trying to use some javascript code in Internet Explorer 6.0 that's
being generated by a Perl file. I have a line in my header that looks
something like this:

<script src="jscode.pl?parm1=foo&parm2=monkey" type="text/javascript" /
Just for the sake of simplicity, here's a very simplified version of
the Perl code:

print "Content-Type: text/javascript; charset=ISO-8859-1\n\n";
print "alert('Testing, 1-2-3!');\n";


The Perl code returns a header with mime type text/javascript. (I've
tried application/javascript to no avail.) Everything works great in
Firefox, and I get a message box that says "Testing, 1-2-3!" that pops
up when I load the page. However, in Internet Explorer 6.0, even
though the results of pulling up the URL manually looks great, the
code isn't executing. No message box, no errors, no nothing. It's
silently dying.

I think the problem is that Internet Explorer is seeing the .pl
extension and thinking, "Hmm, that's not a Javascript file!" and
bypassing it. It might have something to do with this:
http://msdn2.microsoft.com/en-us/library/ms775148.aspx

Does anyone know if there's anything I can do to get a Perl script to
be read by Internet Explorer as a legitimate javascript file? Is
there something special I have to do on the server (IIS 6.0 on Win2k3
Server)? Or is it just plain not possible?
 
D

David Mark

Hi all,

I'm trying to use some javascript code in Internet Explorer 6.0 that's
being generated by a Perl file. I have a line in my header that looks
something like this:

<script src="jscode.pl?parm1=foo&parm2=monkey" type="text/javascript" /



Just for the sake of simplicity, here's a very simplified version of
the Perl code:

print "Content-Type: text/javascript; charset=ISO-8859-1\n\n";
print "alert('Testing, 1-2-3!');\n";

Wrong MIME type, despite what the type attribute would seem to imply.
The Perl code returns a header with mime type text/javascript. (I've
tried application/javascript to no avail.) Everything works

Try application/x-javascript.

great in
Firefox, and I get a message box that says "Testing, 1-2-3!" that pops
up when I load the page. However, in Internet Explorer 6.0, even
though the results of pulling up the URL manually looks great, the
code isn't executing. No message box, no errors, no nothing. It's
silently dying.

I think the problem is that Internet Explorer is seeing the .pl
extension and thinking, "Hmm, that's not a Javascript file!" and

User agents do not determine MIME types from extensions. URI's are
not file paths anyway. Servers map requested URI's to file paths or
processes and in some cases map file extensions to MIME types for
responses.
bypassing it. It might have something to do with this:http://msdn2.microsoft.com/en-us/library/ms775148.aspx

No.

Does anyone know if there's anything I can do to get a Perl script to
be read by Internet Explorer as a legitimate javascript file? Is
there something special I have to do on the server (IIS 6.0 on Win2k3
Server)? Or is it just plain not possible?

It is quite possible. Looking at your script declaration:

<script src="jscode.pl?parm1=foo&parm2=monkey" type="text/
javascript" /

I assume the tag is closed after the slash, so you should realize that
you cannot use self-closing tags with script elements. Change to:

<script src="jscode.pl?parm1=foo&parm2=monkey" type="text/
javascript"></script>

Fix that and the MIME type and it should work fine. Post a link to a
sample page if not.
 
T

Thomas 'PointedEars' Lahn

TonyV said:
I'm trying to use some javascript code in Internet Explorer 6.0 that's
being generated by a Perl file. I have a line in my header that looks
something like this:

<script src="jscode.pl?parm1=foo&parm2=monkey" type="text/javascript" / ------------------------------------------------------------------------^
Just for the sake of simplicity, here's a very simplified version of
the Perl code:

print "Content-Type: text/javascript; charset=ISO-8859-1\n\n";
print "alert('Testing, 1-2-3!');\n";

[...] However, in Internet Explorer 6.0, even
though the results of pulling up the URL manually looks great, the
code isn't executing. No message box, no errors, no nothing. It's
silently dying.

I think the problem is that Internet Explorer is seeing the .pl
extension and thinking, "Hmm, that's not a Javascript file!" and
bypassing it. It might have something to do with this:
http://msdn2.microsoft.com/en-us/library/ms775148.aspx

I find it more likely that your pseudo-XHTML (see the marker above) is
causing the problem. IE does not support X(HT)ML, so the `script' element
does not end. However, if using HTML instead does not solve the problem:
You are not required to serve Perl-generated content as resource with a .pl
suffix; in fact, it is recommended not to do so; use content negotiation
instead (why not .js.pl, and using .js?). [1]


PointedEars
___________
[1] http://www.w3.org/QA/Tips/uri-choose pp., especially
http://www.w3.org/Provider/Style/URI
 
B

Bart Van der Donck

Thomas said:
TonyV wrote:
<script src="jscode.pl?parm1=foo&parm2=monkey" type="text/javascript"

simplified version of the Perl code:
print "Content-Type: text/javascript; charset=ISO-8859-1\n\n";
print "alert('Testing, 1-2-3!');\n";

[...] However, in Internet Explorer 6.0, even
though the results of pulling up the URL manually looks great, the
code isn't executing. No message box, no errors, no nothing. It's
silently dying.
I think the problem is that Internet Explorer is seeing the .pl
extension and thinking, "Hmm, that's not a Javascript file!" and
bypassing it. It might have something to do with this:
http://msdn2.microsoft.com/en-us/library/ms775148.aspx

I find it more likely that your pseudo-XHTML (see the marker above) is
causing the problem. IE does not support X(HT)ML, so the `script' element
does not end. However, if using HTML instead does not solve the problem:
You are not required to serve Perl-generated content as resource with a .pl
suffix; in fact, it is recommended not to do so; use content negotiation
instead (why not .js.pl, and using .js?).

I think a better approach here is to alter the .pl extension to .js,
and tell server to execute .js as CGI. It's simple and robust - I use
it a lot.

If you're on *nix, add a file named ".htaccess" in the same directory,
with content:

AddHandler cgi-script .js

This will cause .js files to be executed as (Perl) CGI script, and
output the actual content to the <script> call. Then make sure to use
the default "application/x-javascript" MIME type, and no danger for
any browser incompatibility.
 
T

Thomas 'PointedEars' Lahn

Bart said:
I think a better approach here is to alter the .pl extension to .js,
and tell server to execute .js as CGI. It's simple and robust - I use
it a lot.

If you like driving with the handbrake pulled, you might want to do that.


PointedEars
 
B

Bart Van der Donck

Thomas said:
Bart Van der Donck wrote:

If you like driving with the handbrake pulled, you might want to do that.

If this would be a problem, you shouldn't let the file being parsed by
server in the first place. This consumes much more memory than a
htaccess lookup.

Alternatively, you could add the setting in Apache's general conf file
instead of htaccess.

Some systems keep htaccess files in their operating memory, so they
don't need to be re-parsed.
 
B

Bart Van der Donck

TonyV said:
I'm trying to use some javascript code in Internet Explorer 6.0 that's
being generated by a Perl file. I have a line in my header that looks
something like this:

<script src="jscode.pl?parm1=foo&parm2=monkey" type="text/javascript" /

Just for the sake of simplicity, here's a very simplified version of
the Perl code:

print "Content-Type: text/javascript; charset=ISO-8859-1\n\n";
print "alert('Testing, 1-2-3!');\n";

The Perl code returns a header with mime type text/javascript. (I've
tried application/javascript to no avail.) Everything works great in
Firefox, and I get a message box that says "Testing, 1-2-3!" that pops
up when I load the page. However, in Internet Explorer 6.0, even
though the results of pulling up the URL manually looks great, the
code isn't executing. No message box, no errors, no nothing. It's
silently dying.

I've had good experiences by making the .js writable, and put the Perl
program (as non-CGI) in a cronjob, and then periodically rewrite the
content from cronjob into the .js file.

The .js-call then becomes a read-action and not an execute-action,
which dramatically improves performance under heavy conditions (and
especially when database queries are involved).

Of course, the cronjob frequency depends on how often your js data
should be updated.

It's a bit more work, but if you're on a heavy traffic website, I
think you should definitely consider it.

I'm using such a setup for many years and found it an excellent way to
reduce load on that machine.
 
T

Thomas 'PointedEars' Lahn

Bart said:
If this would be a problem, you shouldn't let the file being parsed by
server in the first place.

No, it is very inefficient to have the server execute .js as CGI, because
that would apply to *every* .js file. Instead, enabling content negotiation
and having only resources with e.g. suffix .js.pl executed as CGI while they
are accessed with a .js suffix and letting the server "decide" what to do
with it, is much more efficient, flexible (allows for using *any*
server-side language to generate the client-side script code) and easier to
maintain.

Have you followed the URI advisory link in my first followup in this thread?
This consumes much more memory than a htaccess lookup.

And "This" would be what exactly?
Alternatively, you could add the setting in Apache's general conf file
instead of htaccess.

Some systems keep htaccess files in their operating memory, so they
don't need to be re-parsed.

You've lost me. What has .htaccess got to do with it? And, IIUC, how do
you got the idea that content negotation has nothing to do with .htaccess?


PointedEars
 
B

Bart Van der Donck

Thomas said:
No, it is very inefficient to have the server execute .js as CGI, because
that would apply to *every* .js file.

I'm sorry? You can certainly set this for a whole webserver, for a
single directory, or for one file. It wouldn't be too wise to set such
a rule for the whole server indeed.
Instead, enabling content negotiation
and having only resources with e.g. suffix .js.pl executed as CGI while they
are accessed with a .js suffix and letting the server "decide" what to do
with it, is much more efficient, flexible

I don't see where the difference is. You do it for .js.pl (on whole
server, so it seems), and I do it for .js on file/directory base. You
set your httpd.conf/htaccess to output .js.pl as application/x-
javascript, and I tell it to parse .js as CGI, right? Why would yours
be more efficient or flexible then?
(allows for using *any* server-side language to generate the client-side
script code)

And why would that not be the case in my scenario? All I say is that
the js file in question should be parsed as CGI by webserver. It's
just coincidence that Perl is the CGI language in this case. It can be
anything. Even more: when you write your rule for .js.pl, you at least
*suggest* Perl is the language, while my solution doesn't suggest
anything about that.
and easier to maintain.

In our both scenarios, only one config line is to be maintained.
Maintainance is identical.
And "This" would be what exactly?

This = parsing a CGI programme that's written in Perl. Apache must
load the code into memory, pass it to the Perl executable, gets back
compiled result and outputs it to client. This process requires more
memory than a single htaccess lookup.
You've lost me. What has .htaccess got to do with it?

So that the same .htaccess file does not need to be loaded again for
every new request (as long as it doesn't change, of course). This
"buffer" causes less load for Apache (it holds the directives for the
file "ready"), though I'ld say you'll only see a noticeable impact on
heavy traffic websites.
And, IIUC, how do you got the idea that content negotation has nothing
to do with .htaccess?

Well, uhm, what about, I didn't have that idea? :) Of course they are
related.
 
T

TonyV

A short update:

I'm sorry for the confusion, I'm unfortunately not using Apache, the
server is IIS 6.0. (I'm not being facetious; I normally use a LAMP
setup which is why I'm a little out of my league on this one.)

I tried replacing the <script ... /> with <script ...></script> and I
got the same results. I haven't tried changing the MIME type from
text/javascript to application/x-javascript yet, that's what I'm going
to do now and I'll post the results.

I'd rather avoid giving Perl files .js extension if possible to avoid
confusion server-side. I'm not the only one who's writing scripts for
that server. .js.pl is a possibility (and the more I think about it,
a really good idea!), but it seems to me that trying to retrieve a
file with a .js.pl extension wouldn't be functionally different than
retrieving one with a .pl extension.

At any rate, I'm off to try changing the MIME type and see what
happens. Thanks for the help, I'll post again when I have more info.

-KS
 
T

Thomas 'PointedEars' Lahn

Bart said:
I'm sorry? You can certainly set this for a whole webserver, for a
single directory, or for one file. It wouldn't be too wise to set such
a rule for the whole server indeed.

But it adds the maintenance effort for every .js file that is not to be
parsed on the server or in the directory (and its subdirectories), or for
every single file that is to be parsed if the per-server or per-directory
approach was not taken.
I don't see where the difference is. You do it for .js.pl (on whole
server, so it seems), and I do it for .js on file/directory base. You
set your httpd.conf/htaccess to output .js.pl as application/x-
javascript, and I tell it to parse .js as CGI, right? Why would yours
be more efficient or flexible then?

Because with "my" approach it can be anything from .js.bash to .js.php,
without having to change the request over time. And I don't have to watch
for a directive for each relevant (group of) file(s) on my server.
In our both scenarios, only one config line is to be maintained.
Maintainance is identical.

I don't think so.
This = parsing a CGI programme that's written in Perl. Apache must
load the code into memory, pass it to the Perl executable, gets back
compiled result and outputs it to client. This process requires more
memory than a single htaccess lookup.

That's true, but I still don't see the connection regarding our discussion
which approach is the most reasonable one.
You've lost me. What has .htaccess got to do with it?

So that the same .htaccess file does not need to be loaded again for
every new request (as long as it doesn't change, of course). [...]

Neither does the server-wide configuration file. But the relevance of that
regarding our discussion is still zero as you are willing to concede that
the position of the configuration directive does not really matter for
either approach (BTW, it was you who introduced that distinction).


PointedEars
 
T

Thomas 'PointedEars' Lahn

TonyV said:
[...] I'm unfortunately not using Apache, the server is IIS 6.0. [...]

IIS 6.0 supports Content Negotiation, too. Requires a bit of tweaking, though.

http://www.websiteoptimization.com/speed/tweak/rewrite/
I tried replacing the <script ... /> with <script ...></script> and I
got the same results. I haven't tried changing the MIME type from
text/javascript to application/x-javascript yet, that's what I'm going
to do now and I'll post the results.

It would be not Valid markup if you changed the media type from one
registered and fully supported, albeit marked deprecated recently, to one
being experimental or in other words proprietary. Especially with HTTP, it
probably breaks interoperability in the long-term.
[...] .js.pl is a possibility (and the more I think about it,
a really good idea!), but it seems to me that trying to retrieve a
file with a .js.pl extension wouldn't be functionally different than
retrieving one with a .pl extension.

It would, because you would use only .js in the request and let the server
see about which .js.suffix file matches that. And the only thing that you
would need to change over time is the server configuration that says how
..suffix (here .pl) is to be handled. Yes, Content Negotiation *is* a
*really* *good* idea; I'm glad it does exist.


PointedEars
 
B

Bart Van der Donck

Thomas said:
TonyV wrote:
[...] .js.pl is a possibility (and the more I think about it,
a really good idea!), but it seems to me that trying to retrieve a
file with a .js.pl extension wouldn't be functionally different than
retrieving one with a .pl extension.

It would, because you would use only .js in the request

Yes, always.
and let the server see about which .js.suffix file matches that.

This is where I'm lost.
And the only thing that you would need to change over time is the
server configuration that says how .suffix (here .pl) is to be
handled.

Could you show the relevant httpd.conf lines how you would achieve
this ?

Thanks
 
T

Thomas 'PointedEars' Lahn

Bart said:
Thomas said:
TonyV said:
[...] .js.pl is a possibility (and the more I think about it,
a really good idea!), but it seems to me that trying to retrieve a
file with a .js.pl extension wouldn't be functionally different than
retrieving one with a .pl extension.
It would, because you would use only .js in the request [...]
and let the server see about which .js.suffix file matches that.

This is where I'm lost.
And the only thing that you would need to change over time is the
server configuration that says how .suffix (here .pl) is to be
handled.

Could you show the relevant httpd.conf lines how you would achieve
this ?

In its most simple form,

Options +MultiViews


PointedEars
 
B

Bart Van der Donck

Thomas said:
Bart Van der Donck wrote:


In its most simple form,

Options +MultiViews

I see. I wasn't aware of that technique. But I doubt it's the best
here. For the OP's question, I would still councel to use

AddHandler cgi-script .js

in a specific directory holding his perl-generated javascript files.
 
T

Thomas 'PointedEars' Lahn

Bart said:
I see. I wasn't aware of that technique.

Now come on, how many times have I written "Content Negotiation" already?
But I doubt it's the best here.

If you deem a technique to be not the best as a solution to a problem, you
would have to come up with an alternative that is objectively a better one,
i.e. has a considerable benefit over the former, or such a statement stands
and falls with your credibility among your readers.

I have been using Content Negotiation for years now. It works just fine,
even with Apache 1.x; it gives me little to no headache, and the performance
loss for the additional lookup really is insignificant. For example, do you
think I am updating the date of modification in [1] *manually* or that there
even exists an "es-matrix" directory? Do you think that I am using any
programming to determine what translation should be used automatically
according to the users preferences in [2]? I could go on, but it would be
better if you read the (Apache) docs on how CN automagically can work for you.

[1] http://pointedears.de/scripts/es-matrix
[2] http://pointedears.de/scripts/test/whatami
For the OP's question, I would still councel to use

AddHandler cgi-script .js

in a specific directory holding his perl-generated javascript files.

That would impose an unnecessary restriction, where

AddHandler cgi-script .pl
Options +MultiView

would not. So in what way would the former be better, please?


PointedEars
 
B

Bart Van der Donck

Thomas said:
If you deem a technique to be not the best as a solution to a problem, you
would have to come up with an alternative that is objectively a better one,
i.e. has a considerable benefit over the former, or such a statement stands
and falls with your credibility among your readers.

See below.
I have been using Content Negotiation for years now. It works just fine,
even with Apache 1.x; it gives me little to no headache, and the performance
loss for the additional lookup really is insignificant. For example, do you
think I am updating the date of modification in [1] *manually* or that there
even exists an "es-matrix" directory? Do you think that I am using any
programming to determine what translation should be used automatically
according to the users preferences in [2]? I could go on, but it would be
better if you read the (Apache) docs on how CN automagically can work for you.

[1]http://pointedears.de/scripts/es-matrix
[2]http://pointedears.de/scripts/test/whatami
For the OP's question, I would still councel to use
AddHandler cgi-script .js
in a specific directory holding his perl-generated javascript files.

That would impose an unnecessary restriction, where

AddHandler cgi-script .pl
Options +MultiView

would not. So in what way would the former be better, please?

1. Because it's much simpler. The original poster doesn't need all
this stuff like automatic language selection, automatic last-update
dates, etc. that you mention. He just needs to parse js files by perl
and my solution just does that. Not more, not less.

2. Because proxies might interfere. How many websites use Content
Negotiation for language settings and turn out to be wrong ?

3. Because it's much faster. Multiviews are known to cause
considerable overhead. You're right that this shouldn't matter much on
low-profile sites; but once you enter the more heavy stuff, every
single bit matters and you will have far better ways to spend your
memory resources.

4. Because web servers might use a different HTTP specification than
the visiting browser, so possible HTTP version conflicts arise.

5. Because you're *fully* browser-independent. Multiviews rely on
browser information and the preferred order of the file extensions
they request. Browsers have errors in that regard.

Two reputable sources (the rest of the Googling is for you):

http://www.debian.org/intro/cn.en.html says:
"We also suggest you use lynx when testing. It is the only browser we
have found to comply 100% with the HTTP specifications for content
negotiation."

http://en.wikipedia.org/wiki/Criticisms_of_Internet_Explorer says:
"Internet Explorer does not fully support HTTP/1.1 content
negotiation"
 
T

Thomas 'PointedEars' Lahn

Bart said:
1. Because it's much simpler. The original poster doesn't need all
this stuff like automatic language selection, automatic last-update
dates, etc. that you mention. He just needs to parse js files by perl
and my solution just does that. Not more, not less.

CN can be configured so that only the feature that is wanted is supported.
I, personally, want all the features. And I still can't see why your
solution would be simpler in any way; not regarding configuration, and not
regarding maintenance. But perhaps we leave it at that, we are very much
off topic already.
2. Because proxies might interfere. How many websites use Content
Negotiation for language settings and turn out to be wrong ?

I know of some that use CN and none of them has done it wrong yet.
I did not do any really *thorough* testing on them (with several UAs and
settings), though. I just don't have the time.
3. Because it's much faster. Multiviews are known to cause
considerable overhead. You're right that this shouldn't matter much on
low-profile sites; but once you enter the more heavy stuff, every
single bit matters and you will have far better ways to spend your
memory resources.

As I said, I don't perceive and I can't think of a considerable performance
loss, no matter the size of the site. Mainly because large sites need fast
servers anyway.
4. Because web servers might use a different HTTP specification than
the visiting browser, so possible HTTP version conflicts arise.

Doesn't apply here.
5. Because you're *fully* browser-independent. Multiviews rely on
browser information and the preferred order of the file extensions
they request. Browsers have errors in that regard.

Doesn't apply here.
http://www.debian.org/intro/cn.en.html says:
"We also suggest you use lynx when testing. It is the only browser we
have found to comply 100% with the HTTP specifications for content
negotiation."

http://en.wikipedia.org/wiki/Criticisms_of_Internet_Explorer says:
"Internet Explorer does not fully support HTTP/1.1 content
negotiation"

Both quotes are as ambiguous as it can get, with debian.org (which would
have been one of my examples on how CN works just fine with me) is at least
going into the details. But both quotes don't apply to the particular
problem we are discussing here, too. Your fears really are unfounded :)


Regards,
PointedEars
 
B

Bart Van der Donck

Thomas said:
CN can be configured so that only the feature that is wanted is supported.
I, personally, want all the features. And I still can't see why your
solution would be simpler in any way; not regarding configuration, and not
regarding maintenance.

It is simpler because:
- You only need one line: "AddHandler cgi-script .js"
- The meaning of this line is very simple: "don't serve as regular js
file, but CGI must output js code"

If you want to explain the working of Content Negotiation, you need
half of a manual to cover it.
But perhaps we leave it at that, we are very much
off topic already.

You're right. This discussion belongs to alt.apache.configuration.
I know of some that use CN and none of them has done it wrong yet.

Perhaps you should visit some more then.
As I said, I don't perceive and I can't think of a considerable performance
loss, no matter the size of the site.

Yes, the performance loss is considerable. Of course, this must be
viewed in the right context. One mosquito weights 0,1 grammes but
10.000 mosquitoes make a kilo.
Mainly because large sites need fast servers anyway.

That would be ridiculous to deny.
Doesn't apply here.

Yes it does - it applies anywhere. The web server might use HTTP/1.1
and the browser HTTP/1.0.
Doesn't apply here.

It certainly does. The weakness is fundamental: your solution depends
on browser information (1) of which you have no control whether it's
correct, (2) whether the data have been transferred correctly, (3)
whether the browser is not defunct (4) whether there were no firewalls
or proxies that have altered/blocked the data. You can just "hope"
that your information is correct; this fact alone makes my solution
better; it does not have this uncertainty.
Both quotes are as ambiguous as it can get, with debian.org (which would
have been one of my examples on how CN works just fine with me) is at least
going into the details.

I could at most live with a statement that Wikipedia isn't always as
reputable; though I'ld say that this is not the case here. I don't
agree to call Debian advisories "as ambiguous as it can get", and I
doubt that many on this group would agree on that with you.
Your fears really are unfounded :)

There's a core of truth in them.

I'm not saying CN is überhaupt bad.
 
T

Thomas 'PointedEars' Lahn

(For brevity and appreciation of the potential reader, I am leaving out
everything now that is not related to the problem at hand.)
Thomas said:
Bart said:
Thomas 'PointedEars' Lahn wrote:
Bart Van der Donck wrote:
For the OP's question, I would still councel to use
AddHandler cgi-script .js
in a specific directory holding his perl-generated javascript files.
That would impose an unnecessary restriction, where
AddHandler cgi-script .pl
Options +MultiView
would not. So in what way would the former be better, please?
[...]
4. Because web servers might use a different HTTP specification than
the visiting browser, so possible HTTP version conflicts arise.
Doesn't apply here.

Yes it does - it applies anywhere. The web server might use HTTP/1.1
and the browser HTTP/1.0.

First, there can be no HTTP communication that uses two different versions
of HTTP. Either both the client and the server support HTTP/1.1 or it's
going to be HTTP/1.0 in both directions (HTTP/0.9 being unlikely on the Web;
I'd like to see an otherwise usable Web client that does not support
HTTP/1.1 yet, BTW.)

Second, the possible problem of different HTTP version support does *not*
apply here. Content Negotation was not introduced with HTTP/1.1, nor is
HTTP/1.1 an absolute requirement of it. CN has many different aspects. The
one that is used here does not rely on the (Content-* request) HTTP headers.

If you have only one file with prefix `foo.js' it does not matter what HTTP
version the client supports. As long as CN is supported and enabled on the
server, it "knows" that it should pick foo.js.pl (and pass it to the CGI
module) if the request says `foo.js'.
Doesn't apply here.

It certainly does. The weakness is fundamental: your solution depends
on browser information (1) [...]

Same here. I am beginning to get the impression that you do not really know
what you are talking about. BTDT, WFM.
I could at most live with a statement that Wikipedia isn't always as
reputable;

My bad, they do explain their statement, and the explanation is sound.
Unfortunately, you had not posted it:

| Internet Explorer does not fully support HTTP/1.1 content negotiation,
| because the browser does not specify, in its requests, what character
| encodings it can accept.

The accepted character encoding is *completely irrelevant* here.


PointedEars
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,998
Messages
2,570,242
Members
46,835
Latest member
lila30

Latest Threads

Top