D
DanielEKFA
Hey hey
Having a lot of problems with this. I'm helping my brother out with this art
school web page. He has a big wish to have random images from the web shown
in the background, and asked me to help him out.
My idea is this: Use the CNN Top Stories RSS feed to harvest keywords, then
use a random keyword from this harvest to search Google, get links from the
result, look through random links here and get links to images, then load
these images into the art web page's background.
My first thought was to use an XmlHttpRequest object, as I've played around
with it before and knew what it can do. Trouble is that Firefox needs some
kind of script signing to allow retrieving data from a different domain,
and I don't really understand how it works. Konqueror gives no warnings,
but fails to actually load the external content, perhaps because of the
same restriction. Add to this, IE seems to work, but pops up a warning
dialog to the user that a script is trying to access external content which
is unsafe. I wouldn't click okay if I saw this kind of dialog on a
different site, so I wouldn't expect anyone to do it on this site either.
Okay, so I looked for another way, and I read about iframes, thinking this
would be perfect, especially considering the .links and .images properties
already existing on a document object (making a custom lexer/parser pretty
much superfluous). The iframes themselves work great in Konqueror, Firefox,
and IE, loading content happily, problem is (as I've understood it) that
iframes aren't document objects but elements, and therefore own neither
a .links nor an .images property. Firefox console gives property not
defined errors on both document.getElementById('myIframe').document and
document/window.frames['myIframe'].document, anyway
I can really do fine without either of these properties if I could simply
find a way to get the external HTML code imported and script accessible so
that I can parse it with a state machine and extract what I need.
I've thought about a PHP solution too, but haven't looked into it thoroughly
yet, as I understand that very few servers allow reading files from other
domains into a local PHP script. Or is this wrong?
I'm anxious to receive any kind of tip, so long as it's crossplatform
compatible and not based on Java or something else (Flash etc.) that would
require the visiting browser to have an add-on installed. The site should
work on Safari, IE (Mac & PC), Konqueror, and Firefox. I only have the rest
of the day to get this working (approx. 10 hours)
Thanks in advance,
Daniel
Having a lot of problems with this. I'm helping my brother out with this art
school web page. He has a big wish to have random images from the web shown
in the background, and asked me to help him out.
My idea is this: Use the CNN Top Stories RSS feed to harvest keywords, then
use a random keyword from this harvest to search Google, get links from the
result, look through random links here and get links to images, then load
these images into the art web page's background.
My first thought was to use an XmlHttpRequest object, as I've played around
with it before and knew what it can do. Trouble is that Firefox needs some
kind of script signing to allow retrieving data from a different domain,
and I don't really understand how it works. Konqueror gives no warnings,
but fails to actually load the external content, perhaps because of the
same restriction. Add to this, IE seems to work, but pops up a warning
dialog to the user that a script is trying to access external content which
is unsafe. I wouldn't click okay if I saw this kind of dialog on a
different site, so I wouldn't expect anyone to do it on this site either.
Okay, so I looked for another way, and I read about iframes, thinking this
would be perfect, especially considering the .links and .images properties
already existing on a document object (making a custom lexer/parser pretty
much superfluous). The iframes themselves work great in Konqueror, Firefox,
and IE, loading content happily, problem is (as I've understood it) that
iframes aren't document objects but elements, and therefore own neither
a .links nor an .images property. Firefox console gives property not
defined errors on both document.getElementById('myIframe').document and
document/window.frames['myIframe'].document, anyway
I can really do fine without either of these properties if I could simply
find a way to get the external HTML code imported and script accessible so
that I can parse it with a state machine and extract what I need.
I've thought about a PHP solution too, but haven't looked into it thoroughly
yet, as I understand that very few servers allow reading files from other
domains into a local PHP script. Or is this wrong?
I'm anxious to receive any kind of tip, so long as it's crossplatform
compatible and not based on Java or something else (Flash etc.) that would
require the visiting browser to have an add-on installed. The site should
work on Safari, IE (Mac & PC), Konqueror, and Firefox. I only have the rest
of the day to get this working (approx. 10 hours)
Thanks in advance,
Daniel