Jump to content

Geting url content and storing as a variable


vbachtiar
 Share

Recommended Posts

HiI am trying to get a url content and store it as a variable for further processing.Here's my code:

<html><head><script type="text/javascript">  function readFile(url) {    pageRequest = new XMLHttpRequest()    pageRequest.open('GET', url, false);    pageRequest.send(null);    return pageRequest.responseText;  }</script></Head><body><script type="text/javascript">  txt = readFile('http://maps.googleapis.com/maps/api/elevation/xml?locations=39.7391536,-104.9847034|36.455556,-116.866667&sensor=false');  document.write(txt);</script></body></html>

Now, it only works with IE, but whenever I open in firefox, there is nothing on the page.I googled about this and found that 'new XMLHttpRequest()' should work with Firefox, so I am confused as to why the code cannot work.Is there another way of doing this so that it's compatible cross-browser?Thanks in advance for any help!

Edited by vbachtiar
Link to comment
Share on other sites

Have you checked the response format from Google? It looks like you're requesting xmlhttp://maps.googleapis.com/maps/api/elevation/xml?locations=39.7391536,-104.9847034|36.455556,-116.866667&sensor=falseto verify, just put that url in your web browserhttp://maps.googleapis.com/maps/api/elevat...mp;sensor=falsetry changing the last line to

return pageRequest.responseXML;

http://www.w3schools.com/ajax/ajax_xmlhttp...st_response.aspyou will probably want to look up how parse XML in Javascript if that indeed is what is causing the problem.

Edited by thescientist
Link to comment
Share on other sites

doh. I thought for some reason Google maps was exempt for some reason because I was using it herehttp://www.seasonsnh.com/directions.htmlbut then I realized I did it with an iFrame. So yes, I'm not sure why it's working with IE (could you show us pageRequest.responseText), but you will have to use a server side language (like PHP and cURL) to get the content of that URL and then parse it, and make it available to javascript. One way would be to have the page make an AJAX request on document.ready to a PHP script you've set up that would make the cURL request), and then it would return the response back to your javascript script, which could do stuff with it.

Edited by thescientist
Link to comment
Share on other sites

Thanks for your replies..pageRequest.responseText returns:

OK 39.7391536 -104.9847034 1608.6379395 36.4555560 -116.8666670 -50.7890358

I tried using 'return pageRequest.responseXML;' as intially sugested:

<html><head><script type="text/javascript">function readContent(url) {	if (window.XMLHttpRequest) {		xmlhttp = new XMLHttpRequest();	} else {		xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");	}	xmlhttp.open("GET",url,true);	xmlhttp.send();	return xmlhttp.responseXML;}</script></Head><body><script type="text/javascript">xml = readContent('http://maps.googleapis.com/maps/api/elevation/xml?locations=39.7391536,-104.9847034|36.455556,-116.866667&sensor=false');var txt = '';x = xml.getElementsByTagName('lat');for (i=0;i<x.length;i++) {	txt = txt + x[i].childNodes[0].nodeValue + "<br />";}document.write(txt);</script></body></html>

Again, works fine with IE but not firefox..I actually am a beginner in doing this stuff, so it's a bit difficult for me to understand your explanation.Furthermore, to explain my situation, I have no access to a 'server' / 'webserver'. I am running my .html file on my local disk. I am doing a university project kind of thing, so I need to get the .html file "working" (as in I double-click it and it do stuff I want it to do), and give it to my supervisor to give it to the IT guys to put in my uni webpage. (At least that's my understanding of what will happen...)

Edited by vbachtiar
Link to comment
Share on other sites

If it's eventually going to be on a web server, then you might as well plan for that. IE uses different security settings when you are on an online page vs. a local file. It may work locally in IE, but it's not going to work when it's online because of the same limitation that Firefox carries over to the local files.To get around this, people use a server proxy, which is just an extra script sitting on the server. The way that works is that your Javascript sends a request to the proxy, which is allowed since it's on the same server. The proxy gets the URL you want to look up and gets the content, which it's allowed to do regardless of where the content is. When it has the response, it sends it back to Javascript. So the proxy is just a middleman between Javascript and the remote content to get around the security implications.Since it depends what the server supports, you may want to ask your IT group what you can use. Tell them you need a server-side proxy script to act as a workaround for cross-domain ajax requests. They should be able to understand that request and direct you to something if they have it available. At a minimum, they should be able to tell you what languages you can use to write your own.A very basic proxy in PHP that does no error checking only needs this much code:

<?php$file = isset($_GET['url']) ? $_GET['url'] : '';if (!empty($file)){  echo @file_get_contents($file);}?>

You would call that proxy like this:proxy.php?url=http%3A%2F%2Fwww.google.comThe output would be the source code of the page at www.google.com.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...