Accessing Elements and Attributes from a Live Feed XML - javascript

I am trying to access the time until the next bus arrives for a Live Feed bus transit system in Asheville NC at the given bus stop but I keep returning two console errors:
"time is not defined"
and
"Cannot read property of geElementsbyTagName of undefined"
You can use "470" as an exam stopID to see the XML file.
I have made sure the right stop ID is being added although I am not sure I am adding the ID correctly onto the URL.
If the element is nested in another, is that an issue?
var feedURL = "http://webservices.nextbus.com/service/publicXMLFeed?command=predictions&a=art&stopId="+stopID;
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function(){
if(this.readyState == 4 && this.status == 200){
attribute(this);
}
};
xhttp.open("GET", feedURL, true);
xhttp.send();
function attribute(feedURL){
var y;
var xmlDoc = xml.responseXML;
var time = "";
y = xmlDoc.getElementsbyTagName('prediction');
time = x.getAttribute('minutes');
}
console.log(time);
I am expecting the number of minutes given until the bus arrives at a given stop.

You are trying to log the value of variable time outside of the function you've defined it in. You could change your code to this and that should work:
function attribute(feedURL){
var y;
var xmlDoc = xml.responseXML;
var time = "";
y = xmlDoc.getElementsbyTagName('prediction');
time = x.getAttribute('minutes');
console.log(time);
}

Related

adding JSON in javascript with multiple urls

I need to parse through several JSON files, extract different values and then compare these values. I'm having trouble getting more than one JSON file to show up though...I think the"request.onload" only works once in the for-loop so maybe thats the reason.
var myArray = [7138, 6237];
for (i = 0; i < myArray.length; i++) {
//var id = 55;
var head = "https://developer.trimet.org/ws/V2/arrivals/locIDs/";
var tail = "/appID/30BE7218095886D573C04A41C/xml='true'";
var url = head + myArray[i] + tail;
//console.log(url);
var request = new XMLHttpRequest();
request.open('GET', url);
request.responseType = 'json';
request.send();
request.onload = function() {
var arrivalData = request.response;
console.log(arrivalData.resultSet);
}}
Explaination of the code: the array "myArray" contains two bus-stop IDs which are found with another piece of code (I am making small app to find the 10 closest bus stops and then tell the user how long before the next bus arrives at each bus stop. To test it out, I am just using two constant IDs). These IDs are plugged into a url that contains a JSON script detailing the bus schedule for that bus stop. I want to extract the JSON and save it as a separate JSON within the code. I think the current code does this now, but it only seems to work once. In the end, the for loop will add the arrival times to an array, these times will be compared to see which one comes sooner, then the soonest arrival time and its corresponding bus-stop ID will be found. Finally i want to make this a function that can take any array of bus-stop IDs so that I can find the soonest bus arrival time for any set of bus stops.If you want to look into the arrival times, you can go to the url link and see the JSON - the arrival times can be either "estimated" or "scheduled" and the values are in milliseconds since Jan 1 1970. If someone could help me just be able to access the JSONs outside of the request.onload function, i'd be very grateful.
XMLHttpRequest works in asynchronous manner. And for loop is synchronous. So your facing this problem. I think below piece of code solves your problem
var myArray = [7138, 6237];
myArray.forEach(function (id) {
//var id = 55;
var head = "https://developer.trimet.org/ws/V2/arrivals/locIDs/";
var tail = "/appID/30BE7218095886D573C04A41C/xml='true'";
var url = head + id + tail;
//console.log(url);
var request = new XMLHttpRequest();
request.open('GET', url);
request.responseType = 'json';
request.send();
request.onload = function() {
var arrivalData = request.response;
console.log(arrivalData.resultSet);
}
})
Adding as an answer too
var myArray = [7138, 6237];
//Define results globally
var results = [];
for (i = 0; i < myArray.length; i++) {
//var id = 55;
var head = "https://developer.trimet.org/ws/V2/arrivals/locIDs/";
var tail = "/appID/30BE7218095886D573C04A41C/xml='true'";
var url = head + myArray[i] + tail;
//console.log(url);
var request = new XMLHttpRequest();
//Move here so onload event is registered
request.onload = function() {
var arrivalData = request.response;
results.push(arrivalData.resultSet);
};
request.open('GET', url);
request.responseType = 'json';
request.send();
}
While this should work I think your problem could be solved with promises and it would make your code much more understandable.
Have a read on the promise api and especially Promise.All().
EDIT
Just a note on that requests array, since XMLHttpRequest and onload is asynchronous if you try to access the results array directly after running that for loop it will be empty. This is one of the reasons promises are so powerful.
Using the current method you won't have any way of knowing when both of the requests have finished unless you are constantly checking the length of the results array in a while loop or something, which would block the thread and make everything much worse.

How to delay a method until another is finished first , javascript?

I'm currently working on a project for school using a pokemon api that will display the information needed to evolve the pokemon (please note that I'm completely new to javascript and HTML).
Link :http://pokeapi.co/docsv2/
The website will ask the user for a name and that name will be used to get a url for the main information that I'm looking for.
For example : if someone enters in pikachu, the program will request the object for pikachu which contains the url for pikachu's evolution chain and that url is the one that will provide the main information for the website.
Currently the code looks like this:
var pokemon = new XMLHttpRequest();
var name = prompt("Whats the name of the pokemon you have?").toLowerCase();
var url = "http://pokeapi.co/api/v2/pokemon-species/" + name;
var url2;
pokemon.onreadystatechange = function(){
if(pokemon.readyState == 4 && pokemon.status == 200){
var myArr = JSON.parse(pokemon.responseText);
var url2 = myArr.evolution_chain;
}
}
pokemon2.onreadystatechange = function() {
if (pokemon2.readyState == 4 && pokemon2.status == 200) {
var myArr2 = JSON.parse(pokemon2.responseText);
console.log(myArr2.chain.species.name);
}
}
var pokemon2 = new XMLHttpRequest();
pokemon2.open("GET", url2, true).done(onreadystatechange);
pokemon2.send();
pokemon.open("GET", url, true);
pokemon.send();
However the program doesn't work due to the fact that the getting is occurring at the same time and pokemon2 should only be called after pokemon is finished because it's getting the actual url for pokemon2.
Does anyone know how to be able to accomplish this?
Many thanks! :).
You can call pokemon2 once pokemon finishes:
pokemon.onreadystatechange = function(){
if(pokemon.readyState == 4 && pokemon.status == 200){
var myArr = JSON.parse(pokemon.responseText);
var url2 = myArr.evolution_chain;
// Call pokemon2 here
var pokemon2 = new XMLHttpRequest();
pokemon2.open("GET", url2, true);
pokemon2.send();
}
}

AJAX loop, calling function onload vs. manually in the console/code

I have 2 functions:
function find_existing_widgets(){
xmlHttp = new XMLHttpRequest();
xmlHttp.open('GET', './_includes/check-load-user-settings.php', true);
xmlHttp.onreadystatechange = open_existing_widgets;
xmlHttp.send(null);
}
function open_existing_widgets(){
if(xmlHttp.readyState==4 && xmlHttp.status==200)
{
xmlResponse = xmlHttp.responseXML;
root = xmlResponse.documentElement;
var widget_id = root.getElementsByTagName('widget_id');
var name = root.getElementsByTagName('name');
var type = root.getElementsByTagName('type');
var value = root.getElementsByTagName('value');
for(i= 0; i< name.length; i++)
{
nameText = name.item(i).childNodes[0].nodeValue;
widgetID = widget_id.item(i).childNodes[0].nodeValue;
//var value = name.item(i).firstChild.data;
alert(nameText);
nameText = nameText.replace(/\s+/g,'_');
//alert(value); WORKS
widget_creator();
load = "./widgets/"+nameText+"/index.php?widget_id="+widgetID;
//alert(load); WORKS
$('.wrapper:first .main_sec').load(load);
$('.wrapper:first').fadeIn(1000);
}
}
}`
4 scenarios:
one. running the function find_existing_widgets() onload on the <body> tag and works perfect. runs once and shows data accordingly
two. running same function before the </body> tag like: <script>find_existing_widgets()</scripts> and this causes an infinite loop.
three. running the same function as window.onload = find_existing_widgets(); in JS code. and this cause the same issue as two, infinite loop.
four. if I run it manually from the console it works perfectly.
Why the loop? what's the difference ?
Once there you'll see the execution that is looping. I am using AJAX, all the javascript can be seen in the dev tools in chrome.
Example:
josesebastianmanunta.com/animated3/login.php user: smanunta pass: password

How can I iterate through a function with for loop?

I want to pass the 'y' variable to okayid but there seems to be a problem with the looping. The loop works fine with the first call of 'y' on okay.item(y) but it is not looping through okayid.item(y). It seemed to me like it was a scope problem but I am not sure.
var okay = document.getElementsByClassName("Okay");
var okayid = document.getElementsByClassName("OkayID");
var numberOkays = okay.length;
for(y = 0; y <= numberOkays; y++){
okay.item(y).onclick = function(){
xmlhttp = new XMLHttpRequest();
xmlhttp.onreadystatechange = function(){
if(xmlhttp.readyState == 4 && xmlhttp.status == 200){
alert('vote Sent to picture with id = ' + okayid.item(y).innerHTML);
}
};
xmlhttp.open("GET", "ajax/vote.php", true);
xmlhttp.send();
};
}
Here is the html ...
<a class="Link1A Okay" href="#"><span class="OkayID">[id]</span><div class="Vote1A">Okay</div></a>
You've got lots if issues.
in the for loop, you don't init y with var which could cause problems, as now y is part of the global scope.
in the for loop, you have y <= numberOkays which will cause an empty element to be retrieved at the end since numberOkays is the result of the array's length. So you'd get okay[y] is undefined at the end.
You don't need to retrieve the okays at the onset, you can just get the appropriate element in the onclick event.
After the loops are done y will be at the last index, so when the click event is fired you'd always get the last element when you refer to the Okay[y] (or in your case you'll just get undefined because of problem 2). Using this will refer to the element clicked which effectively what you intended with Okay[y].
Here's an updated version of your code, with a link to a working jsFiddle below:
var okay = document.getElementsByClassName("Okay");
for(var y = 0; y < okay.length; y++){
okay[y].onclick = function(){
idElem = this.getElementsByClassName("OkayID")[0];
xmlhttp = new XMLHttpRequest();
xmlhttp.onreadystatechange = function(){
if(xmlhttp.readyState == 4 && xmlhttp.status == 200){
alert('vote Sent to picture with id = ' + idElem.innerHTML);
}
};
xmlhttp.open("GET", "ajax/vote.php", true);
xmlhttp.send();
};
}
jsFiddle

Browser crashes after 10-15 mins

In my app I'm displaying 10 charts (charts are from dygraphs.) to monitor data. For displaying charts I'm getting data from my sever by sending ajax request to 4 servlets on every 5 seconds. After 10-15 mins (don't know exact time.) my browser crashes saying "aw!! snap." What could be the reason? Is it javascript that is causing it? or is it because I'm sending request every 5 seconds?
Browser tested: Firefox and Chorme.
Note:- When I refresh the browser after crash it again works fine for 10-15 mins.
JS code:
var i=0;
var loc = new String();
var conn = new String();
var heapUsage = new String();
var cpuUsage = new String();
var thrdCnt = new String();
var heapUsageConsole = new String();
var cpuUsageConsole = new String();
var thrdCntConsole = new String();
var user = new String();
var MemTotal = new String();
function jubking(){
var xmlhttp;
if (window.XMLHttpRequest) {
xmlhttp = new XMLHttpRequest();
} else {
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}
var url = "MonitorDBServlet";
xmlhttp.open("POST", url, false);
xmlhttp.send(null);
var str = xmlhttp.responseText;
var strArr = str.split(",");
url = "MonitorTomcatServlet";
xmlhttp.open("POST", url, false);
xmlhttp.send(null);
var appstr = xmlhttp.responseText;
var appArr = appstr.split(",");
url = "MonitorConsoleTomcatServlet";
xmlhttp.open("POST", url, false);
xmlhttp.send(null);
var appstrConsole = xmlhttp.responseText;
var appArrConsole = appstrConsole.split(",");
url = "CpuMemoryServlet";
xmlhttp.open("POST", url, false);
xmlhttp.send(null);
var statesStr = xmlhttp.responseText;
var states = statesStr.split(",");
if(i>30){
loc = loc.substring(loc.indexOf("\n")+1);
loc += i+","+strArr[0]+","+strArr[1]+"\n";
//--- Do same thing all other var
} else {
loc += i+","+strArr[0]+","+strArr[1]+"\n";
//--- Do same thing all other var
}
document.getElementById("dbSize").innerHTML = strArr[3];
document.getElementById("HeapMemoryUsageMax").innerHTML = appArr[1];
document.getElementById("HeapMemoryUsageMaxConsole").innerHTML = appArrConsole[1];
g = new Dygraph(document.getElementById("dbLocks"),
",locksheld,lockswait\n"+loc+"");
g = new Dygraph(document.getElementById("activeConnection"),
",Connections\n"+conn+"");
g = new Dygraph(document.getElementById("example2"),
",heapUsage\n"+heapUsage+"");
g = new Dygraph(document.getElementById("example3"),
",cpuUsage\n"+cpuUsage+"");
g = new Dygraph(document.getElementById("example4"),
",thread,peakThread\n"+thrdCnt+"");
g = new Dygraph(document.getElementById("example6"),
",heapUsage\n"+heapUsageConsole+"");
g = new Dygraph(document.getElementById("example7"),
",\n"+cpuUsageConsole+"");
g = new Dygraph(document.getElementById("example8"),
",thread,peakThread\n"+thrdCntConsole+"");
g = new Dygraph(document.getElementById("cpuStates"),
",user,system,nice,idle\n"+user+"");
g = new Dygraph(document.getElementById("memStates"),
",MT,MF,B,C,ST,SF\n"+MemTotal+"");
i = i + 1;
setTimeout("jubking()", 5000);
}
You can use about:crashes in FF to view the specific reason for your crash. As mentioned by others, you could be leaking memory if you're caching off data (assigning it to a variable) returned by your AJAX call and not clearing it when the next call is made.
Edit:
Just saw your comment - 1,923,481 K is definitely too much - you're leaking data somewhere. What OS are you running? If you run FF from console in *nix, you usually get some form of a dump into console when something's going wrong (not sure about Windows).
You could possibly try decreasing your poll intervals to once every few seconds and step through the script using Firebug or Chrome's debugger to see what's happening. Worst case, start commenting things out until you figure out exactly what is making your app crash. And then, figure out a way to fix it :)
I suspect that your dygraphs usage is, as you note in your comments, the source of your trouble. It looks like you're binding new graphs over and over again when you only want to update the data, using a moving window for the data would also help. Try reworking your updater to work like this pseudo-JavaScript:
var graphs = {
dbLocks: {
graph: new DyGraph(/* ... */),
data: [ ]
},
activeConnection: {
graph: new DyGraph(/* ... */),
data: [ ]
},
// etc.
};
var DATA_WINDOW_SIZE = 1000; // Or whatever works for you.
function update(which, new_data) {
var g = graphs[which];
g.data.push(new_data);
if(g.data.length > DATA_WINDOW_SIZE)
g.data.shift();
g.graph.updateOptions({ file: g.data });
}
function jubking() {
// Launch all your AJAX calls and bind a callback to each
// one. The success callback would call the update() function
// above to update the graph and manage the data window.
// Wait for all the above asynchronous AJAX calls to finish and
// then restart the timer for the next round.
setTimeout(jubking, 5000);
}
The basic idea is to use window on your data with a reasonable maximum width so that the data doesn't grow to chew up all your memory. As you add a new data point at the end of your data cache, you drop old ones off the other end once you hit your maximum comfortable size.
You can find some techniques for waiting for several asynchronous AJAX calls to finish over here: How to confirm when more than one AJAX call has completed? (disclosure: yes, that's one of my other answers).
The answer above advocates re-using your Dygraph object and calling g.updateOptions({file:...}) to reduce memory usage. This is a great way to do it.
The other way is to call g.destroy() before you redefine the Dygraph object. This will make dygraphs clear out all of its internal arrays and DOM references. Example:
g = new Dygraph(...);
g.destroy();
g = new Dygraph(...);
Read more here: http://blog.dygraphs.com/2012/01/preventing-dygraphs-memory-leaks.html

Categories

Resources