My partner and I are trying to get a domain that I own, communicate with a ios app that is run on objective c to work via http. He is using the code that was provided by this link Sending an HTTP POST request on iOS.
He is able to do a GET to receive the data in my .txt page but when he performs a PUT to try and write to that file so that I can get that data it fails. We are both rather new to http so it is possible that we are missing something. A concern we have is that he doesn't have the privileges to write to this file. Any advice would help, thanks!
Here is the javascript I am using on my side. I added a header to my response to try and resolve the cors issue.
(function () {
window.onload = function () {
httpGetAsync("http://students.washington.edu/bharatis/distances.txt", processData)
//alert("hello inside onload");
document.getElementById("first").innerHTML = leader1;
document.getElementById("second").innerHTML = leader1;
document.getElementById("third").innerHTML = leader1;
//window.onbeforeunload = update;
}
function processData(responseText) {
//alert(responseText);
var txt = "";
var x = responseText.getElementsByTagName('Distance'); // Talk to alex about
for(i = 0; i < x.length; i++) {
txt += x[i].childNodes[0].nodeValue;
}
var result = parseDouble(txt);
alert(result);
}
function httpGetAsync(theUrl, callback) {
var xmlHttp = new XMLHttpRequest();
xmlHttp.onreadystatechange = function() {
if (xmlHttp.readyState == 4 && xmlHttp.status == 200)
callback(xmlHttp.responseText);
}
xmlHttp.open("GET", theUrl, true); // true for asynchronous
xmlHttp.setRequestHeader("Access-Control-Allow-Origin", "*");
xmlHttp.send("response message");
}
})();
My code works in Chrome and Safari, but it hangs in FF.
I removed the parts of the code that aren't necessary.
I used console commands to show how far the first loop gets, and it will do the second log fine right before the xhr open and send commands.
If the open/send commands are present the loop only happens once, if I remove the open/send commands the loop completes successfully.
Currently using FF 62nightly, but this issue has plagued me since Quantum has come out and I'm now trying to figure out why it doesn't work right.
for (i = 0; i < length; i++) {
(function(i) {
// new XMLHttpRequest
xhr[i] = new XMLHttpRequest();
// gets machine url from href tag
url = rows[i].getElementsByTagName("td")[0].getElementsByTagName('a')[0].getAttribute('href');
// Insert the desired values at the end of each row;
// will try to make this customizable later as well
insertVNC[i] = rows[i].insertCell(-1);
insertSerial[i] = rows[i].insertCell(-1);
insertVersion[i] = rows[i].insertCell(-1);
insertFreeDiskSpace[i] = rows[i].insertCell(-1);
// the fun part: this function takes each url, loads it in the background,
// retrieves the values needed, and then discards the page once the function is complete;
// In theory you could add whatever you want without taking significantly longer
// as long as it's on this page
console.log(i);
xhr[i].onreadystatechange = function() {
if (xhr[i].readyState == 4 && xhr[i].status == 200) {
}
};
//"Get" the "Url"... true means asyncrhonous
console.log(url);
xhr[i].open("GET", url, true);
xhr[i].send(null);
})(i); //end for loop
}
I cannot tell you why it gives issues in Firefox. I would not trust sending off arbitrarily many requests from any browser
I would personally try this instead since it will not fire off the next one until one is finished
const urls = [...document.querySelectorAll("tr>td:nth-child(0) a")].map(x => x.href);
let cnt=0;
function getUrl() {
console.log(urls[cnt]);
xhr[i].open("GET", urls[cnt], true);
xhr[i].send(null);
}
let xhr = new XMLHttpRequest();
xhr.onreadystatechange = function() {
if (xhr[i].readyState == 4 && xhr[i].status == 200) {
if (cnt>urls.length) getUrl();
cnt++;
}
}
getUrl();
I have some JavaScript code that I use to retrieve data from a json file and populate a dropdown list.
Everything was working fine.
I added some code and it went into an infinite loop.
I deleted this code but since then it no longer works on the HTTP server I setup using Python.
HOWEVER it works perfectly fine when I load it on a network server.
I deleted Python, reinstalled and still not working.
Logically it can't be the code because it works on the network server... I am totally lost. Any and all help much appreciated. I am getting now where fast. (I can't work from the network server so need this working locally.)
Below is the javascript
//this will hold the data from JSON
var teamSkillsData
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
//retrieve data from the external json file
var response = JSON.parse(xhttp.responseText);
teamSkillsData = response.teamSkills;
var select = document.getElementById("teamList");
alert("nn");
//populate the teamList drop down menu
for(var i = 0; i < teamSkillsData.length; i++) {
//assign the team names
var opt = teamSkillsData[i].team;
var el = document.createElement("option");
el.textContent = opt;
el.value = i;
select.appendChild(el);
}
// Typical action to be performed when the document is ready:
document.getElementById("demo").innerHTML = xhttp.responseText;
}
};
xhttp.open("GET", 'Data.json', true);
xhttp.send();
function teamChanged(teamSelected)
{
var skills = teamSkillsData[teamSelected].skillset;
for(var i = 0; i < skills.length; i++) {
skillsRequired = skills[i];
alert(skillsRequired);
}
}
Perplexed by this issue of my xmlhttp failing to complete. The issue only arises when I have multiple calls. Oddly enough only the last one completes. I feel as if the first one times out or something. In watching the code in the window the ready state change function fires a few times but the value is always 1 and eventually it jumps out and performs the next call. Is there a way of fixing this? MAybe adding a delay? Any advice is much apprecaited.
<script>
<!--var plant_select = document.createElement("select"); -->
var datafile = '';
var xmlhttp = new XMLHttpRequest();
xmlhttp.open("GET","http://localhost:8080/res/plants.csv",true);
xmlhttp.send();
xmlhttp.onreadystatechange = function()
{
if(xmlhttp.status==200 && xmlhttp.readyState==4)
{
processCSV(xmlhttp.responseText, document.getElementById("plant_select"),"Select Plant");
}
}
</script>
</select>
</div>
<div class="menu_element" id="plantType_div">
<select class="Menu" id="plantType_select">
<script>
<!--var plant_select = document.createElement("select"); -->
var datafile = '';
var xmlhttp = new XMLHttpRequest();
xmlhttp.open("GET","http://localhost:8080/res/planttypes.csv",true);
xmlhttp.send();
xmlhttp.onreadystatechange = function()
{
if(xmlhttp.status==200 && xmlhttp.readyState==4)
{
processCSV(xmlhttp.responseText, document.getElementById("plantType_select"),"Select Plant Type");
}
}
</script>
You are using the same, global variable for each request: var xmlhttp. Each subsequent instance then tries to operate on the same variable. So you will only get the last one because that was the last value of the variable to be written before any of the responses got back.
Wrap each instance in a function so you are dealing with locally scoped variables instead of globals.
You are using the same object every time:
var xmlhttp = new XMLHttpRequest();
Try to give a different name for each request.
In my app I'm displaying 10 charts (charts are from dygraphs.) to monitor data. For displaying charts I'm getting data from my sever by sending ajax request to 4 servlets on every 5 seconds. After 10-15 mins (don't know exact time.) my browser crashes saying "aw!! snap." What could be the reason? Is it javascript that is causing it? or is it because I'm sending request every 5 seconds?
Browser tested: Firefox and Chorme.
Note:- When I refresh the browser after crash it again works fine for 10-15 mins.
JS code:
var i=0;
var loc = new String();
var conn = new String();
var heapUsage = new String();
var cpuUsage = new String();
var thrdCnt = new String();
var heapUsageConsole = new String();
var cpuUsageConsole = new String();
var thrdCntConsole = new String();
var user = new String();
var MemTotal = new String();
function jubking(){
var xmlhttp;
if (window.XMLHttpRequest) {
xmlhttp = new XMLHttpRequest();
} else {
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}
var url = "MonitorDBServlet";
xmlhttp.open("POST", url, false);
xmlhttp.send(null);
var str = xmlhttp.responseText;
var strArr = str.split(",");
url = "MonitorTomcatServlet";
xmlhttp.open("POST", url, false);
xmlhttp.send(null);
var appstr = xmlhttp.responseText;
var appArr = appstr.split(",");
url = "MonitorConsoleTomcatServlet";
xmlhttp.open("POST", url, false);
xmlhttp.send(null);
var appstrConsole = xmlhttp.responseText;
var appArrConsole = appstrConsole.split(",");
url = "CpuMemoryServlet";
xmlhttp.open("POST", url, false);
xmlhttp.send(null);
var statesStr = xmlhttp.responseText;
var states = statesStr.split(",");
if(i>30){
loc = loc.substring(loc.indexOf("\n")+1);
loc += i+","+strArr[0]+","+strArr[1]+"\n";
//--- Do same thing all other var
} else {
loc += i+","+strArr[0]+","+strArr[1]+"\n";
//--- Do same thing all other var
}
document.getElementById("dbSize").innerHTML = strArr[3];
document.getElementById("HeapMemoryUsageMax").innerHTML = appArr[1];
document.getElementById("HeapMemoryUsageMaxConsole").innerHTML = appArrConsole[1];
g = new Dygraph(document.getElementById("dbLocks"),
",locksheld,lockswait\n"+loc+"");
g = new Dygraph(document.getElementById("activeConnection"),
",Connections\n"+conn+"");
g = new Dygraph(document.getElementById("example2"),
",heapUsage\n"+heapUsage+"");
g = new Dygraph(document.getElementById("example3"),
",cpuUsage\n"+cpuUsage+"");
g = new Dygraph(document.getElementById("example4"),
",thread,peakThread\n"+thrdCnt+"");
g = new Dygraph(document.getElementById("example6"),
",heapUsage\n"+heapUsageConsole+"");
g = new Dygraph(document.getElementById("example7"),
",\n"+cpuUsageConsole+"");
g = new Dygraph(document.getElementById("example8"),
",thread,peakThread\n"+thrdCntConsole+"");
g = new Dygraph(document.getElementById("cpuStates"),
",user,system,nice,idle\n"+user+"");
g = new Dygraph(document.getElementById("memStates"),
",MT,MF,B,C,ST,SF\n"+MemTotal+"");
i = i + 1;
setTimeout("jubking()", 5000);
}
You can use about:crashes in FF to view the specific reason for your crash. As mentioned by others, you could be leaking memory if you're caching off data (assigning it to a variable) returned by your AJAX call and not clearing it when the next call is made.
Edit:
Just saw your comment - 1,923,481 K is definitely too much - you're leaking data somewhere. What OS are you running? If you run FF from console in *nix, you usually get some form of a dump into console when something's going wrong (not sure about Windows).
You could possibly try decreasing your poll intervals to once every few seconds and step through the script using Firebug or Chrome's debugger to see what's happening. Worst case, start commenting things out until you figure out exactly what is making your app crash. And then, figure out a way to fix it :)
I suspect that your dygraphs usage is, as you note in your comments, the source of your trouble. It looks like you're binding new graphs over and over again when you only want to update the data, using a moving window for the data would also help. Try reworking your updater to work like this pseudo-JavaScript:
var graphs = {
dbLocks: {
graph: new DyGraph(/* ... */),
data: [ ]
},
activeConnection: {
graph: new DyGraph(/* ... */),
data: [ ]
},
// etc.
};
var DATA_WINDOW_SIZE = 1000; // Or whatever works for you.
function update(which, new_data) {
var g = graphs[which];
g.data.push(new_data);
if(g.data.length > DATA_WINDOW_SIZE)
g.data.shift();
g.graph.updateOptions({ file: g.data });
}
function jubking() {
// Launch all your AJAX calls and bind a callback to each
// one. The success callback would call the update() function
// above to update the graph and manage the data window.
// Wait for all the above asynchronous AJAX calls to finish and
// then restart the timer for the next round.
setTimeout(jubking, 5000);
}
The basic idea is to use window on your data with a reasonable maximum width so that the data doesn't grow to chew up all your memory. As you add a new data point at the end of your data cache, you drop old ones off the other end once you hit your maximum comfortable size.
You can find some techniques for waiting for several asynchronous AJAX calls to finish over here: How to confirm when more than one AJAX call has completed? (disclosure: yes, that's one of my other answers).
The answer above advocates re-using your Dygraph object and calling g.updateOptions({file:...}) to reduce memory usage. This is a great way to do it.
The other way is to call g.destroy() before you redefine the Dygraph object. This will make dygraphs clear out all of its internal arrays and DOM references. Example:
g = new Dygraph(...);
g.destroy();
g = new Dygraph(...);
Read more here: http://blog.dygraphs.com/2012/01/preventing-dygraphs-memory-leaks.html