I was calling an ajax function on load page event. When I load the page, in the server log, There was not existing call log(I log entrance and out on all mvc server method). The request from javascript to server was taking some time, 2 to 3 minutes. The strange thing is when I test it local and test server, it works well. It occurs just when I deploy a project to the real server!
I found some posts about the method to do ajax, xmlhttp, $.ajax(). I already used both. But It still exists.
This is my javascript code.
$(document).ready(function () {
$.ajax({
type: "GET",
url: allservicesUrl,
async: true,
success: function(result, status, xhr){
var services = JSON.parse(xhr.responseText);
for (i in services) {
createServicecard(services[i]);
}
},
error: function(xhr, status, err) {
alert(xhr.responseText);
}
});
})
I wanna execute it immediately. How do I correct this problem?
Thanks for the advice.
I saw an answer and comments. Then, I saw the dev tool(network tab), finally, I found the problem.
Now I correct it, it works well. Thank you very much.
ps. Problem: connecting the internet in closed network.
Use the debugging/developer tools in your browser to trouble shoot this issue.
look in the console and see if you have any JS errors, then look in the network tab. Clear the entries and then reload the AJAX call.
you should be able to see if your script is slow to send the request to the server or if the server is slow to answer.
until you figure out if the bottleneck is in the script or on the server you can't fix it.
Related
This behavior was not present all the time, just come from nothing about a month ago and then disappeared suddenly. The problem is that I can't identify what happened. I have no server debug tool because only take place in production.
Roughly 100 ajax request are triggered at the same time using some loop like :
let url = "example.com/";
var methods = ["method1", "method2", "method3", "method4"]; //Roughly 100
$.each(methods, function(index, value) {
$.ajax({
url: url + value,
method: "POST",
data: { params: "whatever", otherParams: "whatever" }
}).done(function(data) {
console.log(data);
});
});
In server side (apache+php) there are selects, updates and inserts in a relational database. Each request performs an individual thread since apache is hearing.
When I see the network console, all requests starts at the same time (roughly), but here is the problem. The response happen one after the other finish. If request 1 starts at 0 and spend 5 seconds, request 2 starts at 5, and request 3 starts when request 2 was finished. All browser have the same behavior.
The best logic explanation I thought, is that database is blocking some table when performs an update or insert. Some tables are huge and without indexes could spend too much time. Well, staging environment points to the same database and works perfectly asynchronously. So what is going on? It is possible that php or Apache could be stucked in this way for some reason? I thought other crazy idea that is some writing problems with log files in the OS (debian) but I have no idea how that works. So I would be glad if anyone could give me any suggestion. Maybe I could reproduce the problem in a controlled environment and do something to prevent this can happen again.
Some additional information, the API have two clients one in angular the other in javascript+php. It's exactly the same behavior with both clients.
I have a Javascript function that runs every 5 seconds and requests information from the same server via a jQuery AJAX call. The function runs indefinitely once the page is loaded.
For some reason the AJAX query is failing about once every minute or two, and showing
ERR_EMPTY_RESPONSE
in the console. The odd thing is, it fails for exactly 60 seconds, then starts working fine for another minute or two.
So far I've tried with no success:
Different browser
Different internet connection
Changing the polling time of the function. (Still fails for 60 second intervals. eg run every 10 seconds, it fails 6 times. Or 5x12 or 1x60)
Web searches which suggesting flushing ip settings on my computer
I never had any problem on my last server which was a VPS. I'm now running this off shared hosting with GoDaddy and wonder if there's a problem at that end. Other sites and AJAX calls to the server are working fine during downtimes though.
I also used to run the site over HTTPS, now it's over plain HTTP only. Not sure if relevant.
Here's the guts of the function:
var interval = null;
function checkOrders() {
interval = window.setInterval(function () {
$.ajax({
type: "POST",
dataType: "json",
url: "http://www.chipshop.co.nz/ajax/check_orders.php",
data: {shopid : 699},
error: function(errorData) {
//handle error
},
success: function(data) {
//handle success
}
});
}, 5000); // repeat until switched off, polling every 5 seconds
}
Solved: It turned out the problem was with GoDaddy hosting. Too many POST requests resulted in the 60 second 'ban' from accessing that file. Changing to GET avoided this.
This page contains the answer from user emrys57 :
For me, the problem was caused by the hosting company (Godaddy)
treating POST operations which had substantial response data (anything
more than tens of kilobytes) as some sort of security threat. If more
than 6 of these occurred in one minute, the host refused to execute
the PHP code that responded to the POST request during the next
minute. I'm not entirely sure what the host did instead, but I did
see, with tcpdump, a TCP reset packet coming as the response to a POST
request from the browser. This caused the http status code returned in
a jqXHR object to be 0.
Changing the operations from POST to GET fixed the problem. It's not
clear why Godaddy impose this limit, but changing the code was easier
than changing the host.
I am working with codeigniter and jquery. I am using ajax to send some info to a codeigniter function to perform a db operation , in order to update the page.
$.ajax({
type: "POST",
url: BASE_URL+"Update/update",
data:{ i : searchIDs, m : message },
dataType: 'json',
success: function(data) {
console.log(data);
},
error: function(data) {
console.log(data);
},
complete: function() {
alert("REFRESHING..");
window.location.href = "pan_controller/reply";
}
});
After the operation is complete I want to reload the page. This works normally in locally on my WAMP. However when I deploy to a shared host I usually have to reload the page manually, (using the reload button in the browser -- screenshot above). I see no errors in firebug. I've tried a variety of fixes but have recently started to wonder if this is a caching issue on my shared hosting server. If so is there a better way to reload using jquery rather than just redirect so as to avoid a cache on the shared host?
Have you established that the complete function is running in those cases when the page doesn't reload? If so then I think you are right, it's a caching problem.
You can try adding a timestamp parameter to the end of the window.location.href value to make it unique each time and avoid any issues with caching, although a better approach would be to send the correct headers back with the response, so that the browser knows not to cache that page.
window.location.href = "pan_controller/reply?t=" + (new Date().getTime());
On Localhost every thing works fine but problem occur when I am deploying code on heroku.
This is simple Ajax call that I am using in my application.
I am using AJAX to send some data to server for processing.
When I add large amount of data to the request then it get failed.
If I send less data with AJAX request,its working fine.
$.ajax({
url:'Ajax.php',
data:"data to send",
type:'POST',
success: function(data) {
console.log("success");
},
error: function(XMLHttpRequest, textStatus, errorThrown) {
console.log("failed");
}
});
Can anyone suggest me why this is happing???
Heroku only allows 30 seconds for a request before it times out.
https://devcenter.heroku.com/articles/request-timeout
When this happens the router will terminate the request if it takes longer than 30 seconds to complete. The timeout countdown begins when the request leaves the router. The request must then be processed in the dyno by your application, and then a response delivered back to the router within 30 seconds to avoid the timeout.
it has absolutely got to do with the system memory usage especially in a screnerio where you would be parsing a alot of JSON objects to and fro!
Oh wait! not only JSON data, but any type of data(large in size ) using POST/GET method can result in failure because many JS oriented framework like nodejs still returns them as objects at the end!
I'm building an iPhone app that displays a UIWebView pointing to a web application I've created.
The web application makes frequent web service calls for data items which are used to animate controls on a canvas. The calls for data use jQuery ajax, passing parameters via JSON and receiving an XML response.
I'm finding that while user interactions with the UIWebView are occurring, the javascript setTimeout method is blocked and doesn't seem to execute at all. Fair enough; there are ways around this.
But the major problem is that every now and then after user interactions (zooming, panning etc), the ajax web service calls will just fail all the time and I can't establish a reason why. Even if they are made repeatedly, for the next few minutes none of them will even get through to the web service. If you completely leave the UIWebView alone, they will never fail as long as the web service is up and connectivity is present.
Can anyone suggest why, and how to fix/work around this?
Quick update: according to the Safari mobile debugger, the 'response' object in the error function is undefined. (It works if, for example, I make the URL invalid. This can then be called from objective-c by [webView stringByEvaluatingJavascript:#"lastError"], but throws an exception for this 'touched the uiwebview' error):
$.ajax({
type: "POST",
url: "WebService.asmx/GetValues",
async: true,
data: "{'pageVersionIndex': " + PageVersionIndex + " , 'timeStreamIndex': '" + TimeStream + "'}",
contentType: "application/json; charset=utf-8",
dataType: "xml",
success: function (response) { UpdateControls(response); },
error: function (response, status, errorthrown) {
calling = false;
lastError = response.statusText; //Throws exception
connectionInterrupted = true;
DataRoutine = window.setTimeout(DataService, dataFrequency); }
});
I'm afraid you are toasted... in a way. In iOS Safari and in UIWebView respectively system processes have priority over browser and if there is a sudden demand for more CPU power or memory for native processes (like handling touch etc) it might happen that any running javascript will be stopped from executing to reduce the memory load or cpu usage. The worst part is that it won't throw any errors or anything... it just stops your code execution as if nothing happened.
Afraid that if it happens a lot in your app the only way would be to add some kind of timer that would listen if the request wasn't blocked if so - do it again until successful.
Ups and downs of iOS - they really like you to go native rather then web :)
hope it helps,
Tom