I have what should be a very simple little process. I have an interval timer which, when it expires, makes an ajax call to the server. With the json that comes back, it parses it out and updates the DOM. The setInterval is set to one second. The server then does a very simple SELECT on the database. It executes a query which takes milliseconds to execute, if that.
On our test server, it's working fine. But when it's deployed to our customer it is most of the time NOT actually hitting the database. We can see it in query analyser. There should be a continuous flow of queries, but they are sporadic at best, often 40 or more seconds between hits.
Here's the code:
setTimeout(function run() {
// When the timer elapses, get the data from the server
GetData(0);
setTimeout(run, _refreshRate);
}, 1000);
function GetData(isFirstLoad) {
//console.log("Attempting to obtain the data...");
jQuery.ajax({
url: "something.ashx",
type: "GET",
contentType: 'application/json; charset=utf-8',
success: function(resultData) {
//console.log("Got the data.");
ParseJson(resultData);
// show the last refresh date and time
$('#refreshTime').html(GetDateTime());
},
error : function(xhr, textStatus, errorThrown) {
if (textStatus == 'timeout') {
//console.log("Timeout occured while getting data from the server. Trying again.");
// If a timeout happens, DON'T STOP. Just keep going forever.
$.ajax(this);
return;
}
},
timeout: 0,
});
}
Everything within ParseJson(resultData); works fine. And with this line...
$('#refreshTime').html(GetDateTime());
...the time gets refreshed every one second like clockwork, even though the database never gets hit.
And I can put a breakpoint in the debug tools inside the error and it never gets hit.
If we hit refresh, it works or a few seconds (we can see queries hitting the database) but then it slows way down again.
The frustrating part is that it works flawlessly on our test server. But there is clearly something I'm overlooking.
EDIT:
Ok, this is really weird. When I have debugger open, it works. As soon as I close the debugger, it stops working. I don't even have to have the network tab running and capturing events. Just the debugger window open makes it work.
This is IE, which is what the client is using so it's our only option.
Found the answer here:
jQuery ajax only works in IE when the IE debugger is open
Turns out iE, and only IE, will cache ajax responses. You have to tell it not to. Adding cache: false did the trick.
function GetData(isFirstLoad) {
//console.log("Attempting to obtain the data...");
jQuery.ajax({
url: "something.ashx",
type: "GET",
contentType: 'application/json; charset=utf-8',
cache: false,
success: function(resultData) {
Related
I am binding the beforeunload event of a page to make a quick synchronous ajax call (not best practice but it is essentially a ping so we are trying it as an idea).
$(window).bind('beforeunload', function(){
makeAjaxPUTcall();
//return 'Send a message!';
});
This works fine when built, deployed to a test environment and tested manually.
I was then trying to add some selenium tests (ChromeDriver) to automate the testing. The Ajax call does not seem to get made. Setting a breakpoint on the browser seems to show the code running through the Ajax call and if I uncomment the return I will get an (unwanted) alert before the page unloads. I have, for sake of argument, bound the javascript function to the blur event of the page controls
$(':input').on('blur', function() { makeAjaxPUTcall(); });
and it works without problem in selenium (as well as manual testing).
When run in the beforeunload event logging seems to show that the call never hits server.
The javascript function is like this
function makeAjaxPUTcall(){
$.ajax({
type : 'PUT',
async: false,
url: urlVal,
data: null,
processData: false,
dataType: 'json',
contentType: 'application/json',
success: function() {}
});
}
So in summary from where I'm standing I've proved, in various ways, that the code to make the REST call is working client side and server side. I have also 'proved' by manual dev testing that these calls are triggered when a page navigation takes place manually. I have also proved that the beforeunload event is being triggered in the selenium tests (by adding a return string and seeing an alert pop up).
I have also converted the PUT to a GET and after putting a breakpoint in the test code navigated a different tab to the url provided and proved that it is triggered and does hit a breakpoint in the (java) controller code.
It SEEMS to me that the issue must be something to do with the way Selenium is handling the navigation ? Can anyone point me in the next direction ?!
Try it with fetch:
window.onbeforeunload = () => {
fetch(url, {
method: 'PUT',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({foo: "bar"})
})
}
Why? Fetch creates a promise and I believe chrome will try to let that resolve first
I have a web application that loads different content into the same div (.content) using either an ajax request or a .load() request.
A simplified version of the ajax request is:
$(document).on('click', '.button1', function() {
$.ajax({
type: "POST",
url: "/path/to/file/ajax.php",
data: {
'param1': 'xyz',
'param2': '123',
},
timeout: 10000,
dataType: "json",
success: function(data) {
if (data.status == 'Success') {
$('.content').html(data.result);
} else if (data.status == 'Error'){
console.log('Something went wrong')
}
},
error: function(x, status, error) {
if (status === "timeout") {
console.log('Request Timed Out');
}
},
});
});
on another button I have:
$(document).on('click', '.button2', function() {
$('.content').load('/path/to/file/getContent.php?ref=foo');
});
The problem is that if I click on button1, and then click on button 2 whilst the ajax request is still executing, nothing happens. (ie. getContent.php doesn't seem to return anything for anywhere between 15-30 seconds)
Clicking on button 2 returns the results instantly, but as soon as button1 is clicked and the ajax request is being processed, it seems to "stall" the entire web app until it's completed (or errors).
(I've also tried using the abort() method to cancel the ajax request, but the same problem persists, even when the abort is successful).
UPDATE
See solution/Answer below
Following on from the tip from #adeneo, I did some more research into session_start and ajax calls which led me to this article:
http://konrness.com/php5/how-to-prevent-blocking-php-requests/
The problem absolutely was to do with session_start locking the session file which causes back-end issues with multiple ajax requests. Following the advice in the article above, I made use of session_write_close() in my PHP and everything is working great.
If it helps anyone else, I actually changed my code from session_start() to:
if (session_status() == PHP_SESSION_NONE) {
session_start();
session_write_close();
}
This makes session variables readable, but no longer locks the session file causing ajax request delays. (It also checks to make sure session_start() hasn't already been called)
I have a local application that contains a web server for exchanging JSON with a web application. The Web application itself is served from the web, meaning that browsers see them as cross-origin.
When the application is running, it provides correct cross-origin headers to allow the interchange. However, what I want is the option to quickly detect if the application is running or not.
The current method is to use AJAX to a "heartbeat" URL on the localhost service:
pg.init.getHeartbeat = function(port) {
var url = pg.utils.baseUrl('heartbeat', port); // returns a localhost URL
$.ajax({
url: url,
type: 'GET',
async: true,
success: function(data) {
// Hooray! Application is there. Do 'hooray' stuff
},
error: function(data) {
// Whoah the bus. Application not there. Do 'no application' stuff.
}
});
};
Works great in Webkit. Webkit tries to get a heartbeat, quickly fails, and does failure stuff very quickly.
Problem is in Firefox. Firefox tries to get a heartbeat, and takes between 4 and 10 seconds to fail. It might not seem like much, but 4 seconds before the UI moves the user to the next step is making the app feel very slow and unresponsive.
Any other ideas out there? As far as I can tell, changing an iFrame's src attribute and capturing a failure isn't working, either. It's not triggering the error event. And even when I can get an error to trigger from sample code, it's still taking 4 seconds, so there is no net improvement.
The web server side of things should not have any server-side scripting languages (PHP, etc.); I need the JavaScript to be able to take care of it independently.
You can do a timeout if the navigator is Firefox:
var timeoutcall = 0;
if(navigator.userAgent.toLowerCase().indexOf('firefox') > -1)
{
timeoutcall = 100;
}
$.ajax({
url: url,
type: 'GET',
timeout: timeoutcall ,
async: true,
success: function(data) {
// Hooray! Application is there. Do 'hooray' stuff
},
error: function(data) {
// Whoah the bus. Application not there. Do 'no application' stuff.
}
});
If the timeout is 0 then there is no timeout. So, if i am in firefox i set timeout to 100ms, and if im not in firefox set to unlimited.
I have an AJAX long polling request below:
$.ajax({
type: "GET",
url: "events_controller.php",
dataType: "json",
success: function (data) {
eventsTimer = setTimeout(function(){eventsTimerHandler()}, 1000);
}
});
On the server, if some event happens then it will return what has happened and the request above will display a notification.
A problem I am having is if I do something on the browser to trigger an event that will happen in say 10 seconds in the future and then immediately go to a different page, it will create a new long polling request but the previous one is still active and no notification will be sent to the user.
I hope I'm making sense.
My application relies on being able to set a user Online/Offline state.
I have solved this by using the window.onbeforeunload to see when the user leaves a page.
The problem is that the below code does not always get executed and is therefore still saying that some people are online when they have left the page.
This is my current code:
window.onbeforeunload = function (event) {
$.ajax({
type: 'GET',
url: './changeStatus.php?cid=' + ccID + '&status=0',
contentType: "application/json",
dataType: 'json',
cache: false,
async: false,
success: function (data) {},
error: function (e) {
//console.log(e);
}
});
return;
};
Any ideas how i can force this to always get executed? (On all browsers)
The script is probably always executed when the page closes, but that doesn't mean that it always succeeds in calling the server.
If for example the user disconnects the internet connection before closing the page, the script can't send anything to the server. You simply can't rely on getting a call from the browser every time the page is closed.
You can keep track on the server when the user was last heard of, so that you can set the user as offline when he hasn't requested any pages for a while. Another alternative is to frequently send a request to the server as long as the user is on the page, and when the requests stop coming, the server knows that the user has gone away.