I have a LoadingStatus Function that has two options SHOW or HIDE.
The Show triggers to display when the JQUERY POST is made, the HIDE happens after the RESPONSE comes back.
The issue I'm having is that sometimes this happens so fast that it makes for a bad experience. What I thought about doing was putting in a JavaScript PAUSE, but if the POST takes a while to respond it will take even longer because of the PAUSE.
How can I have my SHOW HIDE function work together, to make sure at minimum the SHOW was displayed to the user for at least 1/2 second?
function saveBanner (action) {
if (action == 'show') {
// Display the AJAX Status MSG
$("#ajaxstatus").css("display","block");
$("#msg").text('Saving...');
}
else if (action == 'hide') {
$("#ajaxstatus").css("display","none");
$("#msg").text('');
}
};
Thanks
In your ajax success callback, you can put the hide command in a setTimeout() for 1500 miliseconds:
success: function(results) {
setTimeout(function(){
saveBanner("hide");
}, 1500);
}
Of course that would merely add 1.5 seconds onto however long the process itself took. Another solution would be to record the time the process started, with the Date object. Then, when the callback takes place, record that time and find the difference. If it's less than a second and a half, set the timeout for the difference.
/* untested */
var start = new Date();
success: function(results) {
var stop = new Date();
var difference = stop.getTime() - start.getTime();
difference = (difference > 1500) ? difference : 1500 ;
setTimeout(function(){
saveBanner("hide");
}, difference);
}
You can perform this math either inside your callback, or within the saveBanner() function itself, within the show portion you would set the starting time, within the hide() portion you would check the difference and set the setTimeout().
You can use setTimeout/clearTimeout to only show the status when the response takes longer than a set amount of time to load.
Edit:
Some untested code:
var t_id = 0;
function on_request_start()
{
t_id = setTimeout(show_message, 1000);
}
function on_request_completed()
{
clearTimeout(t_id);
hide_message();
}
The JQuery handlers should look something like the above. The message will not be shown if you receive a reply in less than a second.
var shownTime;
function saveBanner (action) {
if (action == 'show') {
// Display the AJAX Status MSG
$("#ajaxstatus").css("display","block");
$("#msg").text('Saving...');
shownTime = new Date().getTime();
}
else if (action == 'hide') {
var hideIt = function() {
$("#ajaxstatus").css("display","none");
$("#msg").text('');
};
var timeRemaining = new Date().getTime() - shownTime - 1500;
if (timeRemaining > 0) {
setTimeout(hideIt, timeRemaining);
else {
hideIt();
}
}
};
As of jQuery 1.5, you are able to extend the $.ajax functionality by using prefilters. I wanted a similar experience where a message was shown a minimum amount of time when an ajax call is made.
By using prefilters, I can now add a property to the ajax call named "delayedSuccess" and pass it a time in milliseconds. The time that is passed in is the minimum amount of time the ajax call will wait to call the success function. For instance, if you passed in 3000 (3 seconds) and the actual ajax call took 1.3 seconds, the success function would be delayed 1.7 seconds. If the original ajax call lasted more than 3 seconds, the success function would be called immediately.
Here is how I achieved that with an ajax prefilter.
$.ajaxPrefilter(function (options, originalOptions, jqXHR) {
if (originalOptions.delaySuccess && $.isFunction(originalOptions.success)) {
var start, stop;
options.beforeSend = function () {
start = new Date().getTime();
if ($.isFunction(originalOptions.beforeSend))
originalOptions.beforeSend();
};
options.success = function (response) {
var that = this, args = arguments;
stop = new Date().getTime();
function applySuccess() {
originalOptions.success.apply(that, args);
}
var difference = originalOptions.delaySuccess - (stop - start);
if (difference > 0)
setTimeout(applySuccess, difference);
else
applySuccess();
};
}
});
I first check to see if the delaySuccess and success options are set. If they are, I then override the beforeSend callback in order set the start variable to the current time. I then override the success function to grab the time after the ajax call has finish and subtract the difference from the original delaySuccess time. Finally, a timeout is set to the computed time which then calls the original success function.
I found this to be a nice way to achieve this effect and it can easily be used multiple times throughout a site.
Related
I am running a HTTP Request to a file and depending on the response whether it be "200" or another response a success or error function is ran. This request takes place every second.
The problem I am facing is when I get lots of error responses they all run together and the last one doesn't stop e.g. End the interval to start a new one.
The red light begins to flash way too fast. Can anyone help me out. My code is below and I have been playing with it for a few hours now but can't seem to get to the bottom of it.
var requestResponses = {
greenLight: $('.cp_trafficLight_Light--greenDimmed'),
redLight: $('.cp_trafficLight_Light--redDimmed'),
greenBright: 'cp_trafficLight_Light--greenBright',
redBright: 'cp_trafficLight_Light--redBright',
init: function (url) {
setInterval(function () {
requestResponses.getResponse(url);
}, 1000);
},
successResponse: function () {
var redBright = requestResponses.redBright,
greenBright = requestResponses.greenBright;
requestResponses.errorCode = false;
requestResponses.redLight.removeClass(redBright);
requestResponses.greenLight.addClass(greenBright);
},
errorResponse: function () {
requestResponses.runOnInterval();
},
runOnInterval: function () {
// clearInterval(runInterval);
var redBright = requestResponses.redBright,
greenBright = requestResponses.greenBright,
redLight = requestResponses.redLight;
requestResponses.greenLight.removeClass(greenBright);
var runInterval = setInterval(function () {
if (requestResponses.errorCode === true) {
redLight.toggleClass(redBright);
}
}, 400);
},
getResponse: function (serverURL) {
$.ajax(serverURL, {
success: function () {
requestResponses.errorCode = false;
requestResponses.successResponse();
},
error: function () {
requestResponses.errorCode = true;
requestResponses.errorResponse();
},
});
},
errorCode: false
}
requestResponses.init('/status');
Appreciate the help.
Javascript is an event driven language. Do not loop inifinitely to check things periodically. There are places to do so but most of the time either calling a delay function (setTimeout) repeatedly when needed or using a callback would be better method.
Using setInterval with request, think what happens if requests start taking longer than your interval.
In your case, you have two loops created with setInterval. First one is the request which will run every 1 sec. Instead of using setInterval, you can modify your code to run setTimeout only after a request finishes and do other tasks just before re-running the next request :
function runRequest(...) {
$.ajax(serverURL, {
...
complete: function () {
setTimeout(runRequest, 1000);
}
...
});
}
function lightsOnOff() {
var redBright = requestResponses.redBright,
greenBright = requestResponses.greenBright,
redLight = requestResponses.redLight;
requestResponses.greenLight.removeClass(greenBright);
if (requestResponses.errorCode === true) {
redLight.toggleClass(redBright);
}
}
setInterval(lightsOnOff, 400);
The setInterval() method repeats itself over and over, not just one time. Your error response handler is then invoking the routine that creates another setInterval(), and so on. Until you have so many processes running that you get the flashing red light issue.
The solution is to only invoke the logic where the setInterval() call is made once. Or, even better, use setTimeout() to call the routine. It is run one-time and likely better for your use.
In my app I make several nested AJAX calls to the LiquidPlanner API that limits requests to 30 requests every 15 seconds. When I hit the limit, I want to set a timeout of sorts to stop sending requests to the API until the 15 seconds have elapsed. This (at the moment) will only be used by one person ever, so multiple clients are not a concern.
Upon hitting the rate limit the response is:
{
"type":"Error",
"error":"Throttled",
"message":"34 requests, exceeds limit of 30 in 15 seconds. Try again in 7 seconds, or contact support#liquidplanner.com"
}
Here is some code, simplified for brevity:
$.getJSON('/dashboard/tasks/123, function(tasks) {
$.each(tasks, function(t, task) {
$.getJSON('/dashboard/project/987, function(project) {
$.getJSON('/dashboard/checklist-items/382983, function(checklist-items) {
// form some html here
});
});
});
});
So at any point in this process I could hit the limit and need to wait until the timeout has completed.
I am also open to suggestions to better form the requests instead of nesting them.
Another solution that probably prevents hammering better is a queue - however you need to be aware that the order of requests could be significantly different using this method. And that only one request will ever run at a time (so total response times may increase significantly depending on the use case).
//Keep track of queue
var queue = [];
//Keep track of last failed request
var failed_request = false;
function do_request(url, callback) {
//Just add to queue
queue.push({
url:url,
callback:callback
});
//If the queue was empty send it off
if (queue.length === 1) attempt_fetch();
}
function attempt_fetch() {
//If nothing to do just return
if (queue.length === 0 && failed_request === false) return;
//Get the url and callback from the failed request if any,
var parms;
if (failed_request !== false) {
parms = failed_request;
} else {
//otherwise first queue element
parms = queue.shift();
}
//Do request
$.getJSON(parms.url, function(response) {
//Detect throttling
if (response.type === 'error' && response.error === 'throttled') {
//Store the request
failed_request = parms;
//Call self in 15 seconds
setTimeout(function(){
attempt_fetch();
}, 15000);
} else {
//Request went fine, let the next call pick from the queue
failed_request = false;
//Do your stuff
parms.callback(response);
//And send the next request
attempt_fetch();
}
}
}
...your logic still remains largely unchanged:
do_request('/dashboard/tasks/123', function(tasks) {
$.each(tasks, function(t, task) {
do_request('/dashboard/project/987', function(project) {
do_request('/dashboard/checklist-items/382983', function(checklist_items) {
// form some html here
});
});
});
});
Disclaimer: Still completely untested.
As far as design patterns for chaining multiple requests, take a look at the chaining section in the following article: http://davidwalsh.name/write-javascript-promises . Basically, you could create a service that exposes a method for each type of request, which returns the promise object and then chain them together as needed.
As far as you question about setting a timeout, given the information you provided, it is a bit difficult to advice you on it, but if that is absolutely all we have, I would create a request queue ( a simple array that allows you to push new requests at the end and pop the from the head ). I would then execute the known requests in order and inspect the response. If the response was a timeout error, set a timeout flag that the request executor would honor, and if successful, either queue additional requests or create the html output. This is probably a pretty bad design, but is all I can offer given the information you provided.
Write a wrapper that will detect the rate-limited response:
//Keep track of state
var is_throttled = false;
function my_wrapper(url, callback) {
//No need to try right now if already throttled
if (is_throttled) {
//Just call self in 15 seconds time
setTimeout(function(){
return my_wrapper(url, callback);
}, 15000);
}
//Get your stuff
$.getJSON(url, function(response) {
//Detect throttling
if (response.type === 'error' && response.error === 'throttled') {
/**
* Let "others" know that we are throttled - the each-loop
* (probably) makes this necessary, as it may send off
* multiple requests at once... If there's more than a couple
* you will probably need to find a way to also delay those,
* otherwise you'll be hammering the server before realizing
* that you are being limited
*/
is_throttled = true
//Call self in 15 seconds
setTimeout(function(){
//Throttling is (hopefully) over now
is_throttled = false;
return my_wrapper(url, callback);
}, 15000);
} else {
//If not throttled, just call the callback with the data we have
callback(response);
}
}
}
Then you should be able to rewrite your logic to:
my_wrapper('/dashboard/tasks/123', function(tasks) {
$.each(tasks, function(t, task) {
my_wrapper('/dashboard/project/987', function(project) {
my_wrapper('/dashboard/checklist-items/382983', function(checklist_items) {
// form some html here
});
});
});
});
Disclaimer: Totally untested - my main concern is the scope of the url and callback... But it's probably easier for you to test.
Hi all i got a getjson call and wondering how i can check its response(siteContents) if it is empty or if it doesn't have a required string(for example look for seasonEpisode=)then call getjson again .Can we call getjson itself from within it ?My goal is to get correct response from getjson.Hope you guys help me.Thanks
$.getJSON('http://www.mysite.com/doit.php?value=55?', function(data){
//$('#output').html(data.contents);
var siteContents = data.contents;
Try this:
var handler = function(data){
//$('#output').html(data.contents);
var siteContents = data.contents;
if (!siteContents) {
$.getJSON('http:/...', handler);
return;
}
// handle siteContents
}
$.getJSON('http://...', handler);
edit: the above would spam the server with repeating attempts in case the siteContents is empty - creating infinite loop and high load. I would suggest two improvements:
1) count how many repeating empty siteContents loops you made. Cancel the loop with an error message (if appropriate) after some failure threshold (eg. 20 attempts).
2) do the iteration with setTimeout(function() { $.getJSON(...) }, delay) where delay is some milliseconds to wait between retries.
It sounds like the better question is why doesn't your server return the 'correct' response on the first try? Or as NuclearGhost points out, why does it return different responses for the same request?
But to accomplish what you're asking for requires recursion. You can't just do it in a loop because the response is asynchronous. But if you name a function, you can call that function in the success handler, something like this:
function getJSONRecursive(maxRetries, count) {
if(!count) count = 1;
if (count > maxRetries) {
alert('Giving up after '+count+' retries.');
return;
}
$.getJSON('http://www.mysite.com/doit.php?', function(data) {
if(!data || !data.contents || data.contents.indexOf('seasonEpisode') == -1) {
getJSONRecursive(++count, maxRetries);
} else {
$('#output').html(data.contents);
}
})
}
Then invoke it like this:
getJSONRecursive(5);
I wouldn't recommend doing without the count, because otherwise you will overflow the stack if the correct response never comes back. If the situation you're avoiding is a server timeout or overload problem, then I would recommend putting the recursive call in a timeout, like so:
if(!data || !data.contents || data.contents.indexOf('seasonEpisode') == -1) {
setTimeout(function() { getJSONRecursive(++count, maxRetries)}, 5000);
// etc.
This waits an extra 5 seconds between calls, but the extra time will help ensure your server doesn't get slammed and your getjson calls don't just run themselves out too quickly.
Not sure if I'm phrasing my question right, but I have the code below. Basically want to make an ajax request, check the call back, and re-perform the ajax until it gets the desired response (in the example connectedvoter==1).
The problem is it only takes like 80ms and the amount of xhr's gets to huge numbers really fast at that speed. I tried to come up with a way to 'pause' but everything I could think of ate up cpu.
Is there a way to 'slow down' the amount of requests made, to say once a second or two without eating up cpu?
var connctedvoter = 0;
var govoters = function () {
$.ajaxSetup({
async: false
});
var url = "getconnectedvoter.php";
var data = {
userid: userid
};
$.getJSON(url, data, callback);
};
var pausevoters = function () {
console.log("pausing ajax voters");
};
var callback = function (response) {
if (response.error) {
return;
}
if (response.connectedvoter == 0) {
setTimeout(govoters, 150);
//govoters();
} else {
$('#vanid').html(response.vanid);
$('#name').html(response.name);
$("#mapurl").attr("src", response.mapurl);
$('.call').fadeIn();
return;
}
};
//DO THIS TO START
setTimeout(govoters, 150);
pausevoters();
Just increase the delay of the timeout:
if(!response.connectedvoter) {
setTimeout(govoters, 2000); // instead of 150
}
Instead of 10 requests for a second:
setTimeout(govoters, 150); // one request every 150 millisecond.
Note that for start you can change from this:
//DO THIS TO START
setTimeout(govoters, 150);
To this:
//DO THIS TO START
govoters();
There isn't really a need for a timeout here.
Just change lines with this code:
setTimeout(govoters, 150);
with:
setTimeout(govoters, 1000); // 1 second
I'm a novice-to-intermediate JavaScript/jQuery programmer, so concrete/executable examples would be very much appreciated.
My project requires using AJAX to poll a URL that returns JSON containing either content to be added to the DOM, or a message { "status" : "pending" } that indicates that the backend is still working on generating a JSON response with the content. The idea is that the first request to the URL triggers the backend to start building a JSON response (which is then cached), and subsequent calls check to see if this JSON is ready (in which case it's provided).
In my script, I need to poll this URL at 15-second intervals up to 1:30 mins., and do the following:
If the AJAX request results in an error, terminate the script.
If the AJAX request results in success, and the JSON content contains { "status" : "pending" }, continue polling.
If the AJAX request results in success, and the JSON content contains usable content (i.e. any valid response other than { "status" : "pending" }), then display that content, stop polling and terminate the script.
I've tried a few approaches with limited success, but I get the sense that they're all messier than they need to be. Here's a skeletal function I've used with success to make a single AJAX request at a time, which does its job if I get usable content from the JSON response:
// make the AJAX request
function ajax_request() {
$.ajax({
url: JSON_URL, // JSON_URL is a global variable
dataType: 'json',
error: function(xhr_data) {
// terminate the script
},
success: function(xhr_data) {
if (xhr_data.status == 'pending') {
// continue polling
} else {
success(xhr_data);
}
},
contentType: 'application/json'
});
}
However, this function currently does nothing unless it receives a valid JSON response containing usable content.
I'm not sure what to do on the lines that are just comments. I suspect that another function should handle the polling, and call ajax_request() as needed, but I don't know the most elegant way for ajax_request() to communicate its results back to the polling function so that it can respond appropriately.
Any help is very much appreciated! Please let me know if I can provide any more information. Thanks!
You could use a simple timeout to recursively call ajax_request.
success: function(xhr_data) {
console.log(xhr_data);
if (xhr_data.status == 'pending') {
setTimeout(function() { ajax_request(); }, 15000); // wait 15 seconds than call ajax request again
} else {
success(xhr_data);
}
}
Stick a counter check around that line and you've got a max number of polls.
if (xhr_data.status == 'pending') {
if (cnt < 6) {
cnt++;
setTimeout(function() { ajax_request(); }, 15000); // wait 15 seconds than call ajax request again
}
}
You don't need to do anything in your error function unless you want to put an alert up or something. the simple fact that it error will prevent the success function from being called and possibly triggering another poll.
thank you very much for the function. It is a little bit buggy, but here is the fix. roosteronacid's answer doesn't stop after reaching the 100%, because there is wrong usage of the clearInterval function.
Here is a working function:
$(function ()
{
var statusElement = $("#status");
// this function will run each 1000 ms until stopped with clearInterval()
var i = setInterval(function ()
{
$.ajax(
{
success: function (json)
{
// progress from 1-100
statusElement.text(json.progress + "%");
// when the worker process is done (reached 100%), stop execution
if (json.progress == 100) clearInterval(i);
},
error: function ()
{
// on error, stop execution
clearInterval(i);
}
});
}, 1000);
});
The clearInterval() function is becomming the interval id as parameter and then everything is fine ;-)
Cheers
Nik
Off the top of my head:
$(function ()
{
// reference cache to speed up the process of querying for the status element
var statusElement = $("#status");
// this function will run each 1000 ms until stopped with clearInterval()
var i = setInterval(function ()
{
$.ajax(
{
success: function (json)
{
// progress from 1-100
statusElement.text(json.progress + "%");
// when the worker process is done (reached 100%), stop execution
if (json.progress == 100) i.clearInterval();
},
error: function ()
{
// on error, stop execution
i.clearInterval();
}
});
}, 1000);
});
You can use javascript setInterval function to load the contents each and every 5 sec.
var auto= $('#content'), refreshed_content;
refreshed_content = setInterval(function(){
auto.fadeOut('slow').load("result.php).fadeIn("slow");},
3000);
For your reference-
Auto refresh div content every 3 sec