Exit setTimeout loop within AJAX GET request - javascript

I have a function that performs an AJAX GET request every 1 second to retrieve data and update a progress bar. I'm using setTimeout to accomplish this. Here is the code:
function check_progress(thread_id) {
function worker() {
$.get('ecab_run/progress/' + thread_id, function(data) {
progress = data['progress'];
status_message = data['status_message'];
if (progress < 100) {
$('.progress-bar').css('width', progress+'%').attr('aria-valuenow', progress);
$('.progress-bar')[0].innerHTML = progress+"%";
$('#status-message')[0].innerHTML = status_message
timer = setTimeout(worker, 1000);
// console.log(timer);
} else if ( progress == 100 ){
$('.progress-bar').css('width', progress+'%').attr('aria-valuenow', progress);
$('.progress-bar')[0].innerHTML = progress+"%";
}
})
return status_message;
}
worker();
}
The function is called after a successful POST as such:
$('#ecab-run-dates').submit(function(e){
e.preventDefault();
var formData = $('form').serialize();
$.ajax({
url:'',
type:'post',
data:formData,
success:function(data){
thread_id = data;
$('.progress-container').show();
check_progress(thread_id);
}
});
});
The backend returns status messages as the behind-the-scenes functions execute, and when it encounters an error, it returns Error: could not compile data or something similar.
My initial thought was to use the error message as a condition for stopping the check_progress function. I've read several answers, including this one that say to use clearTimeout, but I'm not exactly sure how to structure that in my code. Can someone help me exit the setTimeout loop when encountering an error?

Related

jQuery ajax requests only working linear and not parallel

I would like to set up a progressbar showing the progress of a long working task which imports a large CSV-File and pass to the database. I start the import process with an initial jQuery.ajax call and setting up a timeout to get the processed lines from these file in percent.
The problem is when I start the initial ajax-call, all other ajax-calls just wait to be executed until the initial call is done.
So this is my code:
var progress = false;
var update_progress = function() {
if(progress) {
$.ajax({
url: 'index.php?do=update_progress'
},
function(json) {
// Something < 100
if(json.perc !== undefined) {
$('#progress').css('width', json.perc + '%');
}
setTimeout(update_progress, 1000);
});
}
}
var start_import = function(i) {
// Setting progress allowed
progress = true;
// start the update in 1s
setTimeout(update_progress, 1000);
// start the database-import (20-30 seconds runtime on server)
$.ajax({
url: 'index.php?do=start_import'
},
function(json) {
// Import finished, disallow progressing
progress = false;
// Finally always complete: json.perc is 100
if(json.perc !== undefined) {
$('#progress').css('width', json.perc + '%');
}
});
};
start_import();
This is a bit confusing, because I thought that each call can work itself asynchronously. What is wrong?
Regards Tim
Why do you not call setInterval() instead of setTimeout()? This is the problem! The next call of the function update_progress() will happen after the callback from the previous AJAX call returned!

setInterval() loop still running after clearInterval() is called

I know that setInterval() returns a unique ID on every loop, and that clearInterval must be called with that ID to kill the loop. I believe my code does this, yet the loop continues to run indefinitely.
var refresh = setInterval(function () {
$.get(url, function (data) {
success: {
if (data < 5) {
data = 5;
}
var width = data + "%";
$("#recalculation-progress-bar").css('width', width);
if (data > 99) {
clearInterval(refresh);
$("#recalculation-message").text("Recalculation complete.");
}
}
});
}, 3000);
I have checked this in debug and clearInterval() is definitely being called and no errors are thrown. Am I doing something obviously wrong?
1.-
data in $.get is a String by default
https://api.jquery.com/jquery.get/
data = parseInt(data)
2.-
You add a second level of sync... Are you sure that this request is faster than 3000ms?
3.- If the data is > 99, the code works.
In my concern, the problem is the request is longer than 3 seconds, and you still receive connections that your previously launched.
if (data > 99) {
clearInterval(refresh);
$("#recalculation-message").text("Recalculation complete.");
} else {
//relaunch here the connection and remove the interval
}

Making multiple ajax requests synchronously

Let's suppose I have some function called makeRequest(), which makes an AJAX request to a server.
Now let's suppose I am given the amount of times this request should be made, but I can't do them asynchronously but synchronously instead.
For instance, I am given the number 5, and I shall call makeRequest(), when it's done, I shall call it again, and when it's done, I shall call it again... I should end up calling it 5 times.
I'm no expert at JavaScript but I found it easy to handle asynchronous calls by the use of callbacks.
So, my makeRequest() function takes a callback argument that is to be executed when the request has succeeded.
In my previous example, I had to make the request 5 times, so the behaviour should look like:
makeRequest(function () {
makeRequest(function () {
makeRequest(function () {
makeRequest(function () {
makeRequest(function () {
});
});
});
});
});
How can I design this to behave the same for any argument given to me, be it 6, 12 or even 1?
PS: I have tried many approaches, the most common involving creating while loops that wait until a flag is set by a finished request. All of these approaches makes the browser think the script crashed and prompt the user to stop the script.
Simple, recursively call the ajax request, while keeping track of a count variable:
function makeRequest(count, finalCallback){
someAjaxCall(data, function(){
if(count > 1){
makeRequest(count - 1, finalCallback);
} else {
finalCallback && finalCallback();
}
});
}
finalCallback is a optional callback (function) that will be executed when all the requests are completed.
You can do it this way,
var i = 5; // number of calls to you function calling ajax
recurs(i); // call it initially
function recurs(count) {
makeRequest(function() {
count--; // decrement count
if (count > 1) {
recurs(count) // call function agian
}
});
}
Here I have written multiple Ajax calls using promises. This function will run synchronously. You can get the current position of response which is executed from Ajax.
var ajaxs = {
i : 0,
callback : null,
param : null,
exec_fun : function (i) {
let data_send = this.param[i];
let url = this.url;
this.promise = new Promise(function (res,rej) {
$.ajax({
url: url,
method: 'POST',
data: data_send,
dataType: 'json',
success: function(resvalidate){
res(resvalidate);
}
});
});
this.promise.then(function (resvalidate) {
let resp = resvalidate,
param = ajaxs.param,
pos = ajaxs.i,
callback_fun = ajaxs.callback_fun;
callback_fun(resp,ajaxs.i);
ajaxs.i++;
if( ajaxs.param[ajaxs.i] != undefined){
ajaxs.exec_fun(ajaxs.i);
}
});
},
each : function (url,data,inc_callback) {
this.callback_fun = inc_callback;
this.param = data;
this.url = url;
this.exec_fun(ajaxs.i);
}
};
let url = "http://localhost/dev/test_ajax.php";
let data_param = [{data : 3},{data : 1},{data : 2}];
ajaxs.each(url,data_param, function (resp,i) {
console.log(resp,i);
});

jQuery AJAX polling for JSON response, handling based on AJAX result or JSON content

I'm a novice-to-intermediate JavaScript/jQuery programmer, so concrete/executable examples would be very much appreciated.
My project requires using AJAX to poll a URL that returns JSON containing either content to be added to the DOM, or a message { "status" : "pending" } that indicates that the backend is still working on generating a JSON response with the content. The idea is that the first request to the URL triggers the backend to start building a JSON response (which is then cached), and subsequent calls check to see if this JSON is ready (in which case it's provided).
In my script, I need to poll this URL at 15-second intervals up to 1:30 mins., and do the following:
If the AJAX request results in an error, terminate the script.
If the AJAX request results in success, and the JSON content contains { "status" : "pending" }, continue polling.
If the AJAX request results in success, and the JSON content contains usable content (i.e. any valid response other than { "status" : "pending" }), then display that content, stop polling and terminate the script.
I've tried a few approaches with limited success, but I get the sense that they're all messier than they need to be. Here's a skeletal function I've used with success to make a single AJAX request at a time, which does its job if I get usable content from the JSON response:
// make the AJAX request
function ajax_request() {
$.ajax({
url: JSON_URL, // JSON_URL is a global variable
dataType: 'json',
error: function(xhr_data) {
// terminate the script
},
success: function(xhr_data) {
if (xhr_data.status == 'pending') {
// continue polling
} else {
success(xhr_data);
}
},
contentType: 'application/json'
});
}
However, this function currently does nothing unless it receives a valid JSON response containing usable content.
I'm not sure what to do on the lines that are just comments. I suspect that another function should handle the polling, and call ajax_request() as needed, but I don't know the most elegant way for ajax_request() to communicate its results back to the polling function so that it can respond appropriately.
Any help is very much appreciated! Please let me know if I can provide any more information. Thanks!
You could use a simple timeout to recursively call ajax_request.
success: function(xhr_data) {
console.log(xhr_data);
if (xhr_data.status == 'pending') {
setTimeout(function() { ajax_request(); }, 15000); // wait 15 seconds than call ajax request again
} else {
success(xhr_data);
}
}
Stick a counter check around that line and you've got a max number of polls.
if (xhr_data.status == 'pending') {
if (cnt < 6) {
cnt++;
setTimeout(function() { ajax_request(); }, 15000); // wait 15 seconds than call ajax request again
}
}
You don't need to do anything in your error function unless you want to put an alert up or something. the simple fact that it error will prevent the success function from being called and possibly triggering another poll.
thank you very much for the function. It is a little bit buggy, but here is the fix. roosteronacid's answer doesn't stop after reaching the 100%, because there is wrong usage of the clearInterval function.
Here is a working function:
$(function ()
{
var statusElement = $("#status");
// this function will run each 1000 ms until stopped with clearInterval()
var i = setInterval(function ()
{
$.ajax(
{
success: function (json)
{
// progress from 1-100
statusElement.text(json.progress + "%");
// when the worker process is done (reached 100%), stop execution
if (json.progress == 100) clearInterval(i);
},
error: function ()
{
// on error, stop execution
clearInterval(i);
}
});
}, 1000);
});
The clearInterval() function is becomming the interval id as parameter and then everything is fine ;-)
Cheers
Nik
Off the top of my head:
$(function ()
{
// reference cache to speed up the process of querying for the status element
var statusElement = $("#status");
// this function will run each 1000 ms until stopped with clearInterval()
var i = setInterval(function ()
{
$.ajax(
{
success: function (json)
{
// progress from 1-100
statusElement.text(json.progress + "%");
// when the worker process is done (reached 100%), stop execution
if (json.progress == 100) i.clearInterval();
},
error: function ()
{
// on error, stop execution
i.clearInterval();
}
});
}, 1000);
});
You can use javascript setInterval function to load the contents each and every 5 sec.
var auto= $('#content'), refreshed_content;
refreshed_content = setInterval(function(){
auto.fadeOut('slow').load("result.php).fadeIn("slow");},
3000);
For your reference-
Auto refresh div content every 3 sec

Best way to add a 'callback' after a series of asynchronous XHR calls

I stumbled on a piece of Ajax code that is not 100% safe since it's mixing asynchronous/synchronous type of code... so basically in the code below I have a jQuery.each in which it grabs information on the elements and launch an Ajax get request for each:
$(search).each(function() {
$.ajax({
url: 'save.x3?id='+$(this).attr("id")+'value='$(this).data("value");
success: function(o){
//Update UI
},
error: function(o){
//Update UI
}
});
});
//code to do after saving...
So obviously the 'code to do after saving...' often gets executed before all the requests are completed. In the ideal world I would like to have the server-side code handle all of them at once and move //code to do after saving in the success callback but assuming this is not possible, I changed the code to something like this to make sure all requests came back before continuing which I'm still not in love with:
var recs = [];
$(search).each(function() {
recs[recs.length] = 'save.x3?id='+$(this).attr("id")+'value='$(this).data("value");
});
var counter = 0;
function saveRecords(){
$.ajax({
url: recs[counter],
success: function(o){
//Update progress
if (counter<recs.length){
counter++;
saveRecords();
}else{
doneSavingRecords();
}
},
error: function(o){
//Update progress
doneSavingRecords(o.status);
}
});
}
function doneSavingRecords(text){
//code to do after saving...
}
if (recs.length>0){
saveRecords(); //will recursively callback itself until a failed request or until all records were saved
}else{
doneSavingRecords();
}
So I'm looking for the 'best' way to add a bit of synchronous functionality to a series of asynchronous calls ?
Thanks!!
Better Answer:
function saveRecords(callback, errorCallback){
$('<div></div>').ajaxStop(function(){
$(this).remove(); // Keep future AJAX events from effecting this
callback();
}).ajaxError(function(e, xhr, options, err){
errorCallback(e, xhr, options, err);
});
$(search).each(function() {
$.get('save.x3', { id: $(this).attr("id"), value: $(this).data("value") });
});
}
Which would be used like this:
saveRecords(function(){
// Complete will fire after all requests have completed with a success or error
}, function(e, xhr, options, err){
// Error will fire for every error
});
Original Answer: This is good if they need to be in a certain order or you have other regular AJAX events on the page that would affect the use of ajaxStop, but this will be slower:
function saveRecords(callback){
var recs = $(search).map(function(i, obj) {
return { id: $(obj).attr("id"), value: $(obj).data("value") };
});
var save = function(){
if(!recs.length) return callback();
$.ajax({
url: 'save.x3',
data: recs.shift(), // shift removes/returns the first item in an array
success: function(o){
save();
},
error: function(o){
//Update progress
callback(o.status);
}
});
}
save();
}
Then you can call it like this:
saveRecords(function(error){
// This function will run on error or after all
// commands have run
});
If I understand what you're asking, I think you could use $.ajaxStop() for this purpose.
This is easily solved by calling the same function to check that all AJAX calls are complete. You just need a simple queue shared between functions, and a quick check (no loops, timers, promises, etc).
//a list of URLs for which we'll make async requests
var queue = ['/something.json', '/another.json'];
//will contain our return data so we can work with it
//in our final unified callback ('displayAll' in this example)
var data = [];
//work through the queue, dispatch requests, check if complete
function processQueue( queue ){
for(var i = 0; i < queue.length; i++){
$.getJSON( queue[i], function( returnData ) {
data.push(returnData);
//reduce the length of queue by 1
//don't care what URL is discarded, only that length is -1
queue.pop();
checkIfLast(displayAll(data));
}).fail(function() {
throw new Error("Unable to fetch resource: " + queue[i]);
});
}
}
//see if this is last successful AJAX (when queue == 0 it is last)
//if this is the last success, run the callback
//otherwise don't do anything
function checkIfLast(callback){
if(queue.length == 0){
callback();
}
}
//when all the things are done
function displayAll(things){
console.log(things); //show all data after final ajax request has completed.
}
//begin
processQueue();
Edit: I should add that I specifically aimed for an arbitrary number of items in the queue. You can simply add another URL and this will work just the same.
>> In the ideal world I would like to have the server-side code handle all of them at once and move //code to do after saving in the success callback
You'll need to think about this in terms of events.
Closure's net.BulkLoader (or a similar approach) will do it for you:
http://closure-library.googlecode.com/svn/trunk/closure/goog/docs/class_goog_net_BulkLoader.html
http://closure-library.googlecode.com/svn/trunk/closure/goog/docs/closure_goog_net_bulkloader.js.source.html
See:
goog.net.BulkLoader.prototype.handleSuccess_ (for individual calls)
&
goog.net.BulkLoader.prototype.finishLoad_ (for completion of all calls)

Categories

Resources