I am trying to call an action on a controller with ajax: 10 times with a 2 second delay in my MVC5 application.
Here is the code I've written:
$(document).ready(function () {
(function loop(i) {
setTimeout(function() {
var d = new Date();
console.log(d.getTime());
callAjax();
console.log("works " + i);
if (--i) loop(i);
},
2000); // ms
})(10);
function callAjax() {
$.ajax({
url: '/Home/StartElection',
type: 'POST',
data: "test",
async: true
})
.done(function (partialViewResult) {
$("#partialTable").html(partialViewResult);
});
};
});
The console log is as expected (with a 2 second delay) but the calls to the controller happen instantly - when I set a break in Visual Studio on the controller action the next call after continuing takes 2ms
I can't see why this is happening - can any one help?
Edit: I added a console log of Date.getTime() just before the ajax call & there are 2000 ms between each
You have to change this line alone async: true -> async: false
because the AJAX calls will be made asynchronous if this property is set to true.
And so your ajax calls have no time delay.
Hope this helps.
As far as your client side code is concerned it seems to be working well as far as calls are concerned. With that said here are a few things to consider when dealing with ajax:
1) You have control over the number of times that you can call a remote service but you have no control over the time it will take for that service to respond.
2) As such it is usual good practise for most cases to not make ajax calls in a loop (this somewhat defeats the general purpose of ajax). Rather use the response to each call to then make the next call (but then of course we would need to know exactly what it is you are trying to build to suggest an exact solution).
So the closest thing to what you are looking for using ajax I think would be more of something like this:
$(document).ready(function () {
/*(function loop(i) {
setTimeout(function() {
var d = new Date();
console.log(d.getTime());
callAjax();
console.log("works " + i);
if (--i) loop(i);
},
2000); // ms
})(10);*/
var i=0;
function callAjax() {
var requestTimeMessage = "fetch #"+i+" at: "+new Date().getTime();
console.log(requestMessage);
$.ajax({
url: '/Home/StartElection',
type: 'POST',
data: "test",
async: true
})
.done(function (partialViewResult) {
var requestTimeMessage = "response #"+i+" at: "+new Date().getTime();
console.log(requestMessage);
$("#partialTable").html(partialViewResult);
i++;
if(i<10) {
/*Note this assumes a quick response for each call.
* Otherwise the response together with this delay will total a wait longer than 2 seconds.
*/
setTimeout(function() {
callAjax();
2000);
}
});
};
});
But as I said. I would need to know exactly what you are trying to achieve to give more appropriate answer to your question.
Related
I would like to set up a progressbar showing the progress of a long working task which imports a large CSV-File and pass to the database. I start the import process with an initial jQuery.ajax call and setting up a timeout to get the processed lines from these file in percent.
The problem is when I start the initial ajax-call, all other ajax-calls just wait to be executed until the initial call is done.
So this is my code:
var progress = false;
var update_progress = function() {
if(progress) {
$.ajax({
url: 'index.php?do=update_progress'
},
function(json) {
// Something < 100
if(json.perc !== undefined) {
$('#progress').css('width', json.perc + '%');
}
setTimeout(update_progress, 1000);
});
}
}
var start_import = function(i) {
// Setting progress allowed
progress = true;
// start the update in 1s
setTimeout(update_progress, 1000);
// start the database-import (20-30 seconds runtime on server)
$.ajax({
url: 'index.php?do=start_import'
},
function(json) {
// Import finished, disallow progressing
progress = false;
// Finally always complete: json.perc is 100
if(json.perc !== undefined) {
$('#progress').css('width', json.perc + '%');
}
});
};
start_import();
This is a bit confusing, because I thought that each call can work itself asynchronously. What is wrong?
Regards Tim
Why do you not call setInterval() instead of setTimeout()? This is the problem! The next call of the function update_progress() will happen after the callback from the previous AJAX call returned!
A ajax code that i am using to request a page is consuming too much memory and is making the browser slow and everything lag. It seems like there is recursion going on and i dont know of any way to prevent it. Here is what the code looks like.
$(".item").each(function() {
$this = $(this);
var dataString = {s: "<?echo $_SESSION['currentview_'.$stamp]?>", r:"<?echo $search_usernumber?>", st: "<?echo $stamp?>"};
$.ajaxSetup({cache:false});
function timeLeft() {
$.ajax({
type: "POST",
url: "get_content_home.php",
dataType: "html",
data: dataString,
success: function(result) {
$this.html(result);
//console.log("a");
window.setInterval(function() {
timeLeft();
}, 500);
}
});
}
timeLeft();
});
How can i solve this problem? Thanks in advance.
You are recursing and you shouldn't be using this form of nested setInterval. Doing this, will cause an explosion of interval instances. Instead of using setInterval, schedule additional requests using setTimeout.
setInterval will fire and continue firing every interval until you tell it to stop.
setTimeout will fire once.
Let's consider the following code which should address some of the issues you are having in this question as well as your other 2 questions.
First off, as we said before, don't use setInterval unless you actually want it to run forever. Additionally, don't nest the setInterval creations unless you actually mean to.
Instead, let's create a recursive function getTimeLeft() that will handle firing the request and scheduling the next check for time left after some duration.
This example also mocks the $.ajax() function so that you can see the function in action since we don't have an actual back end to use.
// Fake server side data to keep track of time lefts
const timeLefts = {
foo: 0,
bar: 0,
fizz: 0,
buzz: 0
};
const timeLeftsUpdateInterval = setInterval(() => {
for (const [key, val] of Object.entries(timeLefts)) {
timeLefts[key] = Math.min(val + Math.random() * 10, 100);
}
if (Object.entries(timeLefts).every(([k, v]) => v >= 100)) {
clearInterval(timeLeftsUpdateInterval);
}
}, 1000);
// Mock $.ajax function to stub sending AJAX requests
function $ajax(kwargs) {
return {
done: cb => {
setTimeout(() => {
cb(timeLefts[kwargs.data.x]);
}, 500);
}
};
}
// We will check for an update every second after the last request finishes
const timeLeftCheckInterval = 1000;
// Continuously check to see how much time is left for an element
function getTimeLeft(el) {
// Make our request data
const dataString = {
s: "<?echo $_SESSION['currentview_'.$stamp]?>",
r: "<?echo $search_usernumber?>",
st: "<?echo $stamp?>",
// My custom property to make this work
x: el.dataset.item
};
// Make our request to get the time left
const req = $ajax({ // Using our mock $.ajax
type: "POST",
url: "get_content_home.php",
dataType: "html",
data: dataString
});
// Once the request has finished
req.done(data => {
// Set the time left to the element
el.innerHTML = data;
// Have some condition so that you don't check for time left forever
// Eventually there will be no time left right? Why keep checking?
if (data.timeleft <= 0) return;
// Schedule another round of checking for time left after some duration
setTimeout(() => {
getTimeLeft(el);
}, timeLeftCheckInterval);
});
}
// Kick off getting timeleft for all .items
Array.from(document.querySelectorAll(".item"))
.forEach(el => getTimeLeft(el));
<ul>
<li class="item" data-item="foo"></li>
<li class="item" data-item="bar"></li>
<li class="item" data-item="fizz"></li>
<li class="item" data-item="buzz"></li>
</ul>
This code will address the issue that you are having in 2 Ajax non-blocking because each element will have it's own logic of going and fetching time left and updating itself.
This also addresses the issue that you are potentially facing in Timer in Ajax - Preemption because now the element won't check to see how much time is left again until after the previous check is finished.
Is there a way via $.ajaxSetup to throttle requests so anything after X requests in Y seconds don't happen until time has passed so throttling can reset itself? I know I could use jquery throttle,
But I am looking for a global way to effect all $.ajax, $.get, and $.post.
For example I might setup that I will only allow 20 requests in 5 seconds. On the 21st request I would have an error handler to display a message. Every 5 seconds the counter would start over so they can make requests again.
Here is a way to rateLimit all ajaxRequests. You can define a rate limit interval in MS and how many request are allowed in a given time period.
var RATE_LIMIT_IN_MS = 100;
var NUMBER_OF_REQUESTS_ALLOWED = 2;
var NUMBER_OF_REQUESTS = 0;
setInterval(function()
{
NUMBER_OF_REQUESTS = 0;
}, RATE_LIMIT_IN_MS);
$.ajaxSetup ({
beforeSend: function canSendAjaxRequest()
{
var can_send = NUMBER_OF_REQUESTS < NUMBER_OF_REQUESTS_ALLOWED;
NUMBER_OF_REQUESTS++;
return can_send;
}
});
Couldn't you put the ajax call into a function and track how often that function is called?
var globalCount = 0;
var ajaxOperations = {
init: function(){
//init
},
loadNewsletterTemplate: function(templateDataFile){
globalCount++;
$.ajax({
url: "data/" + templateDataFile,
type: "GET",
success: function(response){
globalCount--;
formProcessing.convertToHTML_newsLetter_weeklyUpdates(response);
//console.log("response" + response);
}
});
}
}
Every time ajaxOperations is called you could iterate a global number and every time a success is performed you could do a retraction, thereby maintaining a steady flow. If (globalCount < 5) {call new ajax event}
Throttle one function, isSafe(), that returns true if it's good to go or false otherwise.
In ajaxSetup, have beforeSend() return the result of isSafe(). If it is false, the request will not be sent.
I haven't tested it, but it would be something like:
const isSafe = $.throttle(numberOfMillis, () => true);
$.ajaxSetup({
beforeSend: isSafe
});
I'm a novice-to-intermediate JavaScript/jQuery programmer, so concrete/executable examples would be very much appreciated.
My project requires using AJAX to poll a URL that returns JSON containing either content to be added to the DOM, or a message { "status" : "pending" } that indicates that the backend is still working on generating a JSON response with the content. The idea is that the first request to the URL triggers the backend to start building a JSON response (which is then cached), and subsequent calls check to see if this JSON is ready (in which case it's provided).
In my script, I need to poll this URL at 15-second intervals up to 1:30 mins., and do the following:
If the AJAX request results in an error, terminate the script.
If the AJAX request results in success, and the JSON content contains { "status" : "pending" }, continue polling.
If the AJAX request results in success, and the JSON content contains usable content (i.e. any valid response other than { "status" : "pending" }), then display that content, stop polling and terminate the script.
I've tried a few approaches with limited success, but I get the sense that they're all messier than they need to be. Here's a skeletal function I've used with success to make a single AJAX request at a time, which does its job if I get usable content from the JSON response:
// make the AJAX request
function ajax_request() {
$.ajax({
url: JSON_URL, // JSON_URL is a global variable
dataType: 'json',
error: function(xhr_data) {
// terminate the script
},
success: function(xhr_data) {
if (xhr_data.status == 'pending') {
// continue polling
} else {
success(xhr_data);
}
},
contentType: 'application/json'
});
}
However, this function currently does nothing unless it receives a valid JSON response containing usable content.
I'm not sure what to do on the lines that are just comments. I suspect that another function should handle the polling, and call ajax_request() as needed, but I don't know the most elegant way for ajax_request() to communicate its results back to the polling function so that it can respond appropriately.
Any help is very much appreciated! Please let me know if I can provide any more information. Thanks!
You could use a simple timeout to recursively call ajax_request.
success: function(xhr_data) {
console.log(xhr_data);
if (xhr_data.status == 'pending') {
setTimeout(function() { ajax_request(); }, 15000); // wait 15 seconds than call ajax request again
} else {
success(xhr_data);
}
}
Stick a counter check around that line and you've got a max number of polls.
if (xhr_data.status == 'pending') {
if (cnt < 6) {
cnt++;
setTimeout(function() { ajax_request(); }, 15000); // wait 15 seconds than call ajax request again
}
}
You don't need to do anything in your error function unless you want to put an alert up or something. the simple fact that it error will prevent the success function from being called and possibly triggering another poll.
thank you very much for the function. It is a little bit buggy, but here is the fix. roosteronacid's answer doesn't stop after reaching the 100%, because there is wrong usage of the clearInterval function.
Here is a working function:
$(function ()
{
var statusElement = $("#status");
// this function will run each 1000 ms until stopped with clearInterval()
var i = setInterval(function ()
{
$.ajax(
{
success: function (json)
{
// progress from 1-100
statusElement.text(json.progress + "%");
// when the worker process is done (reached 100%), stop execution
if (json.progress == 100) clearInterval(i);
},
error: function ()
{
// on error, stop execution
clearInterval(i);
}
});
}, 1000);
});
The clearInterval() function is becomming the interval id as parameter and then everything is fine ;-)
Cheers
Nik
Off the top of my head:
$(function ()
{
// reference cache to speed up the process of querying for the status element
var statusElement = $("#status");
// this function will run each 1000 ms until stopped with clearInterval()
var i = setInterval(function ()
{
$.ajax(
{
success: function (json)
{
// progress from 1-100
statusElement.text(json.progress + "%");
// when the worker process is done (reached 100%), stop execution
if (json.progress == 100) i.clearInterval();
},
error: function ()
{
// on error, stop execution
i.clearInterval();
}
});
}, 1000);
});
You can use javascript setInterval function to load the contents each and every 5 sec.
var auto= $('#content'), refreshed_content;
refreshed_content = setInterval(function(){
auto.fadeOut('slow').load("result.php).fadeIn("slow");},
3000);
For your reference-
Auto refresh div content every 3 sec
I am looking into QUnit for JavaScript unit testing. I am in a strange situation where I am checking against the value returned from the Ajax call.
For the following test I am purposely trying to fail it.
// test to check if the persons are returned!
test("getPersons", function() {
getPersons(function(response) {
// persons = $.evalJSON(response.d);
equals("boo", "Foo", "The name is valid");
});
});
But it ends up passing all the time. Here is the getPersons method that make the Ajax call.
function getPersons(callback) {
var persons = null;
$.ajax({
type: "POST",
dataType: "json",
data: {},
contentType: "application/json",
url: "AjaxService.asmx/GetPersons",
success: function(response) {
callback(response);
}
});
}
Starting and stopping using the QUnit library seems to be working!
// test to check if the persons are returned!
test("getPersons", function() {
stop();
getPersons(function(response) {
persons = $.evalJSON(response.d);
equals(persons[0].FirstName, "Mohammad");
start();
});
});
The real problem here isn't needing to call the start() and stop() methods - in fact you could get into trouble using that approach if you are not careful in calling stop() again at the end of your callback if you have additional .ajax() methods. Which then means you find yourself in some snarled mess of having to keep track if all the callbacks have been fired to know if you still need to call stop() again.
The root of the problem involves the default behavior of asynchronous requests - and the simple solution is to make the .ajax() request happen synchronously by setting the async property to false:
test("synchronous test", function() {
$.ajax({
url:'Sample.asmx/Service',
async:false,
success: function(d,s,x) { ok(true, "Called synchronously"); }
});
});
Even still, the best approach is to allow the asynchronous behavior and use the right test method invocation: asyncTest(). According to the docs "Asynchronous tests are queued and run one after the other. Equivalent to calling a normal test() and immediately calling stop()."
asyncTest("a test", function() {
$.ajax({
url: 'Sample.asmx/Service',
success: function(d,s,x) {
ok(true, "asynchronous PASS!");
start();
}
});
});
There a lot of qunit test in my project. like:
module("comment");
asyncTest("comment1", function() {
expect(6);
$.ajax({
url: 'http://xxx.com/comment',
dataType: "json",
type: "GET",
timeout: 1000
}).done(function(data) {
ok(true, "loaded");
ok(data.data.length>1, "array size");
ok(data.total, "attr total");
var c = data.data[0];
ok(c.id, "attr c.id");
ok(c.user_id, "attr c.user_id");
ok(c.type == 4, "attr c.type equal 4");
}).fail(function(x, text, thrown) {
ok(false, "ajax failed: " + text);
}).always(function(){
start();
});
});
ive done some qunit testing with ajax. its not pretty. the best thing i could come with is stopping the test when ajax is fired, and starting it again in the success callback. (using start() and stop()) methods. This meant one ajax request at a time, but i could live with that. Good luck