Multi Javascript Ajax calls are blocking the UI (Laravel REST) - javascript

I'm fetching data (summarizing timeframes) for my dashboard asynchronously by using $.GET(). The script is simple, I'm waiting that the page (fonts, icons..) is completely rendered by using $(window).load(function () {}.
Then I'm using a document.querySelectorAll('[data-id]'); to search the related ids and start a query in a for loop.
// some date ranges I'm using for the request
var request = [
[moment(), moment(), 'today'],
( ... )
[moment().subtract(1, 'year').startOf('year'), moment().subtract(1, 'year').endOf('year'), 'last-year']
];
// find all data-id elements
var t = document.querySelectorAll('[data-id]');
for (var i = 0; i < t.length; i++) {
// extract the id
var mid = t[i].getAttribute('data-id');
// iterate the request array
for (var j = 0; j < request.length; j++) {
requestData(mid, request[j]);
}
}
function requestData(id, time) {
$.ajax({
url: "/api/v1/data/point/" + id,
type: 'GET',
data: {
from: time[0].format('YYYY-MM-DD'),
to: time[1].format('YYYY-MM-DD'),
sum: true
},
dataType: 'json',
success: function (response) {
// find elements by id stuff and replace the innerHTML with the response value (it is just to long to display here, but nothing special).
}
});
}
Q
While the page is doing ~ 5-12 GET requests, the page is completely blocked and I can not load another page by clicking on a link. So whats basically wrong here? Is the behavior maybe also referable to the power of the web server that those 12 GET requests cause heavy load? I've also noticed that if I'm using jquerys $(document).ready function, that the icons are rendered after $.Ajax finishes - this results in squares instead of icons.
Edit: I thought that maybe the mysql calls by the API are blocking the server?
Edit2: async is true by default (http://api.jquery.com/jquery.ajax/#jQuery-ajax-settings)

You can add async: false in AJAX call.
function requestData(id, time) {
$.ajax({
url: "/api/v1/data/point/" + id,
type: 'GET',
async: false,
data: {
from: time[0].format('YYYY-MM-DD'),
to: time[1].format('YYYY-MM-DD'),
sum: true
},
dataType: 'json',
success: function (response) {
// find elements by id stuff and replace the innerHTML with the response value (it is just to long to display here, but nothing special).
}
});
}
And if thats not work then replace it by true.

Below are my ideas on how you can improve the state:
Have you tracked the timeline of the page(in Timeline tab in Chrome dev tools)? It will show the peaks and downs of page performance. The reason of the bloacking can be different from Ajax.
I am sure you know that the browser can only run a limited amount of requests at the same time, no matter if they are sync of async. In your case it's 6. Can you cache the requests so that you don't do the real requests every time?

Related

DurandalJS and jQuery not adding to Select

I am creating a SPA in DurandalJS with MVC, and have jQuery loading a <select> with options loaded from a database. Setting breakpoints i was able to follow the stack all the way down the chain and verify that all of my ajax loading and jQuery calls were occuring, but when I go to check the select box, it is empty, including the inner html of the tag and the dropdown elements themselves.
What's weird though is that the items will load if I navigate to another page and then come back to the original page (since this is an ajax-ified single page application it doesn't actually navigate in the traditional sense.
Why is this happening and how can I fix it?
Code to load the data:
function addProjectSelectorOptions(projects) {
$('#project-picker').empty();
for (var i = 0; i < projects.length; i++) {
console.log(projects[i]);
$('#project-picker').append(new Option(projects[i]["Name"], projects[i]["Id"]));
}
}
function loadData() {
$.ajax({
url: '/ClubhouseData/GetProjects',
type: 'GET',
cache: false,
success: function (results) {
console.log(results);
addProjectSelectorOptions(results);
}
});
};
loadData();
Solved:
I wrapped loadData() in a jQuery onDocumentLoad call and it fixed the problem.

JQuery - Looping a .load() inside a 'for' statement

I'm not sure if this will actually be possible, since load() is an asynchronous method, but I need some way to basically Load several little bits of pages, one at a time, get some data included in them via JavaScript, and then send that over via Ajax so I can put it on a database I made.
Basically I get this from my page, where all the links I'll be having to iterate through are located:
var digiList = $('.2u');
var link;
for(var i=0;i<digiList.length;i++){
link = "http://www.digimon-heroes.com" + $(digiList).eq(i).find('map').children().attr('href');
So far so good.
Now, I'm going to have to load each link (only a specific div of the full page, not the whole thing) into a div I have somewhere around my page, so that I can get some data via JQuery:
var contentURI= link + ' div.row:nth-child(2)';
$('#single').load('grabber.php?url='+ contentURI,function(){
///////////// And I do a bunch of JQuery stuff here, and save stuff into an object
///////////// Aaaand then I call up an ajax request.
$.ajax({
url: 'insertDigi.php',
type: 'POST',
data: {digimon: JSON.stringify(digimon)},
dataType: 'json',
success: function(msg){
console.log(msg);
}
////////This calls up a script that handles everything and makes an insert into my database.
}); //END ajax
}); //END load callback Function
} //END 'for' Statement.
alert('Inserted!');
Naturally, as would be expected, the loading takes too long, and the rest of the for statement just keeps going through, not really caring about letting the load finish up it's business, since the load is asynchronous. The alert('Inserted!'); is called before I even get the chance to load the very first page. This, in turn, means that I only get to load the stuff into my div before I can even treat it's information and send it over to my script.
So my question is: Is there some creative way to do this in such a manner that I could iterate through multiple links, load them, do my business with them, and be done with it? And if not, is there a synchronous alternative to load, that could produce roughly the same effect? I know that it would probably block up my page completely, but I'd be fine with it, since the page does not require any input from me.
Hopefully I explained everything with the necessary detail, and hopefully you guys can help me out with this. Thanks!
You probably want a recursive function, that waits for one iteration, before going to the next iteration etc.
(function recursive(i) {
var digiList = $('.2u');
var link = digiList.eq(i).find('map').children().attr('href') + ' div.row:nth-child(2)';
$.ajax({
url: 'grabber.php',
data: {
url: link
}
}).done(function(data) {
// do stuff with "data"
$.ajax({
url: 'insertDigi.php',
type: 'POST',
data: {
digimon: digimon
},
dataType: 'json'
}).done(function(msg) {
console.log(msg);
if (i < digiList.length) {
recursive(++i); // do the next one ... when this is one is done
}
});
});
})(0);
Just in case you want them to run together you can use closure to preserve each number in the loop
for (var i = 0; i < digiList.length; i++) {
(function(num) { < // num here as the argument is actually i
var link = "http://www.digimon-heroes.com" + $(digiList).eq(num).find('map').children().attr('href');
var contentURI= link + ' div.row:nth-child(2)';
$('#single').load('grabber.php?url=' + contentURI, function() {
///////////// And I do a bunch of JQuery stuff here, and save stuff into an object
///////////// Aaaand then I call up an ajax request.
$.ajax({
url: 'insertDigi.php',
type: 'POST',
data: {
digimon: JSON.stringify(digimon)
},
dataType: 'json',
success: function(msg) {
console.log(msg);
}
////////This calls up a script that handles everything and makes an insert into my database.
}); //END ajax
}); //END load callback Function
})(i);// <-- pass in the number from the loop
}
You can always use synchronous ajax, but there's no good reason for it.
If you know the number of documents you need to download (you can count them or just hardcode if it's constant), you could run some callback function on success and if everything is done, then proceed with logic that need all documents.
To make it even better you could just trigger an event (on document or any other object) when everything is downloaded (e.x. "downloads_done") and listen on this even to make what you need to make.
But all above is for case you need to do something when all is done. However I'm not sure if I understood your question correctly (just read this again).
If you want to download something -> do something with data -> download another thing -> do something again...
Then you can also use javascript waterfall (library or build your own) to make it simple and easy to use. On waterfall you define what should happen when async function is done, one by one.

how to use for-loop to do form submission in javascript?

I have written a javascript in the console of chrome to download many files from a web. The javascript is as follows:
for (i=0; i<100; i++) {
$("input[name=id]").val(i);
$("form").submit()
}
I expected I can download all the files whose id is from 0 to 99; however, what I am able to download is only the last file (i.e. file with id =99). Why does this happen?
I think you are missing the responses. When you call first form, the browser start the connection, but just after that, you ask for the second form. Whitch cancel the first.
You need to call that forms with AJAX, to store each request with the correspondent response in some variable.
Your problem is like when you click too many links in a page before let it load the first link. Finally the loaded link is the last clicked.
TRY THIS SOLUTION:
You need to que keep some time between calls to handle response and start the download, so wright a function like this:
function submitFormAndWait(i) {
//this is your code to submit
$("input[name=id]:hidden").val(i);
$("form").submit()
//wait some time and call next form
if (i < 100) {
setTimeout(function(){ submitFormAndWait(i + 1); }, 3000);
}
}
POSSIBLE PROBLEM:
When you submit a form in the browser, this will load the response as the current page, and the script will start from zero other time.
The best way to do that is using AJAX and FormData.
Assuming you're using jQuery in your site, some pseudocode for this:
var url = '/some/ajax/url';
var i = 0;
for ( ; i < 100; ++i) {
var formData = new FormData();
formData.append('some-number-field', i);
// Send data using Ajax POST
$.ajax({
url: url,
type: 'POST',
data: formData,
success: function(response) {
// Do something on success
},
error: function(xhr, status errorThrown) {
// Do something on error
}
});
}

Will reinitializing a JavaScript object at global scope cause a memory leak?

I have links in a JQuery DataTable that use JQuery UI's tooltip feature. Each link has a tooltip that is populated by an Ajax call. I would like to limit the number of Ajax calls to as few as possible. The DataTable uses server-side processing, and the results are paginated, so there will never be more than ten links on the page at any one time.
The data that is returned by the Ajax call will never change and thus can be safely cached. In my testing, I have seen that the browser does cache the result of each Ajax call, so that it only makes one call per link, and then uses the cache thereafter. My concern is that some user might have their browser configured in such a way that it doesn't use the cache for some reason, and they will be firing off one Ajax call after another, every time they mouse over a link.
Here is the JavaScript for the tooltip:
$('.jobId').tooltip({
content: function(callback) {
var jobId = $(this).text();
$.ajax({
url: 'myUrl',
data: {jobId: jobId},
dataType: 'json',
success: function(data) {
var html = formatResults(data);
callback(html);
},
error: function() {
callback('An error has occurred.');
}
});
}
});
I considered storing the result of each Ajax call in a JavaScript object declared at global scope, and then checking that before making the Ajax call, but I have the vague sense that this might cause a memory leak somehow.
var gJobs = new Object();
$('.jobId').tooltip({
content: function(callback) {
var jobId = $(this).text();
if (gJobs[jobId]) {
callback(gJobs[jobId]);
} else {
$.ajax({
url: 'myUrl',
data: {jobId: jobId},
dataType: 'json',
success: function(data) {
var html = formatResults(data);
gJobs[jobId] = html;
callback(html);
},
error: function() {
callback('An error has occurred.');
}
});
}
}
});
I am also concerned that if the table has a large number of rows, the gJobs object could end up using a lot of memory. To prevent the gJobs object from growing indefinitely, every time the user goes to the next or previous page of results in the DataTable, I use the fnDrawCallback function to reinitialize gJobs:
$('#jobsTable').dataTable({
...
"fnDrawCallback": function() {
gJobs = new Object();
}
});
I should mention that since the data returned by each Ajax call doesn't change, I could also just store the data in the JSP as static text, and populate the tooltips that way instead of using Ajax. However, I have to make a separate web service call to get the data for each link, and rather than make ten web service calls every time the user pages forward or back, I would rather load the data on demand via Ajax.
Is there anything wrong with this approach? Is there any way this can cause a memory leak? Should I explicitly delete all the properties of gJobs before reinitializing it? Thanks for your help.

How to wait for an async call from a callback using jQuery?

I'm using the select2 jQuery based replacement for combo boxes, and I have to define a callback to process the data I receive from a json rest web service.
The problem is that, in the same callback, I have to issue another GET request to get the total numbers of matching records, so that select2 can decide if it has to load more results (it has an infinite scroll feature)
The code is something like this:
$("#country").select2({
ajax: { // instead of writing the function to execute the request we use Select2's convenient helper
url: 'http://localhost:9000/api/countries',
dataType: 'json',
data: function(term, page) {
return {
filter: term,
page: page,
len: 10
};
},
results: function(data, page) {
return {
results: data, more: ????
};
}
}
});
The problem is I don't know how to issue an async request (I'm issuing a cross-domain request, and the docs says async is not supported in that case) and wait for it to finish before returning form the results callback.
The example from select2 page is like this:
results: function (data, page) {
var more = (page * 10) < data.total; // whether or not there are more results available
// notice we return the value of more so Select2 knows if more results can be loaded
return {results: data.movies, more: more};
}
The problem is that my web service returns the total number of records from a different endpoint, so I have to make another request, like this: http: //localhost:9000/api/countries?filter=term
any idea?
You can't wait for an async callback in javascript. You have to restructure your code to do all future work based on the async response from the actual callback.
If you need to make multiple consecutive ajax calls, then you issue the first one and in the success handler or response handler for the first ajax call, you issue the second ajax call and in the response handler for the second one, you carry out whatever you want to do with the data.
If see that you're using the .select2() framework. In the framework, the results callback is where the ajax call returns. It would be in that function that you would issue the second ajax call using normal jQuery ajax calls and in the success handler from that second ajax call, you would carry out whatever you're trying to do with the eventual data you got back. You won't be able to use the normal return value of the results callback because you won't have your final data yet at the point you need to return. I think this is just a limitation of .select2() in that it only supports a single ajax call. It just means you can't use a little bit of the built-in behavior and have to apply the result yourself with your own code, but it doesn't mean you have to throw out .select2() for everything else you were using it for.
It looks like you might want to just hook the change event directly and not use their built-in ajax stuff since it doesn't look like it really provides you with much if you need two serialized ajax calls.
I studied the source code on select2, and finnally came out with this solution
var ajax = {
url: 'http://localhost:9000/api/countries',
len: 3,
};
$("#country").select2({
query: function(options) {
var data = {
filter: options.term,
page: options.page,
len: ajax.len
};
$.ajax({
url: ajax.url,
data: data,
dataType: 'json',
type: 'GET',
success: function(data) {
$.ajax({
url: ajax.url + '/count',
data: { filter: options.term },
dataype: 'json',
success: function(resp) {
var total = parseInt(resp, 10);
var more = (options.page * ajax.len) < total;
options.callback({results: data, more: more});
}
});
}
});
},
});
As you can see, when te first fetch (ajax.url) completes I issue another request (ajax.url + '/count') and only when this second request completes I call options.callback, efectively serializing both ajax calls...
In fact the ajax function from select2 has more functionality, such as throttling and dropping out-of-order responses, I just ported them too, but I left them out of this response in order not to complicate the example...
In addition to jfriend00's answer (which is excellent, BTW) I found the followgin workaround, which is basically to issue the request synchronously, which in spite jquery docs it seemd to work (at least with chromium 18.0 and jquery 1.8.0)
I'm just posting it in case anybody find it useful...
var config = {
url: 'http://localhost:9000/api/countries',
len: 20,
term: ''
}
$("#country").select2({
ajax: {
url: config.url,
dataType: 'json',
data: function(term, page) {
config.term = term;
return {
filter: term,
page: page,
len: config.len
};
},
results: function(data, page) { // parse the results into the format expected by Select2.
var more = false;
$.ajax({
url: config.url + '/count',
data: { filter: config.term },
cache: false,
async: false,
success: function(resp) {
var total = parseInt(resp, 10);
more = (page * config.len) < total;
},
async:false
});
return {
results: data, more: more
};
}
}
});

Categories

Resources