jquery ajax call synchronous for databases - javascript

I'm creating a web site using ASP. NET with a large client side that takes care of many events for the site. On the client side via AJAX I update, delete and add to the database (in that order!).
My question is, because the order of the tasks is very important: first- update database, second- delete from database, third- add to database:
Should I make the AJAX call synchronous? by changing "async" to false"?
or should I leave it as true by default?? which approach should I take?

U should do this by sending only one ajax call for all the operation you needed and make that ajax call async false.

In this case it would be better to instead use async: true and chain your requests so that they happen one after the other.
$.ajax({
type:'put',
url: '/model/7256185',
data: {name: 'Lucy'}
}).then(function () {
return $.ajax({
type:'delete',
url: '/model/7256186'
});
}).then(function () {
return $.post('/model', {name: 'bob'});
}).then(function (result) {
console.log("All Done!");
console.log(result);
}, function () {
console.log('An error has occurred!');
console.log(arguments);
});
This ensures that the requests happen in order, and it doesn't cause your page to appear broken during the requests (which is what happens with a synchronous request.)
It also allows you to use a loading gif if you so wish. With synchronous requests, loading gifs won't spin.

You DEFINITELY need to use transactions either at the business or at the data layer of your backend.
I usually prefer performing brief tasks (each entity with their own repositories) and keep the connection open for as little as possible in the Data Layer, then manage the transactional logic in the business layer using - for example - the TransactionScope class.
After this, it doesn't really matter whether you call the service/method in a sync or async fashion.

Related

Nested AJAX requests without a callback success function

After reading this thread jQuery Ajax Request inside Ajax Request
Hi everyone I need to have explanations about such a situation.
I've just been working on the code of a former member of my development team and found many parts of the code where he makes asynchronous ajax calls within other ajax calls.
My question is: can anyone explain the advantages and disadvantages of this practice and whether it is a good or bad practice?
Here is an example of code:
// first ajax (starting ajax call)
$.ajax({
url: "script1.php",
type: "POST",
data: {paramFisrtAjax: "first-ajax"},
success: function(response) {
alert(response);
}
});
script1.php
<script>
// second ajax
$.ajax({
url: "script2.php",
type: "POST",
data: {paramFirstAjax: "<?= $_POST['paramFisrtAjax'] ?>", paramSecondAjax: "second-ajax"},
success: function(response) {
alert(response);
}
});
</script>
<?php
// some operations on database server
echo "page2 operations with param: paramFirstAjax-> {$_POST['paramFirstAjax']}";
?>
script2.php
<?php
// some operations on database server
echo "page3 operations with params: firstParam -> {$_POST['paramFisrtAjax']} and secondParam-> {$_POST['paramSecondAjax']}";
?>
Something tells me it's not a good thing becouse i think the correct way is use the callback function success.
Like this: jquery nested ajax calls formatting
There is an advantage and a disadvantage here.
The Advantages are:
1) You make an async call, making the request a lot faster. You do not wait for the callback function, thus do not wait for your response which might take time to return. You do everything on the background rather then 'straight forward'.
This is understandable when you call multiple methods and you do not want the delay in waiting for the callback.
2) You are able to fetch a far greater amount data through your call while minimizing the need of the end client to wait.
This is useful when you have a big amount of data to display and you want to make it with minimal effort.
The Disadvantages:
1) Error handling is a pain. If something fails within the inner calls, it takes time to detect were the failure occurred and on which method.
When waiting for the callback, you can detect right away where the error occurred, as it will return a response of success or error,
2) if there is a mismatch on the data, it is hard to track back and see where the missing part took place, you will have to go through each request one by one to detect and use developer tools and/or fiddler as well, since those are async calls at the end.
3) it is easy to put too much effort on the client, since maintaining this kind of technique might result in calling multiple methods that will work together at the same time, thus creating overload on the client, locks on the threads or DB when working with server side code and more.
This explained, you can now decide for yourself with which type of method you would like to continue further in your code.

Optimistic Update using localStorage. Is this bad design?

Scenario:
So I have a javascript component that basically holds an instance of some global list of data (In my actual application, this resides in a Flux Store, but I'm just referring it as a global variable for simplicity's sake).
It contains functions to ADD/DELETE data by making AJAX calls to a REST API.
Design:
Since I want the users to be able to immediately view the updated list, instead of having to wait until the Ajax success callback, I'm performing an "Optimistic Update."
That is, I'm updating the list before performing the actual AJAX call, while keeping the original copy of the list in localStorage in case the AJAX call fails.
(1) If the AJAX call succeeds, then update the list with the API response (which should basically be the same as the optimistically updated list)
(2) If the AJAX call fails, then undo the optimistic update by retrieving the original copy from the localStorage.
Here is my Implementation:
// Some global data list
var myData = ["data1", "data2", ...];
function addData(dataToAdd) {
// Store original in cache before optimistic update
localStorage.set("MY_DATA", myData);
// Do optimistic update
myData = myData.concat(dataToAdd);
$.ajax({
url: REST_API_ADD,
method: "POST",
data: { data: dataToAdd },
dataType: "json",
success: function(response) {
// API returns the updated list when success
myData = response;
},
error: function(xhr, status, err) {
console.log(err);
// Cancel optimistic update and retrieve old data from cache
myData = localStorage.get("MY_DATA");
}
});
}
function deleteData(dataToDelete) {
// Store original in cache before optimistic update
localStorage.set("MY_DATA", myData);
// Do optimistic update
// I'm using Underscore.js here to delete data from list
myData = _.without(
myData,
_.findWhere(myData, {id: dataToDelete.id})
);
$.ajax({
url: REST_API_DELETE,
method: "DELETE",
data: { data: dataToDelete },
dataType: "json",
success: function(response) {
// API returns the updated list when success
myData = response;
},
error: function(xhr, status, err) {
console.log(err);
// Cancel optimistic update and retrieve old data from cache
myData = localStorage.get("MY_DATA");
}
});
}
Is there anything wrong with this idea?
My primary concern is a race condition that might occur when the user performs ADD and DELETE operations almost simultaneously..
I've thought of possible scenarios, but it seems like concurrency is not a problem in my case, since the callback functions never modify the localStorage. All they do is "get."
Can anyone think of situations where my design might cause problems?
Is this a bad design overall? If so, then can you suggest an alternative approach?
Thanks
My primary concern is a race condition that might occur when the user
performs ADD and DELETE operations almost simultaneously..
There should be no concern here since AJAX executed from the UI thread isn't asynchronous in terms of parallelism. Web browsers use an execution queue where each enqueued action is dequeued synchronously.
For example, it will never happen that you modify the DOM and perform an AJAX request concurrently. Actually, UI tasks have more priority than AJAX, but anyway they will be executed one by one.
AFAIK, there's only one chance that you will run in concurrency problems: Web workers. Unless you use them, you're absolutely safe.
I think this is not a bad design, but you can still have more control over users. At least you can block them from taking another action while an ajax call waiting for success.
I can think of one problem: If you are dealing with lots of data, writing to and reading from the localStorage may lead to performance problems.
Besides these, there should be no problems at all within your approach.

AJAX -- Multiple concurrent requests: Delay AJAX execution until certain calls have completed

I am currently working on a web based time tracking software. I'm developing in grails, but this question is solely related to javascript and asynchronous requests.
The time tracking tool shall enable users to choose a day for the current month, create one or multiple activities for each day and save the entire day. Each activity must be assigned to a project and a contract.
Upon choosing "save", the partial day is saved to the database, the hours are calculated and a table is updated at the bottom of the page, showing an overview of the user's worked hours per month.
Now to my issue: There may be a lot of AJAX request. Patient users might only click the "create activity" button just once and wait until it is created. Others, however, might just keep clicking until something happens.
The main issue here is updating the view, although i also recognized some failed calls because of concurrent database transaction (especially when choosing "save" and "delete" sequentially). Any feedback on that issue -- requests not "waiting" for the same row to be ready again -- will be apreciated as well, yet this is not my question.
I have an updateTemplate(data, day) function, which is invoked onSuccess of respective ajax calls in either of my functions saveRecord(), deleteRecord(), pasteRecords(), makeEditable() (undo save). Here is the example AJAX call in jquery:
$.ajax({
type: "POST",
url: "${g.createLink(controller:"controller", action:"action")}",
data: requestJson,
contentType:"application/json; charset=utf-8",
async: true,
success: function(data, textstatus) {updateTemplate(data["template"], tag); updateTable(data["table"]);},
});
In the controller action, a JSON object is rendered as a response, containing the keys template and table. Each key has a template rendered as a String assigned to it, using g.render.
Now, what happens when I click on create repeatedly in very short intervalls, due to the asynchronous calls, some create (or other) actions are executed concurrently. The issue is that updateTemplate just renders data from the repsonse; the data to render is collected in the create controller action. But the "last" request action only finds the objects created by itself. I think this is because create actions are run concurrently
I figure there is something I'm either overcomplicating or doing something essentially wrong working with a page that refreshs dynamically. The only thing I found that helps are synchronous calls, which works, but the user experience was awful. What options do I have to make this work? Is this really it or am I just looking for the wrong approach? How can I make this all more robust, so that impatient users are not able to break my code?
*********EDIT:********
I know that I could block buttons or keyboard shortcuts, use synchronous calls or similar things to avoid those issues. However, I want to know if it is possible to solve it with multiple AJAX requests being submitted. So the user should be able to keep adding new activities, although they won't appear immediately. There is a spinner for feedback anyway. I just want to somehow make sure that before the "last" AJAX request gets fired, the database is up to date so that the controller action will respond with the up-to-date gsp template with the right objects.
With help of this Stackoverflow answer, I found a way to ensure that the ajax call -- in the javascript function executed lastly -- always responds with an up-to-date model. Basically, I put the javascript functions containing AJAX calls in a waiting queue if a "critical" AJAX request has been initiated before but not completed yet.
For that I define the function doCallAjaxBusyAwareFunction(callable) that checks if the global variable Global.busy is 'true' prior to executing the callable function. If it's true, the function will be executed again until Global.busy is false, to finally execute the function -- collecting the data from the DOM -- and fire the AJAX request.
Definition of the global Variable:
var Global = {
ajaxIsBusy = false//,
//additional Global scope variables
};
Definition of the function doCallAjaxBusyAwareFunction:
function doCallAjaxBusyAwareFunction(callable) {
if(Global.busy == true){
console.log("Global.busy = " + Global.busy + ". Timout set! Try again in 100ms!!");
setTimeout(function(){doCallAjaxBusyAwareFunction(callable);}, 100);
}
else{
console.log("Global.busy = " + Global.busy + ". Call function!!");
callable();
}
}
To flag a function containing ajax as critical, I let it set Global.busy = true at the very start and Global.busy = false on AJAX complete. Example call:
function xyz (){
Global.busy = true;
//collect ajax request parameters from DOM
$.ajax({
//desired ajax settings
complete: function(data, status){ Global.busy = false; }
}
Since Global.busy is set to true at the very beginning, the DOM cannot be manipulated -- e.g. by deletes while the function xyz collects DOM data. But when the function was executed, there is still Global.busy === true until the ajax call completes.
Fire an ajax call from a "busy-aware" function:
doCallAjaxBusyAwareFunction(function(){
//collect DOM data
$.ajax({/*AJAX settings*/});
});
....or fire an ajax call from a "busy-aware" function that is also marked critical itself (basically what I mainly use it for):
doCallAjaxBusyAwareFunction(function(){
Global.busy = true;
//collect DOM data
$.ajax({
//AJAX SETTINGS
complete: function(data, status){ Global.busy = false; }
});
});
Feedback is welcome and other options too, especially if this approach is bad practice. I really hope somebody finds this post and evaluates it, since I don't know if it should be done like that at all. I will leave this question unanswered for now.

Is it okay to use async: false if calling for small data?

I'm mainly asking this to know what's the best practice with regards to getting small data from the server.
Like for one example, I'm using an ajax(or sjax. lol) call to check if there are new Notifications for a user
function checkNewNotifs() {
$.ajax({
url: '/Home/CheckNewNotifications',
async: false,
success: function (data) {
if (data == 'True') {
$('#alert-icon').css('color', '#FF4136');
}
}
})
}
It gets the job done, but I'm thinking if there's a better way of achieving this?
I'm mainly using ASP.NET MVC 4/5 as of the moment to provide context.
Edit:
For the future ajax beginner readers like myself, the proper way of achieving something similar to this is through .done() I haven't completely grasped the idea of ajax yet, but a lot can be done through the following call:
function checkNewNotifs() {
$.when(
$.ajax({
url: '/Home/CheckNewNotifications',
success: function (data) {
//do data manipulation and stuff.
}
})).done(function() {
//append to view
})
}
tl;dr async: false = bad
The main reason why you should avoid sync request**s is a **hang outs of UI.
Is there a reason why do you use sync requests?
I think you should use async requests to check notifications, because in case when request will take a bit more time than you expect - user will not see any freezing of UI.
Especially this problem will be actual for users with slow internet connection or slow connection to your server (different country, or event different continent).

Make FB.api() calls synchronous

I am creating fQuery API on top of FB javascript SDK. And till now everything worked fine, but i got stuck in FB.api calls now.
Actually, I am trying to load facebook user object i.e. "/me" using FB.api function.
function somefunc() {
var r = fQuery.load(selector); //selector = "me"
return r;
}
fQuery.load = function( selector ) {
fQuery.fn.response = "";
return FB.api( "/" + selector, function (response) {
// we get response here.
});
}
Is it possible to return the response or can we make it sync call. I have tried many ways to work around but could not get success.
Please provide suggestions.
If you think about it, you don't really want to make it synchronous. Javascript is single threaded by nature, making something that is asynchronous synchronous, would involve "freezing" the thread until the asynchronous call returns.
Even if you could do it, you don't want to, trust me.
Redesign your code to work with the asynchronous nature instead of fighting it. you will create better applications, have happier users and become a better coder all at the same time.
As commented elsewhere, making a synchronous call is useful if you want to open a popup after a successful response as browsers will often block popups that aren't a result of a direct user action.
You can do this by manually calling the Open Graph API with JavaScript (or jQuery as per the example below) rather than using the Facebook JS SDK.
e.g. to upload a photo via the Open Graph API and then prompt the user to add it as their profile picture using a popup, without the popup being blocked:
$.ajax({
type: 'POST',
url: 'https://graph.facebook.com/me/photos',
async: false,
data: {
access_token: '[accessToken]',//received via response.authResponse.accessToken after login
url: '[imageUrl]'
},
success: function(response) {
if (response && !response.error) {
window.open('http://www.facebook.com/photo.php?fbid=' + response.id + '&makeprofile=1');
}
}
});
You probably want to call FB.api in a for loop iteration if this is the case you need its proper solution which exists in the use of Closures. Please read my answer here, given in another question
My solution was to make a recursive call until i got what i need from the FB.api

Categories

Resources