I am developping a web application.
At some point, I have an input field where the user can enter a number. Right next to it, I would like to show as an output the double of that number. Of course, that can be done easily client-side using Javascript in the following way (jQuery syntax):
$(document).on("change", "#input", function () {
var x = $("#input").val();
var y = 2 * x;
$("#output").val(y);
});
However, for various reasons linked with the real application, I need to perform the computation server-side and show the result client-side after calling my API. My naive approach was to implement it in the following way (jQuery syntax):
$(document).on("change", "#input", function () {
$("#output").val("Currently computing result");
$.ajax({ ... }).done(function (result) {
$("#ouput").val(result);
});
});
Each time the user updates the field, an event gets triggered and an asynchronous AJAX query is sent. Each time a result comes back, the output field is updated. However, if the network is slow, or the server-side computations are multi-threaded and take random time, there is no guarantee that the results come back in the same order as the successive changes made by the user. In the end, it can happen that an inconsistent result is shown.
I expect that this is a typical situation. How can I solve it? Should I be using a front-end framework like Vue.js (despite the fact that I still want to host the computation server-side)? Do front-end frameworks generally handle this difficulty? Is there a simple way to handle it using plain Javascript? Thanks!
You can put loader to prevent user actions(ajax calls) until the response of current ajax call will not be come:
$(document).on("change", "#input", function () {
// Show loader
$("#output").val("Currently computing result");
$.ajax({ ... }).done(function (result) {
$("#ouput").val(result);
// Hide loader
});
});
Related
I have a web application that has to perform the following task. For a chosen date range, it makes GET request to a web service for each date in the range; this can take a while, and because I want to visualize the data later, all the calls are synchronous (the result of each request gets stored into an array). This retrieval takes a while (several seconds) which means the main thread "freezes."
What would be a good way to avoid this? (E.g. doing the retrieval in a separate thread and getting notified once it's done.)
Consider using promises.
They enable you to perform non-blocking calls to API. It's basically what you are asking for.
EDIT: You can use when() specifically to be notified, when all operations are done.
You should make your GET-requests async and then visualize when all the requests have completed.
var get1 = $get(..
var get2 = $get(..
var get3 = $get(..
$.when(get1, get2, get3).done(function (...) {
// do something with the response
visualize();
});
In fact there is a simple solution. Let's implement a function which needs to be executed when all responses arrived:
function onFinished(responses) {
//Do something
}
Now, let's suppose you have a function which returns the dates as array:
function getDates(range) {
//Do something
}
Also, we need a getURL, like this:
function getURL(date) {
//Do something
}
Finally, let's suppose you have a variable called dateRange which has the range you will use as input in getDates. So we can do this:
var requestDates = getDates(dateRange);
var requestsPending = requestDates.length;
var responses = [];
for (var requestIndex in requestDates) {
$.ajax({
url: getURL(requestDates[requestIndex]),
method: "GET",
//You might pass data as well here if needed in the form of
//data: yourobject,
}).done(function(data, textStatus, jqXHR) {
//Handle response, parse it and store the result using responses.push()
}).fail(function(jqXHR, textStatus, errorThrown) {
//Handle failed requests
}).always(function(param1, param2, param3) {
if (--requestsPending === 0) {
onFinished(responses);
}
});
}
This will send AJAX requests for each date you need and wait for their responses asynchronously, so, you effectively do not wait for the sum of the pending time, but for the longest pending time, which is a great optimization. It is impossible to solve this in a multithreaded fashion, as Javascript is single-threaded, so you need to wait asynchronously for the answers, as the requests won't wait for each-other on the server. If you own the server as well, then you do not need to send a request for each date, but to implement a server-side API function where you will handle date ranges, so client-side will send a single request and wait for the answer.
I am currently working on a web based time tracking software. I'm developing in grails, but this question is solely related to javascript and asynchronous requests.
The time tracking tool shall enable users to choose a day for the current month, create one or multiple activities for each day and save the entire day. Each activity must be assigned to a project and a contract.
Upon choosing "save", the partial day is saved to the database, the hours are calculated and a table is updated at the bottom of the page, showing an overview of the user's worked hours per month.
Now to my issue: There may be a lot of AJAX request. Patient users might only click the "create activity" button just once and wait until it is created. Others, however, might just keep clicking until something happens.
The main issue here is updating the view, although i also recognized some failed calls because of concurrent database transaction (especially when choosing "save" and "delete" sequentially). Any feedback on that issue -- requests not "waiting" for the same row to be ready again -- will be apreciated as well, yet this is not my question.
I have an updateTemplate(data, day) function, which is invoked onSuccess of respective ajax calls in either of my functions saveRecord(), deleteRecord(), pasteRecords(), makeEditable() (undo save). Here is the example AJAX call in jquery:
$.ajax({
type: "POST",
url: "${g.createLink(controller:"controller", action:"action")}",
data: requestJson,
contentType:"application/json; charset=utf-8",
async: true,
success: function(data, textstatus) {updateTemplate(data["template"], tag); updateTable(data["table"]);},
});
In the controller action, a JSON object is rendered as a response, containing the keys template and table. Each key has a template rendered as a String assigned to it, using g.render.
Now, what happens when I click on create repeatedly in very short intervalls, due to the asynchronous calls, some create (or other) actions are executed concurrently. The issue is that updateTemplate just renders data from the repsonse; the data to render is collected in the create controller action. But the "last" request action only finds the objects created by itself. I think this is because create actions are run concurrently
I figure there is something I'm either overcomplicating or doing something essentially wrong working with a page that refreshs dynamically. The only thing I found that helps are synchronous calls, which works, but the user experience was awful. What options do I have to make this work? Is this really it or am I just looking for the wrong approach? How can I make this all more robust, so that impatient users are not able to break my code?
*********EDIT:********
I know that I could block buttons or keyboard shortcuts, use synchronous calls or similar things to avoid those issues. However, I want to know if it is possible to solve it with multiple AJAX requests being submitted. So the user should be able to keep adding new activities, although they won't appear immediately. There is a spinner for feedback anyway. I just want to somehow make sure that before the "last" AJAX request gets fired, the database is up to date so that the controller action will respond with the up-to-date gsp template with the right objects.
With help of this Stackoverflow answer, I found a way to ensure that the ajax call -- in the javascript function executed lastly -- always responds with an up-to-date model. Basically, I put the javascript functions containing AJAX calls in a waiting queue if a "critical" AJAX request has been initiated before but not completed yet.
For that I define the function doCallAjaxBusyAwareFunction(callable) that checks if the global variable Global.busy is 'true' prior to executing the callable function. If it's true, the function will be executed again until Global.busy is false, to finally execute the function -- collecting the data from the DOM -- and fire the AJAX request.
Definition of the global Variable:
var Global = {
ajaxIsBusy = false//,
//additional Global scope variables
};
Definition of the function doCallAjaxBusyAwareFunction:
function doCallAjaxBusyAwareFunction(callable) {
if(Global.busy == true){
console.log("Global.busy = " + Global.busy + ". Timout set! Try again in 100ms!!");
setTimeout(function(){doCallAjaxBusyAwareFunction(callable);}, 100);
}
else{
console.log("Global.busy = " + Global.busy + ". Call function!!");
callable();
}
}
To flag a function containing ajax as critical, I let it set Global.busy = true at the very start and Global.busy = false on AJAX complete. Example call:
function xyz (){
Global.busy = true;
//collect ajax request parameters from DOM
$.ajax({
//desired ajax settings
complete: function(data, status){ Global.busy = false; }
}
Since Global.busy is set to true at the very beginning, the DOM cannot be manipulated -- e.g. by deletes while the function xyz collects DOM data. But when the function was executed, there is still Global.busy === true until the ajax call completes.
Fire an ajax call from a "busy-aware" function:
doCallAjaxBusyAwareFunction(function(){
//collect DOM data
$.ajax({/*AJAX settings*/});
});
....or fire an ajax call from a "busy-aware" function that is also marked critical itself (basically what I mainly use it for):
doCallAjaxBusyAwareFunction(function(){
Global.busy = true;
//collect DOM data
$.ajax({
//AJAX SETTINGS
complete: function(data, status){ Global.busy = false; }
});
});
Feedback is welcome and other options too, especially if this approach is bad practice. I really hope somebody finds this post and evaluates it, since I don't know if it should be done like that at all. I will leave this question unanswered for now.
When communicating with a server in javascript in my single page browser application, I would like to provide a callback function that is always called after the server replies, regardless of whether the result was a success or some kind of error.
Two cases where I need this:
1) I want to disable a "save" button while waiting for the server's response, and enable it again after the server responds with an error or a success.
2) I have a polling mechanism where I want to prevent stacking of calls when the server for some reason is being slow to respond - I want to wait for one poll call to finish before making the next.
One solution I have right now involves making sure that two functions (success and error) get passed along as options in a long method chain, which feels like a fragile and cumbersome solution (pseudo-ish code):
function doCall() {
framework1.callit({success : myCallback, error : myCallback})
};
framework123.callit = function(options) {
options = options || {};
if (options.error) {
var oldError = options.error;
options.error = function(errorStuff) {
// callit error stuff
oldError(errorStuff);
} else {
// callit error stuff
}
generalCallFunction(options);
}
function generalCallFunction(options) {
options = // ... checking success and error once again to get general success and error stuff in there, plus adding more options
ajax( blah, blah, options);
}
I also have a backbone solution where I listen to the sync event plus an error callback, in similar ways as above.
I'm always scared that error or success functions get lost on the way, and the whole thing is hard to follow.
Any framework or pattern for making this stuff as easy as possible? Is it a weird thing to have general things that should always happen whether the result was an error or a success?
You can use jQuery.ajax({ details here... ).always(callback);
Or, in Backbone
// logic to create model here
model.fetch().always(callback);
What is a good way of saving data form without submit button?
I have one idea. Below exemplary source code.
var delay = 1000,
timeId,
ajax,
//fw is some framework
form = fw.get('myform');
form.getFields().on('change', changeEventHandler);
function changeEventHandler() {
clearTimeout(timeId);
timeId = setTimeout(this.ajaxRequest, delay);
}
function ajaxRequest() {
//What do with old ajax request? Abort it?
ajax = fw.ajax({
url: 'ololo',
params: {
data: form.getValues()
}
});
}
What do with old ajax request? Abort it?
Have somebody other ideas?
I had a similar problem when designed an interactive form without save button.
First of all, its not a good idea to save the data on every change. I used on blur event, so when the input loses focus, I check if the value was changed (i.e. not just focus-blur on the input), if it was changed, I disabled the input and send an ajax request. When the request returned, I enabled the input once again (possibly displaying an error if the ajax failed and etc, depends on your needs).
Its the easiest way to do interactive form. This avoids the headache of multiple request trying to modify the same value on server side and the headache of monitoring all ajax requests.
I have the following kludgey code;
HTML
<input type="search" id="search_box" />
<div id="search_results"></div>
JS
var search_timeout,
search_xhr;
$("#search_box").bind("textchange", function(){
clearTimeout(search_timeout); search_xhr.abort();
search_term = $(this).val();
search_results = $("#search_results");
if(search_term == "") {
if(search_results.is(":visible"))
search_results.stop().hide("blind", 200);
} else {
if(search_results.is(":hidden"))
search_results.stop().show("blind", 200);
}
search_timeout = setTimeout(function () {
search_xhr = $.post("search.php", {
q: search_term
}, function(data){
search_results.html(data);
});
}, 100);
});
(uses the textchange plugin by Zurb)
The problem I had with my original more simple code was that it was horribly unresponsive. Results would appear seconds later, especially when typed slower, or when Backspace was used, etc.
I made all this, and the situation isn't much better. Requests pile up.
My original intention is to use .abort() to cancel out whatever previous request is still running as the textchange event is fired again (as per 446594). This doesn't work, as I get repeated errors like this in console;
Uncaught TypeError: Cannot call method 'abort' of undefined
How can I make .abort() work in my case?
Furthermore, is this approach the best way to fetch 'realtime' search results? Much like Facebook's search bar, which gives results as the user types, and seems to be very quick on its feet.
You'd do well to put a small delay in before sending the request. If the user hits another key within 100ms (or some other time of your choosing) of the last there is no need to send the request in the first place.
When actually sending the request you should check to see if one is already if active. If it is, cancel it.
e.g.
if (search_xhr) {
search_xhr.abort();
}
don't forget to reset that var on a successful retrieval. e.g. delete search_xhr;