I make a $.post call by sending array of objects (selected values from a checkbox tree) to my API every time the mouse leave the div where checkbox tree is located. The issue is that the user by moving randomly the mouse could leave and enter the div houndreds of times so that will cause useless $.post submission without getting new data as the sent content hasn't changed.
Actually here is how the code looks like right now:
$("div.sidebar-wrapper")
.mouseleave(function () {
postPareto();
});
function postPareto() {
$.ajax({
type: "POST",
data: JSON.stringify(config_Array()),
url: "api/prodotti/pareto/",
contentType: "application/json",
success: function (dati) {
tablePareto(dati);
}
});
}
So my question is, is there a way to prevent the $.post to being submissed if the content hasn't changed or should i just find another approach to submiss the checkbox selection (as it's a checkbox tree i chose to submit data on mouseleave so the user will have some time to decide what to check)?
In this case I would do the following (simple solution):
$("div.sidebar-wrapper")
.mouseleave(function () {
postPareto();
});
// Save reference to previous value
let prevValue = null;
function postPareto() {
let data = JSON.stringify(config_Array());
// If it is the same value we can return and not perform POST
if (data === prevValue){
return;
} else {
// Assign new value to prevValue
prevValue = data;
}
$.ajax({
type: "POST",
data: JSON.stringify(config_Array()),
url: "api/prodotti/pareto/",
contentType: "application/json",
success: function (dati) {
tablePareto(dati);
}
});
}
It might be a good idea to look into rxjs, it's quite cool and powerful for reactive sites. For example in rxjs you could do:
const input = document.querySelector('div.sidebar-wrapper');
const observable = fromEvent(input, 'mouseleave');
observable
.map((event: any) => config_Array()),
.distinctUntilChanged() // Returns an Observable that emits all items emitted by the source Observable that are distinct by comparison from the previous item.
.subscribe((resp) => {
// This will only be called if value changes
console.log(resp); // resp will be the result from `config_Array()`
});
You can check this article, which explains this a bit more in depth
Related
I have been writing many web apps with php & mysql & jquery & bootstrap and now it's time address this problem. How to write shorter ajax queries(posting) ?
If I want to write code that works and takes care of many problems, it's too long for every ajax call.
Is there a better way or some library / wrapper that makes the code SHORTER and FASTER to write, but does atleast all these stuff
I looked popular axios, but it seems even worse
//JUST an example code, too complicated
var $btnStatusElem = $("#passwordreset").button('loading');
$.ajax({
type: "POST",
cache: false,
url: "pwreset.php",
data: postdata
success: function(data) {
$btnStatusElem.button('reset');
try {
var datajson = JSON.parse(data);
}
catch (e) {
alert('Unexpected server error');
return false;
};
if (datajson['success'] == true) {
//do the OK stuff
} else {
//show the error code, and stuff
return false;
}
},//success
error: function(msg) {
alert('ERROR');
$('#passwordreset_result').html(msg);
}
});
For my code, ajax query, i want it to do these steps:
1. Disable the submit button while posting (re-enable also after 15 seconds and not just leave it disabled until page refresh)
2. It sends json, expects json to return
3. If server has some error, it DOES NOT return json but error. Then the code will halt all js execution if i dont use try...catch. This is pain to write each time
4. If server returns validation error or some other expected error, i have to detect this and show to the user
5. If all ok, do the stuff
As with any refactoring, identify and isolate the repetitive code and pass in the unique bits. In this case, for example, you could isolate the ajax call and json parsing into a function and pass in the url, data, etc.
That function could return a promise that resolves/rejects as appropriate.
Given the doRequest function below (pseudocode, untested and would probably need a bit of tweaking for real-world use), you could then use it all over the place with fewer keystrokes:
doRequest('pwreset.php', postdata, button)
.then(result => {
// do something with the result
})
.catch(error => {
// deal with the error
});
or
try {
const result = await doRequest('pwreset.php', postdata);
// do something with result
}
catch (e) {
// handle error
}
All of the boilerplate stuff is isolated in doRequest.
async function doRequest(url, data, button, type = "POST") {
return new Promise((fulfill, reject) => {
$.ajax({
type,
url,
data,
cache: false,
success: function(data) {
$btnStatusElem.button('reset');
try {
const datajson = JSON.parse(data);
} catch (e) {
return reject(e);
};
return datajson['success'] == true ?
fulfill(datajson) :
reject(datajson);
}, //success
error: function(msg) {
return reject(msg);
}
});
})
}
As #mister-jojo says, you might also want to consider using the [fetch api] instead of jQuery, but the same principle applies.
Trying to solve problem were my object has old snapshot of itself and refuses to update itself when I replace it with new data using AJAX.
var user, status;
function getData(){
var userData = getUserData(),
orderStatus = getOrderStatus(),
allDone = $.when(userData, orderStatus);
allDone.then(function(data, data2){
status = parseInt(data2[0]);
user = JSON.parse(data[0]);
console.log("User", user);
});
}
function getUserData() {
return $.ajax({
type: "POST",
url: functionsPath,
data: {
action: 'getUserData'
}
});
}
function getOrderStatus() {
return $.ajax({
type: "POST",
url: functionsPath,
data: {
action: 'getOrderStatus'
}
})
}
//onclick event
function verify(){
console.log(user);
}
Here is screenshot of the problem: This first one is from console. I call this inside getData() function. Take a closer look into user.content.page4
Next image is taken from network tab. As you can see data is different when it not. There are "symbols" with values. Why are these two different?
This is problem when I try to call user within verify() since it has the old snapshot of itself and doesn't return my symbols etc. It really feels like a console bug. Symbols are listed in database.
EDIT
I think I got more closer to the problem. Result loses its "Symbol-1", "Symbol-2" and so on immediately when I parse it. So when I parse user.content page4 loses Symbol properties.
Before parsing I have it, but soon as I parse it I lose every symbol properties and their values.
so I have this situation:
renderer: function(value, grid, record) {
var testAjax = function(callback) {
Ext.Ajax.request({
url: appConfig.baseUrl + '/api/users/' + record.getData().id + '/jobRoles',
method: 'GET',
success: function(result) {
callback(result)
};
});
};
return testAjax(function(result) {
try {
result = JSON.parse(result.responseText);
} catch(e) {
return '';
}
result = result.data;
var roles = _.map(result, function(jRole) {
console.log(jRole);
return jRole.name;
}).join(',');
console.log("Roles: ", roles);
return roles;
});
}
What I wanted to achieve is that when I have to render a particular field, I make a call to my Loopback endpoint, retrieve some data about a relation, map it using a "," character and return the joined string in order to view it.
However, I think I have a few problem with callbacks here as I don't see the result at all, as if the function is returning before the callback is called (thus showing nothing instead of what I retrieved from the server).
I tried to look here and there, and this is the best I came up with.
How can I return to the parent function the "roles" variable? How do I properly set up my callbacks?
Regards
You cannot and should not use the renderer with load operations and asynchonous callbacks. The renderer can be called dozens of times for the same record, if you filter or sort or just refresh the grid view. What you want to do is get all the information required for display in the grid in a single call. Data you cannot get in the single call should not be shown in the grid. You don't want to call the endpoint 1000 times for 1000 records, because even if each call needs only 60ms, that's a full minute.
That said, if you really have to, because you cannot change the endpoints and the roles have to be displayed, you can do as follows:
dataIndex: 'MyTempRoles',
renderer: function(value, grid, record) {
if(value) return value; // show the loaded value if available
else { // no value loaded -> load value
Ext.Ajax.request({
url: appConfig.baseUrl + '/api/users/' + record.getData().id + '/jobRoles',
method: 'GET',
success: function(result) {
try {
result = JSON.parse(result.responseText);
result = result.data;
var roles = _.map(result, function(jRole) {
console.log(jRole);
return jRole.name;
}).join(',');
record.set("MyTempRoles", roles || " "); // put the loaded value into the record. This will cause a grid row refresh, thus a call to the renderer again.
} catch(e) {
}
}
});
}
}
This will call the backend in the first call to the renderer, and asynchronously fill the displayed record's temp variable. When the temp variable is filled, the renderer will then display the value from the temp variable automatically.
I am creating a React app that involves several calls to some webapi REST services to do my thing. Part of the app, is the approval flow of some requests. There is a specific role that can create these flows with a UI that consists of:
A table that lists the steps of the procedure sorted by cardinality (that means the order). The steps have the actor(s) and the status as well.
The buttons on each row to arrange up/down
Buttons on each row to delete the respective row
A button to add a new step.
What I do is allow the user to make changes using Javascript (mostly array operations), while populating an actionbuffer array with the action and the respective data. Eg.
this.actionsBuffer.push({
action: "ADD_STEP",
data: next
});
When the user is happy with the arrangement, she can press the Accept button. What it does is iterating the actionsBuffer array and execute the appropriate REST service which is determined by the action field.
I know my description might seem too detailed but I wanted you to know the context.
Question:
My question now is that since the calls are asynchronous how can I guarantee that the actions will execute by this order.
Some code snippets:
This iterates and calls determineAction
onAccept: function (e) {
e.preventDefault();
var self = this;
//console.log("Gonna save:",JSON.stringify(this.state.workflow));
var ret=null;
// First we do actions in actionsBuffer
for(var i=0;i<this.actionsBuffer.length;i++)
{
ret = self.determineAction(this.actionsBuffer[i]);
if (ret==false)
break;
else
this.actionsBuffer.splice(i,1);
ret=null;
}
this.saveAll();
},
And determineAction. Pardon the debugging console messages
determineAction: function (action) {
var url="";
var verb="";
switch(action.action)
{
case "ADD_STEP":
delete action.data.ActorList;
url=this.props.server+"/workflows/"+this.props.workflowid+"/steps";
verb="POST";
break;
case "DELETE_STEP":
url=this.props.server+"/workflows/"+this.props.workflowid+"/delete/";
verb="POST";
break;
}
console.log("Going to call url:",url," with varb:",verb," and data:",action.data);
$.ajax({
type: verb,
url: url,
data: JSON.stringify(action.data),
processData:false,
contentType: 'application/json'
})
.success(function(data) {
return true;
//self.props.onclose(self.state.workflows.WorkflowId);
})
.error(function(jqXhr) {
console.log(jqXhr);
return false;
});
},
You are not waiting for determineAction to finish. Make it return a promise, and wait for it where you are calling it. Also your loop has to be asynchronous. I've created an attempt that may not be exactly what you need but shows you the direction that you should move.
onAccept: function (e) {
e.preventDefault();
var self = this;
var ret=null;
// First we do actions in actionsBuffer
var i = 0;
function makeRequest() {
self.determineAction(self.actionsBuffer[i]).success(function() {
i++;
if (i >= (self.actionsBuffer.length) {
self.saveAll();
} else {
makeRequest();
}
}).error(function(){
self.saveAll();
})
}
makeRequest()
this.saveAll();
},
determineAction: function (action) {
var url="";
var verb="";
switch(action.action)
{
case "ADD_STEP":
delete action.data.ActorList;
url=this.props.server+"/workflows/"+this.props.workflowid+"/steps";
verb="POST";
break;
case "DELETE_STEP":
url=this.props.server+"/workflows/"+this.props.workflowid+"/delete/";
verb="POST";
break;
}
console.log("Going to call url:",url," with varb:",verb," and data:",action.data);
return $.ajax({
type: verb,
url: url,
data: JSON.stringify(action.data),
processData:false,
contentType: 'application/json'
});
},
Rather than iterating over your array of actions synchronously with a for loop. Instead, treat it as a queue.
Take the first item from the queue and execute it.
When the asynchronous work finishes, take another item from the
queue.
Repeat until you've cleared the queue.
Here's a simple example.
function processActions(actionQueue) {
if(actionQueue.length == 0) return;
// take the first action from the queue
var action = actionQueue[0];
// assuming determineAction() returns a promise
determineAction(action)
.then(function() {
var remainingActions = actionQueue.slice(1);
// we know this action has completed, so we can pass
// the remaining actions to be processed
processActions(remainingActions);
});
}
using Backbone.js we have an application, in which on a certain occasion we need to send an ajax post to a clients webservice.
however, the content to be posted, is dynamic, and is decided by a certain array.
for each item in the array we need to go fetch a piece of data.
after assembling the data that aggregated object needs to be sent.
as of now, i have a synchronous approach, though i feel that this is not the best way.
var arrParams = [{id: 1, processed: false},{id: 7, processed: false},{id: 4, processed: false}];
function callback(data) {
$.post()... // jquery ajax to post the data... }
function fetchData(arr, data, callback) {
var currentId = _(arr).find(function(p){ return p.processed === false; }).id; // getting the ID of the first param that has processed on false...
// ajax call fetching the results for that parameter.
$.ajax({
url: 'http://mysuperwebservice.com',
type: 'GET',
dataType: 'json',
data: {id: currentId},
success: function(serviceData) {
data[currentId] = serviceData; // insert it into the data
_(arr).find(function(p){ return p.id === currentId; }).processed = true; // set this param in the array to 'being processed'.
// if more params not processed, call this function again, else continue to callback
if(_(arr).any(function(p){ return p.processed === false }))
{
fetchData(arr, data, callback);
}
else
{
callback(data);
}
},
error: function(){ /* not important fr now, ... */ }
});
}
fetchData(arrParams, {}, callback);
isn't there a way to launch these calls asynchronous and execute the callback only when all results are in?
You have to use JQuery $.Deferred object to sync them. Look at this article Deferred Docs
You can use in this way:
$.when(
$.ajax({ url : 'url1' }),
$.ajax({ url : 'url2' }) // or even more calls
).done(done_callback).fail(fail_callback);
I would do something like this:
make a function that besides the parameters that you pass to fetchData also gets the index within arrParams, then in a loop call that function for every element. In the success function set processed in your element to true, and check if "you're the last" by going through the array and see if all the rest is true as well.
A bit of optimization can be if you make a counter:
var todo = arrParams.length;
and in the success you do:
if (--todo == 0) {
callback(...)
}