Which is the "right" way to handle a response that came late - javascript

Lets say that we have two buttons, each on are calling the following method:
var NUMBER_OF_IMAGE_REQUEST_RETRIES = 3;
var IMAGE_REQUEST_TIMEOUT = 3000;
processImage: function(image_data) {
var main_response = $q.defer();
var hash = getImageHash(image_data);
var requestsCounter = -1;
requestImage = function() {
$http.post(apiUrl, {params: {data: hash},timeout: IMAGE_REQUEST_TIMEOUT})
.then(function(response) {
return main_response.resolve(response.data);
}, function(error) {
if (++requestsCounter < NUMBER_OF_IMAGE_REQUEST_RETRIES) {
requestLabelsImage();
} else {
return main_response.reject();
}
});
};
requestLabelsImage();
return main_response.promise;
}
The method passes an image related data to the server, the server process the data and then response. Every time a user press a different button different image_data is being send to the server.
The problem:
The user press button 1, the method is called with image_data_1, and then he/she immediately press button 2 and the method is called with image_data_2. The processImage function is called by another method, lets say doSomethingWithTheResponse which only cares about the latest user's action, but the image_data_2 is proceed faster by the servers, so the client gets image_data_2 before image_data_1, so the client believes that image_data_1 was related to the user's latest action, which is not the case. How can we ensure that the client is always getting the response that is related to the users latest action?
Note: The hash is different for the differente image_data requests.
I was thinking something like:
var oldhash = null;
processImage: function(image_data) {
var main_response = $q.defer();
var hash = getImageHash(image_data);
oldhash = hash;
var requestsCounter = -1;
requestImage = function(hash) {
if(hash === oldhash){
$http.post(apiUrl, {params: {data: hash},timeout: IMAGE_REQUEST_TIMEOUT})
.then(function(response) {
return main_response.resolve(response.data);
}, function(error) {
if (++requestsCounter < NUMBER_OF_IMAGE_REQUEST_RETRIES) {
requestLabelsImage(hash);
} else {
return main_response.reject();
}
});
}
else {
main_response.reject();
}
}
requestLabelsImage(hash);
return main_response.promise;
}
But I am not 100% sure that this is the right approach.

Simply disregard the previous requests.
You can create a repository of requests (array or dictionary implementation is okay). Call .abort() on the previous ones once another request is made -- when you add it in your storage.
If you want a dictionary, there is a good example here (tackles a different topic, though), but here is a modified snippet of his code which is related to your case:
var _pendingRequests = {};
function abortPendingRequests(key) {
if (_pendingRequests[key]) {
_pendingRequests[key].abort();
}
}
Where the key can be.. say... a category of your action. You can name constants for it, or it can be just the name of the button pressed. It can even be a URL of your request; completely up to you.
There is an excellent explanation of the whole concept here:
jquery abort() ajax request before sending another
https://stackoverflow.com/a/3313022/594992

If your UI allows for initiation multiple actions, while processing of those actions are mutually exclusive, then you should probably use promises, and track active promises.
button1.addEventListener("click", function(evt) {
startRunning( task1.start() );
});
button2.addEventListener("click", function(evt) {
startRunning( task2.start() );
});
With a task runner like:
function startRunning( promise ) {
while(runningTasks.length>0) {
cancel( runningTasks.unshift() );
});
runningTasks.push( promise );
}
Your cancel function can come from anything that can deal with promises, like Angular's service.cancelRequest, or you can write your own code that takes the promise and smartly breaks off its operation.
Of course, if you're not using promises, then you probably want to start doing so, but if you absolutely can't you can use a manager object like:
button1.addEventListener("click", function(evt) { task1(); });
button2.addEventListener("click", function(evt) { task2(); });
with
var manager = [];
function cancelAll() {
while(manager.lenght>0) {
var cancelfn = manager.unshift()
cancelfn();
}
return true;
}
function task1() {
var running = cancelAll();
manager.push(function() { running = false; });
asyncDo(something1, function(result) {
if(!running) return;
// do your real thing
});
}
function task1() {
var running = cancelAll();
manager.push(function() { running = false; });
asyncDo(something2, function(result) {
if(!running) return;
// do your real thing
});
}
And you can put cancels on as many aspects as you need. If you need to cancel running XHRs, you might be able to do so, if you have multiple steps in your result handling, cut off at each step start, etc.

This sounds like an ideal use-case for promises. Basically, whenever a new request is made, you want to cancel any existing promises. I am not versed in AngularJS, but the following ng-specific links might prove useful:
Angularjs how to cancel resource promise when switching routes
Canceling A Promise In AngularJS

Related

IndexedDB - During upgrade force abort, but with Promises

I used this post: IndexedDB: upgrade with promises?
And implemented the part here: https://stackoverflow.com/a/25565755/15778635
This works for what I need. the part I am having trouble with is this:
var newMigrationPromise = function (dbName, version, migration) {
return newPromise(function (deferred) {
var request = indexedDB.open(dbName, version);
// NB: caller must ensure upgrade callback always called
request.onupgradeneeded = function (event) {
var db = request.result;
newTransactionPromise(
function () {
var syncUPStore = transaction.objectStore("syncUP");
var syncCountRequest = syncUPStore.count();
syncCountRequest.oncomplete = function (event) {
if (syncCountRequest.result > 0)
deferred.reject(syncCountRequest.result + " SyncUp Records exist, database upgrade aborted, keeping at current version.");
else {
//Good, continue with update
migration(db, request.transaction);
return request.transaction;
}
}
})
.then(function () { db.close(); })
.then(deferred.resolve, deferred.reject);
};
request.onerror = function (ev) { deferred.reject(request.error); };
});
};
I have a syncUP object store that has data that needs to be sent to the server when the user goes online. In this particular case the service worker is installing (because they came online and a change was put on the server) and needs to know if syncUP records exist prior to allowing the service worker to update. If they do exist then it needs to abort the install until it is empty.
The service worker abort works fine, and the database aborting upgrade works fine if I were to throw an error where var syncCountRequest = syncUPStore.count(); is.
My question:
How can I check if there are records in the "syncUP" object store and still use the implementation I mentioned above? I had considered moving the logic to another method, but I found I was having the same issue of not knowing how to handle the reject/resolve. My Promises knowledge is ok, but not good enough yet to figure it out on my own.
a rushed example:
var request = indexedDb.open(...);
request.onupgradeneeded = function(event) {
if(conditionShouldDoMigrationFromVersionXToNowIsTrue) {
migrate(event.transaction);
}
};
function migrate(versionChangeTransaction) {
var store = versionChangeTransaction.objectStore('x');
var request = store.getAll();
request.onsuccess = function(event) {
var objects = event.target.result;
for (var object of objects) {
// do some mutation to the object
object.x++;
// write it back
store.put(object);
}
};
}

AngularJS $q.all - wait between http calls

So I have a situation where I need to perform a bunch of http calls, then once they are complete, continue on to the next step in the process.
Below is the code which does this and works fine.
However, I now need to wait a few seconds between each of the http calls. Is there a way to pass in a timeout with my current set up, or will it involve a good bit of refactoring?
Can post more code if needs be. I have tried passing in a timeout config varable into the http call, however, they still get fired at the same time.
Any advice would be great.
Code
var allThings = array.map(function(object) {
var singleThingPromise = getFile(object.id);
return singleThingPromise;
});
$q.all(allThings).then(function() {
deferred.resolve('Finished');
}, function(error) {
deferred.reject(error);
});
Instead of using $q.all, you might want to perform sequential calls one on success of previous and probably with use of $timeout. Maybe you could build a recursive function.
Something like this..
function performSequentialCalls (index) {
if(angular.isUndefined(array[index])) {
return;
}
getFile(array[index].id).then(function() {
$timeout(function() {
performSequentialCalls(index + 1)
}, 1000) // waiting 1 sec after each call
})
}
Inject required stuff properly. This assumes array to contain objects with ids using which you perform API calls. Also assumes that you are using $http. If using $resource, add $promise accordingly.
Hope that helps a bit!
function getItemsWithDelay(index) {
getFile(object[index].id).then(()=>{
setTimeout(()=>{
if(index+1 > object.length) { return }
getItemsWithDelay(index+1)
}, 5000)
})
}
You can make sequential calls
This is a awesome trick question to be asked in an interview, anyways I had a similar requirement and did some research on the internet and thanks to reference https://codehandbook.org/understanding-settimeout-inside-for-loop-in-javascript
I was able to delay all promise call in angularjs and the same can be applied in normal JS syntax as well.
I need to send tasks to a TTP API, and they requested to add a delay in each call
_sendTasks: function(taskMeta) {
var defer = $q.defer();
var promiseArray = [];
const delayIncrement = 1000 * 5;
let delay = 0;
for (i = 0; i < taskMeta.length; i++) {
// using 'let' keyword is VERY IMPORTANT else 'var' will send the same task in all http calls
let requestTask = {
"action": "SOME_ACTION",
"userId": '',
"sessionId": '',
};
// new Promise can be replaced with $q - you can try that, I haven't test it although.
promiseArray.push(new Promise(() => setTimeout(() => $http.post(config.API_ROOT_URL + '/' + requestTask.action, requestTask), delay)));
delay += delayIncrement;
}
$q.all(promiseArray).
then(function(results) {
// handle the results and resolve it at the end
defer.resolve(allResponses);
})
.catch(error => {
console.log(error);
defer.reject("failed to execute");
});
return defer.promise;
}
Note:: using 'let' keyword in FOR loop is VERY IMPORTANT else 'var' will send the same task in all http calls - due to closure/context getting switched

How do I wait for a promise to fill and then return a generator function?

I know this is wrong, but essentially I want to
connect to a db/orm via a promise
wait on that promise to fulfill and get the models (the return from the promise)
use the results for form a middleware generator function to place the models on the request
I suspect that this isn't the best approach, so essentially I have two questions:
Should I be rewriting my db/orm connect to a generator function (I have a feeling that is more inline with koa style)
Back to the original question (as I am sure I will not get a chance to rewrite all my business logic) - how do I wait on a promise to fulfill and to then return a generator function?
This is my poor attempt - which is not working, and honestly I didn't expect it to, but I wanted to start by writing code, to have something to work with to figure this out:
var connectImpl = function() {
var qPromise = q.nbind(object.method, object);
return qPromise ;
}
var domainMiddlewareImpl = function() {
let connectPromise = connectImpl()
return connectPromise.then(function(models){
return function *(next){
this.request.models = models ;
}
})
}
var app = koa()
app.use(domainMiddlewareImpl())
According to this, you can do the following:
var domainMiddlewareImpl = function() {
return function *(){
this.request.models = yield connectImpl();
};
};
A context sensitive answer based on the info provided by Hugo (thx):
var connectImpl = function() {
var qPromise = q.nbind(object.method, object);
return qPromise ;
}
var domainMiddlewareImpl = function () {
let models = null ;
return function *(next) {
if(models == null){
//////////////////////////////////////////////////////
// we only want one "connection", if that sets up a
// pool so be it
//////////////////////////////////////////////////////
models = yield connectImpl() ;
}
this.request.models = models.collections;
this.request.connections = models.connections;
yield next
};
};
My example the connectImpl is setting up domain models in an ORM (waterline for now), connecting to a database (pooled), and returning a promise for the ORM models and DB connections. I only want that to happen once, and then for every request through my Koa middleware, add the objects to the request.

angularjs - $http reading json and wait for callback

I am trying to read data from json and wait until data will be fetched into $scope.urls.content. So I write code:
$scope.urls = { content:null};
$http.get('mock/plane_urls.json').success(function(thisData) {
$scope.urls.content = thisData;
});
And now I am trying to write something like callback but that doesn't work. How can i do that? Or is there any function for this? I am running out of ideas ;/
Do you mean that ?
$http.get('mock/plane_urls.json').success(function(thisData) {
$scope.urls.content = thisData;
$scope.yourCallback();
});
$scope.yourCallback = function() {
// your code
};
You want to work with promises and $resource.
As $http itself returns a promise, all you got to do is to chain to its return. Simple as that:
var promise = $http.get('mock/plane_urls.json').then(function(thisData) {
$scope.urls.content = thisData;
return 'something';
});
// somewhere else in the code
promise.then(function(data) {
// receives the data returned from the http handler
console.log(data === "something");
});
I made a pretty simple fiddle here.
But if you need to constantly call this info, you should expose it through a service, so anyone can grab its result and process it. i.e.:
service('dataService', function($http) {
var requestPromise = $http.get('mock/plane_urls.json').then(function(d) {
return d.data;
});
this.getPlanesURL = function() {
return requestPromise;
};
});
// and anywhere in code where you need this info
dataService.getPlanesURL().then(function(planes) {
// do somehting with planes URL
$scope.urls.content = planes;
});
Just an important note. This service I mocked will cache and always return the same data. If what you need is to call this JSON many times, then you should go with $resource.

node-mysql timing

i have a recursive query like this (note: this is just an example):
var user = function(data)
{
this.minions = [];
this.loadMinions = function()
{
_user = this;
database.query('select * from users where owner='+data.id,function(err,result,fields)
{
for(var m in result)
{
_user.minions[result[m].id] = new user(result[m]);
_user.minions[result[m].id].loadMinions();
}
}
console.log("loaded all minions");
}
}
currentUser = new user(ID);
for (var m in currentUser.minions)
{
console.log("minion found!");
}
this don't work because the timmings are all wrong, the code don't wait for the query.
i've tried to do this:
var MyQuery = function(QueryString){
var Data;
var Done = false;
database.query(QueryString, function(err, result, fields) {
Data = result;
Done = true;
});
while(Done != true){};
return Data;
}
var user = function(data)
{
this.minions = [];
this.loadMinions = function()
{
_user = this;
result= MyQuery('select * from users where owner='+data.id);
for(var m in result)
{
_user.minions[result[m].id] = new user(result[m]);
_user.minions[result[m].id].loadMinions();
}
console.log("loaded all minions");
}
}
currentUser = new user(ID);
for (var m in currentUser.minions)
{
console.log("minion found!");
}
but he just freezes on the while, am i missing something?
The first hurdle to solving your problem is understanding that I/O in Node.js is asynchronous. Once you know how this applies to your problem the recursive part will be much easier (especially if you use a flow control library like Async or Step).
Here is an example that does some of what you're trying to do (minus the recursion). Personally, I would avoid recursively loading a possibly unknown number/depth of records like that; Instead load them on demand, like in this example:
var User = function(data) {
this.data = data
this.minions;
};
User.prototype.getMinions = function(primaryCallback) {
var that = this; // scope handle
if(this.minions) { // bypass the db query if results cached
return primaryCallback(null, this.minions);
}
// Callback invoked by database.query when it has the records
var aCallback = function(error, results, fields) {
if(error) {
return primaryCallback(error);
}
// This is where you would put your recursive minion initialization
// The problem you are going to have is callback counting, using a library
// like async or step would make this party much much easier
that.minions = results; // bypass the db query after this
primaryCallback(null, results);
}
database.query('SELECT * FROM users WHERE owner = ' + data.id, aCallback);
};
var user = new User(someData);
user.getMinions(function(error, minions) {
if(error) {
throw error;
}
// Inside the function invoked by primaryCallback(...)
minions.forEach(function(minion) {
console.log('found this minion:', minion);
});
});
The biggest thing to note in this example are the callbacks. The database.query(...) is asynchronous and you don't want to tie up the event loop waiting for it to finish. This is solved by providing a callback, aCallback, to the query, which is executed when the results are ready. Once that callback fires and after you perform whatever processing you want to do on the records you can fire the primaryCallback with the final results.
Each Node.js process is single-threaded, so the line
while(Done != true){};
takes over the thread, and the callback that would have set Done to true never gets run because the thead is blocked on an infinite loop.
You need to refactor your program so that code that depends on the results of the query is included within the callback itself. For example, make MyQuery take a callback argument:
MyQuery = function(QueryString, callback){
Then call the callback at the end of your database.query callback -- or even supply it as the database.query callback.
The freezing is unfortunately correct behaviour, as Node is single-threaded.
You need a scheduler package to fix this. Personally, I have been using Fibers-promise for this kind of issue. You might want to look at this or another promise library or at async

Categories

Resources