We provide a drop down option at the top..let's say it has options A B C.
Whenever user changes the drop down option, a saga gets triggered which makes around 10 different webapi calls.( A map of calls which get executed in parallel)
We use takeLatest helper in saga watcher.
So if user immediately changes the dropdown from A to B., Only calls made via B are consumed at client end as takeLatest aborts saga that got triggered by A.
Now , the problem is though only B's calls are consumed, all the A's promise calls that are in pending, also run to completion.
Only saga (because of A) gets aborted but not the api calls.
We can see that in network tab.
So, if user changes between A B C rather quickly, we would have around 30-40 calls to server.
All of those run to completion though we are just interested in completion of last 10 calls.
If we see in chrome dev tools, last 10 calls are in queue or stalled until the above ones are done.
Isn't there a way to propagate the cancellation from saga(takeLatest level)to axios and there by cancelling promises.
Axios docs say about cancellation and tokens but I'm not clear about how to propagate cancel in a redux -Saga and axios environment.
How to initiate cancel from saga to axios?
const cancelSource = axios.CancelToken.source()
try {
yield all([
call(axios.get, "/api/1", { cancelToken: cancelSource.token }),
call(axios.get, "/api/2", { cancelToken: cancelSource.token }),
/// ...
])
} finally {
if (yield cancelled()) {
yield call(cancelSource.cancel)
}
}
If someone is interested in a working solution to this exact problem, I am creator of redux saga library addon specifically to those kind of problems - https://github.com/klis87/redux-saga-requests
It does those kind of things automatically for you - if a saga in cancelled, request is aborted automatically, so you dont even need to think about it.
The top answer helped me, however, I had to use this call
yield call(cancelSource.cancel)
Related
I'm having trouble understanding control flow with asynchronous programming in JS. I come from classic OOP background. eg. C++. Your program starts in the "main" -- top level -- function and it calls other functions to do stuff, but everything always comes back to that main function and it retains overall control. And each sub-function retains control of what they're doing even when they call sub functions. Ultimately the program ends when that main function ends. (That said, that's about as much as I remember of my C++ days so answers with C++ analogies might not be helpful lol).
This makes control flow relatively easy. But I get how that's not designed to handle event driven programming as needed on something like a web server. While Javascript (let's talk node for now, not browser) handles event-driven web servers with callbacks and promises, with relative ease... apparently.
I think I've finally got my head around the idea that with event-driven programming the entry point of the app might do little more than set up a bunch of listeners and then get out of the way (effectively end itself). The listeners pick up all the action and respond.
But sometimes stuff still has to be synchronous, and this is where I keep getting unstuck.
With callbacks, promises, or async/await, we can effectively build synchronous chains of events. eg with Promises:
doSomething()
.then(result => doSomethingElse(result))
.then(newResult => doThirdThing(newResult))
.then(finalResult => {
console.log(`Got the final result: ${finalResult}`);
})
.catch(failureCallback);
});
Great. I've got a series of tasks I can do in order -- kinda like more traditional synchronous programming.
My question is: sometimes you need to deviate from the chain. Ask some questions and act differently depending on the answers. Perhaps conditionally there's some other function you need to call to get something else you need along the way. You can't continue without it. But what if it's an async function and all it's going to give me back is a promise? How do I get the actual result without the control flow running off and eloping with that function and never coming back?
Example:
I want to call an API in a database, get a record, do something with the data in that record, then write something back to the database. I can't do any of those steps without completing the previous step first. Let's assume there aren't any sync functions that can handle this API. No problem. A Promise chain (like the above) seems like a good solution.
But... Let's say when I call the database the first time, the authorization token I picked up earlier for it has expired and I have to get a new one. I don't know that until I make that first call. I don't want to get (or even test for the need for) a new auth token every time. I just want to be able to respond when a call fails because I need one.
Ok... In synchronous pseudo-code that might look something like this:
let Token = X
Step 1: Call the database(Token). Wait for the response.
Step 2: If response says need new token, then:
Token = syncFunctionThatGetsAndReturnsNewToken().
// here the program waits till that function is done and I've got my token.
Repeat Step 1
End if
Step 3: Do the rest of what I need to do.
But now we need to do it in Javascript/node with only async functions, so we can use a promise (or callback) chain?
let Token = X
CallDatabase(Token)
.then(check if response says we need new token, and if so, get one)
.then(...
Wait a sec. That "if so, get one" is the part that's screwing me. All this asynchronicity in JS/node isn't going to wait around for that. That function is just going to "promise" me a new token sometime in the future. It's an IOU. Great. Can't call the database with an IOU. Well ok, I'd be happy to wait, but node and JS won't let me, because that's blocking.
That's it in a (well, ok, rather large) nutshell. What am I missing? How do I do something like the above with callbacks or Promises?
I'm sure there's a stupid "duh" moment in my near future here, thanks to one or more of you wonderful people. I look forward to it. 😉 Thanks in advance!
What you do with the .then call is to attach a function which will run when the Promise resolves in a future task. The processing of that function is itself synchronous, and can use all the control flows you'd want:
getResponse()
.then(response => {
if(response.needsToken)
return getNewToken().then(getResponse);
})
.then(() => /* either runs if token is not expired or token was renewed */)
If the token is expired, instead of directly scheduling the Promise returned by .then, a new asynchronous action gets started to retrieve a new token. If that asynchronous action is done, in a new task it'll resolve the Promise it returns, and as that Promise was returned from the .then callback, this will also then resolve the outer Promise and the Promise chain continues.
Note that these Promise chains can get complicated very quick, and with async functions this can be written more elegantly (though under the hood it is about the same):
do {
response = await getResponse();
if(response.needsToken)
await renewToken();
} while(response.needsToken)
Fist of all, I would recommend against using then and catch method to listen to Promise result. They tend to create a too nested code which is hard to read and maintain.
I worked a prototype for your case which makes use of async/await. It also features a mechanism to keep track of attempts we are making to authenticate to database. If we reach max attempts, it would be viable to send an emergency alert to administrator etc for notification purposes. This avoid the endless loop of trying to authenticate and instead helps you to take proper actions.
'use strict'
var token;
async function getBooks() {
// In case you are not using an ORM(Sequelize, TypeORM), I would suggest to use
// at least a query builder like Knex
const query = generateQuery(options);
const books = executeQuery(query)
}
async function executeQuery(query) {
let attempts = 0;
let authError = true;
if (!token) {
await getDbAuthToken();
}
while (attemps < maxAttemps) {
try {
attempts++;
// call database
// return result
}
catch(err) {
// token expired
if (err.code == 401) {
await getDbAuthToken();
}
else {
authError = false;
}
}
}
throw new Error('Crital error! After several attempts, authentication to db failed. Take immediate steps to fix this')
}
// This can be sync or async depending on the flow
// how the auth token is retrieved
async function getDbAuthToken() {
}
I have a callback which I need to return true or false to trigger some external action, I need to make an API call inside this callback, so I need to get the state and dispatch some actions inside the callback, I don't know if I can use eventChannel because this callback could not be a generator, only a plain function. I need to do something like this.
zafClient.on('ticket.save', () => {
const state = yield select();
yield put(actionWithApiCall())
// I need to wait for the action to finish and then return something
// based on the response from the API
// I know how to block the saga to wait for the action dispatched
// the problem is that I can't use yield here
return somethingfromthestore;
});
Btw, this is zendesk API.
Your not going to be able to pass a generator function to that API. The work around is to dispatch an action directly to the redux store and then write a saga that listens for that action.
zafClient.on('ticket.save', () => reduxStore.dispatch(actionWithApiCall()))
You will have to make the redux store exportable from where you create it. So that you can directly access it here.
One of the challenges is how to yield in the callback. How about importing dispatch? Isn't dispatch the non-generator version of yield?
Should you be using React, doing so seems to be an option.
In the same saga that you use to specify the callback, you can subsequently listen for the action created in the dispatch/action creator callback. That's done by specifying another watchFor function (in my setup, any number of exported watchFor functions get added to the pool of saga watchers).
The paired saga worker can finalize whatever needs to happen to the data returned from the API call before using your final action creator to "document" the workflow by updating the store.
The other option might be to wrap the api call in an eventChannel (see: https://github.com/redux-saga/redux-saga/issues/1178).
- E
Can someone explain when you would use a dispatch versus a commit?
I understand a commit triggers mutation, and a dispatch triggers an action.
However, isn't a dispatch also a type of action?
As you rightly said, $dispatch triggers an action, and commit triggers a mutation. Here is how you can use these concepts:
You always use $dispatch from your methods in routes / components. $dispatch sends a message to your vuex store to do some action. The action may be done anytime after the current tick, so that your frontend performance is not affected.
You never commit from any of your components / routes. It is done only from within an action, and only when you have some data to commit. Reason: commit is synchronous and may freeze your frontend till it is done.
Let's consider this case: If you have to fetch some json data from server. In this case, you need to do this asynchronously so that your user interface is not unresponsive / frozen for a while. So, you simply $dispatch an action and expect it to be done later. Your action takes up this task, loads data from server and updates your state later.
If you need to know when an action is finished, so that you can display an ajax spinner till then, you may return a Promise as explained below (example: load current user):
Here is how you define the "loadCurrentUser" action:
actions: {
loadCurrentUser(context) {
// Return a promise so that calling method may show an AJAX spinner gif till this is done
return new Promise((resolve, reject) => {
// Load data from server
// Note: you cannot commit here, the data is not available yet
this.$http.get("/api/current-user").then(response => {
// The data is available now. Finally we can commit something
context.commit("saveCurrentUser", response.body) // ref: vue-resource docs
// Now resolve the promise
resolve()
}, response => {
// error in loading data
reject()
})
})
},
// More actions
}
In your mutations handler, you do all the commits originating from actions. Here is how you define the "saveCurrentUser" commit:
mutations: {
saveCurrentUser(state, data) {
Vue.set(state, "currentUser", data)
},
// More commit-handlers (mutations)
}
In your component, when it is created or mounted, you just call the action as shown below:
mounted: function() {
// This component just got created. Lets fetch some data here using an action
// TODO: show ajax spinner before dispatching this action
this.$store.dispatch("loadCurrentUser").then(response => {
console.log("Got some data, now lets show something in this component")
// TODO: stop the ajax spinner, loading is done at this point.
}, error => {
console.error("Got nothing from server. Prompt user to check internet connection and try again")
})
}
Returning a Promise as shown above is entirely optional and also a design decision not preferred by everyone. For a detailed discussion on whether to return a Promise or not, you may read the comments under this answer: https://stackoverflow.com/a/40167499/654825
I have a bootstrapped extension which interacts with the chrome part of Firefox (i.e. even before the content loads), and needs to query an SQLite database for some check. I would prefer a sync call. But, since a sync call is bad in terms of performance and can cause possible UI issues, I need to make an async DB call.
My use case is such:
Make aysnc call to database
Once completed do further processing
Now, this can be easily handled by placing 'further processing' part in handleCompletion part of executeAsync function.
But, I want the 'further processing' to be done irrespective of this statement being executed i.e. This DB lookup may or may not happen. If it doesn't happen well and good, go ahead. If it does I need to wait.
So, I am using a flag based strategy; I set a flag handleCompletionCalled in handleError & handleCompletion callback to true.
In the further processing part, I do a
while(handleCompletionCalled) {
// do nothing
}
//further processing
Is this a good strategy or can I do something better ( I don't really want to use Observers, etc. for this since I have many such cases in my entire extension and my code will be filled with Observers)?
Using a while loop to wait is a seriously Bad Ideaâ„¢. If you do, the result will be that you hang the UI, or, at a minimum, drive CPU usage through the roof by rapidly running though your loop a large number of times as fast as possible.1
The point about asynchronous programming is that you start an action and then another function, a callback, is executed once the activity is completed, or fails. This either allows you to start multiple actions, or to relinquish processing to some other part of the overall code. In general, this callback should handle all activity that is dependent on the completion of the asynchronous action. The callback function, itself, does not have to include the code to do the other processing. After it has done what needs to happen in response to the async action completing, it can just call another function like doOtherProcessing().
If you launch multiple asynchronous, actions you can then wait for the completion of all of them by having flags for each task and a single function that is called at the end of all the different callback functions like:
function continueAfterAllDone(){
if(task1Done && task2Done && task3Done && task4Done) {
//do more processing
}else{
//Not done with everything, yet.
return;
}
}
This could be extended to an arbitrary number of tasks by using an array, or task queue, which the function then checks to see if all of those are completed rather than a hard-coded set of tasks.
Waiting:
If you are going to have another processing path which executes, but then must wait for the completion of the asynchronous action(s), you should have the wait performed by setting up a timer, or interval. You then yield the processor for a specified period of time until you check again to see if the conditions you need to proceed have occurred.
In a bootstrapped add-on, you will probably need to use the nsITimer interface to implement a timeout or interval timer. This is needed because at the time you are running your initialization code it is possible that no <window> exists (i.e. there may be no possibility to have access to a window.setTimeout()).
If you are going to implement a wait for some other task, you could do it something like:
const Cc = Components.classes;
const Ci = Components.interfaces;
var asyncTaskIsDone = false;
var otherProcessingDone = false;
// Define the timer here in case we want to cancel it somewhere else.
var taskTimeoutTimer;
function doStuffSpecificToResultsOfAsyncAction(){
//Do the other things specific to the Async action callback.
asyncTaskIsDone = true;
//Can either call doStuffAfterOtherTaskCompletesOrInterval() here,
// or wait for the timer to fire.
doStuffAfterBothAsyncAndOtherTaskCompletesOrInterval();
}
function doStuffAfterBothAsyncAndOtherTaskCompletesOrInterval(){
if(asyncTaskIsDone && otherProcessingDone){
if(typeof taskTimeoutTimer.cancel === "function") {
taskTimeoutTimer.cancel();
}
//The task is done
}else{
//Tasks not done.
if(taskTimeoutTimer){
//The timer expired. Choose to either continue without one of the tasks
// being done, or set the timer again.
}
//}else{ //Use else if you don't want to keep waiting.
taskTimeoutTimer = setTimer(doStuffAfterBothAsyncAndOtherTaskCompletesOrInterval
,5000,false)
//}
}
}
function setTimer(callback,delay,isInterval){
//Set up the timeout (.TYPE_ONE_SHOT) or interval (.TYPE_REPEATING_SLACK).
let type = Ci.nsITimer.TYPE_ONE_SHOT
if(isInterval){
type = Ci.nsITimer.TYPE_REPEATING_SLACK
}
let timerCallback = {
notify: function notify() {
callback();
}
}
var timer = Cc["#mozilla.org/timer;1"].createInstance(Ci.nsITimer);
timer.initWithCallback(timerCallback,delay,type);
return timer;
}
function main(){
//Launch whatever the asynchronous action is that you are doing.
//The callback for that action is doStuffSpecificToResultsOfAsyncAction().
//Do 'other processing' which can be done without results from async task here.
otherProcessingDone = true;
doStuffAfterBothAsyncAndOtherTaskCompletesOrInterval();
}
Initialization code at Firefox startup:
The above code is modified from what I use for delaying some startup actions which do not have to be done prior to the Firefox UI being displayed.
In one of my add-ons, I have a reasonable amount of processing which should be done, but which is not absolutely necessary for the Firefox UI to be shown to the user. [See "Performance best practices in extensions".] Thus, in order to not delay the UI, I use a timer and a callback which is executed 5 seconds after Firefox has started. This allows the Firefox UI to feel more responsive to the user. The code for that is:
const Cc = Components.classes;
const Ci = Components.interfaces;
// Define the timer here in case we want to cancel it somewhere else.
var startupLaterTimer = Cc["#mozilla.org/timer;1"].createInstance(Ci.nsITimer);
function startupLater(){
//Tasks that should be done at startup, but which do not _NEED_ to be
// done prior to the Firefox UI being shown to the user.
}
function mainStartup(){
let timerCallback = {
notify: function notify() {
startupLater();
}
}
startupLaterTimer = startupLaterTimer.initWithCallback(timerCallback,5000
,Ci.nsITimer.TYPE_ONE_SHOT);
}
Note that what is done in startupLater() does not, necessarily, include everything that is needed prior to the ad-on being activated by the user for the first time. In my case, it is everything which must be done prior to the user pressing the add-on's UI button, or invoking it via the context menu. The timeout could/should be longer (e.g. 10s), but is 5s so I don't have to wait so long for testing while in development. Note that there are also one-time/startup tasks that can/should be done only after the user has pressed the add-on's UI button.
1. A general programming issue here: In some programming languages, if you never yield the processor from your main code, your callback may never be called. In such case, you will just lock-up in the while loop and never exit.
I want to model the following async logic using redux:
User action triggers a chain of async API calls.
Any API call might return 401 status (login timed out)
If API responds with 401, display re-login popup
On successful re-login, reissue API call and continue
I am not sure where to put this logic. Actions don't know about other actions, they only have access to dispatch, so they can't stop and wait for them to complete. Reducers don't have access to dispatch, so I can't put it there… so where does it live? Custom middleware? store.listen? In a smart component?
I'm currently using redux-promise-middleware & redux-thunk. How would one best organise this type of flow – without requiring buy-in into something like redux-saga or redux-rx, etc?
Also not sure best way to transparently interrupt the API call to perform those other actions i.e. API call shouldn't trigger its completed or failed actions until after the optional login process completes.
It sounds to me like you'd want an action creator that generates a Thunk, and keep all that logic in the Thunk. There's really no other good way to preserve the association between your suite of API calls, and ensure that all the others are cancelled if one fails.
In that Thunk, you'd fire your API calls, and collect their promises:
const call1 = promiseGenerator1();
const call2 = promiseGenerator2();
const call3 = promiseGenerator3();
const allCallPromises = [call1, call2, call3];
Use an all() promise handler to monitor them:
const watcher = Promise.all(allCallPromises).then(allSuccess, anyFail);
Your fail handler will:
cancel the rest of the promises if any of them 401's. (Note, this requires a library like Bluebird that has cancellation semantics, or some other form of augmentation of your promise/request.)
dispatch an action or route-change to trigger the re-login window
anyFail(error) => {
if (error.status === 401) {
allCallPromises.forEach((item)=> {item.cancel();});
reLogin();
}
}
Then, I'd be inclined to let your relogin component worry about re-firing that same complex action again, to issue all the calls.
However, should your suite of API calls be somehow variable or context-specific, you could cache on the store the ones you need, from inside the anyFail handler. Have a reducer where you can stash an actionPendingReLogin. Compose an action that will re-fire the same calls as last time, and then dispatch it:
dispatch(createAction('CACHE_RELOGIN_ACTION`, actionObjectToSaveForLater));
(Or, just cache whatever action-creator you used.)
Then, following successful relogin, you can:
const action = store.getState('actionPendingReLogin');
dispatch(action);
// or:
const actionCreator = store.getState('actionPendingReLogin');
dispatch(actionCreator());
Oh: and in your allSuccess handler you'd simply dispatch the results of the async calls.