Is it possible to wait for all actions in the ofType operator to be called before calling next? - javascript

I have an array of redux actions which when ALL have been called I want to call another action in my epic. The actions in the array are changeable, dynamic. What I'm hoping to find would be the equivalent of the ofType operator working like this:
ofType(FETCH_USER_PROFILE_SUCCESS && FETCH_USER_PREFERENCES_SUCCESS)
When both actions have been called, another is called further down the pipe. Of course the ofType operator doesn't work like this. It only needs one of the actions to be called to continue.
I have read that combineLatest might be the solution here but I've had no luck. Below is what I imagine the solution could look like but obviously this doesn't work for me. Any suggestions would be appreciated.
const actions = ['FETCH_USER_PROFILE_SUCCESS', 'FETCH_USER_PREFERENCES_SUCCESS'];
const asTypes = actions.map((action) => ofType(action);
const userEpic = (action$, state$) =>
action$.pipe(
combineLatest(asTypes);
mergeMap(() =>
of({
type: 'SET_USER_READY'
})
)
);
Following on from replies below:
Just to clear what I meant by dynamic. The array of actions might include more actions depending on a user's profile (e.g. 'FETCH_USER_POSTS_SUCCESS'), essentially an unknown number of actions depending on the user.
So, is possible to generate this dynamically:
zip(
action$.pipe(ofType('FETCH_USER_PROFILE_SUCCESS')),
action$.pipe(ofType('FETCH_USER_PREFERENCES_SUCCESS')),
)
const actions = ['FETCH_USER_PROFILE_SUCCESS', 'FETCH_USER_PREFERENCES_SUCCESS', unknown number of more actions];
const actions = dynamic.forEach((action) => action$.pipe(ofType(action))
zip(
actions
)

why not making one general action like executeAllActions and inside this action payload you pass an array which contain all the references for all the actions you want like
executeAllActions = {
type: 'exectueAllActions',
actionsList: [ fetchCategories, fetchProducts, ...],
}
then you listen for this action
this.actions$.pipe(
ofType(executeAllActions),
concatMap(action => [ ...actions.actionsList ])
)
where we used concatMap to return a list of actoins that will all be executed.
I wish I can try this idea now, but I think it somehow possible.
If it works for you please let me know.

There's numerous ways of approaching this because there are some important missing details. But I'll make some guesses :)
The biggest is whether the order in which they arrive matters. If it does not, using zip() is probably what you want. It will pair emissions 1:1 (actually any number of observables, not just two, e.g. 1:1:1 for three, etc)
Even in that case, there's the question of what you want to do if the rate you receive the actions varies between them. What happens if you receive FETCH_USER_PROFILE_SUCCESS's twice as fast as often as you receive FETCH_USER_PREFERENCES_SUCCESS's? Should you continue to buffer them? Do you drop them? Do you error?
If buffering them is fine or you otherwise provide guarantees they will always happen at the same rate, zip() is works.
Now if you want to make sure they come in some sequence, things get more complicated still. You'd likely have a chain utilizing mergeMap, switchMap, exhaustMap, or concatMap. Deciding which depends again on what you want to do if the rate you get the actions isn't always the same.
import { zip } from 'rxjs';
import { map, take, switchMap } from 'rxjs/operators';
import { ofType } from 'redux-observable';
// Pair up the emissions 1:1 but without regard to the sequence
// in which they are received i.e. they can be in either order
const userEpic = (action$, state$) =>
zip(
action$.pipe(ofType('FETCH_USER_PROFILE_SUCCESS')),
action$.pipe(ofType('FETCH_USER_PREFERENCES_SUCCESS')),
// ...etc more observables
).pipe(
map(() => ({
type: 'SET_USER_READY'
}))
);
// FETCH_USER_PROFILE_SUCCESS must come before FETCH_USER_PREFERENCES_SUCCESS
// and if we receive another FETCH_USER_PROFILE_SUCCESS before we get
// the FETCH_USER_PREFERENCES_SUCCESS, start over again (switchMap)
const userEpic = (action$, state$) =>
action$.pipe(
ofType('FETCH_USER_PROFILE_SUCCESS'),
switchMap(() =>
action$.pipe(
ofType('FETCH_USER_PREFERENCES_SUCCESS'),
take(1) // important! we only want to wait for one
)
),
map(() => ({
type: 'SET_USER_READY'
}))
)
The actions in the array are changeable, dynamic
It's not clear how you want them to change, so I didn't include a mechanism for doing so.
--
There's also a similar question here: https://stackoverflow.com/a/45554647/1770633

Related

Redux - How to handle multiple actions of the same type at the same time

I have a list of songs, every row has a like functionality. When I click on the like button a LIKE_SONG action is being triggered, then a async API request is made and we can receive either a LIKE_SONG_SUCCESS action or an LIKE_SONG_FAILURE action with an error.
To handle different error/success/loading states I have a BaseStore that looks like this:
export type ActionType<T> = {
type: string,
payload?: T,
};
export type ActionStore = {
actions: Array<ActionType<any>>,
errorActions: Array<ActionType<any>>,
successActions: Array<ActionType<any>>,
};
and I specify start/stop/error action creators that take any action as an argument and store them in the ActionStore. So in ALL of my sagas I have something like this:
yield put(startAction(someAction));
try {
// do something
} catch (error) {
yield put(errorAction(someAction, error));
} finally {
yield put(stopAction(someAction));
}
Next, I've created some custom selectors for picking up the actions that are loading/failed/succeeded e.g
export const checkLoadingSelector = (
state: ActionStore,
actionsToCheck: Array<string>
): boolean => {
const { actions } = state;
return actions.some((action) => {
return actionsToCheck.includes(action.type);
});
};
Which I'm using later in some component containers like this:
const mapStateToProps = ({ actionStore }) => {
return {
isSongLikeLoading: checkLoadingSelector(actionStore, [songActions.LIKE_SONG]),
};
};
This is working fine for most of the time. The PROBLEM starts if there are multiple actions with the same name. Let's say I click quickly on 5 different song like buttons and trigger 5 LIKE_SONG actions. There are now 2 problems:
The isSongLikeLoading from the mapStateToProps will be set to TRUE until the last of those 5 requests finishes which is not good.
When the first LIKE_SONG actions completes the stopAction(someAction) is being triggered which removes a LIKE_SONG action from the ActionStore actions array. The problem again is that LIKE_SONG is not unique and at that time we have 5 LIKE_SONG actions thus we can remove the wrong one which will produce wrong UI state.
I don't want a working code solution for this problem, that's not what I'm searching for. I want to start a discussion on how to properly design my stores/actions to easily handle (and distinguish) multiple actions of the same type that have been triggered at the same time to provide a good user experience in my app. Most of the articles I read tackle really simple situation which do not have multiple actions of the same type at the same time thus I don't know if I'm doing something completely wrong and this issue does not exist in other projects or what's the case
My current idea for solving this issue is to add a unique id to those actions.
export type ActionType<T> = {
id: string // unique id generated probably using uuid() method
type: string,
payload?: T,
};
There are however two problems with this approach:
Not sure if this id field is a valid field, going through the Redux docs I see that the actions should have {type, payload, meta, error} fields only
The check checkLoadingSelector would not be so convinient to use anymore. The user instead of the type would need to insert a unique action id which first he would need to get by calling some action like:
const mapDispatchToProps = (dispatch) => {
return {
/* currently I return here void, in the solution I have in my mind I would return a string (the unique id of the action), not sure if this is possible however. Then in the component I would store the ongoing ids and base my loading state on those ids rather than on the action type (which is not unique) */
likeSong: (songId): string => {
dispatch(likeSongAction(songId));
},
};
};
Would this be a good way? Do you have some thoughts about it?
Just use batch:
import { batch } from 'react-redux'
function myThunk() {
return (dispatch, getState) => {
// should only result in one combined re-render, not two
batch(() => {
dispatch(increment())
dispatch(increment())
})
}
}
Your demand is pretty proper for redux-saga actually. I think there are 4 solutions to fix that.
Firstly you can just take the last request using with "takeLatest". (https://redux-saga.js.org/docs/api#takelatestpattern-saga-args)
Secondly you can use "takeLeading the trigger action for leading request. (https://redux-saga.js.org/docs/api#takeleadingpattern-saga-args)
Third, you can fork all requests and wait for all of them to complete. Fork (https://redux-saga.js.org/docs/api#forkfn-args)
And Latest, (this is the option I recommend) using "spawn" you can start completely independent requests from saga tree. (https://redux-saga.js.org/docs/api#spawnfn-args)

Redux-Observable: modify state and trigger follow up action

I have the following scenario in redux-observable. I have a component which detects which backend to use and should set the backend URL used by the api-client. Both the client and URL are held in the global state object.
The order of execution should be:
1. check backend
2. on error replace backend URL held in state
3. trigger 3 actions to load resources using new backend state URL
What i did so far is, in step 1. access the state$ object from within my epic and modify the backed URL. This seems to only half work. The state is updated by actions triggered in 3. still see the old state and use the wrong backend.
What is the standard way to update state in between actions if you depend on the order of execution?
My API-Epic looks like this:
export const authenticate = (action$, state$) => action$.pipe(
ofType(actions.API_AUTHENTICATE),
mergeMap(action =>
from(state$.value.apiState.apiClient.authenticate(state$.value.apiState.bearer)).pipe(
map(bearer => apiActions.authenticatedSuccess(bearer))
)
)
)
export const authenticatedSuccess = (action$, state$) => action$.pipe(
ofType(actions.API_AUTHENTICATED_SUCCESS),
concatMap(action => concat(
of(resourceActions.doLoadAResource()),
of(resourceActions.doLoadOtherResource()),
of(resourceActions.doLoadSomethingElse()))
)
)
A common approach I've found users discussing on GitHub & StackOverflow is chaining multiple epics, much like what I believe your example tries to demonstrate. The first epic dispatches an action when it's "done". A reducer listens for this action and updates the store's state. A second epic (or many additional epics if you want concurrent operations) listen for this same action and kick off the next sequence of the workflow. The secondary epics run after the reducers and thus see the updated state. From the docs:
Epics run alongside the normal Redux dispatch channel, after the reducers have already received them...
I have found the chaining approach works well to decouple phases of a larger workflow. You may want the decoupling for design reasons (such as separation of concerns), to reuse smaller portions of the larger workflow, or to make smaller units for easier testing. It's an easy approach to implement when your epic is dispatching actions in between the different phases of the larger workflow.
However, keep in mind that state$ is an observable. You can use it to get the current value at any point in time -- including between dispatching different actions inside a single epic. For example, consider the following and assume our store keeps a simple counter:
export const workflow = (action$, state$) => action$.pipe(
ofType(constants.START),
withLatestFrom(state$),
mergeMap(([action, state]) => // "state" is the value when the START action was dispatched
concat(
of(actions.increment()),
state$.pipe(
first(),
map(state => // this new "state" is the _incremented_ value!
actions.decrement()),
),
defer(() => {
const state = state$.value // this new "state" is now the _decremented_ value!
return empty()
}),
),
),
)
There are lots of ways to get the current state from the observable!
Regarding the following line of code in your example:
state$.value.apiState.apiClient.authenticate(state$.value.apiState.bearer)
First, passing an API client around using the state is not a common/recommended pattern. You may want to look at injecting the API client as a dependency to your epics (this makes unit testing much easier!). Second, it's not clear how the API client is getting the current backend URL from the state. Is it possible the API client is using a cached version of the state? If yes, you may want to refactor your authenticate method and pass in the current backend URL.
Here's an example that handles errors and incorporates the above:
/**
* Let's assume the state looks like the following:
* state: {
* apiState: {
* backend: "URL",
* bearer: "token"
* }
*/
// Note how the API client is injected as a dependency
export const authenticate = (action$, state$, { apiClient }) => action$.pipe(
ofType(actions.API_AUTHENTICATE),
withLatestFrom(state$),
mergeMap(([action, state]) =>
// Try to authenticate against the current backend URL
from(apiClient.authenticate(state.apiState.backend, state.apiState.bearer)).pipe(
// On success, dispatch an action to kick off the chained epic(s)
map(bearer => apiActions.authenticatedSuccess(bearer)),
// On failure, dispatch two actions:
// 1) an action that replaces the backend URL in the state
// 2) an action that restarts _this_ epic using the new/replaced backend URL
catchError(error$ => of(apiActions.authenticatedFailed(), apiActions.authenticate()),
),
),
)
export const authenticatedSuccess = (action$, state$) => action$.pipe(
ofType(actions.API_AUTHENTICATED_SUCCESS),
...
)
Additionally, keep in mind when chaining epics that constructs like concat will not wait for the chained epics to "finish". For example:
concat(
of(resourceActions.doLoadAResource()),
of(resourceActions.doLoadOtherResource()),
of(resourceActions.doLoadSomethingElse()))
)
If each of these doLoadXXX actions "starts" an epic, all three will likely run concurrently. Each action will be dispatched one after another, and each epic will "start" running one after another without waiting for the previous one to "finish". This is because epics never really complete. They're long-lived, never ending streams. You will need to explicitly wait on some signal that identifies when doLoadAResource completes if you want to doLoadOtherResource to run after doLoadAResource.

RxJs operator with unconditional execution in the end of each iteration

Is there a Rx operator or composition to guarantee some logic execution last per each observable emission?
Let's assume the following context:
endless or continuous sequence
conditional logic like filter() to skip some emissions
some logic in the end of each iteration in a doAlways()-like operator
Please refer to numbered comments in the code-sample below
Notes:
finalize() would require the sequence to terminate (violates p.1)
iif() or regular if inside switchMap() is an option but makes the code more unreadable
Code snippet to illustrate: Step (3) should execute always, per-iteration, last, i.e. we want always start and finish log in a doAlways()-like operator instead of tap()
import { of, interval } from 'rxjs';
import { tap, filter, switchMap } from 'rxjs/operators';
const dataService = { update: (z) => of(z /*posts data to back-end*/) };
const sub = interval(1000).pipe( // <-- (1)
tap(a => console.log('starting', a)),
filter(b => b % 100 === 0), // <-- (2)
switchMap(c => dataService.update(c)),
tap(d => console.log('finishing', d)) // <-- (3) should execute always even (2)
)
.subscribe(x => console.log(x));
No such operator exists, and that's because it can't exist. Once you filtered out a value, the resulting observable just doesn't emit it anymore. Any operator "downstream" just simply doesnt know about its existence.
To illustrate, you included a switchMap to some service, which depends on the emitted value. For obvious reasons that operator that cannot logically be applied if there is no value to switch on.
You would have to "tag" each value instead of filtering it and defer the filter to after the tap call, but even then scenarios like switching to another observable would require more detailed requirements.
Think of the observable as a conveyor belt on which you place items. Each operator is a room through which the belt leads. Inside each room a worker can decide what to do with each item: modify it, take it away, put new items in instead etc. However, each worker only sees what the conveyor belt brings along — they don't know what other rooms came before them or what has been done there.
To achieve what you want, the worker in the last room would have to know about an item that he never received, which would require additional knowledge they don't have.
Nope, what you ask is not possible due to the fact that filtered notifications won't ever be passed on. You would need a totally new type of notification that is not consumed by any other operator than the one you describe.
Now, here is an idea that is probably not recommended. You can misuse the error notification to skip some operators, but that will interfere with any other error handling so that's not something you should do...
const sub = interval(1000).pipe(
tap(a => console.log('starting', a)),
mergeMap(b => b % 100 === 0 ? of(b) : throwError(b)),
switchMap(c => dataService.update(c)),
catchError(b => of(b)),
tap(d => console.log('finishing', d))
)
Note that we don't use filter but map to either a next or an error notification depending on the condition. Errors will naturally be ignored by most operators and consumed by tap, subscribe or catchError. This is a way to put a tag on the item so the workers know they shouldn't touch it (in the analogy described by Ingo)
I would split it for 2 streams with partition, and them would merge them back.
https://stackblitz.com/edit/rxjs-8z9jit?file=index.ts
import { of, interval, merge } from 'rxjs';
import { tap, switchMap, partition, share } from 'rxjs/operators';
const dataService = { update: z => of(z /*posts data to back-end*/) };
const [sub, rest] = interval(1000).pipe(
tap(a => console.log('starting', a)),
share(),
partition(b => b % 2 === 0)
);
merge(
sub.pipe(
switchMap(c => dataService.update(c)),
tap(() => console.log('Ok'))
),
rest
)
.pipe(tap(d => console.log('finishing', d)))
.subscribe(x => console.log(x));
A possible work-around is to eliminate the .filter() of course.
Given:
observable.pipe(
filter(() => condition),
switchMap(p => service.otherObservable(p)),
tap(d => console.log('not triggered when condition is false'))
)
Can be rewritten as:
observable.pipe(
switchMap(p =>
(!condition) ?
of(null) :
service.otherObservable(p)),
tap(d => console.log('always triggered'))
)

RxJS share parent observable among partitioned child observables

I'm coding a game in which the character can fire their weapon.
I want different things to happen when the player tries to fire, depending on whether they have ammo.
I reduced my issue down to the following code (btw I'm not sure why SO's snippet feature does not work, so I made CodePen where you can try out my code).
const { from, merge } = rxjs;
const { partition, share, tap } = rxjs.operators;
let hasAmmo = true;
const [ fire$, noAmmo$ ] = from([true]).pipe(
share(),
partition(() => hasAmmo),
);
merge(
fire$.pipe(
tap(() => {
hasAmmo = false;
console.log('boom');
}),
),
noAmmo$.pipe(
tap(() => {
console.log('bam');
}),
)
).subscribe({
next: val => console.log('next', val),
error: val => console.log('error', val),
complete: val => console.log('complete', val),
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/6.3.3/rxjs.umd.js"></script>
When I run this code I get the following:
"boom"
"next" true
"bam"
"next" true
"complete" undefined
I don't understand why I get a "bam".
The first emission goes to fire$ (I get a "boom"), which makes sense because hasAmmo is true. But as a side-effect of fire$ emitting is that the result of the partition condition changes, which I guess is causing me to get a "bam".
Am I not supposed to cause side-effects that affect partition()?
Or maybe is there an issue with the way I share() my parent observable? I may be wrong but I would intuitively think the fire$ and noAmmo$ internally subscribe to the parent in order to split it, in which case share() should work?
It actually works correctly. The confusion comes from the partition operator which is basically just two filter operators.
If you rewrite it without partition it looks like this:
const fire$ = from([true]).pipe(
share(),
filter(() => hasAmmo),
);
const noAmmo$ = from([true]).pipe(
share(),
filter(() => !hasAmmo),
);
Be aware that changing hasAmmo has no effect on partition itself. partition acts only when it receives a value from its source Observable.
When you later use merge() it makes two separate subscriptions to two completely different chains with two different from([true])s. This means that true is passed to both fire$ and noAmmo$.
So share() has no effect here. If you want to share it you'll have to wrap from before using it on fire$ and noAmmo$. If the source Observable is just from it's unfortunately going to be even more confusing because the initial emission will arrive only to the first subscriber which is fire$ later when used in merge:
const shared$ = from([true]).pipe(
share(),
);
const fire$ = shared$.pipe(...);
const noAmmo$ = shared$.pipe(...);
The last thing why you're receiving both messages is that partition doesn't modify the value that goes through. It only decides which one of the returned Observable will reemit it.
Btw, rather avoid partition completely because it's probably going to be deprecated and just use filter which is more obvious:
https://github.com/ReactiveX/rxjs/issues/3797
https://github.com/ReactiveX/rxjs/issues/3807

How to chain async actions and wait for the result without store.dispatch

I'm trying to write my INITIALIZE action which should chain some async actions together in the following way
Call the initialize action.
Call two async actions simultaneously.
Wait for the completion of above actions.
Run additional one action.
Finish initialization.
here is the redux flow that I expect
INITIALIZATION_STARTED => ASYNC_ACTION_A_STARTED AND ASYNC_ACTION_B_STARTED => ASYNC_ACTION_A_FINISHED AND ASYNC_ACTION_B_FINISHED => ASYNC_ACTION_C_STARTED => ASYNC_ACTION_C_FINISHED => INITIALIZATION_FINISHED
I managed to achieve that flow using store.dispatch inside my epic, I know that this is anti-pattern and it will be removed in the 1.0.0 version so I would like to know how I can do it using pure epics
My working solution
export const initEpic = (action$: ActionsObservable<Action>, store) =>
action$.filter(actions.initialization.started.match)
.switchMap(action => (
Observable.forkJoin(
waitForActions(action$, actions.asyncA.done, actions.asyncB.done),
Observable.of(
store.dispatch(actions.asyncA.started(action.payload)),
store.dispatch(actions.asyncB.started(action.payload)),
)
).map(() => actions.asyncC.started(action.payload))
)
);
const waitForActions = (action$, ...reduxActions) => {
const actionTypes = reduxActions.map(x => x.type);
const obs = actionTypes.map(type => action$.ofType(type).take(1));
return Observable.forkJoin(obs);
}
I have also been trying to use forkEpic from this comment like that
export const initEpic = (action$: ActionsObservable<Action>, store) =>
action$.filter(actions.initialization.started.match)).mergeMap(action =>
forkEpic(loadTagsEpic, store, actions.asyncA.started(action.payload))
.concat(
forkEpic(loadBranchesEpic, store, actions.asyncB.started(action.payload))
)
.map(() => actions.asyncC.started(action.payload))
);
but it doesn't dispatch starting actions ASYNC_ACTION_A_STARTED and _ASYNC_ACTION_B_STARTED
Sounds like merge is perfect for this. You'll start listening for asyncA.done and asyncB.done and then while waiting you'll kick off the requests by emitting asyncA.started and asyncB.started. These two streams are merged together as one, so it happens in the correct order and the actions emitted by either are emitted by our epic without needing store.dispatch.
const initEpic = action$ =>
action$.filter(actions.initialization.started.match)
.switchMap(action => (
Observable.merge(
waitForActions(action$, actions.asyncA.done, actions.asyncB.done)
.map(() => actions.asyncC.started(action.payload)),
Observable.of(
actions.asyncA.started(action.payload),
actions.asyncB.started(action.payload),
)
)
)
);
Here is a JSBin demoing: https://jsbin.com/yonohop/edit?js,console
It doesn't do any of the ASYNC_ACTION_C_FINISHED and INITIALIZATION_FINISHED stuff because code for that was not included in the question so not sure what it would have done. 😁
You might notice this is mostly a regular RxJS question where the items streaming happen to be actions. This is really helpful because when you ask for help you can ask from the entire RxJS community if you craft the question as generic RxJS.
Note that I listened for done before starting; this is generally a best practice in case done is emitted synchronously after started. If you didn't listen first, you'd miss it. Since it's async it doesn't matter, but still generally a best practice and helpful when you unit test.

Categories

Resources