Need to combine two impure observables - javascript

I need to make 2 AJAX requests to the same endpoint that would return filtered and unfiltered data. Then I need to combine results and use them both in processing.
loadUnfilteredData() {
// remember status
const {status} = this.service.filters;
delete this.service.filters.status;
this.service.saleCounts$()
.subscribe((appCounts) =>
this.processUnfilteredData(appCounts)
);
// restore status
if (status) {
this.service.filters.status = status;
}
}
loadFilteredData() {
this.service.saleCounts$()
.subscribe((appCounts) =>
this.processFilteredData(appCounts)
);
}
The problem is that this.service.saleCounts$() is impure and instead of using arguments just uses this.service.filters.
That's why i have to store the status, then delete it from filter, then do the request, and then restore (because same filter is used by other requests).
So I can't just do combineLatest over two observables (because i need to restore).
Is there any workaround?
(p.s. I know the approach is disgusting, i know about state management and about pure functions. Just wanted to know is there any beautiful solution).

I believe your constraints require that the two operations are run sequentially , one after the other, rather than in parallel as is generally the case when we're using combineLatest.
To run two Observables sequentially, we can use switchMap (as an operator inside a pipe call in modern rxjs):
doFirstOperation()
.pipe(
switchMap(result => return doSecondOperation())
);
One potential issue with that is that you lose access to the result of doFirstOperation when you switchMap it to the result of doSecondOperation. To work around that, we can do something like:
doFirstOperation()
.pipe(
switchMap(firstResult => return doSecondOperation())
.pipe(
map(secondResult => [firstResult, secondResult])
)
);
i.e., use map to change the returned value of switchMap to be an array including both values.
Putting this together with your "disgusting" requirements for state management, you could use something like:
loadData() {
const { status } = this.service.filters;
delete this.service.filters.status;
return this.service
.saleCounts$()
.pipe(
finalize(() => {
if (status) {
this.service.filters.status = status;
}
}),
switchMap(filteredData => {
return this.service
.saleCounts$() // unfiltered query
.pipe(map(unfilteredData => [filteredData, unfilteredData]));
})
)
.subscribe(results => {
const [filteredData, unfilteredData] = results;
this.processFilteredData(filteredData);
this.processUnfilteredData(unfilteredData);
});
}
I'm not too many people would categorize that is beautiful, but it does at least allow you to get results in a way that looks like you used combineLatest, yet works around the constraints imposed by your impure method.

Related

Refactoring chained RxJs subscriptions

I have a piece of code that I need to refactor because it's a hell of chained subscriptions.
ngOnInit(): void {
this.dossierService.getIdTree()
.subscribe(idTree => {
this.bootstrappingService.refreshObligations(idTree)
.subscribe(() => {
this.dossierPersonsService.retrieveDossierPersons(idTree)
.subscribe(debtors => {
this.retrieveObligations();
this.debtors = debtors;
});
});
});
}
The first call dossierService.getIdTree() retrieves idTree which is used by other services except obligationsService.retrieveObligations().
All service methods should be executed in the order they executed now. But retrieveDossierPersons and retrieveObligations can be executed in parallel.
retrieveObligations() is a method that subscribes to another observable. This method is used in a few other methods.
I've refactored it and it seems to work. But did I refactor it in a proper way or my code can be improved?
this.dossierService.getIdTree()
.pipe(
map(idTree => {
this.idTree = idTree;
}),
switchMap(() => {
return this.bootstrappingService.refreshObligations(this.idTree)
}),
switchMap(
() => {
return this.dossierPersonsService.retrieveDossierPersons(this.idTree)
},
)
)
.subscribe(debtors => {
this.retrieveObligations();
this.debtors = debtors;
});
Something like this (not syntax checked):
ngOnInit(): void {
this.dossierService.getIdTree().pipe(
switchMap(idTree =>
this.bootstrappingService.refreshObligations(idTree)).pipe(
switchMap(() => this.dossierPersonsService.retrieveDossierPersons(idTree).pipe(
tap(debtors => this.debtors = debtors)
)),
switchMap(() => this.retrieveObligations())
)
).subscribe();
}
Using a higher-order mapping operator (switchMap in this case) will ensure that the inner observables are subscribed and unsubscribed.
In this example, you don't need to separately store idTree because you have access to it down the chained pipes.
You could try something like:
ngOnInit(): void {
const getIdTree$ = () => this.dossierService.getIdTree();
const getObligations = idTree => this.bootstrappingService.refreshObligations(idTree);
const getDossierPersons = idTree => this.dossierPersonsService.retrieveDossierPersons(idTree);
getIdTree$().pipe(
switchMap(idTree => forkJoin({
obligations: getObligations(idTree)
debtors: getDossierPersons(idTree),
}))
).subscribe(({obligations, debtors}) => {
// this.retrieveObligations(); // seems like duplicate of refreshObligations?
this.debtors = debtors;
});
}
Depending on the rest of the code and on the template, you might also want to avoid unwrapping debtors by employing the async pipe instead
forkJoin will only complete when all of its streams have completed.
You might want also want to employ some error handling by piping catchError to each inner observable.
Instead of forkJoin you might want to use mergeMap or concatMap (they take an array rather than an object) - this depends a lot on logic and the UI. concatMap will preserve the sequence, mergeMap will not - in both cases, data could be display accumulatively, as it arrives. With forkJoin when one request gets stuck, the whole stream will get stuck, so you won't be able to display anything until all streams have completed.
You can use switchMap or the best choice is concatMap to ensure orders of executions
obs1$.pipe(
switchMap(data1 => obs2$.pipe(
switchMap(data2 => obs3$)
)
)

Observable map property based on another observable

Lets say i have an observable that emits employees.
$employees
I wish to manipulate a sub property of each employee based on an Observable. For the example lets say display name.
Currently im performing the task like this.
const getDisplayName = (emp) => {}; //returns an observable
const mapFn = async (emp) => {
emp.displayName = await getDisplayName(emp).toPromise();
return emp;
}
$employees
.pipe(mergeMap(mapFn));
I think my confusion is that, to my understanding, we have two streams. The root $employees and the getDisplayName. My understanding is with the various merge operators the root value would be replaced by the value of the secondary stream. Not merged with.
Is there a better way to do this where i don't need to convert to a promise but can also just map a property of the employee?
Thanks.
you want to do:
$employees
.pipe(mergeMap(employees => {
return forkJoin(employees.map(emp => getDisplayName(emp).pipe(
map(displayName => ({...emp, ...{displayName}}))
)))
}));
if you really want a broken out mapFn:
const mapFn = employees => {
return forkJoin(employees.map(emp => getDisplayName(emp).pipe(
map(displayName => ({...emp, ...{displayName}}))
)));
}
$employees
.pipe(mergeMap(mapFn))
always advise against mixing rxjs and async / await. They're different methods of handling async operations and don't play nice. mergeMap needs you to return an observable, forkJoin executes observables in parralel, so you join all your employees mapped into their getName functions and then map the name into the original employee and return it.
EDIT: the above is if $eployees is emitting an array of employees. if it's just a single employee, do:
const mapFn = emp => {
return getDisplayName(emp).pipe(
map(displayName => ({...e, ...{displayName}}))
);
}
however, if it's emitting a single employee (or an array of employees) multiple times, it's important to understand the implications of using mergeMap vs switchMap vs concatMap. Let me know if that is the case.
This should work:
const emp$ = from([1, 2, 3]);
function getName(emp: number): Observable<string> {
return of(emp.toString());
}
const empNameTuple$ = emp$.pipe(mergeMap(emp => {
return getName(emp).pipe(map(name => [emp, name]));
}))
Here I'm just returning a tuple of [number, string], but you can map it however you wish.

Angular subscribe within subscribe: data doesn't load at the same time within view

I know it is bad practice to call subscribe within subscribe but I don't know how to handle it differently with my special case.
The code as it is now works, but my problem is that if I update my website for example every second, parts of the table are loaded first and other parts are loaded afterwards (the content of the subscibe within my subscribe).
I have a service containing a function that returns an Observable of a list of files for different assets.
Within that function I request the filelist for each asset by calling another service and this service returns observables.
I then iterate over the elements of that list and build up my data structures to return them later on (AssetFilesTableItems).
Some files can be zip files and I want to get the contents of those files by subscribing to another service (extractZipService). To be able to get that correct data I need the name of the file which I got by requesting the filelist. I then add some data of the zip contents to my AssetFilesTableItems and return everything at the end.
The code of that function is as follows:
getAssetfilesData(assetIds: Array<string>, filter: RegExp, showConfig: boolean): Observable<AssetFilesTableItem[][]> {
const data = assetIds.map(assetId => {
// for each assetId
return this.fileService.getFileList(assetId)
.pipe(
map((datasets: any) => {
const result: AssetFilesTableItem[] = [];
// iterate over each element
datasets.forEach((element: AssetFilesTableItem) => {
// apply regex filter to filename
if (filter.test(element.name)) {
this.logger.debug(`Filter ${filter} matches for element: ${element.name}`);
// build up AssetFilesTableItem
const assetFilesItem: AssetFilesTableItem = {
name: element.name,
type: element.type,
asset: assetId
};
// save all keys of AssetFilesTableItem
const assetFilesItemKeys = Object.keys(assetFilesItem);
// if file is of type ZIP, extract 'config.json' from it if available
if (showConfig && element.type.includes('zip')) {
this.extractZipService.getJSONfromZip(assetId, element.name, 'config.json')
.subscribe((configJson: any) => {
const jsonContent = JSON.parse(configJson);
const entries = Object.entries(jsonContent);
entries.forEach((entry: any) => {
const key = entry[0];
const value = entry[1];
// only add new keys to AssetFilesTableItem
if (!assetFilesItemKeys.includes(key)) {
assetFilesItem[key] = value;
} else {
this.logger.error(`Key '${key}' of config.json is already in use and will not be displayed.`);
}
});
});
}
result.push(assetFilesItem);
}
});
return result;
}));
});
// return combined result of each assetId request
return forkJoin(data);
}
}
I update my table using the following code within my component:
getValuesPeriodically(updateInterval: number) {
this.pollingSubscription = interval(updateInterval)
.subscribe(() => {
this.getAssetfilesFromService();
}
);
}
getAssetfilesFromService() {
this.assetfilesService.getAssetfilesData(this.assetIds, this.filterRegEx, this.showConfig)
.subscribe((assetFilesTables: any) => {
this.assetFilesData = [].concat.apply([], assetFilesTables);
});
}
Edit: I tried ForkJoin, but as far as I understandit is used for doing more requests in parallel. My extractZipService though depends on results that I get from my fileService. Also I have a forkJoin at the end already which should combine all of my fileList requests for different assets. I don't understand why my view is not loaded at once then.
EDIT: The problem seems to be the subscribe to the extractZipService within the forEach of my fileService subscribe. It seems to finish after the fileService Subscribe. I tried lots of things already, like SwitchMap, mergeMap and the solution suggested here, but no luck. I'm sure it's possible to make it work somehow but I'm running out of ideas. Any help would be appreciated!
You are calling this.extractZipService.getJSON inside a for loop. So this method gets called asynch and your function inside map is not waiting for the results. When result does come as your items are same which is in your view they get refreshed.
To solve this you need to return from this.extractZipService.getJSON and map the results which will give you a collections of results and then you do forkJoin on results ( Not sure why you need to forkjoin as there are just the objects and not API's which you need to call )
this.logger.debug(`ConfigJson found for file '${element.name}': ${configJson}`);
const jsonContent = JSON.parse(configJson);
const entries = Object.entries(jsonContent);
entries.forEach((entry: any) => {
// code
});
complete code should look on similar lines :-
getAssetfilesData(assetIds: Array<string>, filter: RegExp, showConfig: boolean): Observable<AssetFilesTableItem[][]> {
const data = assetIds.map(assetId => {
// for each assetId
return this.fileService.getFileList(assetId)
.pipe(
map((datasets: any) => {
// iterate over each element
datasets.forEach((element: AssetFilesTableItem) => {
return this.extractZipService.getJSONfromZip(assetId, element.name,
'config.json')
});
})).map((configJson: any) => {
// collect your results and return from here
// return result
});;
});
// return combined result of each assetId request
return forkJoin(data);
}
}
I have created a Stackblitz(https://stackblitz.com/edit/nested-subscribe-solution) which work along the same lines. You need to use concatMap and forkJoin for getting all the results.
Hope this helps.

Subscribe onComplete never completes with flatMap

I'm using Angular 6 with RxJS 6.2.2 and RxJS Compact 6.2.2.
I have a code to call my api service to load some records, which is:
this.route.params
.flatMap((params: Params) => {
if (!params['id']) {
return Observable.throwError('Id is not specified.');
}
this.itemId = params['id'];
this.isEditMode = true;
this.loadCategoryGroupCondition = new LoadCategoryGroupViewModel();
this.loadCategoryGroupCondition.id = [this.itemId];
this.loadCategoryGroupCondition.pagination = new Pagination();
return this.categoryGroupService
.loadCategoryGroup(this.loadCategoryGroupCondition);
})
.subscribe(
(loadCategoryGroupResult: SearchResult<CategoryGroup>) => {
console.log(loadCategoryGroupResult);
},
() => {},
() => {
console.log('Completed')
});
The code above can print a list of my items returned from my api service. That means onSuccess has been called.
But the complete method is fired.
What is wrong with my code ?
Thank you,
As discussed, the flatMap operator does itself not complete its source observable. You are using this.route.params as your source observable, which is long-lived - it never completes by itself.
To get a complete notification you can use an operator such as take. It will re-emit the number of items you pass as a parameter and complete afterwards. For example, if you just want to receive the current route and are not interested in further notifications of your source observable, use take(1), like:
this.route.params
.take(1)
.flatMap((params: Params) => {
Also, note that the recommeded way for doing this in RxJS 6+ is using pipeable operators. This would look like so:
this.route.params.pipe(
first(),
mergeMap((params: Params) => {
...
})
I also replaced the operators with the newer recommended variants.

Queuing Actions in Redux

I've currently got a situation whereby I need Redux Actions to be run consecutively. I've taken a look at various middlewares, such a redux-promise, which seem to be fine if you know what the successive actions are at the point of the root (for lack of a better term) action being triggered.
Essentially, I'd like to maintain a queue of actions that can be added to at any point. Each object has an instance of this queue in its state and dependent actions can be enqueued, processed and dequeued accordingly. I have an implementation, but in doing so I'm accessing state in my action creators, which feels like an anti-pattern.
I'll try and give some context on use case and implementation.
Use Case
Suppose you want to create some lists and persist them on a server. On list creation, the server responds with an id for that list, which is used in subsequent API end points pertaining to the list:
http://my.api.com/v1.0/lists/ // POST returns some id
http://my.api.com/v1.0/lists/<id>/items // API end points include id
Imagine that the client wants to perform optimistic updates on these API points, to enhance UX - nobody likes looking at spinners. So when you create a list, your new list instantly appears, with an option at add items:
+-------------+----------+
| List Name | Actions |
+-------------+----------+
| My New List | Add Item |
+-------------+----------+
Suppose that someone attempts to add an item before the response from the initial create call has made it back. The items API is dependent on the id, so we know we can't call it until we have that data. However, we might want to optimistically show the new item and enqueue a call to the items API so that it triggers once the create call is done.
A Potential Solution
The method I'm using to get around this currently is by giving each list an action queue - that is, a list of Redux actions that will be triggered in succession.
The reducer functionality for a list creation might look something like this:
case ADD_LIST:
return {
id: undefined, // To be filled on server response
name: action.payload.name,
actionQueue: []
}
Then, in an action creator, we'd enqueue an action instead of directly triggering it:
export const createListItem = (name) => {
return (dispatch) => {
dispatch(addList(name)); // Optimistic action
dispatch(enqueueListAction(name, backendCreateListAction(name));
}
}
For brevity, assume the backendCreateListAction function calls a fetch API, which dispatches messages to dequeue from the list on success/failure.
The Problem
What worries me here is the implementation of the enqueueListAction method. This is where I'm accessing state to govern the advancement of the queue. It looks something like this (ignore this matching on name - this actually uses a clientId in reality, but I'm trying to keep the example simple):
const enqueueListAction = (name, asyncAction) => {
return (dispatch, getState) => {
const state = getState();
dispatch(enqueue(name, asyncAction));{
const thisList = state.lists.find((l) => {
return l.name == name;
});
// If there's nothing in the queue then process immediately
if (thisList.actionQueue.length === 0) {
asyncAction(dispatch);
}
}
}
Here, assume that the enqueue method returns a plain action that inserts an async action into the lists actionQueue.
The whole thing feels a bit against the grain, but I'm not sure if there's another way to go with it. Additionally, since I need to dispatch in my asyncActions, I need to pass the dispatch method down to them.
There is similar code in the method to dequeue from the list, which triggers the next action should one exist:
const dequeueListAction = (name) => {
return (dispatch, getState) => {
dispatch(dequeue(name));
const state = getState();
const thisList = state.lists.find((l) => {
return l.name === name;
});
// Process next action if exists.
if (thisList.actionQueue.length > 0) {
thisList.actionQueue[0].asyncAction(dispatch);
}
}
Generally speaking, I can live with this, but I'm concerned that it's an anti-pattern and there might be a more concise, idiomatic way of doing this in Redux.
Any help is appreciated.
I have the perfect tool for what you are looking for. When you need a lot of control over redux, (especially anything asynchronous) and you need redux actions to happen sequentially there is no better tool than Redux Sagas. It is built on top of es6 generators giving you a lot of control since you can, in a sense, pause your code at certain points.
The action queue you describe is what is called a saga. Now since it is created to work with redux these sagas can be triggered to run by dispatching in your components.
Since Sagas use generators you can also ensure with certainty that your dispatches occur in a specific order and only happen under certain conditions. Here is an example from their documentation and I will walk you through it to illustrate what I mean:
function* loginFlow() {
while (true) {
const {user, password} = yield take('LOGIN_REQUEST')
const token = yield call(authorize, user, password)
if (token) {
yield call(Api.storeItem, {token})
yield take('LOGOUT')
yield call(Api.clearItem, 'token')
}
}
}
Alright, it looks a little confusing at first but this saga defines the exact order a login sequence needs to happen. The infinite loop is allowed because of the nature of generators. When your code gets to a yield it will stop at that line and wait. It will not continue to the next line until you tell it to. So look where it says yield take('LOGIN_REQUEST'). The saga will yield or wait at this point until you dispatch 'LOGIN_REQUEST' after which the saga will call the authorize method, and go until the next yield. The next method is an asynchronous yield call(Api.storeItem, {token}) so it will not go to the next line until that code resolves.
Now, this is where the magic happens. The saga will stop again at yield take('LOGOUT') until you dispatch LOGOUT in your application. This is crucial since if you were to dispatch LOGIN_REQUEST again before LOGOUT, the login process would not be invoked. Now, if you dispatch LOGOUT it will loop back to the first yield and wait for the application to dispatch LOGIN_REQUEST again.
Redux Sagas are, by far, one of my favorite tools to use with Redux. It gives you so much control over your application and anyone reading your code will thank you since everything now reads one line at a time.
Have a look at this: https://github.com/gaearon/redux-thunk
The id alone shouldn't go through the reducer. In your action creator (thunk), fetch the list id first, and then() perform a second call to add the item to the list. After this, you can dispatch different actions based on whether or not the addition was successful.
You can dispatch multiple actions while doing this, to report when the server interaction has started and finished. This will allow you to show a message or a spinner, in case the operation is heavy and might take a while.
A more in-depth analysis can be found here: http://redux.js.org/docs/advanced/AsyncActions.html
All credit to Dan Abramov
I was facing a similar problem to yours. I needed a queue to guarantee that optimistic actions were committed or eventually committed (in case of network problems) to remote server in same sequential order they were created, or rollback if not possible. I found that with Redux only, fells short for this, basically because I believe it was not designed for this and doing it with promises alone can be really a hard problem to reason with, besides the fact you need to manage your queue state somehow... IMHO.
I think #Pcriulan's suggestion on using redux-saga was a good one. At first sight, redux-saga doesn't provide anything to help you with until you get to channels. This opens you a door to deal with concurrency in other ways other languages do, CSP specifically (see Go or Clojure's async for example), thanks to JS generators. There are even questions on why is named after the Saga pattern and not CSP haha... anyway.
So here is how a saga could help you with your queue:
export default function* watchRequests() {
while (true) {
// 1- Create a channel for request actions
const requestChan = yield actionChannel('ASYNC_ACTION');
let resetChannel = false;
while (!resetChannel) {
// 2- take from the channel
const action = yield take(requestChan);
// 3- Note that we're using a blocking call
resetChannel = yield call(handleRequest, action);
}
}
}
function* handleRequest({ asyncAction, payload }) {
while (true) {
try {
// Perform action
yield call(asyncAction, payload);
return false;
} catch(e) {
if(e instanceof ConflictError) {
// Could be a rollback or syncing again with server?
yield put({ type: 'ROLLBACK', payload });
// Store is out of consistency so
// don't let waiting actions come through
return true;
} else if(e instanceof ConnectionError) {
// try again
yield call(delay, 2000);
}
}
}
}
So the interesting part here is how the channel acts as a buffer (a queue) which keeps "listening" for incoming actions but won't proceed with future actions until it finish with the current one. You might need to go over their documentation in order to grasp the code better, but I think it's worth it. The resetting channel part might or not work for your needs :thinking:
Hope it helps!
This is how I would tackle this problem:
Make sure each local list have an unique identifier. I'm not talking about the backend id here. Name is probably not enough to identify a list? An "optimistic" list not yet persisted should be uniquely identifiable, and user may try to create 2 lists with the same name, even if it's an edge case.
On list creation, add a promise of backend id to a cache
CreatedListIdPromiseCache[localListId] = createBackendList({...}).then(list => list.id);
On item add, try to get the backend id from Redux store. If it does not exist, then try to get it from CreatedListIdCache. The returned id must be async because CreatedListIdCache returns a promise.
const getListIdPromise = (localListId,state) => {
// Get id from already created list
if ( state.lists[localListId] ) {
return Promise.resolve(state.lists[localListId].id)
}
// Get id from pending list creations
else if ( CreatedListIdPromiseCache[localListId] ) {
return CreatedListIdPromiseCache[localListId];
}
// Unexpected error
else {
return Promise.reject(new Error("Unable to find backend list id for list with local id = " + localListId));
}
}
Use this method in your addItem, so that your addItem will be delayed automatically until the backend id is available
// Create item, but do not attempt creation until we are sure to get a backend id
const backendListItemPromise = getListIdPromise(localListId,reduxState).then(backendListId => {
return createBackendListItem(backendListId, itemData);
})
// Provide user optimistic feedback even if the item is not yet added to the list
dispatch(addListItemOptimistic());
backendListItemPromise.then(
backendListItem => dispatch(addListItemCommit()),
error => dispatch(addListItemRollback())
);
You may want to clean the CreatedListIdPromiseCache, but it's probably not very important for most apps unless you have very strict memory usage requirements.
Another option would be that the backend id is computed on frontend, with something like UUID. Your backend just need to verify the unicity of this id. Thus you would always have a valid backend id for all optimistically created lists, even if backend didn't reply yet.
You don't have to deal with queuing actions. It will hide the data flow and it will make your app more tedious to debug.
I suggest you to use some temporary ids when creating a list or an item and then update those ids when you actually receive the real ones from the store.
Something like this maybe ? (don't tested but you get the id) :
EDIT : I didn't understand at first that the items need to be automatically saved when the list is saved. I edited the createList action creator.
/* REDUCERS & ACTIONS */
// this "thunk" action creator is responsible for :
// - creating the temporary list item in the store with some
// generated unique id
// - dispatching the action to tell the store that a temporary list
// has been created (optimistic update)
// - triggering a POST request to save the list in the database
// - dispatching an action to tell the store the list is correctly
// saved
// - triggering a POST request for saving items related to the old
// list id and triggering the correspondant receiveCreatedItem
// action
const createList = (name) => {
const tempList = {
id: uniqueId(),
name
}
return (dispatch, getState) => {
dispatch(tempListCreated(tempList))
FakeListAPI
.post(tempList)
.then(list => {
dispatch(receiveCreatedList(tempList.id, list))
// when the list is saved we can now safely
// save the related items since the API
// certainly need a real list ID to correctly
// save an item
const itemsToSave = getState().items.filter(item => item.listId === tempList.id)
for (let tempItem of itemsToSave) {
FakeListItemAPI
.post(tempItem)
.then(item => dispatch(receiveCreatedItem(tempItem.id, item)))
}
)
}
}
const tempListCreated = (list) => ({
type: 'TEMP_LIST_CREATED',
payload: {
list
}
})
const receiveCreatedList = (oldId, list) => ({
type: 'RECEIVE_CREATED_LIST',
payload: {
list
},
meta: {
oldId
}
})
const createItem = (name, listId) => {
const tempItem = {
id: uniqueId(),
name,
listId
}
return (dispatch) => {
dispatch(tempItemCreated(tempItem))
}
}
const tempItemCreated = (item) => ({
type: 'TEMP_ITEM_CREATED',
payload: {
item
}
})
const receiveCreatedItem = (oldId, item) => ({
type: 'RECEIVE_CREATED_ITEM',
payload: {
item
},
meta: {
oldId
}
})
/* given this state shape :
state = {
lists: {
ids: [ 'list1ID', 'list2ID' ],
byId: {
'list1ID': {
id: 'list1ID',
name: 'list1'
},
'list2ID': {
id: 'list2ID',
name: 'list2'
},
}
...
},
items: {
ids: [ 'item1ID','item2ID' ],
byId: {
'item1ID': {
id: 'item1ID',
name: 'item1',
listID: 'list1ID'
},
'item2ID': {
id: 'item2ID',
name: 'item2',
listID: 'list2ID'
}
}
}
}
*/
// Here i'm using a immediately invoked function just
// to isolate ids and byId variable to avoid duplicate
// declaration issue since we need them for both
// lists and items reducers
const lists = (() => {
const ids = (ids = [], action = {}) => ({
switch (action.type) {
// when receiving the temporary list
// we need to add the temporary id
// in the ids list
case 'TEMP_LIST_CREATED':
return [...ids, action.payload.list.id]
// when receiving the real list
// we need to remove the old temporary id
// and add the real id instead
case 'RECEIVE_CREATED_LIST':
return ids
.filter(id => id !== action.meta.oldId)
.concat([action.payload.list.id])
default:
return ids
}
})
const byId = (byId = {}, action = {}) => ({
switch (action.type) {
// same as above, when the the temp list
// gets created we store it indexed by
// its temp id
case 'TEMP_LIST_CREATED':
return {
...byId,
[action.payload.list.id]: action.payload.list
}
// when we receive the real list we first
// need to remove the old one before
// adding the real list
case 'RECEIVE_CREATED_LIST': {
const {
[action.meta.oldId]: oldList,
...otherLists
} = byId
return {
...otherLists,
[action.payload.list.id]: action.payload.list
}
}
}
})
return combineReducers({
ids,
byId
})
})()
const items = (() => {
const ids = (ids = [], action = {}) => ({
switch (action.type) {
case 'TEMP_ITEM_CREATED':
return [...ids, action.payload.item.id]
case 'RECEIVE_CREATED_ITEM':
return ids
.filter(id => id !== action.meta.oldId)
.concat([action.payload.item.id])
default:
return ids
}
})
const byId = (byId = {}, action = {}) => ({
switch (action.type) {
case 'TEMP_ITEM_CREATED':
return {
...byId,
[action.payload.item.id]: action.payload.item
}
case 'RECEIVE_CREATED_ITEM': {
const {
[action.meta.oldId]: oldList,
...otherItems
} = byId
return {
...otherItems,
[action.payload.item.id]: action.payload.item
}
}
// when we receive a real list
// we need to reappropriate all
// the items that are referring to
// the old listId to the new one
case 'RECEIVE_CREATED_LIST': {
const oldListId = action.meta.oldId
const newListId = action.payload.list.id
const _byId = {}
for (let id of Object.keys(byId)) {
let item = byId[id]
_byId[id] = {
...item,
listId: item.listId === oldListId ? newListId : item.listId
}
}
return _byId
}
}
})
return combineReducers({
ids,
byId
})
})()
const reducer = combineReducers({
lists,
items
})
/* REDUCERS & ACTIONS */

Categories

Resources