How to execute Observable consistently and combine result - javascript

I have an issue with Observable chain and can't found decision.
I need to wait for result from IndexedDB, push it to next request and combine two results in last Observable.
Here is example:
const first$ = Database.gelAll(store); // return Observable
first.mergeMap((data) => {
const uuids = data.map((v) => {
return v.uuid;
});
// here i need to send request with 'uuids' array and combine result with
values from first request
const second$ = database.find(store, {'uuid': uuids}); // return Observable
});
Thanks for any advices.

If i understand you correctly, you're trying to make it so the end result of your observable is an object containing both the result from Database.gelAll and also the result from database.find. To do that, you'll just need to add a .map statement after the database.find, and return it to the merge map.
Database.gelAll(store)
.mergeMap((data) => {
const uuids = data.map((v) => v.uuid);
return database.find(store, {'uuid': uuids})
.map((databaseResult) => {
// Combine with data in some fashion
return {
firstData: data,
secondData: databaseResult
}
});
})
.subscribe(objectWithBothFirstDataAndSecondData => {
console.log(objectWithBothFirstDataAndSecondData)
});
Also, you should consider whether .mergeMap is appropriate. If .gelAll only emits one value then it should be ok, but in most cases, .switchMap or .concatMap is a better choice than .mergeMap. With mergeMap you have no guarantee of the order. Switchmap ensures that only the latest is used, concatMap ensures that everything gets through, but they get through in the same order they were asked for.

Related

Rxjs stream of arrays to a single value and repeat at the end of the stream

I have an observable that fetches an array of items (32 each time) from an API and emits a new response until there are no items left to fetch.
I want to process said list of items one by one as soon as i get the first batch until im done with ALL items fetched.
When i'm done with the complete list, i want to repeat the process indefinitely.
Here's what i have so far:
_dataService
.getItemsObservable()
.pipe(
switchMap((items) => {
const itemList = items.map((i) => i.itemId);
return of(itemList);
}),
concatMap((item) =>
from(item).pipe(
concatMap((item) => {
// do something here
}
)
)
),
repeat()
).subscribe()
Any idea on what can i do? Right now what happens is it will loop over the first batch of items and ignore the rest
Replay wont call the service again, it will reuse the original values. Try switchMap from a behaviour subject and make it emit after you have processed the values. Really not sure why you would turn each item into an observable to concatMap. Just process the items after they are emitted.
const { of, BehaviorSubject, switchMap, delay } = rxjs;
const _dataService = {
getItemsObservable: () => of(Array.from({ length: 32 }, () => Math.random()))
};
const bs$ = new BehaviorSubject();
bs$.pipe(
delay(1000), // Behavior subject are synchronous will cause a stack overflow
switchMap(() => _dataService.getItemsObservable())
).subscribe(values => {
values.forEach(val => {
console.log('Doing stuff with ' + val);
});
bs$.next();
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/7.8.0/rxjs.umd.min.js" integrity="sha512-v0/YVjBcbjLN6scjmmJN+h86koeB7JhY4/2YeyA5l+rTdtKLv0VbDBNJ32rxJpsaW1QGMd1Z16lsLOSGI38Rbg==" crossorigin="anonymous" referrerpolicy="no-referrer"></script>
I have an observable that fetches an array of items (32 each time) from an API and emits a new response until there are no items left to fetch.
Okay, I assume that is _dataService.getItemsObservable()?
I want to process said list of items
What does this mean? Process how? Lets assume you have some function called processItemById that processes an itemId and returns the processed item.
one by one as soon as i get the first batch until im done with ALL items fetched.
Sounds like you're turning an Observable<T[]> into an Observable<T>. You can use mergeMap (don't care about order) or concatMap (maintain order) to do this. Since you're just flattening an inner array, they'll be the same in this case.
_dataService.getItemsObservable().pipe(
mergeMap(v => v),
map(item => processItemById(item.itemId)),
// Repeat here doesn't call `getItemsObservable()` again,
// instead it re-subscribes to the observable that was returned.
// Hopefully that's what you're counting on. It's not clear to me
repeat()
).subscribe(processedItemOutput => {
// Do something with the output?
});
Any idea on what can i do?
From your explanation and code, it's not clear what you're trying to do. Maybe this helps.
Right now what happens is it will loop over the first batch of items and ignore the rest
This could happen for a number of reasons.
Tip 1
Using higher-order mapping operators with RxJS::of is a code smell. Just use a regular map instead.
for example:
concatMap(v => of(fn(v)))
// or
switchMap(v => of(fn(v)))
are the same as:
map(v => fn(v))
Tip 2
I have no idea if this would help you but you can generate a new observable on each subscribe by using the delay operator.
For example:
defer(() => _dataService.getItemsObservable()).pipe(
mergeMap(v => v),
map(item => processItemById(item.itemId)),
repeat()
).subscribe(processedItemOutput => {
// Do something with the output?
});
It looks like you want to get all records from an API that is paginated but won't tell you how many pages are there which sounds like you're looking for expand() operator which is great for recursive calls.
import { of, EMPTY, expand, range, toArray, mergeMap, concat, map, takeLast } from 'rxjs';
const MAX_PAGES = 3;
const PER_PAGE = 32;
const fetchList = (offset: number) => {
return Math.ceil(offset / PER_PAGE) >= MAX_PAGES ? of([]) : range(offset, 32).pipe(toArray());
};
const fetchDetail = (id: number) => {
return of(`response for ${id}`);
};
of([]) // Seed for `expand()`.
.pipe(
expand((acc, index) => fetchList(acc.length).pipe(
mergeMap(list => { // Process each response array
console.log(`Response #${index}`, list);
// When the reponse is an empty we can stop recursive calls.
if (list.length === 0) {
return EMPTY;
}
// Process the response and make `fetchDetail()` call for each item.
// `concat()` will guarantee order and process items one by one.
return concat(...list.map(id => fetchDetail(id)))
.pipe(
toArray(),
map(details => [...acc, ...details]), // Append the processed response to a single large array.
);
}),
)),
takeLast(1), // Only take the final array after all pages have been fetched.
)
.subscribe(console.log);
Working demo: https://stackblitz.com/edit/rxjs-yqd4kx?devtoolsheight=60&file=index.ts

How can I call an API function in sequence for all of the elements of an array?

I am working with Angular using RxJs, and currently find it very challenging to solve this problem.
I have an array which contains some ids.
ids = [1,2,3,4]
Then I have an API that can be called with the id param, which deletes the item with the given id from the database:
this.controller.deleteItem(id)
I want to call this API on every id in the array.
These API calls should strictly happen one after another in a sequential pattern, like
this.controller.deleteItem(1) -> this.controller.deleteItem(2) etc.
After all of the api calls finished, I would like to fetch the data with:
this.controller.getData()
How can I solve this?
You can do it using the concat operator.
First you need to turn the list of IDs into a list of observables, by mapping each item of the array into its corresponding delete action:
const idsToDelete = [1, 2, 3];
const deleteTasks = idsToDelete.map(id => this.controller.deleteItem(id));
Then use concat to execute the tasks sequentially:
concat(...deleteTasks).subscribe((response) => {
console.log('deleted', response);
});
For getting the data at the end, (assuming that the getData method also returns an observable) you can insert it at the end, after the deletes, and only listen for the last response:
concat(...deleteTasks, this.controller.getData()).pipe(
// tap((res) => console.log(res)),
last()
).subscribe((dataAfterDelete) => {
console.log(dataAfterDelete);
});
You can also split these calls if you want to get the chance to perform side effects between the delete operations and the data fetch by using the switchMap operator:
const sequentialDelete = concat(...deleteTasks);
sequentialDelete.pipe(
tap((deleteItemResponse) => {
console.log('after each item delete', deleteItemResponse);
}),
last(),
tap(() => {
console.log('after the last item was deleted');
this.idsToDelete = [];
}),
switchMap(() => this.controller.getData())
).subscribe((dataAfterDelete) => {
console.log(dataAfterDelete);
});
In simple RxJS way (assumed that both controller functions returns Observables)
concat(
from(ids).pipe(concatMap(id=> this.controller.deleteItem(id))),
this.controller.getData()
)
NOTE: NOT SEQUENTIAL
forkJoin(
ids.map<Observable<number>>((id: number) =>
this.controller.deleteItem(id) )
).subscribe(val => this.controller.getData())
forkJoin
When all observables complete, emit the last emitted value from each.
map your ids array into an Observables array using your "this.controller.deleteItem(id)" then pass that array to forkJoin and subscribe to it. forkJoin does all the deleteItem calls at the same time and emits the output of each when all complete. When all succeed you can call getData()
ids.forEach(async (id) => {
await this.controller.deleteItem(id);
};
Will call your api sequentially.

Tree not being traversed in correct order when using chrome.storage.sync

I am trying to perform DFS traversal of a tree structure, stored as the mapping Root_Node_Value -> Array_of_Children. When I try to perform the traversal, storing the mapping as a plain object, the tree is traversed as expected.
// Correct DFS
treeMapping = {'1221':['1223','1224']}
function closeTree(root) {
const children = treeMapping[root] || [];
children.forEach(function (child) { closeTree(child) });
console.log("Closing: "+root);
}
The output for the above code, as expected, is: 1223, 1224, 1221
However, when I try to implement the same logic, using chrome.storage.sync to retrieve the mapping, the expected order is not followed.
// Incorrect DFS
function closeTree(root) {
chrome.storage.sync.get(root.toString(), data => {
const children = data[root] || [];
children.forEach(function(child) { closeTree(child); });
console.log("Closing: "+root);
});
}
The code using chrome.storage.sync outputs 1221, 1222, 1226, even though the tree mapping stored in storage is identical. This is clearly incorrect as the root value, 1221, should be printed last.
// Retrieved value using chrome.storage.get('1221', ...)
{1221: Array(2)}
1221: (2) [1222, 1226]
What is the cause for this behavior and how can I fix it?
The chrome API is asynchronous. An easy rule of thumb is, if the method accepts a callback per the documentation then it runs asynchronously.
The simplest solution is to promisify closeTree and use Promise.all on children:
function closeTree(root) {
return new Promise(resolve => {
chrome.storage.sync.get(root.toString(), async data => {
const children = data[root] || [];
await Promise.all(children.map(closeTree));
resolve();
});
});
}
P.S. you can optimize considerably by reading all children in one operation:
chrome.storage.sync.get(children, processAllChildren)

Question about asynchronous JavaScript with Promise

Here I have a function that takes an array of string that contains the user names of github accounts. And this function is going to return an array of user data after resolving. There should be one fetch request per user. and requests shouldn’t wait for each other. So that the data arrives as soon as possible. If there’s no such user, the function should return null in the resulting array.
An example for the input would be ["iliakan", "remy", "no.such.users"], and the expected returned promise after resolving would give us [null, Object, Object], Object being the data that contained info about a user.
Here is my attempt to solve this question.
function getUsers(names) {
return new Promise(resolve => {
const array = [];
const url = "https://api.github.com/users/";
const requests = names.map(name => {
const endpoint = `${url}${name}`;
return fetch(endpoint);
});
Promise.all(requests).then(reponses => {
reponses.forEach(response => {
if (response.status === 200) {
response.json().then(data => {
array.push(data);
});
} else {
array.push(null);
}
});
resolve(array);
});
});
}
It does work, i.e. returning an array [null, Object, Object]. And I thought it fulfilled the requirements I stated above. However, after looking at it closely, I felt like I couldn't fully make sense of it.
My question is, look at where we resolve this array, it resolved immediately after the forEach loop. One thing I don't understand is, why does it contain all three items when some of the items are pushed into it asynchronously after the json() is finished. what I mean is, in the case where response.status === 200, the array is pushed with the data resolved from json(), and I would assume this json() operation should take some time. Since we didn't resolve the array after json() operation is finished, how come we still ended up with all data resolved from json()?
Promise.all(requests).then(reponses => {
reponses.forEach(response => {
if (response.status === 200) {
response.json().then(data => {
array.push(data); <--- this should take some time
});
} else {
array.push(null);
}
});
resolve(array); <--- resolve the array immediately after the `forEach` loop
});
});
It looks to me like the array we get should only have one null in it since at the time it is revolved, the .json() should not be finished
You're right, the result is pushed later into the array.
Try to execute this:
const test = await getUsers(['Guerric-P']);
console.log(test.length);
You'll notice it displays 0. Before the result is pushed into the array, its length is 0. You probably think it works because you click on the array in the console, after the result has arrived.
You should do something like this:
function getUsers(names) {
const array = [];
const url = "https://api.github.com/users/";
const requests = names.map(name => {
const endpoint = `${url}${name}`;
return fetch(endpoint);
});
return Promise.all(requests).then(responses => Promise.all(responses.map(x => x.status === 200 ? x.json() : null)));
};
You should avoid using the Promise constructor directly. Here, we don't need to use it at all.
const url = "https://api.github.com/users/";
const getUsers = names =>
Promise.all(names.map(name =>
fetch(url + name).then(response =>
response.status === 200 ? response.json() : null)));
getUsers(["iliakan", "remy", "no.such.users"]).then(console.log);
The Promise constructor should only be used when you're creating new kinds of asynchronous tasks. In this case, you don't need to use the Promise constructor because fetch already returns a promise.
You also don't need to maintain an array and push to it because Promise.all resolves to an array. Finally, you don't need to map over the result of Promise.all. You can transform the promises returned by fetch.
The thing is that because json() operation is really quick, especially if response data is small in size it just has the time to execute. Second of all as objects in JavaScript passed by reference and not by value and Array is a object in JavaScript, independently of execution time it'll still push that data to the array even after it was resolved.

Transform Observable<String[]> to Observable<DataType[]>

I have an api that returns me an Array<string> of ids, given an original id (one to many). I need to make an http request on each of these ids to get back the associated data from the api. I cannot figure out how to take the Observable<string[]> and map it to the Observable<DataType[]>.
I would like to keep the original observable and use operators to get the desired outcome if at all possible.
Piping the map operator doesn't work in this situation due to the fact that the only item in the observable is the array.
Here's some example code that is similar to the implementation I am attempting.
getIds = (originalId: string) => {
return this.http.get<string[]>(url);
}
getDataFromIds = (originalId: string): Observable<DataType[]> => {
const ids$ = this.getIds(originalId);
// Make http calls for each of the items in the array.
result = ids$.pipe();
return result;
}
this is a use case for the switchMap operator typically, with the forkjoin operator as your inner observable.
getIds = (originalId: string) => {
return this.http.get<string[]>(url);
}
getDataFromIds = (originalId: string): Observable<DataType[]> => {
const ids$ = this.getIds(originalId);
// Make http calls for each of the items in the array.
result = ids$.pipe(switchmap(ids => forkJoin(ids.map(id => this.getId(id))));
// map the array of ids into an array of Observable<DataType>, forkjoin them and switch into it.
return result;
}
This is assuming the getIds() call will result in a list of string ids and that you have some getId() function that takes a string ID and returns an observable DataType
You can try this:
ids$.pipe(
switchMap(ids => //you can swap switchMap with any *Map operator: https://www.learnrxjs.io/operators/transformation/
forkJoin(...ids.map(id => //you swap forkJoin with any comb. operator: https://www.learnrxjs.io/operators/combination/
from(Promise.resolve({ id })).pipe(
map(res => res.id),
catchError(err => of(err)))))));
Imports for from, forkJoin should be from rxjs, while everything else is imported from rxjs/operators
catchError will catch any thrown unhandled errors.
Demo:
https://stackblitz.com/edit/rxjs-xmmhyj

Categories

Resources