RxJs MergeMap and finalize not working as expected (as I thought) - javascript

I am having problems with finalize or complete using the code below. I want to be able to call loading = false after all is done. I tried to use finalize and complete but it is being called too early and multiple times. Any help? Thanks
displayPost(postDetail) {
//some rendering stuff
renderHTML(postDetail);
}
// returns a post query object
getPostQuery(username, postId) {
query: `{
post (id:"${username}", postId:"${postId}") {
id
title
content }`
return query;
}
let loading = true
//getUserHTTP() is an http request
getUserHTTP(userId).pipe(
switchMap(user => {
const ids = [];
// let's say oldest 100 posts
for (let i = 0; i < 100; i++) {
ids.push(getPostQuery(user.username, i))
}
return from(ids);
}),
mergeMap(query => this.httpClient.post(url, query)),
finalize(() => {
//this is being called too early and multiple times
loading = false;
})
).subscribe({
next: result => { /* doing staff */ displayPost(result) },
error: err => { /* handle errors */ },
complete: () => { /* not working */ }
})

mergeMap would trigger multiple parallel requests with each having it's own stream path. So the finalize would be triggered for each stream. Instead you could use forkJoin to trigger multiple requests in parallel that will only emit after all the individual streams complete.
If you're using toPromise() to convert the HTTP observable to a promise, avoid doing it. It isn't required here. You could also skip the from when using forkJoin.
Try the following
import { forkJoin } from 'rxjs';
import { switchMap, finalize } from 'rxjs/operators';
let loading = true;
getUserHTTP(userId).pipe(
switchMap(user => {
const reqs$ = [];
for (let i = 0; i < 100; i++) { // let's say oldest 100 posts
ids.push(
this.httpClient.post( // <-- do the HTTP request here
url,
getPostQuery(user.username, i)
)
);
}
return forkJoin(req$); // <-- return `forkJoin` here
}),
finalize(() => loading = false)
).subscribe({
next: result => { /* doing stuff */ displayPost(result) },
error: err => { /* handle errors */ },
complete: () => { }
});

Related

How to handle the canceled request inside forloop in Angular.?

I have 5+ pages in my App. I have the following method on header component.
The aim is, I need to show the status in the header if the user clicks the particular button. If I make a minimal or slow navigation between pages below code works fine. But if I navigate pages very frequently, the request getting canceled, because in some other pages I am calling the different set of API's.
async geneStatus() {
for (const x of Object.keys(this.gene)) {
const operationId = this.gene[x]['name'];
let operArr;
try {
operArr = await this.fetchEachStatus(name);
} catch (err) {
continue;
}
if (operArr[0] && operArr[0] === 'error') {
continue;
}
// Doing my logics
}
fetchEachStatus(geneId): Promise<any[]> {
return new Promise((resolve, reject) => {
this.apiDataService.get(this.geneUrl+ '/' + geneId).subscribe(
(res) => {
setTimeout(() => {
resolve(res);
}, 500);
}, err => {
setTimeout(() => {
resolve(['error']);
}, 500);
});
});
}
Here the problem is if any one of the API gets cancelled the for loop is not iterating for the next elements. I need to iterate the loop if one API is get cancelled. How can I fix this issue? I am not sure where I am making the problem.
I see multiple issues. I think the conversion to observable to promise is not only unnecessary, but counter-productive. Using the observables directly would enable you to use RxJS functions and operators. We can use the forkJoin function to make multiple simultaneous requests and catchError operator to mitigate the effects of potential errors.
Try the following
import { forkJoin } from 'rxjs';
import { catchError } from 'rxjs/operators';
geneStatus() {
forkJoin(Object.keys(this.gene).map(gene => this.fetchEachStatus(gene['name']))).subscribe(
res => {
// res[0] - `{ success: true | false, geneId: geneId }` from `this.apiDataService.get(this.geneUrl + '/' + this.gene[0]['name'])`
// res[1] - `{ success: true | false, geneId: geneId }` from `this.apiDataService.get(this.geneUrl + '/' + this.gene[1]['name'])`
...
const passedGeneIds = res.filter(item => item.success).map(item => item.geneId);
// passedGeneIds = [`geneId`, `geneId`, ...] - list of passed gene IDs
const failedGeneIds = res.filter(item => !item.success).map(item => item.geneId);
// failedGeneIds = [`geneId`, `geneId`, ...] - list of failed gene IDs
// some other logic
},
error => {
// essentially will never be hit since all the errors return a response instead
}
);
}
fetchEachStatus(geneId): Observable<any> {
return this.apiDataService.get(this.geneUrl + '/' + geneId).pipe(
map(_ => ({ sucess: true, geneId: geneId })), // <-- map to return the `geneId`
catchError(error => of({ sucess: false, geneId: geneId })) // <-- retun an observble from `catchError`
);
}
Now you need to remember that each time a button is clicked multiple simultaeneous requests are triggered. One solution to overcome this issue is to cancel all current requests before triggering a new set of request. For that you could bind the buttons to emit a central observable and trigger the requests using switchMap operator piped to that observable.

Continue on error in RxJs pipeable with mergeMap

I am doing some parallel HTTP get with RxJs pipe and the mergeMap operator.
On the first request fail (let's imagine /urlnotexists throw a 404 error) it stops all other requests.
I want it to continue query all remaining urls without calling all remaining mergeMap for this failed request.
I tried to play with throwError, and catchError from RxJs but without success.
index.js
const { from } = require('rxjs');
const { mergeMap, scan } = require('rxjs/operators');
const request = {
get: url => {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (url === '/urlnotexists') { return reject(new Error(url)); }
return resolve(url);
}, 1000);
});
}
};
(async function() {
await from([
'/urlexists',
'/urlnotexists',
'/urlexists2',
'/urlexists3',
])
.pipe(
mergeMap(async url => {
try {
console.log('mergeMap 1:', url);
const val = await request.get(url);
return val;
} catch(err) {
console.log('err:', err.message);
// a throw here prevent all remaining request.get() to be tried
}
}),
mergeMap(async val => {
// should not pass here if previous request.get() failed
console.log('mergeMap 2:', val);
return val;
}),
scan((acc, val) => {
// should not pass here if previous request.get() failed
acc.push(val);
return acc;
}, []),
)
.toPromise()
.then(merged => {
// should have merged /urlexists, /urlexists2 and /urlexists3
// even if /urlnotexists failed
console.log('merged:', merged);
})
.catch(err => {
console.log('catched err:', err);
});
})();
$ node index.js
mergeMap 1: /urlexists
mergeMap 1: /urlnotexists
mergeMap 1: /urlexists2
mergeMap 1: /urlexists3
err: /urlnotexists
mergeMap 2: /urlexists
mergeMap 2: undefined <- I didn't wanted this mergeMap to have been called
mergeMap 2: /urlexists2
mergeMap 2: /urlexists3
merged: [ '/urlexists', undefined, '/urlexists2', '/urlexists3' ]
I expect to make concurrent GET requests and reduce their respectives values in one object at the end.
But if some error occurs I want them not to interrupt my pipe, but to log them.
Any advice ?
If you want to use RxJS you should add error handling with catchError and any additional tasks to a single request before you execute all your requests concurrently with forkJoin.
const { of, from, forkJoin } = rxjs;
const { catchError, tap } = rxjs.operators;
// your promise factory, unchanged (just shorter)
const request = {
get: url => {
return new Promise((resolve, reject) => setTimeout(
() => url === '/urlnotexists' ? reject(new Error(url)) : resolve(url), 1000
));
}
};
// a single rxjs request with error handling
const fetch$ = url => {
console.log('before:', url);
return from(request.get(url)).pipe(
// add any additional operator that should be executed for each request here
tap(val => console.log('after:', val)),
catchError(error => {
console.log('err:', error.message);
return of(undefined);
})
);
};
// concurrently executed rxjs requests
forkJoin(["/urlexists", "/urlnotexists", "/urlexists2", "/urlexists3"].map(fetch$))
.subscribe(merged => console.log("merged:", merged));
<script src="https://unpkg.com/#reactivex/rxjs#6.5.3/dist/global/rxjs.umd.js"></script>
If you are willing to forego RXJS and just solve with async/await it is very straightforward:
const urls = ['/urlexists', '/urlnotexists', '/urlexists2', '/urlexists3'];
const promises = urls.map(url => request(url));
const resolved = await Promise.allSettled(promises);
// print out errors
resolved.forEach((r, i) => {
if (r.status === 'rejected') {
console.log(`${urls[i]} failed: ${r.reason}`)
}
});
// get the success results
const merged = resolved.filter(r => r.status === 'resolved').map(r => r.value);
console.log('merged', merged);
This make use of Promise.allSettled proposed helper method. If your environment does not have this method, you can implement it as shown in this answer.

Recursive Promise.all with a snapshot in firebase

I have the following structure on my firebase database:
I need to get the values of the keys pin. For that I'm working with a recursive function like this:
let pins = [];
const normalize = (snapchot) => {
snapchot.forEach(function(child) {
if(child.val().pin) {
pins.push(Promise.resolve(child.val().pin));
}
else normalize(child);
});
return Promise.all(pins);
}
And now, call the normalize function:
normalize(snapshot) // snapshot represents the data from the firebase db
.then(p => {
console.log(p); // [ 'mi-pin-agosto', 'mi-pin-julio' ]
})
.catch(err => {
// handle error
})
And it works, but when I debug that code, I see that return Promise.all(pins); gets called more than one time. I only need to be called only once, after the foreach have been completly finished; that's with the idea for the case of performance, because the snapshot data it's more large than the see it in the image I show.
Any ideas ???
to only use Promise.all once you can have the recursive function as a function "inside" `normalize
const normalize = (snapshot) => {
const process = x => {
let ret = [];
x.forEach(function(child) {
if(child.val().pin) {
ret.push(Promise.resolve(child.val().pin));
} else {
ret = ret.concat(process(child));
}
});
return ret;
});
return Promise.all(process(snapshot));
}
This code also doesn't require a global array to store the results
However, as there is nothing asynchronous about any of the code you are calling - dispense with the Promises inside normalize
const normalize = (snapshot) => {
let ret = [];
snapshot.forEach(function(child) {
if(child.val().pin) {
ret.push(child.val().pin);
} else {
ret = ret.concat(normalize(child));
}
});
return ret;
};
If you really have to use Promises for this code, you can simply
Promise.all(normalize(snapshot))
.then(p => {
console.log(p); // [ 'mi-pin-agosto', 'mi-pin-julio' ]
})
.catch(err => {
// handle error
})

fetch retry request (on failure)

I'm using browser's native fetch API for network requests. Also I am using the whatwg-fetch polyfill for unsupported browsers.
However I need to retry in case the request fails. Now there is this npm package whatwg-fetch-retry I found, but they haven't explained how to use it in their docs. Can somebody help me with this or suggest me an alternative?
From the fetch docs :
fetch('/users')
.then(checkStatus)
.then(parseJSON)
.then(function(data) {
console.log('succeeded', data)
}).catch(function(error) {
console.log('request failed', error)
})
See that catch? Will trigger when fetch fails, you can fetch again there.
Have a look at the Promise API.
Implementation example:
function wait(delay){
return new Promise((resolve) => setTimeout(resolve, delay));
}
function fetchRetry(url, delay, tries, fetchOptions = {}) {
function onError(err){
triesLeft = tries - 1;
if(!triesLeft){
throw err;
}
return wait(delay).then(() => fetchRetry(url, delay, triesLeft, fetchOptions));
}
return fetch(url,fetchOptions).catch(onError);
}
Edit 1: as suggested by golopot, p-retry is a nice option.
Edit 2: simplified example code.
I recommend using some library for promise retry, for example p-retry.
Example:
const pRetry = require('p-retry')
const fetch = require('node-fetch')
async function fetchPage () {
const response = await fetch('https://stackoverflow.com')
// Abort retrying if the resource doesn't exist
if (response.status === 404) {
throw new pRetry.AbortError(response.statusText)
}
return response.blob()
}
;(async () => {
console.log(await pRetry(fetchPage, {retries: 5}))
})()
I don't like recursion unless is really necessary. And managing an exploding number of dependencies is also an issue. Here is another alternative in typescript. Which is easy to translate to javascript.
interface retryPromiseOptions<T> {
retryCatchIf?:(response:T) => boolean,
retryIf?:(response:T) => boolean,
retries?:number
}
function retryPromise<T>(promise:() => Promise<T>, options:retryPromiseOptions<T>) {
const { retryIf = (_:T) => false, retryCatchIf= (_:T) => true, retries = 1} = options
let _promise = promise();
for (var i = 1; i < retries; i++)
_promise = _promise.catch((value) => retryCatchIf(value) ? promise() : Promise.reject(value))
.then((value) => retryIf(value) ? promise() : Promise.reject(value));
return _promise;
}
And use it this way...
retryPromise(() => fetch(url),{
retryIf: (response:Response) => true, // you could check before trying again
retries: 5
}).then( ... my favorite things ... )
I wrote this for the fetch API on the browser. Which does not issue a reject on a 500. And did I did not implement a wait. But, more importantly, the code shows how to use composition with promises to avoid recursion.
Javascript version:
function retryPromise(promise, options) {
const { retryIf, retryCatchIf, retries } = { retryIf: () => false, retryCatchIf: () => true, retries: 1, ...options};
let _promise = promise();
for (var i = 1; i < retries; i++)
_promise = _promise.catch((value) => retryCatchIf(value) ? promise() : Promise.reject(value))
.then((value) => retryIf(value) ? promise() : Promise.reject(value));
return _promise;
}
Javascript usage:
retryPromise(() => fetch(url),{
retryIf: (response) => true, // you could check before trying again
retries: 5
}).then( ... my favorite things ... )
EDITS: Added js version, added retryCatchIf, fixed the loop start.
One can easily wrap fetch(...) in a loop and catch potential errors (fetch only rejects the returning promise on network errors and the alike):
const RETRY_COUNT = 5;
async function fetchRetry(...args) {
let count = RETRY_COUNT;
while(count > 0) {
try {
return await fetch(...args);
} catch(error) {
// logging ?
}
// logging / waiting?
count -= 1;
}
throw new Error(`Too many retries`);
}

RxJS 5.0 "do while" like mechanism

I'm trying to use RxJS for a simple short poll. It needs to make a request once every delay seconds to the location path on the server, ending once one of two conditions are reached: either the callback isComplete(data) returns true or it has tried the server more than maxTries. Here's the basic code:
newShortPoll(path, maxTries, delay, isComplete) {
return Observable.interval(delay)
.take(maxTries)
.flatMap((tryNumber) => http.get(path))
.doWhile((data) => !isComplete(data));
}
However, doWhile doesn't exist in RxJS 5.0, so the condition where it can only try the server maxTries works, thanks to the take() call, but the isComplete condition does not work. How can I make it so the observable will next() values until isComplete returns true, at which point it will next() that value and complete().
I should note that takeWhile() does not work for me here. It does not return the last value, which is actually the most important, since that's when we know it's done.
Thanks!
We can create a utility function to create a second Observable that emits every item that the inner Observable emits; however, we will call the onCompleted function once our condition is met:
function takeUntilInclusive(inner$, predicate) {
return Rx.Observable.create(observer => {
var subscription = inner$.subscribe(item => {
observer.onNext(item);
if (predicate(item)) {
observer.onCompleted();
}
}, observer.onError, observer.onCompleted);
return () => {
subscription.dispose();
}
});
}
And here's a quick snippet using our new utility method:
const inner$ = Rx.Observable.range(0, 4);
const data$ = takeUntilInclusive(inner$, (x) => x > 2);
data$.subscribe(x => console.log(x));
// >> 0
// >> 1
// >> 2
// >> 3
This answer is based off: RX Observable.TakeWhile checks condition BEFORE each element but I need to perform the check after
You can achieve this by using retry and first operators.
// helper observable that can return incomplete/complete data or fail.
var server = Rx.Observable.create(function (observer) {
var x = Math.random();
if(x < 0.1) {
observer.next(true);
} else if (x < 0.5) {
observer.error("error");
} else {
observer.next(false);
}
observer.complete();
return function () {
};
});
function isComplete(data) {
return data;
}
var delay = 1000;
Rx.Observable.interval(delay)
.switchMap(() => {
return server
.do((data) => {
console.log('Server returned ' + data);
}, () => {
console.log('Server threw');
})
.retry(3);
})
.first((data) => isComplete(data))
.subscribe(() => {
console.log('Got completed value');
}, () => {
console.log('Got error');
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.0.1/Rx.min.js"></script>
It's an old question, but I also had to poll an endpoint and arrived at this question. Here's my own doWhile operator I ended up creating:
import { pipe, from } from 'rxjs';
import { switchMap, takeWhile, filter, map } from 'rxjs/operators';
export function doWhile<T>(shouldContinue: (a: T) => boolean) {
return pipe(
switchMap((data: T) => from([
{ data, continue: true },
{ data, continue: shouldContinue(data), exclude: true }
])),
takeWhile(message => message.continue),
filter(message => !message.exclude),
map(message => message.data)
);
}
It's a little weird, but it works for me so far. You could use it with the take like you were trying.
i was googling to find a do while behavior, i found this question. and then i found out that doWhile takes in a second param inclusive boolean. so maybe you can do?:
takeWhile((data) => !isComplete(data), true)

Categories

Resources