Retrying a polling service with n seconds delay - javascript

private pollSubscriptions: Subscription;
private defaultPollTime: number = 2000;
constructor(
private http: HttpClient,
) {
this.pollSubscriptions = new Subscription();
}
pollRequest<T>(
url: string,
updateStatus: any,
pollWhileCondition: Function,
onPollingSuccessCallback?: Function,
timer = this.defaultPollTime
) {
this.pollSubscriptions.add(timer(0, 2000).pipe(
switchMap(() => this.http.get<T>(url).pipe(
catchError((error: any) => empty()))),
tap(updateStatus),
takeWhile(data => pollWhileCondition(data)))
.subscribe());
}
ngOnDestroy(): void {
this.pollSubscriptions.unsubscribe();
}
I am able to poll for multiple urls simultaneously. But how can I enhance current functionality so that I can meet the following requirements:
If a url which is polled gets failed then how can we retry that poll url with a delay of 3(n) secs for 3 times?
How can we add distinct operator on the urls being polled?
STILL NO SOLUTION
Thanks in advance

Hopefully this helps..
A couple of things:
You'll probably want to use a shared timer for your polling if you'd like all of your polls to happen at the same time. So I've added a pollWhen property below.
You're looking for retryWhen, it's a pretty tricky operator to understand, but basically it works like this: When you subscribe, it calls the function you pass it with an observable of potential errors, when an error is emitted, you're expected to scrub that into a "nexted" value to retry, an immediate completion (ex empty) to complete quietly, or an error (ex throwError) to kill the observable with an error.
class Component() {
/** ticks every 10 seconds */
pollWhen = timer(0, 10000)
.pipe(share());
private pollSubscriptions: Subscription;
constructor(
private http: HttpClient,
) {
this.pollSubscriptions = new Subscription();
}
pollRequest<T>(
url: string,
updateStatus: any,
pollWhileCondition: Function,
onPollingSuccessCallback?: Function,
timer = this.defaultPollTime
) {
this.pollSubscriptions.add(this.pollWhen.pipe(
switchMap(() =>
this.http.get<T>(url).pipe(
// Setup retries
retryWhen(
errors => errors.switchMap(
// if more than 3 retries,
// stop retrying quietly
(_, i) => i < 3
? timer(1000)
: EMPTY
)
)
)
),
tap(updateStatus),
takeWhile(pollWhileCondition)
).subscribe());
}
ngOnDestroy(): void {
this.pollSubscriptions.unsubscribe();
}
}

To answer your first question you can use retryWhen operator and replace catch.
For 2 question, method needs a bit of rewrite, you can change pollRequest to a subject() to store and emit the distinct url to the stream for processing.
var pollUrl = new rxjs.Subject()
const updateStatus = () => true
const pollWhileCondition = () => true
const http = url => {
console.log('http call...',url)
return rxjs.timer(1000).pipe(
rxjs.operators.tap(()=>{
throw "http call error"
})
)
}
const distinctUrl = pollUrl.pipe(rxjs.operators.distinct())
distinctUrl.pipe(
rxjs.operators.mergeMap(url => {
return rxjs.timer(0, 2000).pipe(rxjs.operators.map(() => url))
}),
rxjs.operators.tap(()=>console.log('xxx')),
rxjs.operators.mergeMap(url => http(url).pipe(
rxjs.operators.retry(3),
)),
rxjs.operators.catchError(()=>rxjs.empty()),
rxjs.operators.repeat()
).subscribe(()=>{},err=>{
console.warn(err)
},()=>console.log('comple'))
pollUrl.next('http://google.com')
setTimeout(()=> pollUrl.next('http://twitter.com') ,7000)
http://jsfiddle.net/cy0nbs3x/1535/

Not the complete solution but here I am able to achieve retrying 3 times with delay of 3 seconds. I am still seeking for how can I make the active poll urls distinct. Any help is much appreciated.
import { timer as observableTimer, Subscription, interval, of, concat, Observable } from 'rxjs';
import { takeWhile, tap, take, switchMap, repeat, retryWhen, scan, mapTo, expand, exhaustMap } from 'rxjs/operators';
#Injectable()
export class ReportPollingService {
private pollSubscriptions: Subscription;
constructor(private http: HttpClient){}
pollRequest<T>(url: string, updateStatus: any, pollWhileCondition: Function){
if (this.pollSubscriptions.closed) {
this.pollSubscriptions = new Subscription();// re-open polling
}
const request$ = this.http.get<T>(url);
const firstRequest$ = request$;
const polling$ = interval(options.interval).pipe(
take(1),
exhaustMap(() => request$),
repeat()
);
this.pollSubscriptions.add(concat(firstRequest$, polling$).pipe(
retryWhen(errors$ => {
return errors$.pipe(
scan(
({ errorCount, error }, err) => {
return { errorCount: errorCount + 1, error: err };
},
{ errorCount: 0, error: null }
),
switchMap(({ errorCount, error }) => {
if (errorCount >= 3) {
throw error;
}
return observableTimer(3000, null);
})
);
}),
).pipe(tap(updateStatus), takeWhile(data => pollWhileCondition(data))).subscribe());
}
stopPolling(): void {
this.pollSubscriptions.unsubscribe();
}

Related

Firebase Firestore: globally detect when there are pending writes

I've got a simple requirement to show a warning to the user if they leave the browser window while a pending write is happening in Firestore using a beforeunload listener:
window.addEventListener('beforeunload', e => {
if (NO_PENDING_SAVES) return;
e.preventDefault();
e.returnValue =
'Your changes are still being saved. Are you sure you want to leave?';
}, {
capture: true,
});
In Firestore using the Web SDK, how do I detect whether there are pending saves or not globally? There is a waitForPendingWrites() method on the Firestore object, but it would require polling and it's also asynchronous, so it won't work inside of beforeunload.
Solved doing this (async way) using Angular Framework and dependency "#angular/fire": "^7.4.1" :
export class FirebaseService {
constructor(private db: AngularFirestore) {
}
pendingWrites(): Observable<any> {
return defer(() => this.db.firestore.waitForPendingWrites())
}
And then:
export class PendingWritesService {
hasPendingWrites$: Observable<boolean>;
constructor(
private firebaseService: FirebaseService
){
this.hasPendingWrites$ = this.somethingPending$();
}
somethingPending$(): Observable<boolean> {
const timeEmmiter = timer(1000);
return race(timeEmmiter, this.firebaseService.pendingWrites()).pipe(
switchMap((pendingWrites, noPendingWrites) => {
const arePendingsWrites = 0;
return of(pendingWrites === arePendingsWrites);
})
)
}
}

EventSource stop recieving any update from server

I have a strange bug concerning EventSource.
I have a server that is permanently sending some event to the UI via EventSource. Until some days ago, everything was working fine.
Recently there was an update, and the server now send new data on some channel.
The thing is, for reason I haven't yet found out, sometimes, EventSource now stop working.
The connection is still open, the status of the request is still pending and not closed, and no error on console at all. The server is also still streaming event to the UI. But EventSource just don't update anymore. I also tried to hit directly with a curl request, but he do not get any update, until I refresh manually.
here is my Client Code =>
import { Injectable, HostListener, OnDestroy } from '#angular/core';
import { environment } from 'src/environments/environment';
#Injectable({
providedIn: 'root'
})
export class SSEService implements OnDestroy {
private eventSource: EventSource;
private callbacks: Map<string, (e: any) => void> = new Map<string, (e: any) => void>();
init(channel: string) {
if (this.eventSource) { return; }
console.log('SSE Channel Start');
this.eventSource = new EventSource(`${environment.SSE_URL}?channel=${channel}`);
}
protected callListener(cb: (d) => void): (e) => void {
const callee = ({data}): void => {
try {
const d = JSON.parse(data);
cb(d.message);
} catch (e) {
console.error(e, data);
}
};
return callee;
}
private addEventToEventSrc(event: string, callback: any, owner: string): void {
console.log(`Subscribed to ⇢ ${event} (owned by: ${owner})`);
const cb = this.callListener(callback);
this.eventSource.addEventListener(event, cb);
if (!this.callbacks.get(`${event}_${owner}`)) {
this.callbacks.set(`${event}_${owner}`, cb);
}
}
subscribe(event: string, callback: any, owner?: string): void {
if (!this.eventSource) { return; }
if (!owner) { owner = 'default'; }
if (!event) { event = 'message'; }
this.addEventToEventSrc(event, callback, owner);
}
unsubscribe(event: string, owner?: string): void {
if (!this.eventSource) { return; }
if (!owner) { owner = 'default'; }
if (!event) { event = 'message'; }
if (this.callbacks.get(`${event}_${owner}`)) {
console.log(`Unsubscribed to ⇢ ${event} (owned by: ${owner})`);
this.eventSource.removeEventListener(event, this.callbacks.get(`${event}_${owner}`));
}
this.callbacks.delete(`${event}_${owner}`);
}
#HostListener('window:beforeunload')
onBrowserClose() {
this.clearAll();
}
ngOnDestroy() {
this.clearAll();
}
clearAll() {
if (this.eventSource) {
console.log('SSE Channel Closed');
this.eventSource.close();
this.eventSource = null;
}
this.callbacks = new Map<string, (e: any) => void>();
}
}
I tried to log the data received, to put try catch... but it is never showing any error, it just stop getting updates.
The data send from the server my be the issue, but the server do not stop stream, so my EventSource should either crash or keep going.
but right now it's silently cutting any updates.
If you could give some idea of where the cause might be I would be very grateful
It turned out that, my Client Implementation of EventSource is correct.
Server side, we had 1 server and a dispatcher, the server was sending a lot of update on the same channel with very short delay to the dispatcher. EventSource was considering all those message as a Single big message, and was stopping the refresh until he considered the message ended. But this was not happening, so it got stuck in open trying to download.

How to implement a paging solution using RxJs?

What is the best way to implement a paging solution in RxJs?
If want to have an observable that emits new data based on an event (could be a click, or a JavaScript function call). For instance if I have an Observable retrieving data from a Web API, and I don't want it to just keep on hammering away HTTP requests, I only want that to happen on an event triggered by the subscriber. Scenarios could be infinite scrolls (event triggered by scrolling), classic paging (event triggered by user clicking on next page) etc.
This is the solution I came up with based on the comments and using RxJs 6.3.3
export default class Test extends React.Component<any, any> {
private subscription: Subscription;
public componentDidMount() {
const client = new MyClient();
let source = client.get('/data');
const buttonNotifier = defer(() => {
console.log("waiting");
return fromEventPattern(
(handler) => { this.next = handler; }
).pipe(tap(() =>
console.log("sending more data")
));
});
const pagedData: Observable<{Value:any, NextLink:string}> = source.pipe(
expand(({ NextLink }) => NextLink ?
buttonNotifier.pipe(take(1), concatMap(() => client.get(NextLink))) :
empty()));
this.subscription = pagedData.subscribe(
result => {
this.setState({
Value: result.Value,
});
},
error => {
this.setState({
Error: `ERROR: ${error}`
});
},
() => {
this.setState({
Done: `DONE`
});
}
);
}
public componentWillUnmount() {
if (this.subscription) {
this.subscription.unsubscribe();
}
}
private next: Function;
public render(): React.ReactElement<any> {
return (<button onClick={()=> {this.next();}}>Next</button>);
}
}

RxJS: Batch requests and share response

Let's imagine i have a function fetchUser which takes as parameter userId and return an observable of user.
As i am calling this method often, i want to batch the ids to perform one request with multiple ids instead !
Here my troubles began...
I can't find a solution to do that without sharing an observable between the different calls of fetchUser.
import { Subject, from } from "rxjs"
import { bufferTime, mergeMap, map, toArray, filter, take, share } from "rxjs/operators"
const functionThatSimulateAFetch = (userIds: string[]) => from(userIds).pipe(
map((userId) => ({ id: userId, name: "George" })),
toArray(),
)
const userToFetch$ = new Subject<string>()
const fetchedUser$ = userToFetch$.pipe(
bufferTime(1000),
mergeMap((userIds) => functionThatSimulateAFetch(userIds)),
share(),
)
const fetchUser = (userId: string) => {
const observable = fetchedUser$.pipe(
map((users) => users.find((user) => user.id === userId)),
filter((user) => !!user),
take(1),
)
userToFetch$.next(userId)
return observable
}
But that's ugly and it has multiple troubles:
If i unsubscribe from the observable returned by fetchUser before the timer of bufferTime is finished, it doesn't prevent the fetch of the user.
If i unsubscribe from all the observables returned by fetchUser before the fetch of the batch is finished, it doesn't cancel the request.
Error handling is more complex
etc
More generally: i don't know how to solve the problems requiring sharing resources using RxJS. It's difficult to find advanced example of RxJS.
I think #Biggy is right.
This is the way I understand the problem and what you want to achieve
There are different places in your app where you want to fetch users
You do not want to fire a fetch request all the time, rather you
want to buffer them and send them at a certain interval of time,
let's say 1 second
You want to cancel a certain buffer and avoid that for that 1 second
interval a request to fetch a batch of users is fired
At the same time, if somebody, let's call it Code at Position
X has asked for a User and just few milliseconds later somebody
else, i.e. Code at Position Y cancels the entire batch of
requests, then Code at Position X has to receive some sort of
answer, let's say a null
More, you may want to be able to ask to fetch a User and then change
your mind, if within the interval of the buffer time, and and avoid
this User to be fetched (I am far from sure this is really something
you want, but it seems somehow to emerge from your question
If this is all true, then you probably have to have some sort of queuing mechanism, as Buggy suggested.
Then there may be many implementations of such mechanism.
What you have is a good, but as with everything RxJS, but the devil is in the details.
Issues
The switchMaping
mergeMap((userIds) => functionThatSimulateAFetch(userIds)),
This is where you first go wrong. By using a merge map here, you are making it impossible to tell appart the "stream of requests" from the "stream returned by a single request":
You are making it near impossible to unsubscribe from an individual request (to cancel it)
You are making it impossible to handle errors
It falls appart if your inner observable emits more than once.
Rather, what you want is to emit individual BatchEvents, via a normal map (producing an observable of observable), and switchMap/mergeMap those after the filtering.
Side effects when creating an observable & Emitting before subscribing
userToFetch$.next(userId)
return observable
Don’t do this. An observable by itself does not actually do anything. It’s a "blueprint" for a sequence of actions to happen when you subscribe to it. By doing this, you’ll only create a batch action on observable creating, but you’re screwed if you get multiple or delayed subscriptions.
Rather, you want to create an observable from defer that emits to userToFetch$ on every subscription.
Even then you’ll want to subscribe to your observable before emitting to userToFetch: If you aren’t subscribed, your observable is not listening to the subject, and the event will be lost. You can do this in a defer-like observable.
Solution
Short, and not very different from your code, but structure it like this.
const BUFFER_TIME = 1000;
type BatchEvent = { keys: Set<string>, values: Observable<Users> };
/** The incoming keys */
const keySubject = new Subject<string>();
const requests: Observable<{ keys: Set<string>, values: Observable<Users> }> =
this.keySubject.asObservable().pipe(
bufferTime(BUFFER_TIME),
map(keys => this.fetchBatch(keys)),
share(),
);
/** Returns a single User from an ID. Batches the request */
function get(userId: string): Observable<User> {
console.log("Creating observable for:", userId);
// The money observable. See "defer":
// triggers a new subject event on subscription
const observable = new Observable<BatchEvent>(observer => {
this.requests.subscribe(observer);
// Emit *after* the subscription
this.keySubject.next(userId);
});
return observable.pipe(
first(v => v.keys.has(userId)),
// There is only 1 item, so any *Map will do here
switchMap(v => v.values),
map(v => v[userId]),
);
}
function fetchBatch(args: string[]): BatchEvent {
const keys = new Set(args); // Do not batch duplicates
const values = this.userService.get(Array.from(keys)).pipe(
share(),
);
return { keys, values };
}
This does exactly what you were asking, including:
Errors are propagated to the recipients of the batch call, but nobody else
If everybody unsubscribes from a batch, the observable is canceled
If everybody unsubscribes from a batch before the request is even fired, it never fires
The observable behaves like HttpClient: subscribing to the observable fires a new (batched) request for data. Callers are free to pipe shareReplay or whatever though. So no surprises there.
Here is a working stackblitz Angular demo: https://stackblitz.com/edit/angular-rxjs-batch-request
In particular, notice the behavior when you "toggle" the display: You’ll notice that re-subscribing to existing observables will fire new batch requests, and that those requests will cancel (or outright not fire) if you re-toggle fast enough.
Use case
In our project, we use this for Angular Tables, where each row needs to individually fetch additional data to render. This allows us to:
chunk all the requests for a "single page", without needing any special knowledge of pagination
Potentially fetch multiple pages at once if the user paginates fast
re-use existing results even if page size changes
Limitations
I would not add chunking or rate limitting into this. Because the source observable is a dumb bufferTime you run into issues:
The "chunking" will happen before the deduping. So if you have 100 requests for a single userId, you’ll end up firing several requests with only 1 element
If you rate limit, you’ll not be able to inspect your queue. So you may end up with a very long queue containing multiple same requests.
This is a pessimistic point of view though. Fixing it would mean going full out with a stateful queue/batch mechanism, which is an order of magnitude more complex.
I'm not sure if this is the best way to solve this problem (at least it need tests), but I will try to explain my point of view.
We have 2 queue: for pending and for feature requests.
result to help delivery response/error to subscribers.
Some kind of worker who is based on some schedule takes a task from the queue to do the request.
If i unsubscribe from the observable returned by fetchUser before the
timer of bufferTime is finished, it doesn't prevent the fetch of the
user.
Unsubscribe from fetchUser will cleanup the request queue and worker will do nothing.
If i unsubscribe from all the observables returned by fetchUser before
the fetch of the batch is finished, it doesn't cancel the request.
Worker subscribe until isNothingRemain$
const functionThatSimulateAFetch = (userIds: string[]) => from(userIds).pipe(
map((userId) => ({ id: userId, name: "George" })),
toArray(),
tap(() => console.log('API_CALL', userIds)),
delay(200),
)
class Queue {
queue$ = new BehaviorSubject(new Map());
private get currentQueue() {
return new Map(this.queue$.getValue());
}
add(...ids) {
const newMap = ids.reduce((acc, id) => {
acc.set(id, (acc.get(id) || 0) + 1);
return acc;
}, this.currentQueue);
this.queue$.next(newMap);
};
addMap(idmap: Map<any, any>) {
const newMap = (Array.from(idmap.keys()))
.reduce((acc, id) => {
acc.set(id, (acc.get(id) || 0) + idmap.get(id));
return acc;
}, this.currentQueue);
this.queue$.next(newMap);
}
remove(...ids) {
const newMap = ids.reduce((acc, id) => {
acc.get(id) > 1 ? acc.set(id, acc.get(id) - 1) : acc.delete(id);
return acc;
}, this.currentQueue)
this.queue$.next(newMap);
};
removeMap(idmap: Map<any, any>) {
const newMap = (Array.from(idmap.keys()))
.reduce((acc, id) => {
acc.get(id) > idmap.get(id) ? acc.set(id, acc.get(id) - idmap.get(id)) : acc.delete(id);
return acc;
}, this.currentQueue)
this.queue$.next(newMap);
};
has(id) {
return this.queue$.getValue().has(id);
}
asObservable() {
return this.queue$.asObservable();
}
}
class Result {
result$ = new BehaviorSubject({ ids: new Map(), isError: null, value: null });
select(id) {
return this.result$.pipe(
filter(({ ids }) => ids.has(id)),
switchMap(({ isError, value }) => isError ? throwError(value) : of(value.find(x => x.id === id)))
)
}
add({ isError, value, ids }) {
this.result$.next({ ids, isError, value });
}
clear(){
this.result$.next({ ids: new Map(), isError: null, value: null });
}
}
const result = new Result();
const queueToSend = new Queue();
const queuePending = new Queue();
const doRequest = new Subject();
const fetchUser = (id: string) => {
return Observable.create(observer => {
queueToSend.add(id);
doRequest.next();
const subscription = result
.select(id)
.pipe(take(1))
.subscribe(observer);
// cleanup queue after got response or unsubscribe
return () => {
(queueToSend.has(id) ? queueToSend : queuePending).remove(id);
subscription.unsubscribe();
}
})
}
// some kind of worker that take task from queue and send requests
doRequest.asObservable().pipe(
auditTime(1000),
// clear outdated results
tap(()=>result.clear()),
withLatestFrom(queueToSend.asObservable()),
map(([_, queue]) => queue),
filter(ids => !!ids.size),
mergeMap(ids => {
// abort the request if it have no subscribers
const isNothingRemain$ = combineLatest(queueToSend.asObservable(), queuePending.asObservable()).pipe(
map(([queueToSendIds, queuePendingIds]) => Array.from(ids.keys()).some(k => queueToSendIds.has(k) || queuePendingIds.has(k))),
filter(hasSameKey => !hasSameKey)
)
// prevent to request the same ids if previous requst is not complete
queueToSend.removeMap(ids);
queuePending.addMap(ids);
return functionThatSimulateAFetch(Array.from(ids.keys())).pipe(
map(res => ({ isErorr: false, value: res, ids })),
takeUntil(isNothingRemain$),
catchError(error => of({ isError: true, value: error, ids }))
)
}),
).subscribe(res => result.add(res))
fetchUser('1').subscribe(console.log);
const subs = fetchUser('2').subscribe(console.log);
subs.unsubscribe();
fetchUser('3').subscribe(console.log);
setTimeout(() => {
const subs1 = fetchUser('10').subscribe(console.log);
subs1.unsubscribe();
const subs2 = fetchUser('11').subscribe(console.log);
subs2.unsubscribe();
}, 2000)
setTimeout(() => {
const subs1 = fetchUser('20').subscribe(console.log);
subs1.unsubscribe();
const subs21 = fetchUser('20').subscribe(console.log);
const subs22 = fetchUser('20').subscribe(console.log);
}, 4000)
// API_CALL
// ["1", "3"]
// {id: "1", name: "George"}
// {id: "3", name: "George"}
// API_CALL
// ["20"]
// {id: "20", name: "George"}
// {id: "20", name: "George"}
stackblitz example
FYI, i tried to create a generic batched task queue using the answers of
#buggy & #picci :
import { Observable, Subject, BehaviorSubject, from, timer } from "rxjs"
import { catchError, share, mergeMap, map, filter, takeUntil, take, bufferTime, timeout, concatMap } from "rxjs/operators"
export interface Task<TInput> {
uid: number
input: TInput
}
interface ErroredTask<TInput> extends Task<TInput> {
error: any
}
interface SucceededTask<TInput, TOutput> extends Task<TInput> {
output: TOutput
}
export type FinishedTask<TInput, TOutput> = ErroredTask<TInput> | SucceededTask<TInput, TOutput>
const taskErrored = <TInput, TOutput>(
taskFinished: FinishedTask<TInput, TOutput>,
): taskFinished is ErroredTask<TInput> => !!(taskFinished as ErroredTask<TInput>).error
type BatchedWorker<TInput, TOutput> = (tasks: Array<Task<TInput>>) => Observable<FinishedTask<TInput, TOutput>>
export const createSimpleBatchedWorker = <TInput, TOutput>(
work: (inputs: TInput[]) => Observable<TOutput[]>,
workTimeout: number,
): BatchedWorker<TInput, TOutput> => (
tasks: Array<Task<TInput>>,
) => work(
tasks.map((task) => task.input),
).pipe(
mergeMap((outputs) => from(tasks.map((task, index) => ({
...task,
output: outputs[index],
})))),
timeout(workTimeout),
catchError((error) => from(tasks.map((task) => ({
...task,
error,
})))),
)
export const createBatchedTaskQueue = <TInput, TOutput>(
worker: BatchedWorker<TInput, TOutput>,
concurrencyLimit: number = 1,
batchTimeout: number = 0,
maxBatchSize: number = Number.POSITIVE_INFINITY,
) => {
const taskSubject = new Subject<Task<TInput>>()
const cancelTaskSubject = new BehaviorSubject<Set<number>>(new Set())
const cancelTask = (task: Task<TInput>) => {
const cancelledUids = cancelTaskSubject.getValue()
const newCancelledUids = new Set(cancelledUids)
newCancelledUids.add(task.uid)
cancelTaskSubject.next(newCancelledUids)
}
const output$: Observable<FinishedTask<TInput, TOutput>> = taskSubject.pipe(
bufferTime(batchTimeout, undefined, maxBatchSize),
map((tasks) => {
const cancelledUids = cancelTaskSubject.getValue()
return tasks.filter((task) => !cancelledUids.has(task.uid))
}),
filter((tasks) => tasks.length > 0),
mergeMap(
(tasks) => worker(tasks).pipe(
takeUntil(cancelTaskSubject.pipe(
filter((uids) => {
for (const task of tasks) {
if (!uids.has(task.uid)) {
return false
}
}
return true
}),
)),
),
undefined,
concurrencyLimit,
),
share(),
)
let nextUid = 0
return (input$: Observable<TInput>): Observable<TOutput> => input$.pipe(
concatMap((input) => new Observable<TOutput>((observer) => {
const task = {
uid: nextUid++,
input,
}
const subscription = output$.pipe(
filter((taskFinished) => taskFinished.uid === task.uid),
take(1),
map((taskFinished) => {
if (taskErrored(taskFinished)) {
throw taskFinished.error
}
return taskFinished.output
}),
).subscribe(observer)
subscription.add(
timer(0).subscribe(() => taskSubject.next(task)),
)
return () => {
subscription.unsubscribe()
cancelTask(task)
}
})),
)
}
With our example:
import { from } from "rxjs"
import { map, toArray } from "rxjs/operators"
import { createBatchedTaskQueue, createSimpleBatchedWorker } from "mmr/components/rxjs/batched-task-queue"
const functionThatSimulateAFetch = (userIds: string[]) => from(userIds).pipe(
map((userId) => ({ id: userId, name: "George" })),
toArray(),
)
const userFetchQueue = createBatchedTaskQueue(
createSimpleBatchedWorker(
functionThatSimulateAFetch,
10000,
),
)
const fetchUser = (userId: string) => {
return from(userId).pipe(
userFetchQueue,
)
}
I am open to any improvement suggestions

how to add delay in http request using angular? [duplicate]

I understand that Observable.debounce() can be used to process rapid fire form input. As Http GET also returns an Observable, I wonder it it is possible to debounce rapid http requests? I tried debounceTime() but it did not appear to do anything.
public getStuff(p1, area:string, p2:string): Observable<number> {
return this.http.get(some_url)
.map(r => r.json())
.debounceTime(10000)
.catch(this.handleError);
};
The debounceTime allows to buffer events and only handle the last one after an amount of time.
It's useful in the context of inputs but it should be defined on the observable that triggers the event not on the one created for the HTTP request.
Here is a example on a control associated with an input that leverages the debounceTime operator:
#Component({
(...)
template: `
<input [ngFormControl]="ctrl"/>
`
})
export class MyComponent {
constructor() {
this.ctrl = new Control();
this.ctrl.valueChanges
.debounceTime(500)
.distinctUntilChanged()
.switchMap((value: string) => {
// Get data according to the filled value
return this.service.getData(entry);
})
.subscribe(data => {
// Update the linked list
this.list = data;
});
}
}
This article could also interest you:
https://jaxenter.com/reactive-programming-http-and-angular-2-124560.html (see section "Linking with user events")
Following the micronyks's comment, here is an additional link:
Everything is a stream: http://slides.com/robwormald/everything-is-a-stream (youtube: https://www.youtube.com/watch?v=UHI0AzD_WfY)
You have to transform from subject observable into an http observable with switchMap like this:
observableObj$: Observable<any>;
subjectObj = new Subject();
ngOnInit() {
this.observableObj$ = this.subjectObj.pipe(
debounceTime(1000),
switchMap(() => {
...
return this.http.get(some_url).map(r => r.json());
}),
);
this.observableObj$.subscribe((data) => {
// result of http get...
...
});
}
getStuff() {
this.subjectObj.next();
}
in Angular7:
import { Observable, of, timer } from 'rxjs';
import { catchError, retry, map, debounce } from 'rxjs/operators';
public getStuff(p1, area:string, p2:string): Observable<number> {
return this.http.get(some_url)
.pipe(
debounce(() => timer(10000)),
catchError(this.handleError)
);
};

Categories

Resources