How does the mergeAll works? - javascript

I am trying to figure out, how mergeAll works and created examples:
const clicks = Rx.Observable.interval(4000).map(()=> "first");
const higherOrder = clicks.map((ev) => Rx.Observable.interval(1000).map(() => "inner").take(10));
const firstOrder = higherOrder.mergeAll();
firstOrder.subscribe(x => console.log(x));
the output it always inner and first never outputted. After calling mergeAll() the clicks observable is no more relevant?
On more example:
const input = document.getElementById("window");
const clicks = Rx.Observable.fromEvent(input, 'keyup').map(() => "Hello");
const interval = Rx.Observable.interval(4000);
const result = clicks.window(interval)
.map(win => {
return win.take(1);
})
.mergeAll(); // flatten the Observable-of-Observables
result.subscribe(x => console.log("Result " + x));
on subscribe, I've got the result from outer observable "Result Hello" not the inner observable. What kind of role plays mergeAll in this case?
Why the win variable is an instance observable not Hello?

After calling mergeAll() the clicks observable is no more relevant?
Correct. You map each individual click to a stream of "inner" events. mergeAll simply merges those streams together. The click event lives in this resulting stream only very faintly as the point in time where a specific merged stream starts. It becomes a bit more clear this way:
const clicks$ = Rx.Observable.interval(1000);
const higherOrder$ = clicks$.map(click => Rx.Observable.interval(500)
.map(counter => `${click}–${counter}`)
);
higherOrder$.mergeAll().subscribe(console.log);
The documentation and its marble diagram might also help you understand:

you should use switchMap.
firstOrder.switchMap(x => console.log(x)).subscribe( value => console.log(value));
or
result.switchMap(x => console.log("Result " + x)).subscribe( value => console.log(value));
switch expects a stream of Observables, when it get an Observable pushed onto it’s input stream it unsubscribes from any previous Observables and subscribes to the new one and then emits any values from that Observable onto it’s output stream.

Related

Add streams dynamically to combined stream (eg forkJoin)

My function (lets call it myFunction) is getting an array of streams (myFunction(streams: Observable<number>[])). Each of those streams produces values from 1 to 100, which acts as a progress indicator. When it hits 100 it is done and completed. Now, when all of those observables are done I want to emit a value. I could do it this way:
public myFunction(streams: Observable<number>[]) {
forkJoin(streams).subscribe(_values => this.done$.emit());
}
This works fine, but imagine following case:
myFunction gets called with 2 streams
one of those streams is done, second one is still progressing
myFunction gets called (again) with 3 more streams (2nd one from previous call is still progressing)
I'd like to somehow add those new streams from 3rd bullet to the "queue", which would result in having 5 streams in forkJoin (1 completed, 4 progressing).
I've tried multiple approaches but can't get it working anyhow... My latest approach was this:
private currentProgressObs: Observable<any> | null = null;
private currentProgressSub: Subscription | null = null;
public myFunction(progressStreams: Observable<number>[]) {
const isUploading = this.cumulativeUploadProgressSub && !this.cumulativeUploadProgressSub.closed;
const currentConcatObs = this.currentProgressObs?.pipe(concatAll());
const currentStream = isUploading && this.currentProgressObs ? this.currentProgressObs : of([100]);
if (this.currentProgressSub) {
this.currentProgressSub.unsubscribe();
this.currentProgressSub = null;
}
this.currentProgressObs = forkJoin([currentStream, ...progressStreams]);
this.currentProgressSub = this.currentProgressObs.subscribe(
_lastProgresses => {
this._isUploading$.next(false); // <----- this is the event I want to emit when all progress is completed
this.currentProgressSub?.unsubscribe();
this.currentProgressSub = null;
this.currentProgressObs = null;
},
);
}
Above code only works for the first time. Second call to the myFunction will never emit the event.
I also tried other ways. I've tried recursion with one global stream array, in which I can add streams while the subscription is still avctive but... I failed. How can I achieve this? Which operator and in what oreder should I use? Why it will or won't work?
Here is my suggestion for your issue.
We will have two subjects, one to count the number of request being processed (requestsInProgress) and one more to mange the requests that are being processed (requestMerger)
So the thing that will do is whenever we want to add new request we will pass it to the requestMerger Subject.
Whenever we receive new request for processing in the requestMerger stream we will first increment the requestInProgress counter and after that we will merge the request itself in the source observable. While merging the new request/observable to the source we will also add the finalize operator in order to track when the request has been completed (reached 100), and when we hit the completion criteria we will decrement the request counter with the decrementCounter function.
In order to emit result e.g. to notify someone else in the app for the state of the pending requests we can subscribe to the requestsInProgress Subject.
You can test it out either here or in this stackBlitz
let {
interval,
Subject,
BehaviorSubject
} = rxjs
let {
mergeMap,
map,
takeWhile,
finalize,
first,
distinctUntilChanged
} = rxjs.operators
// Imagine next lines as a service
// Subject responsible for managing strems
let requestMerger = new Subject();
// Subject responsible for tracking streams in progress
let requestsInProgress = new BehaviorSubject(0);
function incrementCounter() {
requestsInProgress.pipe(first()).subscribe(x => {
requestsInProgress.next(x + 1);
});
}
function decrementCounter() {
requestsInProgress.pipe(first()).subscribe(x => {
requestsInProgress.next(x - 1);
});
}
// Adds request to the request being processed
function addRequest(req) {
// The take while is used to complete the request when we have `value === 100` , if you are dealing with http-request `takeWhile` might be redudant, because http request complete by themseves (e.g. the finalize method of the stream will be called even without the `takeWhile` which will decrement the requestInProgress counter)
requestMerger.next(req.pipe(takeWhile(x => x < 100)));
}
// By subscribing to this stream you can determine if all request are processed or if there are any still pending
requestsInProgress
.pipe(
map(x => (x === 0 ? "Loaded" : "Loading")),
distinctUntilChanged()
)
.subscribe(x => {
console.log(x);
document.getElementById("loadingState").innerHTML = x;
});
// This Subject is taking care to store or request that are in progress
requestMerger
.pipe(
mergeMap(x => {
// when new request is added (recieved from the requestMerger Subject) increment the requrest being processed counter
incrementCounter();
return x.pipe(
finalize(() => {
// when new request has been completed decrement the requrest being processed counter
decrementCounter();
})
);
})
)
.subscribe(x => {
console.log(x);
});
// End of fictional service
// Button that adds request to be processed
document.getElementById("add-stream").addEventListener("click", () => {
addRequest(interval(1000).pipe(map(x => x * 25)));
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/6.6.6/rxjs.umd.min.js"></script>
<div style="display:flex">
<button id="add-stream">Add stream</button>
<h5>Loading State: <span id="loadingState">false</span> </h5>
</div>
Your problem is that each time your call your function, you are creating a new observable. Your life would be much easier if all calls of your function pushed all upload jobs through the same stream.
You can achieve this using a Subject.
I would suggest you push single "Upload Jobs" though a simple subject and design an observable that emits the state of all upload jobs whenever anything changes: A simple class that offers a createJob() method to submit jobs, and a jobs$ observable to reference the state:
class UploadService {
private jobs = new Subject<UploadJob>();
public jobs$ = this.jobs.pipe(
mergeMap(job => this.processJob(job)),
scan((collection, job) => collection.set(job.id, job), new Map<string, UploadJob>()),
map(jobsMap => Array.from(jobsMap.values()))
);
constructor() {
this.jobs$.subscribe();
}
public createJob(id: string) {
this.jobs.next({ id, progress: 0 });
}
private processJob(job: UploadJob) {
// do work and return observable that
// emits updated status of UploadJob
}
}
Let's break it down:
jobs is a simple subject, that we can push "jobs" through
createJob simply calls jobs.next() to push the new job through the stream
jobs$ is where all the magic happens. It receives each UploadJob and uses:
mergeMap to execute whatever function actually does the work (I called it processJob() for this example) and emits its values into the stream
scan is used to accumulate these UploadJob emissions into a Map (for ease of inserting or updating)
map is used to convert the map into an array (Map<string, UploadJob> => UploadJob[])
this.jobs$.subscribe() is called in the constructor of the class so that jobs will be processed
Now, we can easily derive your isUploading and cumulativeProgress from this jobs$ observable like so:
public isUploading$ = this.jobs$.pipe(
map(jobs => jobs.some(j => j.progress !== 100)),
distinctUntilChanged()
);
public progress$ = this.jobs$.pipe(
map(jobs => {
const current = jobs.reduce((sum, j) => sum + j.progress, 0) / 100;
const total = jobs.length ?? current;
return current / total;
})
);
Here's a working StackBlitz demo.

RxJS withLatestFrom doest not emit value if the target has a share operator

const a$ = new BehaviorSubject(['a']).pipe(
// op.share() does not have share
);
a$.subscribe((a) => {
console.log('a$', a)
})
const b$ = new BehaviorSubject([1]);
const c$ = b$.pipe(
op.withLatestFrom(a$)
)
c$.subscribe(c => {
console.log('c$ test', c) // you can find logs in console
})
but if I add a share to a$, subscription of c fail to run.
const a$ = new BehaviorSubject(['a']).pipe(
op.share()
);
a$.subscribe((a) => {
console.log('a$', a)
})
const b$ = new BehaviorSubject([1]);
const c$ = b$.pipe(
op.withLatestFrom(a$)
)
c$.subscribe(c => {
console.log('c$ test', c) // logs here cannot be found in console
})
I cannot understand why this happened. and I do need a$ to be shared. Is there any solution for this.
share doesn't replay values to late subscribers, so you lose the replay functionality of your BehaviorSubject. The first subscription to a$ triggers a subscribe to the BehaviorSubject and share forewards its value. When you subscribe a second time to a$ share has already subscribed to the BehaviorSubject so it won't get its value again and, as share has no replay functionality, won't replay its value.
If your source is a BehaviorSubject and you don't use any other operators you don't need to share it.
If your source is acutally something else your can use shareReplay(1) instead of share to replay its latest value to late subscribers.

Delaying all items except specific one

Let's say I have a stream of actions. They're either Prompts, Responses (to prompts) or Effects. They come at irregular intervals, but assume 1 second delay between each one.
On every PROMPT action I want to emit that action and a BEGIN action (let's say we want to show the message to user for N seconds). All other items should be delayed by N seconds, after which the END action fires (hiding the message) and everything continues.
This is my code for it (for https://rxviz.com/):
const { interval, from, zip, timer } = Rx;
const { concatMap, delayWhen } = RxOperators;
const PROMPT = 'P';
const RESPONSE = 'R';
const EFFECT = 'E';
const BEGIN = '^';
const END = '&';
const convertAction = action => (action === PROMPT) ? [PROMPT, BEGIN, END] : [action];
// Just actions coming at regular intervals
const action$ = zip(
from([PROMPT, RESPONSE, EFFECT, PROMPT, RESPONSE, EFFECT, EFFECT, EFFECT]),
interval(1000),
(a, b) => a,
);
action$.pipe(
concatMap(action =>
from(convertAction(action)).pipe(
delayWhen(action => (action == END) ? timer(5000) : timer(0)),
),
),
);
What I really want to do is for first RESPONSE action after PROMPT to not be affected by the delay. If it comes before END action, it should be shown right away. So, instead of
P^ &REP^ &REEE
I want to receive
P^ R &EP^R &EEE
How can I achieve it while keeping each RESPONSE after their corresponding PROMPT? Assume no events can come between PROMPT and RESPONSE.
If I understand it right, this is a very interesting problem to address with Observables streams. This is the way I would attack it.
First I would store in a constant actionDelayed$ the result of your original logic, i.e. a stream where we have introduced, after each PROMPT, BEGIN and END actions divided by a delay.
const actionDelayed$ = action$.pipe(
concatMap(action =>
from(convertAction(action)).pipe(
delayWhen(action => (action == END) ? timer(5000) : timer(0)),
),
),
);
Then I would create 2 separate streams, response$ and promptDelayed$, containing only the RESPONSE actions before the delay was introduced and the PROMPT actions after the delayed was introduced, like this
const response$ = action$.pipe(
filter(a => a == RESPONSE)
)
const promptDelayed$ = actionDelayed$.pipe(
filter(a => a == PROMPT)
)
With these 2 streams, I can create another stream of RESPONSE actions emitted just after the PROMPT delayed actions are emitted, like this
const responseN1AfterPromptN$ = zip(response$, promptDelayed$).pipe(
map(([r, p]) => r)
)
At this point I have just to remove all RESPONSE actions from actionDelayed$ like this
const actionNoResponseDelayed$ = actionDelayed$.pipe(
filter(a => a != RESPONSE)
)
and merge actionNoResponseDelayed$ with responseN1AfterPromptN$ to get the final stream.
The entirety of the code, to be tried with rxviz is this
const { interval, from, zip, timer, merge } = Rx;
const { concatMap, delayWhen, share, filter, map } = RxOperators;
const PROMPT = 'P';
const RESPONSE = 'R';
const EFFECT = 'E';
const BEGIN = '^';
const END = '&';
const convertAction = action => (action === PROMPT) ? [PROMPT, BEGIN, END] : [action];
// Just actions coming at regular intervals
const action$ = zip(
from([PROMPT, RESPONSE, EFFECT, PROMPT, RESPONSE, EFFECT, EFFECT, EFFECT]),
interval(1000),
(a, b) => a,
).pipe(share());
const actionDelayed$ = action$.pipe(
concatMap(action =>
from(convertAction(action)).pipe(
delayWhen(action => (action == END) ? timer(5000) : timer(0)),
),
),
share()
);
const response$ = action$.pipe(
filter(a => a == RESPONSE)
)
const promptDelayed$ = actionDelayed$.pipe(
filter(a => a == PROMPT)
)
const responseN1AfterPromptN$ = zip(response$, promptDelayed$).pipe(
map(([r, p]) => r)
)
const actionNoResponseDelayed$ = actionDelayed$.pipe(
filter(a => a != RESPONSE)
)
merge(actionNoResponseDelayed$, responseN1AfterPromptN$)
The use of the share operator while creating action$ and actionDelayed$ streams allows to avoid repeated subscriptions to these streams when creating the subsequent streams used in the solution.
It may not work this way because you're using concatMap. As you know, it waits for the inner observable to complete before starting to process(to subscribe) the pending ones. It internally uses a buffer, such that if an inner observable is still active(did not complete), the emitted value will be added to that buffer. When the inner observable becomes inactive, the oldest value from the buffer is selected and a new inner observable will be created, based on the provided callback function.
There is also delayWhen, which emits a complete notification after all of its pending observables complete:
// called when an inner observable sends a `next`/`complete` notification
const notify = () => {
// Notify the consumer.
subscriber.next(value);
// Ensure our inner subscription is cleaned up
// as soon as possible. Once the first `next` fires,
// we have no more use for this subscription.
durationSubscriber?.unsubscribe();
if (!closed) {
active--;
closed = true;
checkComplete();
}
};
checkComplete() will check if there is a need to send a complete notification to the main stream:
const checkComplete = () => isComplete && !active && subscriber.complete();
We've seen that active decreases in notify(). isComplete becomes true when the main source completes:
// this is the `complete` callback
() => {
isComplete = true;
checkComplete();
}
So, this is why it does not work this way:
the PROMPT action is used to create the concatMap's first inner observable
the observable emits 3 consecutive actions [PROMPT, BEGIN, END]
the first 2 will get timer(0), whereas the third one, END, will get (timer(5000)); notice that in this time, before the PROMPT action got emitted, the isComplete variable is set to true, because from() completes synchronously in this case
so there is a timer(5000) that keeps the inner obs. active; then a RESPONSE is emitted from the actions$ stream, but since there is no place for it yet, it will be added to the buffer and an inner obs. will be created when timer(5000) finally expires
A way to solve this might be to replace concatMap with mergeMap.

RXJS Scan - withLatestFrom another observable

I'm trying to work out how to use scan to derive a new state whenever my input observable emits a new value, but I can't seem to get it working.
I want to output a new State every time the input$ observable emits a new value, but it should be derived from the current value of state$.
Can anyone suggest how I can fix this? I have a feeling I've got the wrong idea altogether :-)
My code looks something like this:
const stateReducer = (state$: Observable<State>, input$: Observable<Input>) => {
state$ = state$.pipe( startWith(DEFAULT_STATE) );
const foo$: Observable<State> = input$.pipe(
filter((input) => isFoo(input)),
withLatestFrom(state$),
scan((acc, ([input, state]) => {
//returns derived state
});
const bar$: Observable<State> = input$.pipe(
filter((input) => isBar(input)),
withLatestFrom(state$),
scan((acc, ([input, state]) => {
//returns derived state
});
return merge(
foo$,
bar$
);
}
Since you want to use the result of an Observable as your seed value, switchMap will help.
switchMap docs
const bar$: Observable<State> = state$.pipe(
switchMap((state) => {
return input$.pipe(
filter((input) => isBar(input)),
scan((curState, input) => {
// do some logic here
return {...curState, prop: 'new value'}
}, state);
)
})
I've made a sample CodePen for a more complete solution at https://codepen.io/askmattcairns/pen/LYjEoZz?editors=0010.
More Details
switchMap means to switch to the stream in here. So this code is saying, once state$ emits a value, store it (as state), then wait for input$ to emit.
When we call scan, its seed value (the second property of scan) is now the result of state$'s emitted value.
This will emit a new value any time input$ receives a new value.
Update to Code Sandbox
After digging in to your Code Sandbox, I better understand what the problem is. You can see my final output at https://codesandbox.io/s/elegant-chaum-2b794?file=/src/index.tsx.
When you initialize your 2 inner streams foo$ and bar$, they both reference state$ using withLatestFrom. Each time input$ emits, it still references the original value of state, using 0 as its starting total.

ReactiveX filtering on observable multiple times and merging

I have a problem creating the following observable.
I want it to receive a predefined array of values
And I want to filter by different things, and be able to work with these as individual observables.
And then when it comes time to merge these filtered observables, I want to preserve the order from the original one
//Not sure the share is necessary, just thought it would tie it all together
const input$ = Observable.from([0,1,0,1]).share();
const ones$ = input$.filter(n => n == 1);
const zeroes$ = input$.filter(n => n == 0);
const zeroesChanged$ = zeroes$.mapTo(2);
const onesChanged$ = ones$.mapTo(3);
const allValues$ = Observable.merge(onesChanged$,zeroesChanged$);
allValues$.subscribe(n => console.log(n));
//Outputs 3,3,2,2
//Expected output 3,2,3,2
EDIT: I am sorry I was not specific enough in my question.
I am using a library called cycleJS, which separates sideeffects into drivers.
So what I am doing in my cycle is this
export function socketCycle({ SOCKETIO }) {
const serverConnect$ = SOCKETIO.get('connect').map(serverDidConnect);
const serverDisconnect$ = SOCKETIO.get('disconnect').map(serverDidDisconnect);
const serverFailedToConnect$ = SOCKETIO.get('connect_failed').map(serverFailedToConnect);
return { ACTION: Observable.merge(serverConnect$, serverDisconnect$, serverFailedToConnect$) };
}
Now my problem arose when I wanted to write a test for it. I tried with the following which worked in the wrong matter(using jest)
const inputConnect$ = Observable.from(['connect', 'disconnect', 'connect', 'disconnect']).share();
const expectedOutput$ = Observable.from([
serverDidConnect(),
serverDidDisconnect(),
serverDidConnect(),
serverDidDisconnect(),
]);
const socketIOMock = {
get: (evt) => {
if (evt === 'connect') {
return inputConnect$.filter(s => s === 'connect');
} else if (evt === 'disconnect') {
return inputConnect$.filter(s => s === 'disconnect');
}
return Observable.empty();
},
};
const { ACTION } = socketCycle({ SOCKETIO: socketIOMock });
Observable.zip(ACTION, expectedOutput$).subscribe(
([output, expectedOutput]) => { expect(output).toEqual(expectedOutput); },
(error) => { expect(true).toBe(false) },
() => { done(); },
);
Maybe there is another way I can go about testing it?
When stream is partitioned, the timing guarantees between elements in different daughter streams is actually destroyed. In particular, even if connect events always come before disconnect events at the event source, the events of the connect Observable won't always come before their corresponding events items in the disconnect Observable. At normal timescales, this race condition probably quite rare but dangerous nonetheless, and this test shows the worst case.
The good news is that your function as shown is just a mapper, between events and results from handlers. If you can continue this model generally over event types, then you can even encode the mapping in a plain data structure, which benefits expressiveness:
const event_handlers = new Map({
'connect': serverDidConnect,
'disconnect': serverDidDisconnect,
'connect_failed': serverFailedToConnect
});
const ACTION = input$.map(event_handlers.get.bind(event_handlers));
Caveat: if you were reducing over the daughter streams (or otherwise considering previous values, like with debounceTime), the refactor is not so straightforward, and would also depend on a new definition of "preserve order". Much of the time, it would still be feasible to reproduce with reduce + a more complicated accumulator.
Below code might be able to give you the desire result, but it's no need to use rxjs to operate array IMHO
Rx.Observable.combineLatest(
Rx.Observable.from([0,0,0]),
Rx.Observable.from([1,1,1])
).flatMap(value=>Rx.Observable.from(value))
.subscribe(console.log)

Categories

Resources