How can I create an eager observable from a hot observable? - javascript

AFAIK, observable is lazy.
import * as rxjs from 'rxjs'
const { filter, take, map } = rxjs.operators
function awesomeOpators() {
return take(1);
}
const numbers$ = new rxjs.Subject<number>();
const start$ = numbers$.pipe(
awesomeOpators(),
);
numbers$.next(1);
start$.subscribe((val) => {
// outputs 2
console.log(val)
})
numbers$.next(2)
How can I rewrite awesomeOpators such that beginWithLargeNumbe$ starts with 1?
https://stackblitz.com/edit/rxjs-zqkz7r?file=main.ts

You can use a ReplaySubject instead:
const numbers$ = new rxjs.ReplaySubject<number>();
There's a bunch of other ways as well. You haven't told us which part you can or can't control, though, so this will do.

Related

React / ES6 - Efficiently update property of an object in an array

I am looking for the most efficient way to update a property of an object in an array using modern JavaScript. I am currently doing the following but it is way too slow so I'm looking for an approach that will speed things up. Also, to put this in context, this code is used in a Redux Saga in a react app and is called on every keystroke* a user makes when writing code in an editor.
*Ok not EVERY keystroke. I do have debounce and throttling implemented I just wanted to focus on the update but I appreciate everyone catching this :)
function* updateCode({ payload: { code, selectedFile } }) {
try {
const tempFiles = stateFiles.filter(file => file.id !== selectedFile.id);
const updatedFile = {
...selectedFile,
content: code,
};
const newFiles = [...tempFiles, updatedFile];
}
catch () {}
}
the above works but is too slow.
I have also tried using splice but I get Invariant Violation: A state mutation
const index = stateFiles.findIndex(file => file.id === selectedFile.id);
const newFiles = Array.from(stateFiles.splice(index, 1, { ...selectedFile, content: code }));
You can use Array.prototype.map in order to construct your new array:
const newFiles = stateFiles.map(file => {
if (file.id !== selectedFile.id) {
return file;
}
return {
...selectedFile,
content: code,
};
});
Also, please consider using debouncing in order not to run your code on every keystroke.

Resetting ReplaySubject in RxJS 6

I have a filterable 'activity log' that's currently implemented using a ReplaySubject (since a few components use it and they might subscribe at different times).
When the user changes the filter settings, a new request is made, however the results are appended to the ReplaySubject rather than replacing it.
I was wondering if there is anyway to update the ReplaySubject to only send through the new items using something like a switchMap?
Otherwise, I might need to either use a BehaviorSubject that returns an array of all the activity entries or recreate the ReplaySubject and notify users (probably by using another observable) to unsubscribe and resubscribe to the new observable.
If you want to be able to reset a subject without having its subscribers explicitly unsubscribe and resubscribe, you could do something like this:
import { Observable, Subject } from "rxjs";
import { startWith, switchMap } from "rxjs/operators";
function resettable<T>(factory: () => Subject<T>): {
observable: Observable<T>,
reset(): void,
subject: Subject<T>
} {
const resetter = new Subject<any>();
const source = new Subject<T>();
let destination = factory();
let subscription = source.subscribe(destination);
return {
observable: resetter.asObservable().pipe(
startWith(null),
switchMap(() => destination)
),
reset: () => {
subscription.unsubscribe();
destination = factory();
subscription = source.subscribe(destination);
resetter.next();
},
subject: source
};
}
resettable will return an object containing:
an observable to which subscribers to the re-settable subject should subscribe;
a subject upon which you'd call next, error or complete; and
a reset function that will reset the (inner) subject.
You'd use it like this:
import { ReplaySubject } from "rxjs";
const { observable, reset, subject } = resettable(() => new ReplaySubject(3));
observable.subscribe(value => console.log(`a${value}`)); // a1, a2, a3, a4, a5, a6
subject.next(1);
subject.next(2);
subject.next(3);
subject.next(4);
observable.subscribe(value => console.log(`b${value}`)); // b2, b3, b4, b5, b6
reset();
observable.subscribe(value => console.log(`c${value}`)); // c5, c6
subject.next(5);
subject.next(6);
Here is a class that is using the resettable factory posted here before, so you can use
const myReplaySubject = new ResettableReplaySubject<myType>()
import { ReplaySubject, Subject, Observable, SchedulerLike } from "rxjs";
import { startWith, switchMap } from "rxjs/operators";
export class ResettableReplaySubject<T> extends ReplaySubject<T> {
reset: () => void;
constructor(bufferSize?: number, windowTime?: number, scheduler?: SchedulerLike) {
super(bufferSize, windowTime, scheduler);
const resetable = this.resettable(() => new ReplaySubject<T>(bufferSize, windowTime, scheduler));
Object.keys(resetable.subject).forEach(key => {
this[key] = resetable.subject[key];
})
Object.keys(resetable.observable).forEach(key => {
this[key] = resetable.observable[key];
})
this.reset = resetable.reset;
}
private resettable<T>(factory: () => Subject<T>): {
observable: Observable<T>,
reset(): void,
subject: Subject<T>,
} {
const resetter = new Subject<any>();
const source = new Subject<T>();
let destination = factory();
let subscription = source.subscribe(destination);
return {
observable: resetter.asObservable().pipe(
startWith(null),
switchMap(() => destination)
) as Observable<T>,
reset: () => {
subscription.unsubscribe();
destination = factory();
subscription = source.subscribe(destination);
resetter.next();
},
subject: source,
};
}
}
I had kind of same problem: One of my components subscribed to an ReplaySubject of a shared service. Once navigated away and coming back the former values where still delivered to the component.
Just completing the subject was not enough.
The solutions above seemed to complicated for this purpose but I found another real simple solution in just completing the subject and assigning a newly created one in the shared service like so:
constructor() {
this.selectedFeatures = new ReplaySubject()
this.selectedFeaturesObservable$ = this.selectedFeatures.asObservable()
}
completeSelectedFeatures() {
this.selectedFeatures.complete()
this.selectedFeatures = new ReplaySubject()
this.selectedFeaturesObservable$ = this.selectedFeatures.asObservable()
}
I also printed the constructor of the shared service to show the types I used.
That way any time I move away from my component I just call that method on my shared service and hence get a new fresh and empty ReplaySubject anytime I navigate back to my component thats consuming the shared services observable.
I call that method inside ngOnDestroy Angular lifecycle hook:
ngOnDestroy() {
console.log('unsubscribe')
this.featureSub.unsubscribe()
this.sharedDataService.completeSelectedFeatures()
}
The problem becomes easier if you can use the fact that the buffer consumes data from the original source, and that subscribers to buffered data can switch to the original source after receiving all the old values.
Eg.
let data$ = new Subject<any>() // Data source
let buffer$ = new ReplaySubject<any>()
let bs = data$.subscribe(buffer$) // Buffer subscribes to data
// Observable that returns values until nearest reset
let getRepeater = () => {
return concat(buffer$.pipe(
takeUntil(data$), // Switch from buffer to original source when data comes in
), data$)
}
To clear, replace the buffer
// Begin Buffer Clear Sequence
bs.unsubscribe()
buffer$.complete()
buffer$ = new ReplaySubject()
bs = data$.subscribe(buffer$)
buffObs.next(buffer$)
To make the code more functional, you can replace the function getRepeater() with a subject that reflects the latest reference
let buffObs = new ReplaySubject<ReplaySubject<any>>(1)
buffObs.next(buffer$)
let repeater$ = concat(buffObs.pipe(
takeUntil(data$),
switchMap((e) => e),
), data$)
The following
let data$ = new Subject<any>()
let buffer$ = new ReplaySubject<any>()
let bs = data$.subscribe(buffer$)
let buffObs = new ReplaySubject<ReplaySubject<any>>(1)
buffObs.next(buffer$)
let repeater$ = concat(buffObs.pipe(
takeUntil(data$),
switchMap((e) => e),
), data$)
// Begin Test
data$.next(1)
data$.next(2)
data$.next(3)
console.log('rep1 sub')
let r1 = repeater$.subscribe((e) => {
console.log('rep1 ' + e)
})
// Begin Buffer Clear Sequence
bs.unsubscribe()
buffer$.complete()
buffer$ = new ReplaySubject()
bs = data$.subscribe(buffer$)
buffObs.next(buffer$)
// End Buffer Clear Sequence
console.log('rep2 sub')
let r2 = repeater$.subscribe((e) => {
console.log('rep2 ' + e)
})
data$.next(4)
data$.next(5)
data$.next(6)
r1.unsubscribe()
r2.unsubscribe()
data$.next(7)
data$.next(8)
data$.next(9)
console.log('rep3 sub')
let r3 = repeater$.subscribe((e) => {
console.log('rep3 ' + e)
})
Outputs
rep1 sub
rep1 1
rep1 2
rep1 3
rep2 sub
rep1 4
rep2 4
rep1 5
rep2 5
rep1 6
rep2 6
rep3 sub
rep3 4
rep3 5
rep3 6
rep3 7
rep3 8
rep3 9
For certain situations (ex. where everything is contained in one class), here's what I believe is a very concise solution, with few moving parts:
new subscribers will get the latest from value$$ unless reset$$ has been called since the last value
existing subscribers get each new item emitted to value$$
const value$$ = new Subject();
const reset$$ = new Subject();
const value$ = reset$$.pipe(
// can optionally startWith(null)
map(() => value$$.pipe(shareReplay(1)), // create a new stream every time reset emits
shareReplay(1), // this shares the latest cached value stream emitted
switchAll(), // subscribe to the inner cached value stream
)

RxJS: Elegant way partition source Observable into 3 or more Observables

I have a socket connection that emits messages with an identifier. I would like to create a separate observable for each type of message. I have a few solutions but they all feel clunky or have possible performance issues. I'm relatively new to RxJS so I'm not aware of the possible traps I might be walking into.
My first instinct was to create a filtered observable for each type:
const receive_message = Rx.fromEvent(socket, 'data').pipe(share());
const message_type_a = receive_message.pipe(filter(message => message.type === 'a'));
const message_type_b = receive_message.pipe(filter(message => message.type === 'b'));
const message_type_c = receive_message.pipe(filter(message => message.type === 'c'));
const message_type_d = receive_message.pipe(filter(message => message.type === 'd'));
I think this would cause performance issues because it's performing this check for every message type every time any message comes in.
I thought about doing a multistage partition like this:
const receive_message = Rx.fromEvent(socket, 'data');
const [message_type_a, not_a] = receive_message.pipe(partition(message => message.type === 'a'));
const [message_type_b, not_b] = not_a.pipe(partition(message => message.type === 'b'));
const [message_type_c, message_type_d] = not_b.pipe(partition(message => message.type === 'c'));
This is awfully clunky and I'm not sure if it is any more performant than the filter solution.
Next I tried using subjects like so:
const message_type_a = new Rx.Subject();
const message_type_b = new Rx.Subject();
const message_type_c = new Rx.Subject();
const message_type_d = new Rx.Subject();
Rx.fromEvent(socket, 'data').subscribe(function (message) {
switch (message.type) {
case 'a':
message_type_a.next(message);
break;
case 'b':
message_type_b.next(message);
break;
case 'c':
message_type_c.next(message);
break;
case 'd':
message_type_d.next(message);
break;
default:
console.log('Uh oh');
}
},
console.log,
function () {
message_type_a.complete();
message_type_b.complete();
message_type_c.complete();
message_type_d.complete();
}
);
Again, this is clunky and whenever I'm using subjects I ask myself if this is the "Rx" way of doing things.
Ideally I would be able to do something like this:
const [
message_type_a,
message_type_b,
message_type_c,
message_type_d
] = Rx.fromEvent(socket, 'data').pipe(partitionMany(message.type));
Are there any elegant solutions out there or is my overall approach of splitting the source observable like this fundamentally flawed?
This is my first question so I hope I did a good job. Thanks in advance!
I changed your switch case solution to more performant one.
const message_type_a = new Rx.Subject();
const message_type_b = new Rx.Subject();
const message_type_c = new Rx.Subject();
const message_type_d = new Rx.Subject();
subjects = {
'a': message_type_a,
'b': message_type_b,
'c': message_type_c,
'd': message_type_d
}
Rx.fromEvent(socket, 'data').pipe(tap(message =>
subjects[message.type].next(message))).subscribe();
See https://www.npmjs.com/package/rx-splice.
We had the exact same situation, and in our case it was indeed a performance problem (measured using node --perf). I just created this package after reading your question, because sharing is caring. Let me know if it works for you!
Note that you want this only if executing the filter's selector function becomes a problem! As noted in the splice README:
Using only idiomatic RxJS code, one would use filter instead for the
use case of splice. However, if you are writing high performance code
and this input$ Observable above (or more likely, Subject) would be
subscribed hunderths or thousands of times (X), and thus the selector
function of filter(fn) would be called X times. This can - and
actually did prove to - be the biggest performance bottleneck in our
application, so we wrote splice, which executes it's indexing selector
only once for each emitted value.
To lend an answer to #frido's comment, the groupBy operator is what you want:
import { groupBy, mergeMap } from "rxjs/operators";
Rx.fromEvent(socket, 'data')
.pipe(
groupBy(message => message.type),
// then probably...
mergeMap(observableForASingleMessageType$ => {...})
)

Merge events from a changing list of Observables

I'm using rxjs.
I have a Browser that's responsible for a number of Page objects. Each page has an Observable<Event> that yields a stream of events.
Page objects are closed and opened at various times. I want to create one observable, called TheOneObservable that will merge all the events from all the currently active Page objects, and also merge in custom events from the Browser object itself.
Closing a Page means that the subscription to it should be closed so it doesn't prevent it from being GC'd.
My problem is that Pages can be closed at any time, which means that the number of Observables being merged is always changing. I've thought of using an Observable of Pages and using mergeMap, but there are problems with this. For example, a subscriber will only receive events of Pages that are opened after it subscribes.
Note that this question has been answered here for .NET, but using an ObservableCollection that isn't available in rxjs.
Here is some code to illustrate the problem:
class Page {
private _events = new Subject<Event>();
get events(): Observable<Event> {
return this._events.asObservable();
}
}
class Browser {
pages = [] as Page[];
private _ownEvents = new Subject<Event>();
addPage(page : Page) {
this.pages.push(page);
}
removePage(page : Page) {
let ixPage = this.pages.indexOf(page);
if (ixPage < 0) return;
this.pages.splice(ixPage, 1);
}
get oneObservable() {
//this won't work for aforementioned reasons
return Observable.from(this.pages).mergeMap(x => x.events).merge(this._ownEvents);
}
}
It's in TypeScript, but it should be understandable.
You can switchMap() on a Subject() linked to array changes, replacing oneObservable with a fresh one when the array changes.
pagesChanged = new Rx.Subject();
addPage(page : Page) {
this.pages.push(page);
this.pagesChanged.next();
}
removePage(page : Page) {
let ixPage = this.pages.indexOf(page);
if (ixPage < 0) return;
this.pages.splice(ixPage, 1);
this.pagesChanged.next();
}
get oneObservable() {
return pagesChanged
.switchMap(changeEvent =>
Observable.from(this.pages).mergeMap(x => x.events).merge(this._ownEvents)
)
}
Testing,
const page1 = { events: Rx.Observable.of('page1Event') }
const page2 = { events: Rx.Observable.of('page2Event') }
let pages = [];
const pagesChanged = new Rx.Subject();
const addPage = (page) => {
pages.push(page);
pagesChanged.next();
}
const removePage = (page) => {
let ixPage = pages.indexOf(page);
if (ixPage < 0) return;
pages.splice(ixPage, 1);
pagesChanged.next();
}
const _ownEvents = Rx.Observable.of('ownEvent')
const oneObservable =
pagesChanged
.switchMap(pp =>
Rx.Observable.from(pages)
.mergeMap(x => x.events)
.merge(_ownEvents)
)
oneObservable.subscribe(x => console.log('subscribe', x))
console.log('adding 1')
addPage(page1)
console.log('adding 2')
addPage(page2)
console.log('removing 1')
removePage(page1)
.as-console-wrapper { max-height: 100% !important; top: 0; }
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.5.6/Rx.js"></script>
You will need to manage the subscriptions to the pages yourself and feed its events into the resulting subject yourself:
const theOneObservable$ = new Subject<Event>();
function openPage(page: Page): Subscription {
return page.events$.subscribe(val => this.theOneObservable$.next(val));
}
Closing the page, i.e. calling unsubscribe on the returned subscription, will already do everything it has to do.
Note that theOneObservable$ is a hot observable here.
You can, of course, take this a bit further by writing your own observable type which encapsulates all of this API. In particular, this would allow you to unsubscribe all inner observables when it is being closed.
A slightly different approach is this:
const observables$ = new Subject<Observable<Event>>();
const theOneObservable$ = observables$.mergeMap(obs$ => obs$);
// Add a page's events; note that takeUntil takes care of the
// unsubscription process here.
observables$.next(page.events$.takeUntil(page.closed$));
This approach is superior in the sense that it will unsubscribe the inner observables automatically when the observable is unsubscribed.

ReactiveX filtering on observable multiple times and merging

I have a problem creating the following observable.
I want it to receive a predefined array of values
And I want to filter by different things, and be able to work with these as individual observables.
And then when it comes time to merge these filtered observables, I want to preserve the order from the original one
//Not sure the share is necessary, just thought it would tie it all together
const input$ = Observable.from([0,1,0,1]).share();
const ones$ = input$.filter(n => n == 1);
const zeroes$ = input$.filter(n => n == 0);
const zeroesChanged$ = zeroes$.mapTo(2);
const onesChanged$ = ones$.mapTo(3);
const allValues$ = Observable.merge(onesChanged$,zeroesChanged$);
allValues$.subscribe(n => console.log(n));
//Outputs 3,3,2,2
//Expected output 3,2,3,2
EDIT: I am sorry I was not specific enough in my question.
I am using a library called cycleJS, which separates sideeffects into drivers.
So what I am doing in my cycle is this
export function socketCycle({ SOCKETIO }) {
const serverConnect$ = SOCKETIO.get('connect').map(serverDidConnect);
const serverDisconnect$ = SOCKETIO.get('disconnect').map(serverDidDisconnect);
const serverFailedToConnect$ = SOCKETIO.get('connect_failed').map(serverFailedToConnect);
return { ACTION: Observable.merge(serverConnect$, serverDisconnect$, serverFailedToConnect$) };
}
Now my problem arose when I wanted to write a test for it. I tried with the following which worked in the wrong matter(using jest)
const inputConnect$ = Observable.from(['connect', 'disconnect', 'connect', 'disconnect']).share();
const expectedOutput$ = Observable.from([
serverDidConnect(),
serverDidDisconnect(),
serverDidConnect(),
serverDidDisconnect(),
]);
const socketIOMock = {
get: (evt) => {
if (evt === 'connect') {
return inputConnect$.filter(s => s === 'connect');
} else if (evt === 'disconnect') {
return inputConnect$.filter(s => s === 'disconnect');
}
return Observable.empty();
},
};
const { ACTION } = socketCycle({ SOCKETIO: socketIOMock });
Observable.zip(ACTION, expectedOutput$).subscribe(
([output, expectedOutput]) => { expect(output).toEqual(expectedOutput); },
(error) => { expect(true).toBe(false) },
() => { done(); },
);
Maybe there is another way I can go about testing it?
When stream is partitioned, the timing guarantees between elements in different daughter streams is actually destroyed. In particular, even if connect events always come before disconnect events at the event source, the events of the connect Observable won't always come before their corresponding events items in the disconnect Observable. At normal timescales, this race condition probably quite rare but dangerous nonetheless, and this test shows the worst case.
The good news is that your function as shown is just a mapper, between events and results from handlers. If you can continue this model generally over event types, then you can even encode the mapping in a plain data structure, which benefits expressiveness:
const event_handlers = new Map({
'connect': serverDidConnect,
'disconnect': serverDidDisconnect,
'connect_failed': serverFailedToConnect
});
const ACTION = input$.map(event_handlers.get.bind(event_handlers));
Caveat: if you were reducing over the daughter streams (or otherwise considering previous values, like with debounceTime), the refactor is not so straightforward, and would also depend on a new definition of "preserve order". Much of the time, it would still be feasible to reproduce with reduce + a more complicated accumulator.
Below code might be able to give you the desire result, but it's no need to use rxjs to operate array IMHO
Rx.Observable.combineLatest(
Rx.Observable.from([0,0,0]),
Rx.Observable.from([1,1,1])
).flatMap(value=>Rx.Observable.from(value))
.subscribe(console.log)

Categories

Resources