Merge events from a changing list of Observables - javascript

I'm using rxjs.
I have a Browser that's responsible for a number of Page objects. Each page has an Observable<Event> that yields a stream of events.
Page objects are closed and opened at various times. I want to create one observable, called TheOneObservable that will merge all the events from all the currently active Page objects, and also merge in custom events from the Browser object itself.
Closing a Page means that the subscription to it should be closed so it doesn't prevent it from being GC'd.
My problem is that Pages can be closed at any time, which means that the number of Observables being merged is always changing. I've thought of using an Observable of Pages and using mergeMap, but there are problems with this. For example, a subscriber will only receive events of Pages that are opened after it subscribes.
Note that this question has been answered here for .NET, but using an ObservableCollection that isn't available in rxjs.
Here is some code to illustrate the problem:
class Page {
private _events = new Subject<Event>();
get events(): Observable<Event> {
return this._events.asObservable();
}
}
class Browser {
pages = [] as Page[];
private _ownEvents = new Subject<Event>();
addPage(page : Page) {
this.pages.push(page);
}
removePage(page : Page) {
let ixPage = this.pages.indexOf(page);
if (ixPage < 0) return;
this.pages.splice(ixPage, 1);
}
get oneObservable() {
//this won't work for aforementioned reasons
return Observable.from(this.pages).mergeMap(x => x.events).merge(this._ownEvents);
}
}
It's in TypeScript, but it should be understandable.

You can switchMap() on a Subject() linked to array changes, replacing oneObservable with a fresh one when the array changes.
pagesChanged = new Rx.Subject();
addPage(page : Page) {
this.pages.push(page);
this.pagesChanged.next();
}
removePage(page : Page) {
let ixPage = this.pages.indexOf(page);
if (ixPage < 0) return;
this.pages.splice(ixPage, 1);
this.pagesChanged.next();
}
get oneObservable() {
return pagesChanged
.switchMap(changeEvent =>
Observable.from(this.pages).mergeMap(x => x.events).merge(this._ownEvents)
)
}
Testing,
const page1 = { events: Rx.Observable.of('page1Event') }
const page2 = { events: Rx.Observable.of('page2Event') }
let pages = [];
const pagesChanged = new Rx.Subject();
const addPage = (page) => {
pages.push(page);
pagesChanged.next();
}
const removePage = (page) => {
let ixPage = pages.indexOf(page);
if (ixPage < 0) return;
pages.splice(ixPage, 1);
pagesChanged.next();
}
const _ownEvents = Rx.Observable.of('ownEvent')
const oneObservable =
pagesChanged
.switchMap(pp =>
Rx.Observable.from(pages)
.mergeMap(x => x.events)
.merge(_ownEvents)
)
oneObservable.subscribe(x => console.log('subscribe', x))
console.log('adding 1')
addPage(page1)
console.log('adding 2')
addPage(page2)
console.log('removing 1')
removePage(page1)
.as-console-wrapper { max-height: 100% !important; top: 0; }
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.5.6/Rx.js"></script>

You will need to manage the subscriptions to the pages yourself and feed its events into the resulting subject yourself:
const theOneObservable$ = new Subject<Event>();
function openPage(page: Page): Subscription {
return page.events$.subscribe(val => this.theOneObservable$.next(val));
}
Closing the page, i.e. calling unsubscribe on the returned subscription, will already do everything it has to do.
Note that theOneObservable$ is a hot observable here.
You can, of course, take this a bit further by writing your own observable type which encapsulates all of this API. In particular, this would allow you to unsubscribe all inner observables when it is being closed.
A slightly different approach is this:
const observables$ = new Subject<Observable<Event>>();
const theOneObservable$ = observables$.mergeMap(obs$ => obs$);
// Add a page's events; note that takeUntil takes care of the
// unsubscription process here.
observables$.next(page.events$.takeUntil(page.closed$));
This approach is superior in the sense that it will unsubscribe the inner observables automatically when the observable is unsubscribed.

Related

How to refactor global variables from MV2 to using chrome.storage in MV3 service worker?

To remove the global variables used in a MV2 background script when migrating to a MV3 service worker, all the guides I've found have just given an example of replacing a single global variable with a few lines of setting and then getting using chrome.storage, but it's still not clear to me how it can be used in a bit more complicated scenario.
For instance:
const activatedTabs = [];
let lastActiveTabInfo;
chrome.tabs.onActivated.addListener((activeInfo) => {
if (activatedTabs.length === 0) {
activatedTabs.push(activeInfo.tabId);
lastActiveTabInfo = activeInfo;
}
}
How could the snippet above be refactored to use chrome.storage and remove the global variables?
The number of variables in the state doesn't change the approach:
read the state on the start of the script
save the state on change
For small data (1MB total) use chrome.storage.session, which is in-memory i.e. it doesn't write to disk, otherwise use chrome.storage.local. Both can only store JSON-compatible types i.e. string, number, boolean, null, arrays/objects of such types. There's also IndexedDB for Blob or Uint8Array.
let activatedTabs;
let lastActiveTabInfo;
let busy = chrome.storage.session.get().then(data => {
activatedTabs = data.activatedTabs || [];
lastActiveTabInfo = data.lastActiveTabInfo;
busy = null;
});
const saveState = () => chrome.storage.session.set({
activatedTabs,
lastActiveTabInfo,
});
chrome.tabs.onActivated.addListener(async info => {
if (!activatedTabs.length) {
if (busy) await busy;
activatedTabs.push(info.tabId);
lastActiveTabInfo = info;
await saveState();
}
});
You can also maintain a single object with properties instead:
const state = {
activatedTabs: [],
lastActiveTabInfo: null,
};
const saveState = () => chrome.storage.session.set({ state });
let busy = chrome.storage.session.get('state').then(data => {
Object.assign(state, data.state);
busy = null;
});
chrome.tabs.onActivated.addListener(async info => {
if (!state.activatedTabs.length) {
if (busy) await busy;
state.activatedTabs.push(info.tabId);
state.lastActiveTabInfo = info;
await saveState();
}
});
Note that if you subscribe to frequent events like tabs.onActivated, your service worker may restart hundreds of times a day, which wastes much more resources than keeping an idle persistent background page. The Chromium team ignores this problem, but you shouldn't, and luckily there's a way to reduce the number of restarts by prolonging the SW lifetime. You still need to read/save the state as shown.

Firestore listener removes a message from pagination when adding a new message in React Native

I am trying to do Firestore reactive pagination. I know there are posts, comments, and articles saying that it's not possible but anyways...
When I add a new message, it kicks off or "removes" the previous message
Here's the main code. I'm paginating 4 messages at a time
async getPaginatedRTLData(queryParams: TQueryParams, onChange: Function){
let collectionReference = collection(firestore, queryParams.pathToDataInCollection);
let collectionReferenceQuery = this.modifyQueryByOperations(collectionReference, queryParams);
//Turn query into snapshot to track changes
const unsubscribe = onSnapshot(collectionReferenceQuery, (snapshot: QuerySnapshot) => {
snapshot.docChanges().forEach((change: DocumentChange<DocumentData>) => {
//Now save data to format later
let formattedData = this.storeData(change, queryParams)
onChange(formattedData);
})
})
this.unsubscriptions.push(unsubscribe)
}
For completeness this is how Im building my query
let queryParams: TQueryParams = {
limitResultCount: 4,
uniqueKey: '_id',
pathToDataInCollection: messagePath,
orderBy: {
docField: orderByKey,
direction: orderBy
}
}
modifyQueryByOperations(
collectionReference: CollectionReference<DocumentData> = this.collectionReference,
queryParams: TQueryParams) {
//Extract query params
let { orderBy, where: where_param, limitResultCount = PAGINATE} = queryParams;
let queryCall: Query<DocumentData> = collectionReference;
if(where_param) {
let {searchByField, whereFilterOp, valueToMatch} = where_param;
//collectionReferenceQuery = collectionReference.where(searchByField, whereFilterOp, valueToMatch)
queryCall = query(queryCall, where(searchByField, whereFilterOp, valueToMatch) )
}
if(orderBy) {
let { docField, direction} = orderBy;
//collectionReferenceQuery = collectionReference.orderBy(docField, direction)
queryCall = query(queryCall, fs_orderBy(docField, direction) )
}
if(limitResultCount) {
//collectionReferenceQuery = collectionReference.limit(limitResultCount)
queryCall = query(queryCall, limit(limitResultCount) );
}
if(this.lastDocInSortedOrder) {
//collectionReferenceQuery = collectionReference.startAt(this.lastDocInSortedOrder)
queryCall = query(queryCall, startAt(this.lastDocInSortedOrder) )
}
return queryCall
}
See the last line removed is removed when I add a new message to the collection. Whats worse is it's not consistent. I debugged this and Firestore is removing the message.
I almost feel like this is a bug in Firestore's handling of listeners
As mentioned in the comments and confirmed by you the problem you are facing is occuring due to the fact that some values of the fields that your are searching in your query changed while the listener was still active and this makes the listener think of this document as a removed one.
This is proven by the fact that the records are not being deleted from Firestore itself, but are just being excluded from the listener.
This can be fixed by creating a better querying structure, separating the old data from new data incoming from the listener, which you mentioned you've already done in the comments as well.

Remove multiple firebase listeners at once

Im creating an app with react native and face the problem that I create multiple firebase listeners troughout the app, listeners on different screens to be precise and also listeners that listen to the firebase-database and others listening to the firestore.
What I want to accomplish is to kill all those listeners with one call or if necessary with multiple lines but as compact as possible - and also from an entire different screen where the listeners arent even running, this is important.
I know that there is the possibility to use Firebase.goOffline() but this only disconnects me from the Firebase - it doesnt stop the listeners. As soon as I goOnline() again, the listeners are all back.
I didnt find any solution yet for this problem from google etc thats why I try to ask here now, I would appriciate if anybody would have an idea how maybe an approach how to handle this type of behavior.
The following code samples provide you with listeners I included inside my app, they are located in in the same screen but I have nearly identical ones in other screens.
Database listener:
const statusListener = () => {
var partnerRef = firebase.database().ref(`users/${partnerId}/onlineState`);
partnerRef.on('value', function(snapshot){
setPartnerState(snapshot.val())
})
};
Firestore Listener: (this one is very long, thats only because I filter the documents I retrieve from the listener)
const loadnewmessages = () =>{ firebase.firestore().collection("chatrooms").doc(`${chatId}`).collection(`${chatId}`).orderBy("timestamp").limit(50).onSnapshot((snapshot) => {
var newmessages = [];
var deletedmesssages = [];
snapshot.docChanges().forEach((change) => {
if(change.type === "added"){
newmessages.push({
counter: change.doc.data().counter,
sender: change.doc.data().sender,
timestamp: change.doc.data().timestamp.toString(),
value: change.doc.data().value,
displayedTime: new Date(change.doc.data().displayedTime)
})
};
if(change.type === "removed"){
deletedmesssages.push({
counter: change.doc.data().counter,
sender: change.doc.data().sender,
timestamp: change.doc.data().timestamp.toString(),
value: change.doc.data().value,
displayedTime: new Date(change.doc.data().displayedTime)
})
};
})
if(newmessages.length > 0){
setChatMessages(chatmessages => {
return chatmessages.concat(newmessages)
});
};
if(deletedmesssages.length > 0){
setChatMessages(chatmessages => {
var modifythisarray = chatmessages;
let index = chatmessages.map(e => e.timestamp).indexOf(`${deletedmesssages[0].timestamp}`);
let pasttime = Date.now() - parseInt(modifythisarray[index].timestamp);
modifythisarray.splice(index, 1);
if(pasttime > 300000){
return chatmessages
}else{
return modifythisarray
}
});
setRefreshFlatList(refreshFlatlist => {
//console.log("Aktueller Status von refresher: ", refreshFlatlist);
return !refreshFlatlist
});
}
newmessages = [];
deletedmesssages = [];
})
};
Both those listeners are called within a useEffect hook just like that: (useEffect with empty braces at the end makes sure those listeners are called only once and not multiple times.)
useEffect(() => {
loadnewmessages();
statusListener();
}, []);
All of the subscribe functions return the unsubscribe function
const unSubscriptions = [];
... Where you subscribe
const unSub = document.onSnapshot(listener);
subscriptions.push(unSub);
... Where you unsubscribe all
function unSubAll () {
unSubscriptions.forEach((unSub) => unSub());
// Clear the array
unSubscriptions.length = 0;
}

How to use firebase onSnapshot in chain using JavaScript?

I want to use firebase's onSnapshot function sequentially. A situation where I want to apply this is given below.
Scenario:
There are 2 collections in firestore. Employees and Projects. In the Employees collection, the docs are storing the details of employees. And it also stores the IDs of Projects docs on which that particular employee is working. In Projects collection, the detail of projects is stored.
Goal:
First, I have to fetch the data from Employees collection related to a specific employee. Then, from the fetched employee data, I will have the project IDs on which he/she is working on. So, from that ID I need to fetch the project details. So, when any information related to project or employee changes, the data on screen should also change in real-time.
Issue:
I tried to write a nested code. But it works realtime only for employee data. It doesn't change when the project detail is updated. Something like this...
admin.auth().onAuthStateChanged(async () => {
if (check_field(admin.auth().currentUser)) {
await db.collection('Employees').doc(admin.auth().currentUser.uid).onSnapshot(snap => {
...
let project_details = new Promise(resolve => {
let projects = [];
for (let i in snap.data().projects_list) {
db.collection('Projects').doc(snap.data().projects_list[i]).onSnapshot(prj_snap => {
let obj = prj_snap.data();
obj['doc_id'] = prj_snap.id;
projects.push(obj);
});
}
resolve(projects);
});
Promise.all([project_details]).then(items => {
...
// UI updation
});
...
});
}
});
What is the correct way for doing this?
You're actually proposing a pretty complex dataflow scenario. I would approach this as a multi-step problem. Your goal is essentially:
If there is a user, listen in realtime for the list of project ids for that user.
For each project id, listen in realtime for details about that project.
(presumably) Clean up listeners that are no longer relevant.
So I would tackle it something like this:
let uid;
let employeeUnsub;
let projectIds = [];
let projectUnsubs = {};
let projectData = {};
const employeesRef = firebase.firestore().collection('Employees');
const projectsRef = firebase.firestore().collection('Projects');
firebase.auth().onAuthStateChanged(user => {
// if there is already a listener but the user signs out or changes, unsubscribe
if (employeeUnsub && (!user || user.uid !== uid)) {
employeeUnsub();
}
if (user) {
uid = user.uid;
// subscribe to the employee data and trigger a listener update on changes
employeeUnsub = employeesRef.doc(uid).onSnapshot(snap => {
projectIds = snap.get('projects_list');
updateProjectListeners();
});
}
});
function updateProjectListeners() {
// get a list of existing projects being listened already
let existingListeners = Object.keys(projectUnsubs);
for (const pid of existingListeners) {
// unsubscribe and remove the listener/data if no longer in the list
if (!projectIds.includes(pid)) {
projectUnsubs[pid]();
delete projectUnsubs[pid];
delete projectData[pid];
render();
}
}
for (const pid of projectIds) {
// if we're already listening, nothing to do so skip ahead
if (projectUnsubs[pid]) { continue; }
// subscribe to project data and trigger a render on change
projectUnsubs[pid] = projectsRef.doc(pid).onSnapshot(snap => {
projectData[pid] = snap.data);
render();
});
}
}
function render() {
const out = "<ul>\n";
for (const pid of projectIds) {
if (!projectData[pid]) {
out += `<li class="loading">Loading...</li>\n`;
} else {
const project = projectData[pid];
out += `<li>${project.name}</li>`;
}
}
out += "</ul>\n";
}
The above code does what you're talking about (and in this case the render() function just returns a string but you could do whatever you want to actually manipulate DOM / display data there).
It's a lengthy example, but you're talking about a pretty sophisticated concept of essentially joining realtime data dynamically as it changes. Hope this gives you some guidance on a way forward!

ReactiveX filtering on observable multiple times and merging

I have a problem creating the following observable.
I want it to receive a predefined array of values
And I want to filter by different things, and be able to work with these as individual observables.
And then when it comes time to merge these filtered observables, I want to preserve the order from the original one
//Not sure the share is necessary, just thought it would tie it all together
const input$ = Observable.from([0,1,0,1]).share();
const ones$ = input$.filter(n => n == 1);
const zeroes$ = input$.filter(n => n == 0);
const zeroesChanged$ = zeroes$.mapTo(2);
const onesChanged$ = ones$.mapTo(3);
const allValues$ = Observable.merge(onesChanged$,zeroesChanged$);
allValues$.subscribe(n => console.log(n));
//Outputs 3,3,2,2
//Expected output 3,2,3,2
EDIT: I am sorry I was not specific enough in my question.
I am using a library called cycleJS, which separates sideeffects into drivers.
So what I am doing in my cycle is this
export function socketCycle({ SOCKETIO }) {
const serverConnect$ = SOCKETIO.get('connect').map(serverDidConnect);
const serverDisconnect$ = SOCKETIO.get('disconnect').map(serverDidDisconnect);
const serverFailedToConnect$ = SOCKETIO.get('connect_failed').map(serverFailedToConnect);
return { ACTION: Observable.merge(serverConnect$, serverDisconnect$, serverFailedToConnect$) };
}
Now my problem arose when I wanted to write a test for it. I tried with the following which worked in the wrong matter(using jest)
const inputConnect$ = Observable.from(['connect', 'disconnect', 'connect', 'disconnect']).share();
const expectedOutput$ = Observable.from([
serverDidConnect(),
serverDidDisconnect(),
serverDidConnect(),
serverDidDisconnect(),
]);
const socketIOMock = {
get: (evt) => {
if (evt === 'connect') {
return inputConnect$.filter(s => s === 'connect');
} else if (evt === 'disconnect') {
return inputConnect$.filter(s => s === 'disconnect');
}
return Observable.empty();
},
};
const { ACTION } = socketCycle({ SOCKETIO: socketIOMock });
Observable.zip(ACTION, expectedOutput$).subscribe(
([output, expectedOutput]) => { expect(output).toEqual(expectedOutput); },
(error) => { expect(true).toBe(false) },
() => { done(); },
);
Maybe there is another way I can go about testing it?
When stream is partitioned, the timing guarantees between elements in different daughter streams is actually destroyed. In particular, even if connect events always come before disconnect events at the event source, the events of the connect Observable won't always come before their corresponding events items in the disconnect Observable. At normal timescales, this race condition probably quite rare but dangerous nonetheless, and this test shows the worst case.
The good news is that your function as shown is just a mapper, between events and results from handlers. If you can continue this model generally over event types, then you can even encode the mapping in a plain data structure, which benefits expressiveness:
const event_handlers = new Map({
'connect': serverDidConnect,
'disconnect': serverDidDisconnect,
'connect_failed': serverFailedToConnect
});
const ACTION = input$.map(event_handlers.get.bind(event_handlers));
Caveat: if you were reducing over the daughter streams (or otherwise considering previous values, like with debounceTime), the refactor is not so straightforward, and would also depend on a new definition of "preserve order". Much of the time, it would still be feasible to reproduce with reduce + a more complicated accumulator.
Below code might be able to give you the desire result, but it's no need to use rxjs to operate array IMHO
Rx.Observable.combineLatest(
Rx.Observable.from([0,0,0]),
Rx.Observable.from([1,1,1])
).flatMap(value=>Rx.Observable.from(value))
.subscribe(console.log)

Categories

Resources