RxJS: Batch requests and share response - javascript

Let's imagine i have a function fetchUser which takes as parameter userId and return an observable of user.
As i am calling this method often, i want to batch the ids to perform one request with multiple ids instead !
Here my troubles began...
I can't find a solution to do that without sharing an observable between the different calls of fetchUser.
import { Subject, from } from "rxjs"
import { bufferTime, mergeMap, map, toArray, filter, take, share } from "rxjs/operators"
const functionThatSimulateAFetch = (userIds: string[]) => from(userIds).pipe(
map((userId) => ({ id: userId, name: "George" })),
toArray(),
)
const userToFetch$ = new Subject<string>()
const fetchedUser$ = userToFetch$.pipe(
bufferTime(1000),
mergeMap((userIds) => functionThatSimulateAFetch(userIds)),
share(),
)
const fetchUser = (userId: string) => {
const observable = fetchedUser$.pipe(
map((users) => users.find((user) => user.id === userId)),
filter((user) => !!user),
take(1),
)
userToFetch$.next(userId)
return observable
}
But that's ugly and it has multiple troubles:
If i unsubscribe from the observable returned by fetchUser before the timer of bufferTime is finished, it doesn't prevent the fetch of the user.
If i unsubscribe from all the observables returned by fetchUser before the fetch of the batch is finished, it doesn't cancel the request.
Error handling is more complex
etc
More generally: i don't know how to solve the problems requiring sharing resources using RxJS. It's difficult to find advanced example of RxJS.

I think #Biggy is right.
This is the way I understand the problem and what you want to achieve
There are different places in your app where you want to fetch users
You do not want to fire a fetch request all the time, rather you
want to buffer them and send them at a certain interval of time,
let's say 1 second
You want to cancel a certain buffer and avoid that for that 1 second
interval a request to fetch a batch of users is fired
At the same time, if somebody, let's call it Code at Position
X has asked for a User and just few milliseconds later somebody
else, i.e. Code at Position Y cancels the entire batch of
requests, then Code at Position X has to receive some sort of
answer, let's say a null
More, you may want to be able to ask to fetch a User and then change
your mind, if within the interval of the buffer time, and and avoid
this User to be fetched (I am far from sure this is really something
you want, but it seems somehow to emerge from your question
If this is all true, then you probably have to have some sort of queuing mechanism, as Buggy suggested.
Then there may be many implementations of such mechanism.

What you have is a good, but as with everything RxJS, but the devil is in the details.
Issues
The switchMaping
mergeMap((userIds) => functionThatSimulateAFetch(userIds)),
This is where you first go wrong. By using a merge map here, you are making it impossible to tell appart the "stream of requests" from the "stream returned by a single request":
You are making it near impossible to unsubscribe from an individual request (to cancel it)
You are making it impossible to handle errors
It falls appart if your inner observable emits more than once.
Rather, what you want is to emit individual BatchEvents, via a normal map (producing an observable of observable), and switchMap/mergeMap those after the filtering.
Side effects when creating an observable & Emitting before subscribing
userToFetch$.next(userId)
return observable
Don’t do this. An observable by itself does not actually do anything. It’s a "blueprint" for a sequence of actions to happen when you subscribe to it. By doing this, you’ll only create a batch action on observable creating, but you’re screwed if you get multiple or delayed subscriptions.
Rather, you want to create an observable from defer that emits to userToFetch$ on every subscription.
Even then you’ll want to subscribe to your observable before emitting to userToFetch: If you aren’t subscribed, your observable is not listening to the subject, and the event will be lost. You can do this in a defer-like observable.
Solution
Short, and not very different from your code, but structure it like this.
const BUFFER_TIME = 1000;
type BatchEvent = { keys: Set<string>, values: Observable<Users> };
/** The incoming keys */
const keySubject = new Subject<string>();
const requests: Observable<{ keys: Set<string>, values: Observable<Users> }> =
this.keySubject.asObservable().pipe(
bufferTime(BUFFER_TIME),
map(keys => this.fetchBatch(keys)),
share(),
);
/** Returns a single User from an ID. Batches the request */
function get(userId: string): Observable<User> {
console.log("Creating observable for:", userId);
// The money observable. See "defer":
// triggers a new subject event on subscription
const observable = new Observable<BatchEvent>(observer => {
this.requests.subscribe(observer);
// Emit *after* the subscription
this.keySubject.next(userId);
});
return observable.pipe(
first(v => v.keys.has(userId)),
// There is only 1 item, so any *Map will do here
switchMap(v => v.values),
map(v => v[userId]),
);
}
function fetchBatch(args: string[]): BatchEvent {
const keys = new Set(args); // Do not batch duplicates
const values = this.userService.get(Array.from(keys)).pipe(
share(),
);
return { keys, values };
}
This does exactly what you were asking, including:
Errors are propagated to the recipients of the batch call, but nobody else
If everybody unsubscribes from a batch, the observable is canceled
If everybody unsubscribes from a batch before the request is even fired, it never fires
The observable behaves like HttpClient: subscribing to the observable fires a new (batched) request for data. Callers are free to pipe shareReplay or whatever though. So no surprises there.
Here is a working stackblitz Angular demo: https://stackblitz.com/edit/angular-rxjs-batch-request
In particular, notice the behavior when you "toggle" the display: You’ll notice that re-subscribing to existing observables will fire new batch requests, and that those requests will cancel (or outright not fire) if you re-toggle fast enough.
Use case
In our project, we use this for Angular Tables, where each row needs to individually fetch additional data to render. This allows us to:
chunk all the requests for a "single page", without needing any special knowledge of pagination
Potentially fetch multiple pages at once if the user paginates fast
re-use existing results even if page size changes
Limitations
I would not add chunking or rate limitting into this. Because the source observable is a dumb bufferTime you run into issues:
The "chunking" will happen before the deduping. So if you have 100 requests for a single userId, you’ll end up firing several requests with only 1 element
If you rate limit, you’ll not be able to inspect your queue. So you may end up with a very long queue containing multiple same requests.
This is a pessimistic point of view though. Fixing it would mean going full out with a stateful queue/batch mechanism, which is an order of magnitude more complex.

I'm not sure if this is the best way to solve this problem (at least it need tests), but I will try to explain my point of view.
We have 2 queue: for pending and for feature requests.
result to help delivery response/error to subscribers.
Some kind of worker who is based on some schedule takes a task from the queue to do the request.
If i unsubscribe from the observable returned by fetchUser before the
timer of bufferTime is finished, it doesn't prevent the fetch of the
user.
Unsubscribe from fetchUser will cleanup the request queue and worker will do nothing.
If i unsubscribe from all the observables returned by fetchUser before
the fetch of the batch is finished, it doesn't cancel the request.
Worker subscribe until isNothingRemain$
const functionThatSimulateAFetch = (userIds: string[]) => from(userIds).pipe(
map((userId) => ({ id: userId, name: "George" })),
toArray(),
tap(() => console.log('API_CALL', userIds)),
delay(200),
)
class Queue {
queue$ = new BehaviorSubject(new Map());
private get currentQueue() {
return new Map(this.queue$.getValue());
}
add(...ids) {
const newMap = ids.reduce((acc, id) => {
acc.set(id, (acc.get(id) || 0) + 1);
return acc;
}, this.currentQueue);
this.queue$.next(newMap);
};
addMap(idmap: Map<any, any>) {
const newMap = (Array.from(idmap.keys()))
.reduce((acc, id) => {
acc.set(id, (acc.get(id) || 0) + idmap.get(id));
return acc;
}, this.currentQueue);
this.queue$.next(newMap);
}
remove(...ids) {
const newMap = ids.reduce((acc, id) => {
acc.get(id) > 1 ? acc.set(id, acc.get(id) - 1) : acc.delete(id);
return acc;
}, this.currentQueue)
this.queue$.next(newMap);
};
removeMap(idmap: Map<any, any>) {
const newMap = (Array.from(idmap.keys()))
.reduce((acc, id) => {
acc.get(id) > idmap.get(id) ? acc.set(id, acc.get(id) - idmap.get(id)) : acc.delete(id);
return acc;
}, this.currentQueue)
this.queue$.next(newMap);
};
has(id) {
return this.queue$.getValue().has(id);
}
asObservable() {
return this.queue$.asObservable();
}
}
class Result {
result$ = new BehaviorSubject({ ids: new Map(), isError: null, value: null });
select(id) {
return this.result$.pipe(
filter(({ ids }) => ids.has(id)),
switchMap(({ isError, value }) => isError ? throwError(value) : of(value.find(x => x.id === id)))
)
}
add({ isError, value, ids }) {
this.result$.next({ ids, isError, value });
}
clear(){
this.result$.next({ ids: new Map(), isError: null, value: null });
}
}
const result = new Result();
const queueToSend = new Queue();
const queuePending = new Queue();
const doRequest = new Subject();
const fetchUser = (id: string) => {
return Observable.create(observer => {
queueToSend.add(id);
doRequest.next();
const subscription = result
.select(id)
.pipe(take(1))
.subscribe(observer);
// cleanup queue after got response or unsubscribe
return () => {
(queueToSend.has(id) ? queueToSend : queuePending).remove(id);
subscription.unsubscribe();
}
})
}
// some kind of worker that take task from queue and send requests
doRequest.asObservable().pipe(
auditTime(1000),
// clear outdated results
tap(()=>result.clear()),
withLatestFrom(queueToSend.asObservable()),
map(([_, queue]) => queue),
filter(ids => !!ids.size),
mergeMap(ids => {
// abort the request if it have no subscribers
const isNothingRemain$ = combineLatest(queueToSend.asObservable(), queuePending.asObservable()).pipe(
map(([queueToSendIds, queuePendingIds]) => Array.from(ids.keys()).some(k => queueToSendIds.has(k) || queuePendingIds.has(k))),
filter(hasSameKey => !hasSameKey)
)
// prevent to request the same ids if previous requst is not complete
queueToSend.removeMap(ids);
queuePending.addMap(ids);
return functionThatSimulateAFetch(Array.from(ids.keys())).pipe(
map(res => ({ isErorr: false, value: res, ids })),
takeUntil(isNothingRemain$),
catchError(error => of({ isError: true, value: error, ids }))
)
}),
).subscribe(res => result.add(res))
fetchUser('1').subscribe(console.log);
const subs = fetchUser('2').subscribe(console.log);
subs.unsubscribe();
fetchUser('3').subscribe(console.log);
setTimeout(() => {
const subs1 = fetchUser('10').subscribe(console.log);
subs1.unsubscribe();
const subs2 = fetchUser('11').subscribe(console.log);
subs2.unsubscribe();
}, 2000)
setTimeout(() => {
const subs1 = fetchUser('20').subscribe(console.log);
subs1.unsubscribe();
const subs21 = fetchUser('20').subscribe(console.log);
const subs22 = fetchUser('20').subscribe(console.log);
}, 4000)
// API_CALL
// ["1", "3"]
// {id: "1", name: "George"}
// {id: "3", name: "George"}
// API_CALL
// ["20"]
// {id: "20", name: "George"}
// {id: "20", name: "George"}
stackblitz example

FYI, i tried to create a generic batched task queue using the answers of
#buggy & #picci :
import { Observable, Subject, BehaviorSubject, from, timer } from "rxjs"
import { catchError, share, mergeMap, map, filter, takeUntil, take, bufferTime, timeout, concatMap } from "rxjs/operators"
export interface Task<TInput> {
uid: number
input: TInput
}
interface ErroredTask<TInput> extends Task<TInput> {
error: any
}
interface SucceededTask<TInput, TOutput> extends Task<TInput> {
output: TOutput
}
export type FinishedTask<TInput, TOutput> = ErroredTask<TInput> | SucceededTask<TInput, TOutput>
const taskErrored = <TInput, TOutput>(
taskFinished: FinishedTask<TInput, TOutput>,
): taskFinished is ErroredTask<TInput> => !!(taskFinished as ErroredTask<TInput>).error
type BatchedWorker<TInput, TOutput> = (tasks: Array<Task<TInput>>) => Observable<FinishedTask<TInput, TOutput>>
export const createSimpleBatchedWorker = <TInput, TOutput>(
work: (inputs: TInput[]) => Observable<TOutput[]>,
workTimeout: number,
): BatchedWorker<TInput, TOutput> => (
tasks: Array<Task<TInput>>,
) => work(
tasks.map((task) => task.input),
).pipe(
mergeMap((outputs) => from(tasks.map((task, index) => ({
...task,
output: outputs[index],
})))),
timeout(workTimeout),
catchError((error) => from(tasks.map((task) => ({
...task,
error,
})))),
)
export const createBatchedTaskQueue = <TInput, TOutput>(
worker: BatchedWorker<TInput, TOutput>,
concurrencyLimit: number = 1,
batchTimeout: number = 0,
maxBatchSize: number = Number.POSITIVE_INFINITY,
) => {
const taskSubject = new Subject<Task<TInput>>()
const cancelTaskSubject = new BehaviorSubject<Set<number>>(new Set())
const cancelTask = (task: Task<TInput>) => {
const cancelledUids = cancelTaskSubject.getValue()
const newCancelledUids = new Set(cancelledUids)
newCancelledUids.add(task.uid)
cancelTaskSubject.next(newCancelledUids)
}
const output$: Observable<FinishedTask<TInput, TOutput>> = taskSubject.pipe(
bufferTime(batchTimeout, undefined, maxBatchSize),
map((tasks) => {
const cancelledUids = cancelTaskSubject.getValue()
return tasks.filter((task) => !cancelledUids.has(task.uid))
}),
filter((tasks) => tasks.length > 0),
mergeMap(
(tasks) => worker(tasks).pipe(
takeUntil(cancelTaskSubject.pipe(
filter((uids) => {
for (const task of tasks) {
if (!uids.has(task.uid)) {
return false
}
}
return true
}),
)),
),
undefined,
concurrencyLimit,
),
share(),
)
let nextUid = 0
return (input$: Observable<TInput>): Observable<TOutput> => input$.pipe(
concatMap((input) => new Observable<TOutput>((observer) => {
const task = {
uid: nextUid++,
input,
}
const subscription = output$.pipe(
filter((taskFinished) => taskFinished.uid === task.uid),
take(1),
map((taskFinished) => {
if (taskErrored(taskFinished)) {
throw taskFinished.error
}
return taskFinished.output
}),
).subscribe(observer)
subscription.add(
timer(0).subscribe(() => taskSubject.next(task)),
)
return () => {
subscription.unsubscribe()
cancelTask(task)
}
})),
)
}
With our example:
import { from } from "rxjs"
import { map, toArray } from "rxjs/operators"
import { createBatchedTaskQueue, createSimpleBatchedWorker } from "mmr/components/rxjs/batched-task-queue"
const functionThatSimulateAFetch = (userIds: string[]) => from(userIds).pipe(
map((userId) => ({ id: userId, name: "George" })),
toArray(),
)
const userFetchQueue = createBatchedTaskQueue(
createSimpleBatchedWorker(
functionThatSimulateAFetch,
10000,
),
)
const fetchUser = (userId: string) => {
return from(userId).pipe(
userFetchQueue,
)
}
I am open to any improvement suggestions

Related

How to get data from 2 collection in firebase at a time?(similar to aggregate lookup in MongoDB) [duplicate]

I have a Cloud Firestore DB with the following structure:
users
[uid]
name: "Test User"
posts
[id]
content: "Just some test post."
timestamp: (Dec. 22, 2017)
uid: [uid]
There is more data present in the actual DB, the above just illustrates the collection/document/field structure.
I have a view in my web app where I'm displaying posts and would like to display the name of the user who posted. I'm using the below query to fetch the posts:
let loadedPosts = {};
posts = db.collection('posts')
.orderBy('timestamp', 'desc')
.limit(3);
posts.get()
.then((docSnaps) => {
const postDocs = docSnaps.docs;
for (let i in postDocs) {
loadedPosts[postDocs[i].id] = postDocs[i].data();
}
});
// Render loadedPosts later
What I want to do is query the user object by the uid stored in the post's uid field, and add the user's name field into the corresponding loadedPosts object. If I was only loading one post at a time this would be no problem, just wait for the query to come back with an object and in the .then() function make another query to the user document, and so on.
However because I'm getting multiple post documents at once, I'm having a hard time figuring out how to map the correct user to the correct post after calling .get() on each post's user/[uid] document due to the asynchronous way they return.
Can anyone think of an elegant solution to this issue?
It seems fairly simple to me:
let loadedPosts = {};
posts = db.collection('posts')
.orderBy('timestamp', 'desc')
.limit(3);
posts.get()
.then((docSnaps) => {
docSnaps.forEach((doc) => {
loadedPosts[doc.id] = doc.data();
db.collection('users').child(doc.data().uid).get().then((userDoc) => {
loadedPosts[doc.id].userName = userDoc.data().name;
});
})
});
If you want to prevent loading a user multiple times, you can cache the user data client side. In that case I'd recommend factoring the user-loading code into a helper function. But it'll be a variation of the above.
I would do 1 user doc call and the needed posts call.
let users = {} ;
let loadedPosts = {};
db.collection('users').get().then((results) => {
results.forEach((doc) => {
users[doc.id] = doc.data();
});
posts = db.collection('posts').orderBy('timestamp', 'desc').limit(3);
posts.get().then((docSnaps) => {
docSnaps.forEach((doc) => {
loadedPosts[doc.id] = doc.data();
loadedPosts[doc.id].userName = users[doc.data().uid].name;
});
});
After trying multiple solution I get it done with RXJS combineLatest, take operator. Using map function we can combine result.
Might not be an optimum solution but here its solve your problem.
combineLatest(
this.firestore.collection('Collection1').snapshotChanges(),
this.firestore.collection('Collection2').snapshotChanges(),
//In collection 2 we have document with reference id of collection 1
)
.pipe(
take(1),
).subscribe(
([dataFromCollection1, dataFromCollection2]) => {
this.dataofCollection1 = dataFromCollection1.map((data) => {
return {
id: data.payload.doc.id,
...data.payload.doc.data() as {},
}
as IdataFromCollection1;
});
this.dataofCollection2 = dataFromCollection2.map((data2) => {
return {
id: data2.payload.doc.id,
...data2.payload.doc.data() as {},
}
as IdataFromCollection2;
});
console.log(this.dataofCollection2, 'all feeess');
const mergeDataFromCollection =
this.dataofCollection1.map(itm => ({
payment: [this.dataofCollection2.find((item) => (item.RefId === itm.id))],
...itm
}))
console.log(mergeDataFromCollection, 'all data');
},
my solution as below.
Concept: You know user id you want to get information, in your posts list, you can request user document and save it as promise in your post item. after promise resolve then you get user information.
Note: i do not test below code, but it is simplify version of my code.
let posts: Observable<{}[]>; // you can display in HTML directly with | async tag
this.posts = this.listenPosts()
.map( posts => {
posts.forEach( post => {
post.promise = this.getUserDoc( post.uid )
.then( (doc: DocumentSnapshot) => {
post.userName = doc.data().name;
});
}); // end forEach
return posts;
});
// normally, i keep in provider
listenPosts(): Observable<any> {
let fsPath = 'posts';
return this.afDb.collection( fsPath ).valueChanges();
}
// to get the document according the user uid
getUserDoc( uid: string ): Promise<any> {
let fsPath = 'users/' + uid;
return this.afDb.doc( fsPath ).ref.get();
}
Note: afDb: AngularFirestore it is initialize in constructor (by angularFire lib)
If you want to join observables instead of promises, use combineLatest. Here is an example joining a user document to a post document:
getPosts(): Observable<Post[]> {
let data: any;
return this.afs.collection<Post>('posts').valueChanges().pipe(
switchMap((r: any[]) => {
data = r;
const docs = r.map(
(d: any) => this.afs.doc<any>(`users/${d.user}`).valueChanges()
);
return combineLatest(docs).pipe(
map((arr: any) => arr.reduce((acc: any, cur: any) => [acc].concat(cur)))
);
}),
map((d: any) => {
let i = 0;
return d.map(
(doc: any) => {
const t = { ...data[i], user: doc };
++i;
return t;
}
);
})
);
}
This example joins each document in a collection, but you could simplify this if you wanted to just join one single document to another.
This assumes your post document has a user variable with the userId of the document.
J

How to put a dynamic data from firestore in the function where() and also use the snap.size to count the total query to be passed in a graph?

I have this data from firestore and I wanted to retrieve it dynamically with a where() but this is the error I'm getting:
TypeError: vaccines is not a function
The user collection:
[![enter image description here][1]][1]
Below are the codes:
const Vaccine = () => {
const [vaccines, setVaccines] = useState([]);
useEffect(() => {
const unsubscribe = firestore
.collection("vaccines")
.onSnapshot((snapshot) => {
const arr = [];
snapshot.forEach((doc) =>
arr.push({
...doc.data(),
id: doc.id,
})
);
setVaccines(arr);
});
return () => {
unsubscribe();
};
}, []);
Preface
As highlighted in the comments on the original question, this query structure is not advised as it requires read access to sensitive user data under /users that includes private medical data.
DO NOT USE THIS CODE IN A PRODUCTION/COMMERICAL ENVIRONMENT. Failure to heed this warning will lead to someone suing you for breaches of privacy regulations.
It is only suitable for a school project (although I would a fail a student for such a security hole) or proof of concept using mocked data. The code included below is provided for education purposes, to solve your specific query and to show strategies of handling dynamic queries in React.
From a performance standpoint, in the worst case scenario (a cache miss), you will be billed one read, for every user with at least one dose of any vaccine, on every refresh, for every viewing user. Even though your code doesn't use the contents of any user document, your code must download all of this data too because the Client SDKs do not support the select() operator.
For better security and performance, perform this logic server-side (e.g. Cloud Function, a script on your own computer, etc) and save the results to a single document that can be reused by all users. This will allow you to properly tighten access to /users. It also significantly simplifies the code you need to display the graphs and live statistics on the client-side.
useEffect
As stated by the React documentation on the Rules of hooks:
Only Call Hooks at the Top Level
Don’t call Hooks inside loops, conditions, or nested functions. Instead, always use Hooks at the top level of your React function, before any early returns. By following this rule, you ensure that Hooks are called in the same order each time a component renders. That’s what allows React to correctly preserve the state of Hooks between multiple useState and useEffect calls.
The documentation further elaborates that React relies on the order in which Hooks are called, which means that you can't have hook definitions behind conditional logic where their order and quantity changes between renders. If your hooks rely on some conditional logic, it must be defined inside of the hook's declaration.
As an example, if you have an effect that relies on other data, with this logic:
const [userProfile, setUserProfile] = useState();
const [userPosts, setUserPosts] = useState(null);
useEffect(() => {
// get user profile data and store in userProfile
}, []);
if (userProfile) {
useEffect(() => {
// get user post list and store in userPosts
}, [userProfile]);
}
you need to instead use:
const [userProfile, setUserProfile] = useState();
const [userPosts, setUserPosts] = useState(null);
useEffect(() => {
// get user profile data and store in userProfile
}, []);
useEffect(() => {
if (!userProfile) {
// not ready yet/signed out
setUserPosts(null);
return;
}
// get user post list and store in userPosts
}, [userProfile]);
Similarly, for arrays:
someArray && someArray.forEach((entry) => {
useEffect(() => {
// do something with entry to define the effect
}, /* variable change hooks */);
});
should instead be:
useEffect(() => {
if (!someArray) {
// not ready yet
return;
}
const cleanupFunctions = [];
someArray.forEach((entry) => {
// do something with entry to define an effect
cleanupFunctions.push(() => {
// clean up the effect
});
});
// return function to cleanup the effects created here
return () => {
cleanupFunctions.forEach(cleanup => cleanup());
}
}, /* variable change hooks */);
Because this looks a lot like lifecycle management, you are actually better off replacing it with nested components rather than using hooks, like so:
return (
<> // tip: React.Fragment shorthand (used for multiple top-level elements)
{
someArray && someArray
.map(entry => {
return <Entry key={entry.key} data={entry.data} />
})
}
</>
);
Adapting to your code
Note: The code here doesn't use onSnapshot for the statistics because it would cause a rerender every time a new user is added to the database.
const getVaccineStats = (vaccineName) => {
const baseQuery = firestore
.collection("users")
.where("doses.selectedVaccine", "==", vaccine);
const oneDoseQueryPromise = baseQuery
.where("doses.dose1", "==", true)
.where("doses.dose2", "==", false)
.get()
.then(querySnapshot => querySnapshot.size);
const twoDoseQueryPromise = baseQuery
.where("doses.dose1", "==", true)
.where("doses.dose2", "==", true)
.get()
.then(querySnapshot => querySnapshot.size);
return Promise.all([oneDoseQueryPromise, twoDoseQueryPromise])
.then(([oneDoseCount, twoDoseCount]) => ({ // tip: used "destructuring syntax" instead of `results[0]` and `results[1]`
withOneDose: oneDoseCount,
withTwoDoses: twoDoseCount
}));
};
const Vaccine = () => {
const [vaccines, setVaccines] = useState();
const [vaccineStatsArr, setVaccineStatsArr] = useState([]);
// Purpose: Collect vaccine definitions and store in `vaccines`
useEffect(() => {
return firestore // tip: you can return the unsubscribe function from `onSnapshot` directly
.collection("vaccines")
.onSnapshot({ // tip: using the Observer-like syntax, allows you to handle errors
next: (querySnapshot) => {
const vaccineData = []; // tip: renamed `arr` to indicate what the data contains
querySnapshot.forEach((doc) =>
vaccineData.push({
...doc.data(),
id: doc.id,
});
);
setVaccines(vaccineData);
}),
error: (err) => {
// TODO: Handle database errors (e.g. no permission, no connection)
}
});
}, []);
// Purpose: For each vaccine definition, fetch relevant statistics
// and store in `vaccineStatsArr`
useEffect(() => {
if (!vaccines || vaccines.length === 0) {
return; // no definitions ready, exit early
}
const getVaccineStatsPromises = vaccines
.map(({ vaccine }) => [vaccine, getVaccineStats(vaccine)]);
// tip: used "destructuring syntax" on above line
// (same as `.map(vaccineInfo => [vaccineInfo.vaccine, getVaccineStats(vaccineInfo.vaccine)]);`)
let unsubscribed = false;
Promise.all(getVaccineStatsPromises)
.then(newVaccineStatsArr => {
if (unsubscribed) return; // unsubscribed? do nothing
setVaccineStatsArr(newVaccineStatsArr);
})
.catch(err => {
if (unsubscribed) return; // unsubscribed? do nothing
// TODO: handle errors
});
return () => unsubscribed = true;
}, [vaccines]);
if (!vaccines) // not ready? hide element
return null;
if (vaccines.length === 0) // no vaccines found? show error
return (<span class="error">No vaccines found in database</span>);
if (vaccineStatsArr.length === 0) // no stats yet? show loading message
return (<span>Loading statistics...</span>);
return (<> // tip: React.Fragment shorthand
{
vaccineStatsArr.map(([name, stats]) => {
// this is an example component, find something suitable
// the `key` property is required
return (<BarGraph
key={name}
title={`${name} Statistics`}
columns={["One Dose", "Two Doses"]}
data={[stats.withOneDose, stats.withTwoDoses]}
/>);
});
}
</>);
};
export default Vaccine;
Live Statistics
If you want your graphs to be updated live, you need "zip together" the two snapshot listeners into one, similar to the rxjs combineLatest operator. Here is an example implementation of this:
const onVaccineStatsSnapshot => (vaccine, observerOrSnapshotCallback, errorCallback = undefined) => {
const observer = typeof observerOrCallback === 'function'
? { next: observerOrSnapshotCallback, error: errorCallback }
: observerOrSnapshotCallback;
let latestWithOneDose,
latestWithTwoDoses,
oneDoseReady = false,
twoDosesReady = false;
const fireNext = () => {
// don't actually fire event until both counts have come in
if (oneDoseReady && twoDosesReady) {
observer.next({
withOneDose: latestWithOneDose,
withTwoDoses: latestWithTwoDoses
});
}
};
const fireError = observer.error || (err) => console.error(err);
const oneDoseUnsubscribe = baseQuery
.where("doses.dose1", "==", true)
.where("doses.dose2", "==", false)
.onSnapshot({
next: (querySnapshot) => {
latestWithOneDose = querySnapshot.size;
oneDoseReady = true;
fireNext();
},
error: fireError
});
const twoDoseUnsubscribe = baseQuery
.where("doses.dose1", "==", true)
.where("doses.dose2", "==", true)
.onSnapshot({
next: (querySnapshot) => {
latestWithTwoDoses = querySnapshot.size;
twoDosesReady = true;
fireNext();
},
error: fireError
});
return () => {
oneDoseUnsubscribe();
twoDoseUnsubscribe();
};
}
You could rewrite the above function to make use of useState, but this would unnecessarily cause components to rerender when they don't need to.
Usage (direct):
const unsubscribe = onVaccineStatsSnapshot(vaccineName, {
next: (statsSnapshot) => {
// do something with { withOneDose, withTwoDoses } object
},
error: (err) => {
// TODO: error handling
}
);
or
const unsubscribe = onVaccineStatsSnapshot(vaccineName, (statsSnapshot) => {
// do something with { withOneDose, withTwoDoses } object
});
Usage (as a component):
const VaccineStatsGraph = (vaccineName) => {
const [stats, setStats] = useState(null);
useEffect(() => onVaccineStatsSnapshot(vaccineName, {
next: (newStats) => setStats(newStats),
error: (err) => {
// TODO: Handle errors
}
}, [vaccineName]);
if (!stats)
return (<span>Loading graph for {vaccineName}...</span>);
return (
<BarGraph
title={`${name} Statistics`}
columns={["One Dose", "Two Doses"]}
data={[stats.withOneDose, stats.withTwoDoses]}
/>
);
}
vaccines is an array and not a function. You are trying to run a map on vaccines. Try refactoring your code to this:
vaccines &&
vaccines.map((v, index) => {
// ...
})
Also do check: How to call an async function inside a UseEffect() in React?
here is the code, that works for you:
function DatafromFB() {
const[users, setUsers] = useState({});
useEffect(()=>{
const fetchVaccine = async () => {
try {
const docs = await db.collection("vaccines").get();;
docs.forEach((doc) => {
doc.data().vaccineDetails
.forEach(vaccineData=>{
fetchUsers(vaccineData.vaccine)
})
})
} catch (error) {
console.log("error", error);
}
}
const fetchUsers = async (vaccine)=>{
try {
const docs = await db.collection("users")
.where("doses.selectedVaccine", "==", vaccine).get();
docs.forEach(doc=>{
console.log(doc.data())
setUsers(doc.data());
})
}catch(error){
console.log("error", error);
}
}
fetchVaccine();
},[])
return (
<div>
<h1>{users?.doses?.selectedVaccine}</h1>
</div>
)
}
export default DatafromFB
what is ${index.vaccine} I think it must be v.vaccine
also setSize(snap.size); will set set size commonly not vaccine specific

Properly handling state and WebSockets in a functional way

This is part of a larger project, where I am rewriting the imperative code of my multiplayer game to be functional. Right now, I am more or less at the start, trying to think at a high level about the structure. Here is what I had in mind:
The part that bridges the imperative-functional gap is this class:
class MessageStream<T> {
private readonly actions: T[] = []
private popResolvers: ((value: T) => void)[] = []
push(value: T) {
this.actions.push(value)
if (this.popResolvers.length > 0) {
this.popResolvers.shift()(this.actions.shift())
}
}
pop(): Promise<T> {
return new Promise<T>((resolve) => {
if (this.actions.length > 0 && this.popResolvers.length === 0) {
resolve(this.actions.shift())
} else {
this.popResolvers.push(resolve)
}
})
}
}
I start the web server like this:
const server = new WebServer.Server({ port: 3000 })
const messageStream = new MessageStream<string>()
const initialState: State = { letters: "a" }
server.on("connection", (socket) => {
socket.on("message", (data) => {
try {
messageStream.push(data.toString())
} catch (error) {
console.log("[Error index.ts]", error)
}
})
})
This is where the state updating logic would go:
const handle = async (stream: MessageStream<string>, state: State) => {
const action = await stream.pop()
// do some other thinking to update the state, such as:
handle(stream, { letters: state.letters + action })
}
And this is how we link them:
handle(messageStream, initialState)
Is this the right way to go about writing an FRP-driven server for a multiplayer game? Is the "architecture" here correct? Is there a way to avoid MessageStream, or is that the right way to do this? The recursion in handle seems a little suspicious (feels like the stack can get huge), but I don't see any other way to avoid a global state.
The other confusing part here is that this structure forces all operations to be processed in sequence. Nothing can happen concurrently. But what if processing a message was an intensive operation that I wanted to parallelize, and I only wanted to synchronize updating the state? How, structurally, would one go about doing that?
Edit
Here is a rewrite of handle to be more like reduce:
const reduce = async (
stream: MessageStream<Action>,
callbackfn: (
state: State,
action: Action
) => { newState: State; sideEffects: SideEffect[] },
initialState: State,
executeSideEffect: (sideEffect: SideEffect) => void
) => {
let state = initialState
while (true) {
const action = await stream.pop()
const { newState, sideEffects } = callbackfn(state, action)
sideEffects.map(executeSideEffect)
state = newState
}
}
// example usage
reduce(
messageStream,
(s, a) => ({
newState: { letters: s.letters + a.addLetter },
sideEffects: [
{
type: SideEffectType.NotifyUser,
info: `Got your message, ${a.addLetter}!`,
},
],
}),
{ letters: "a" },
(e) => console.log(e)
)
Then, in socket.on("message"), we push the action to messageStream like above. This way, all the non-pure behavior is contained to the top level of the reduce function. However, one potential anti-pattern is that reduce never returns anything. Is that bad?

Cloud Firestore: Query two collection [duplicate]

I have a Cloud Firestore DB with the following structure:
users
[uid]
name: "Test User"
posts
[id]
content: "Just some test post."
timestamp: (Dec. 22, 2017)
uid: [uid]
There is more data present in the actual DB, the above just illustrates the collection/document/field structure.
I have a view in my web app where I'm displaying posts and would like to display the name of the user who posted. I'm using the below query to fetch the posts:
let loadedPosts = {};
posts = db.collection('posts')
.orderBy('timestamp', 'desc')
.limit(3);
posts.get()
.then((docSnaps) => {
const postDocs = docSnaps.docs;
for (let i in postDocs) {
loadedPosts[postDocs[i].id] = postDocs[i].data();
}
});
// Render loadedPosts later
What I want to do is query the user object by the uid stored in the post's uid field, and add the user's name field into the corresponding loadedPosts object. If I was only loading one post at a time this would be no problem, just wait for the query to come back with an object and in the .then() function make another query to the user document, and so on.
However because I'm getting multiple post documents at once, I'm having a hard time figuring out how to map the correct user to the correct post after calling .get() on each post's user/[uid] document due to the asynchronous way they return.
Can anyone think of an elegant solution to this issue?
It seems fairly simple to me:
let loadedPosts = {};
posts = db.collection('posts')
.orderBy('timestamp', 'desc')
.limit(3);
posts.get()
.then((docSnaps) => {
docSnaps.forEach((doc) => {
loadedPosts[doc.id] = doc.data();
db.collection('users').child(doc.data().uid).get().then((userDoc) => {
loadedPosts[doc.id].userName = userDoc.data().name;
});
})
});
If you want to prevent loading a user multiple times, you can cache the user data client side. In that case I'd recommend factoring the user-loading code into a helper function. But it'll be a variation of the above.
I would do 1 user doc call and the needed posts call.
let users = {} ;
let loadedPosts = {};
db.collection('users').get().then((results) => {
results.forEach((doc) => {
users[doc.id] = doc.data();
});
posts = db.collection('posts').orderBy('timestamp', 'desc').limit(3);
posts.get().then((docSnaps) => {
docSnaps.forEach((doc) => {
loadedPosts[doc.id] = doc.data();
loadedPosts[doc.id].userName = users[doc.data().uid].name;
});
});
After trying multiple solution I get it done with RXJS combineLatest, take operator. Using map function we can combine result.
Might not be an optimum solution but here its solve your problem.
combineLatest(
this.firestore.collection('Collection1').snapshotChanges(),
this.firestore.collection('Collection2').snapshotChanges(),
//In collection 2 we have document with reference id of collection 1
)
.pipe(
take(1),
).subscribe(
([dataFromCollection1, dataFromCollection2]) => {
this.dataofCollection1 = dataFromCollection1.map((data) => {
return {
id: data.payload.doc.id,
...data.payload.doc.data() as {},
}
as IdataFromCollection1;
});
this.dataofCollection2 = dataFromCollection2.map((data2) => {
return {
id: data2.payload.doc.id,
...data2.payload.doc.data() as {},
}
as IdataFromCollection2;
});
console.log(this.dataofCollection2, 'all feeess');
const mergeDataFromCollection =
this.dataofCollection1.map(itm => ({
payment: [this.dataofCollection2.find((item) => (item.RefId === itm.id))],
...itm
}))
console.log(mergeDataFromCollection, 'all data');
},
my solution as below.
Concept: You know user id you want to get information, in your posts list, you can request user document and save it as promise in your post item. after promise resolve then you get user information.
Note: i do not test below code, but it is simplify version of my code.
let posts: Observable<{}[]>; // you can display in HTML directly with | async tag
this.posts = this.listenPosts()
.map( posts => {
posts.forEach( post => {
post.promise = this.getUserDoc( post.uid )
.then( (doc: DocumentSnapshot) => {
post.userName = doc.data().name;
});
}); // end forEach
return posts;
});
// normally, i keep in provider
listenPosts(): Observable<any> {
let fsPath = 'posts';
return this.afDb.collection( fsPath ).valueChanges();
}
// to get the document according the user uid
getUserDoc( uid: string ): Promise<any> {
let fsPath = 'users/' + uid;
return this.afDb.doc( fsPath ).ref.get();
}
Note: afDb: AngularFirestore it is initialize in constructor (by angularFire lib)
If you want to join observables instead of promises, use combineLatest. Here is an example joining a user document to a post document:
getPosts(): Observable<Post[]> {
let data: any;
return this.afs.collection<Post>('posts').valueChanges().pipe(
switchMap((r: any[]) => {
data = r;
const docs = r.map(
(d: any) => this.afs.doc<any>(`users/${d.user}`).valueChanges()
);
return combineLatest(docs).pipe(
map((arr: any) => arr.reduce((acc: any, cur: any) => [acc].concat(cur)))
);
}),
map((d: any) => {
let i = 0;
return d.map(
(doc: any) => {
const t = { ...data[i], user: doc };
++i;
return t;
}
);
})
);
}
This example joins each document in a collection, but you could simplify this if you wanted to just join one single document to another.
This assumes your post document has a user variable with the userId of the document.
J

How to join multiple documents in a Cloud Firestore query?

I have a Cloud Firestore DB with the following structure:
users
[uid]
name: "Test User"
posts
[id]
content: "Just some test post."
timestamp: (Dec. 22, 2017)
uid: [uid]
There is more data present in the actual DB, the above just illustrates the collection/document/field structure.
I have a view in my web app where I'm displaying posts and would like to display the name of the user who posted. I'm using the below query to fetch the posts:
let loadedPosts = {};
posts = db.collection('posts')
.orderBy('timestamp', 'desc')
.limit(3);
posts.get()
.then((docSnaps) => {
const postDocs = docSnaps.docs;
for (let i in postDocs) {
loadedPosts[postDocs[i].id] = postDocs[i].data();
}
});
// Render loadedPosts later
What I want to do is query the user object by the uid stored in the post's uid field, and add the user's name field into the corresponding loadedPosts object. If I was only loading one post at a time this would be no problem, just wait for the query to come back with an object and in the .then() function make another query to the user document, and so on.
However because I'm getting multiple post documents at once, I'm having a hard time figuring out how to map the correct user to the correct post after calling .get() on each post's user/[uid] document due to the asynchronous way they return.
Can anyone think of an elegant solution to this issue?
It seems fairly simple to me:
let loadedPosts = {};
posts = db.collection('posts')
.orderBy('timestamp', 'desc')
.limit(3);
posts.get()
.then((docSnaps) => {
docSnaps.forEach((doc) => {
loadedPosts[doc.id] = doc.data();
db.collection('users').child(doc.data().uid).get().then((userDoc) => {
loadedPosts[doc.id].userName = userDoc.data().name;
});
})
});
If you want to prevent loading a user multiple times, you can cache the user data client side. In that case I'd recommend factoring the user-loading code into a helper function. But it'll be a variation of the above.
I would do 1 user doc call and the needed posts call.
let users = {} ;
let loadedPosts = {};
db.collection('users').get().then((results) => {
results.forEach((doc) => {
users[doc.id] = doc.data();
});
posts = db.collection('posts').orderBy('timestamp', 'desc').limit(3);
posts.get().then((docSnaps) => {
docSnaps.forEach((doc) => {
loadedPosts[doc.id] = doc.data();
loadedPosts[doc.id].userName = users[doc.data().uid].name;
});
});
After trying multiple solution I get it done with RXJS combineLatest, take operator. Using map function we can combine result.
Might not be an optimum solution but here its solve your problem.
combineLatest(
this.firestore.collection('Collection1').snapshotChanges(),
this.firestore.collection('Collection2').snapshotChanges(),
//In collection 2 we have document with reference id of collection 1
)
.pipe(
take(1),
).subscribe(
([dataFromCollection1, dataFromCollection2]) => {
this.dataofCollection1 = dataFromCollection1.map((data) => {
return {
id: data.payload.doc.id,
...data.payload.doc.data() as {},
}
as IdataFromCollection1;
});
this.dataofCollection2 = dataFromCollection2.map((data2) => {
return {
id: data2.payload.doc.id,
...data2.payload.doc.data() as {},
}
as IdataFromCollection2;
});
console.log(this.dataofCollection2, 'all feeess');
const mergeDataFromCollection =
this.dataofCollection1.map(itm => ({
payment: [this.dataofCollection2.find((item) => (item.RefId === itm.id))],
...itm
}))
console.log(mergeDataFromCollection, 'all data');
},
my solution as below.
Concept: You know user id you want to get information, in your posts list, you can request user document and save it as promise in your post item. after promise resolve then you get user information.
Note: i do not test below code, but it is simplify version of my code.
let posts: Observable<{}[]>; // you can display in HTML directly with | async tag
this.posts = this.listenPosts()
.map( posts => {
posts.forEach( post => {
post.promise = this.getUserDoc( post.uid )
.then( (doc: DocumentSnapshot) => {
post.userName = doc.data().name;
});
}); // end forEach
return posts;
});
// normally, i keep in provider
listenPosts(): Observable<any> {
let fsPath = 'posts';
return this.afDb.collection( fsPath ).valueChanges();
}
// to get the document according the user uid
getUserDoc( uid: string ): Promise<any> {
let fsPath = 'users/' + uid;
return this.afDb.doc( fsPath ).ref.get();
}
Note: afDb: AngularFirestore it is initialize in constructor (by angularFire lib)
If you want to join observables instead of promises, use combineLatest. Here is an example joining a user document to a post document:
getPosts(): Observable<Post[]> {
let data: any;
return this.afs.collection<Post>('posts').valueChanges().pipe(
switchMap((r: any[]) => {
data = r;
const docs = r.map(
(d: any) => this.afs.doc<any>(`users/${d.user}`).valueChanges()
);
return combineLatest(docs).pipe(
map((arr: any) => arr.reduce((acc: any, cur: any) => [acc].concat(cur)))
);
}),
map((d: any) => {
let i = 0;
return d.map(
(doc: any) => {
const t = { ...data[i], user: doc };
++i;
return t;
}
);
})
);
}
This example joins each document in a collection, but you could simplify this if you wanted to just join one single document to another.
This assumes your post document has a user variable with the userId of the document.
J

Categories

Resources