getEnrolledPlayers should fetch an array of 'player' objects from the database and then pass it to the matchMaking function. However, it doesn't get passed correctly.
I tried adding observables, playing around with subscriptions
initializeEvent(eventId: string) {
const enrolledPlayers: PlayerStat[] = [];
this.getEnrolledPlayers(eventId)
.subscribe((playerIds: string[]) => {
for (const playerId of playerIds) {
this.dataService.fetchSinglePlayer(playerId)
.subscribe((playerStat: PlayerStat) => enrolledPlayers.push(playerStat));
}
this.matchMaking(enrolledPlayers);
});
}
When I call these series of asynchronous functions, enrolledPlayers[] is calculated correctly (array of 7 elements), but it doesn't get called to the matchMaking() function correctly. I assume it's because of asynchronous runtime.
Yes. It's definitely an issue caused because of the time difference in which the inner subscription resolves a value.
I'd suggest using a forkJoin and waiting on getting all the values resolved before calling matchMaking.
Give this a try:
initializeEvent(eventId: string) {
const enrolledPlayers: PlayerStat[] = [];
this.getEnrolledPlayers(eventId)
.subscribe((playerIds: string[]) => {
const playerInfos$ = playerIds.map(playerId => this.dataService.fetchSinglePlayer(playerId));
forkJoin(...playerInfos$)
.subscribe(enrolledPlayers: PlayerStat[] => this.matchMaking(enrolledPlayers));
});
}
Or with one subscribe
initializeEvent(eventId: string) {
const enrolledPlayers: PlayerStat[] = [];
this.getEnrolledPlayers(eventId)
.take(1)
.switchMap((playerIds: string[]) => {
const playerInfos$ = playerIds.map(playerId => this.dataService.fetchSinglePlayer(playerId).take(1));
return forkJoin(...playerInfos$);
})
.tap(this.matchMaking)
.subscribe();
}
this is a nested subscribe anti pattern... you never nest subscribes, this is how it should look using higher order operators:
initializeEvent(eventId: string) {
this.getEnrolledPlayers(eventId)
.pipe(
switchMap(playerIds =>
forkJoin(playerIds.map(playerId => this.dataService.fetchSinglePlayer(playerId)))
)
).subscribe((enrolledPlayers) =>
this.matchMaking(enrolledPlayers)
);
}
use switchMap to switch into a new observable and then forkJoin to run many observables in parrallel
Related
type Movie = {id: string};
type FullMovie = {id: string, picture: string};
I have a url that returns an array of type Movie:
http.get(url).subscribe(res: Movie[])
I use http.get(movie.id) for each movie in the array returning a FullMovie:
http.get(movie.id).subscribe(res: FullMovie)
so in essence I want to create a method that returns a stream of FullMovie objects, as the requests resolve: getAll = (url): Observable<FullMovie>
getAll = (url): Observable<FullMovie> => {
return http.get(url)
//must pipe the array into a stream of FullMovies but not a stream of FullMovie Observables. I don't want to subscribe to each of the returned FullMovies
//something like
.pipe(//map(array => array.forEach(movie => return http.get(movie.id))))
}
At the moment I have the following solution that works but I want to a more concise solution:
private getFull = (queryGroup: string): Observable<TMDBMovie> =>
new Observable<TMDBMovie>((observer) => {
//get movie array
this.httpGet(queryGroup).subscribe((movies) => {
var j = 0;
if (movies.length === 0) return observer.complete();
//loop through elements
movies.forEach(movie => {
this.getById(movie.id).subscribe(
(res) => complete(observer.next(res)),
(error) => complete()
);
});
}
const complete = (arg: any = 0) => {
if (++j === len) observer.complete();
};
});
});
EDIT:
This works
newGetFull = (queryGroup: string) =>
this.httpGet(queryGroup)
.pipe(concatMap((arr) => from(arr)))
.pipe(
mergeMap((movie) => this.getById(movie.id).pipe(catchError(() => of())))
);
You may want to try something along these lines
getAll = (url): Observable<FullMovie> => {
return http.get(url)
.pipe(
// turn the array Movie[] into a stream of Movie, i.e. an Obsevable<Movie>
concatMap(arrayOfMovies => from(arrayOfMovies)),
// then use mergeMap to "flatten" the various Obaservable<FullMovie> that you get calling http.get(movie.id)
// in other words, with mergeMap, you turn a stream of Observables into a stream of the results returned when each Observable is resolved
mergeMap(movie => http.get(movie.id))
)
}
Consider that using mergeMap as above you do not have guarantee that the final stream will have the same order as the array of Movies you get from the first call. This is because each http.get(movie.id) can take different time to return and therefore the order is not guaranteed.
If you need to guarantee the order, use concatMap rather than mergeMap (actually concatMap is mergeMap with concurrency set to 1).
If you want all the http.get(movie.id) to complete before returning the result, then use forkJoin rather than mergeMap like this
getAll = (url): Observable<FullMovie> => {
return http.get(url)
.pipe(
// turn the array Movie[] into an array of Observable<Movie>
map(arrayOfMovies => arrayOfMovies.map(movie => http.get(movie.id))),
// then use forkJoin to resolve all the Observables in parallel
concatMap(arrayOfObservables => forkJoin(arrayOfObservables))
).subscribe(
arrayOfFullMovies => {
// the result notified by forkJoin is an array of FullMovie objects
}
)
}
I have a use case where I have to call $http.post(request) on batches of the input data.
For this, I created an array of requests. For each of them, I need to get the response of $http.post(), append it to an existing array and pass it to a rendering function. I have to make the next call only when the previous one completes and since $http.post() returns a promise (according to this), I am trying to do this using the reduce function.
function callToHttpPost(request) {
return $http.post('someurl', request);
}
function outerFunc($scope, someId) {
let completeArray = [];
let arrayOfRequests = getRequestInBatches();
arrayOfRequests.reduce((promiseChain, currentRequest) => {
console.log(promiseChain);
return promiseChain.then((previousResponse) => {
completeArray.push.apply(completeArray, previousResponse.data);
render($scope, completeArray, someId);
return callToHttpPost(currentRequest);
});
}, Promise.resolve()).catch(e => errorHandler($scope, e, someId));
}
(I have referred MDN and this answer)
But this gives me TypeError: previousResponse is undefined. The log statement shows the first promise as resolved (since it is the initial value passed to the reduce function), but the other promises show up as rejected due to this error. How can I resolve this?
Using vanilla Javascript
If the outerFunc function can be used in an async context (which it looks like it can, given that it returns nothing and the results are passed to the render function as they are built up), you could clean that right up, paring it down to:
async function outerFunc ($scope, someId) {
const completeArray = [];
try {
for (const request of getRequestInBatches()) {
const { data } = await callToHttpPost(request);
completeArray.push(...data);
render($scope, completeArray, someId);
}
} catch (e) {
errorHandler($scope, e, someId);
}
}
The sequential nature will be enforced by the async/await keywords.
Using RxJS
If you're able to add a dependency on RxJS, your can change the function to:
import { from } from 'rxjs';
import { concatMap, scan } from 'rxjs/operators';
function outerFunc ($scope, someId) {
from(getRequestInBatches()).pipe(
concatMap(callToHttpPost),
scan((completeArray, { data }) => completeArray.concat(...data), [])
).subscribe(
completeArray => render($scope, completeArray, someId),
e => errorHandler($scope, e, someId)
);
}
which revolves around the use of Observable instead of Promise. In this version, the sequential nature is enforced by the concatMap operator, and the complete array of results is reduced and emitted while being built up by the scan operator.
The error was in passing the initial value. In the first iteration of the reduce function, Promise.resolve() returns undefined. This is what is passed as previousResponse. Passing Promise.resolve({ data: [] }) as the initialValue to the reduce function solved the issue.
arrayOfRequests.reduce((promiseChain, currentRequest) => {
console.log(promiseChain);
return promiseChain.then((previousResponse) => {
completeArray.push.apply(completeArray, previousResponse.data);
render($scope, completeArray, someId);
return callToHttpPost(currentRequest);
});
}, Promise.resolve({ data: [] }))
.then(response => {
completeArray.push.apply(completeArray, previousResponse.data);
render($scope, completeArray, someId);
displaySuccessNotification();
})
.catch(e => errorHandler($scope, e, someId));
(edited to handle the final response)
Please see the code below for reference.
export const listenToHotels = (hotelIds: string[]): Observable < Hotel[] > => {
return new Observable < Hotel[] > ((observer) => {
const hotels: any = [];
hotelIds.forEach((hotelId) => {
let roomingList: any;
return FirestoreCollectionReference.Hotels()
.doc(hotelId)
.onSnapshot(
(doc) => {
roomingList = {
hotelId: doc.id,
...doc.data()
}
as Hotel;
console.log(`roomingList`, roomingList);
hotels.push({
...roomingList
});
},
(error) => observer.error(error)
);
});
//Check for error handling
observer.next(hotels);
console.log('hotels', hotels);
});
};
As you can see I am trying to run a forEach on a hotelId Array and in that firestore listener is being executed. Now I want to save the response and push that into hotels array but it gives me an error object not extensible error.
The thing is observer and console.log('hotels',hotels) run first because of promise being executed at later stage.
Please let me know how can I resolve this issue.
I think you could use map instead of forEach, because you could use await with map
example of using map and await: https://flaviocopes.com/javascript-async-await-array-map/
Check out forkJoin. It takes an array of Observables and subscribes to them at the same time.
This example is written in Angular but the operators should be identical as long as you're using a current version of rxjs.
I'm starting with an array of User objects (users.mock.ts)
Each Object is then map to a single Observable (mapCalls)
I now have an array of Observables that will make HTTP calls. These calls will not be made until subscribed to
forkJoin is then added as a wrapper to the array
You subscribe to that forkJoin. All of the objects will make the call.
When the calls have completed, the logic inside of subscribe() will be run
So in your case:
Map the Hotel Ids to a variable calls. Each item is the Observable logic you posted
You would then run forkJoin(calls).subscribe() to make all the calls
export const listenToHotels = (hotelIds: string[]): Observable<Hotel[]> => {
const hotelsObservable = hotelIds.map((hotelId) => {
let roomingList: any;
return new Observable<Hotel>((observerInside) => {
FirestoreCollectionReference.Hotels()
.doc(hotelId)
.onSnapshot(
(doc) => {
roomingList = { hotelId: doc.id, ...doc.data() } as Hotel;
observerInside.next(roomingList);
},
(error) => observerInside.error(error)
);
});
});
const combinedObservable = combineLatest(hotelsObservable);
return combinedObservable;
//Check for error handling
};
The issue was how I was handling the chained observables(Execution of promise is delayed than a normal code because they will be put in micro-queue first). They need to be handled using zip or combineLatest but latter is much better in this use case as we need the latest values for the observables.
After some research here I found out that the best way to make multiple requests in Rxjs is flatMap(). However, I can't get it to work when there's mulitple requests inside the flatMap() method aswell. The output in subscribe() is observables instead of values.
this.companyService.getSuggestions(reference)
.pipe(
flatMap((suggestions: string[]) => {
let infos = [];
suggestions.forEach(suggestion => {
infos.push(this.companyService.getInformation(suggestion));
});
return of(infos);
}),
).subscribe((val) => {
console.log('subscribe', val); //Output here is array of observables instead of values
});
flatMap will only wait for one value.
I would convert the suggestions string[] into a stream and do a flatMap over that.
import { flatMap, from } 'rxjs/operators'
const suggestionsRequest$ = this.companyService.getSuggestions(reference)
// convert to suggestion stream
const suggestions$ = suggestionsRequest$.pipe(
flatMap((suggestions: string[]) => from(suggestions))
)
// get all infos
const infos$ = suggestions$.pipe(
flatMap((suggestion: string) => this.companyService.getInformation(suggestion))
)
We have the following stream.
const recorders = imongo.listCollections('recorders')
.flatMapConcat(names => {
const recorders = names
.map(entry => entry.name)
.filter(entry => !_.contains(
['recorders.starts',
'recorders.sources',
'system.indexes',
'system.users'],
entry));
console.log(recorders);
return Rx.Observable.fromArray(recorders);
});
recorders.isEmpty()
.subscribe(
empty => {
if(empty) {
logger.warn('No recorders found.');
}
},
() => {}
);
recorders.flatMapConcat(createRecorderIntervals)
.finally(() => process.exit(0))
.subscribe(
() => {},
e => logger.error('Error while updating: %s', e, {}),
() => logger.info('Finished syncing all recorders')
);
If the stream is empty then we don't want to createRecorderIntervals. The above piece of code is working. However, checking if the stream is empty, is causing the console.log to be executed twice. Why is this happening? Can I fix it somehow?
EDIT: So, I went the following way, after rethinking it thanks to #Martin's answer
const recorders = imongo.listCollections('recorders')
.flatMapConcat(names => {
const recorders = names
.map(entry => entry.name)
.filter(entry => !_.contains(
['recorders.starts',
'recorders.sources',
'system.indexes',
'system.users'],
entry));
if(!recorders.length) {
logger.warn('No recorders found.');
return Rx.Observable.empty();
}
return Rx.Observable.fromArray(recorders);
})
.flatMapConcat(createRecorderIntervals)
.finally(() => scheduleNextRun())
.subscribe(
() => {},
e => logger.error('Error while updating: %s', e, {}),
() => logger.info('Finished syncing all recorders')
);
When you call subscribe() method on an Observable it causes the entire chain of operators to be created which it turn calls imongo.listCollections('recorders') twice in your case.
You can insert an operator before calling flatMapConcat(createRecorderIntervals) that checks whether the result is empty. I have one of them in mind particularly but there might be other that suit your needs even better:
takeWhile() - takes predicate as an argument and emits onComplete when it return false.
Then your code would be like the following:
const recorders = imongo.listCollections('recorders')
.flatMapConcat(names => {
...
return Rx.Observable.fromArray(recorders);
})
.takeWhile(function(result) {
// condition
})
.flatMapConcat(createRecorderIntervals)
.finally(() => process.exit(0))
.subscribe(...);
I don't know what exactly your code does but I hope you get the idea.
Edit: If you want to be notified when the entire Observable is empty than there're a multiple of ways:
do() operator and a custom Observer object. You'll write a custom Observer and put it using do() operator before .flatMapConcat(createRecorderIntervals) . This object will count how many times its next callback was called and when the preceding Observable completes you can tell whether there was at least one or there were no results at all.
create a ConnectableObservable. This one is maybe the most similar to what you we're doing at the beginning. You'll turn your recorders into ConnectableObservable using publish() operator. Then you can subscribe multiple Observers without triggering the operator chain. When you have all your Observers subscribed you call connect() and it'll sequentially emit values to all Observers:
var published = recorders.publish();
published.subscribe(createObserver('SourceA'));
published.subscribe(createObserver('SourceB'));
// Connect the source
var connection = published.connect();
In your case, you'd create two Subjects (because they act as Observable and Observer at the same time) and chain one of them with isEmpty() and the second one with flatMapConcat(). See the doc for more info: http://reactivex.io/documentation/operators/connect.html
I think the first option is actually easier for you.