I have an array of objects. For each object I need to trigger an asynchronous request (http call). But I only want to have a certain maximum of requests running at the same time. Also, it would be nice (but not neccessary) if I could have one single synchronization point after all requests finished to execute some code.
I've tried suggestions from:
Limit number of requests at a time with RxJS
How to limit the concurrency of flatMap?
Fire async request in parallel but get result in order using rxjs
and many more... I even tried making my own operators.
Either the answers on those pages are too old to work with my code or I can't figure out how to put everything together so all types fit nicely.
This is what I have so far:
for (const obj of objects) {
this.myService.updateObject(obj).subscribe(value => {
this.anotherService.set(obj);
});
}
EDIT 1:
Ok, I think we're getting there! With the answers of Julius and pschild (both seem to work equally) I managed to limit the number of requests. But now it will only fire the first batch of 4 and never fire the rest. So now I have:
const concurrentRequests = 4;
from(objects)
.pipe(
mergeMap(obj => this.myService.updateObject(obj), concurrentRequests),
tap(result => this.anotherService.set(result))
).subscribe();
Am I doing something wrong with the subscribe()?
Btw: The mergeMap with resultSelector parameter is deprecated, so I used mergeMap without it.
Also, the obj of the mergeMap is not visible in the tap, so I had to use tap's parameter
EDIT 2:
Make sure your observers complete! (It cost me a whole day)
You can use the third parameter of mergeMap to limit the number of concurrent inner subscriptions. Use finalize to execute something after all requests finished:
const concurrentRequests = 5;
from(objects)
.pipe(
mergeMap(obj => this.myService.updateObject(obj), concurrentRequests),
tap(res => this.anotherService.set(res))),
finalize(() => console.log('Sequence complete'))
);
See the example on Stackblitz.
from(objects).pipe(
bufferCount(10),
concatMap(objs => forkJoin(objs.map(obj =>
this.myService.updateObject(obj).pipe(
tap(value => this.anotherService.set(obj))
)))),
finalize(() => console.log('all requests are done'))
)
Code is not tested, but you get the idea. Let me know if any error or explanation is needed
I had the same issue once. When I tried to load multiple images from server. I had to send http requests one after another. I achieved desired outcome using awaited promise. Here is the sample code:
async ngOnInit() {
for (const number of this.numbers) {
await new Promise(resolve => {
this.http.get(`https://jsonplaceholder.typicode.com/todos/${number}`).subscribe(
data => {
this.responses.push(data);
console.log(data);
resolve();
}
);
});
}
}
Main idea is here to resolve the promise once you get the response.
With this technique you can come up with custom logic to execute one method once all the requests finished.
Here is the stackblitz. Open up the console to see it in action. :)
Related
Hi I am having a very strange behaviour.
I am iterating over some documents and setting some promises that when the documents are fetched the UI is updated.
However, while the promises are atomic, the firestore / AngularFire waits for all the promises.
Example:
for (const event of events) {
this.eventService.getEventActivitiesAndSomeStreams(this.user,
event.getID(),
[DataLatitudeDegrees.type, DataLongitudeDegrees.type])
.pipe(take(1)).toPromise().then((fullEvent) => {
this.logger.info(`Promise completed`)
})
}
One would expect that slowly for each promise as the data comes it would print the promise completed.
However they are all printed as once. It doesn't look that those promises come one by one but "all at once". There is a big waiting time till the first console log is printed and then all promises print that.
So I would expect if I have a progress bar to increase little but little but increases at once
The inner call this.eventService.getEventActivitiesAndSomeStreams
return this.afs
.collection('users')
.doc(userID)
.collection('events')
.doc(eventID)
.collection('activities')
.doc(activityID)
.collection('streams', ((ref) => {
return ref.where('type', 'in', typesBatch);
}))
.get()
.pipe(map((documentSnapshots) => {
return documentSnapshots.docs.reduce((streamArray: StreamInterface[], documentSnapshot) => {
streamArray.push(this.processStreamDocumentSnapshot(documentSnapshot)); // Does nothing rather to create an class of the JSON object passed back from the firestore
return streamArray;
}, []);
}))
Now, if I put an await inside the for loop of course this works as it should going and completing the promises as it should, but then it takes a lot of time.
I also tried to not use AngularFire and use the native JS SDK with the same effect.
I am suspecting that the IndexedDB can be causing this or some other Firebase logic.
What am I missing here, and how can I have the desired behaviour if possible?
You could repro this via a ["users" -> "events" -> "something"] firestore collections, were each "user" has lets say 500 "events" and each of those events has 2 more docs.
So get all the events for the user and try to make for each one a promise that will return 2 documents of "something" inside a for array )
This behavior is pretty expected and has nothing at all to do with firebase. you're iterating over an array and sending out requests. there is no waiting or delay between items, so the for loop (without await statements) will finish in an imperceptibly small amount of time, which means all of the requests are being sent out within milliseconds of each other, or basically at the same time. So their responses should be expected to arrive at basically the same time as well.
You've stated that you don't want to use await statements and iterate one by one, so it's tough to know exactly what you do want or expect to happen. maybe you want them to be spaced .5 second apart? If so, you need to write that logic:
timer(0, 500).pipe( // put whatever ms time between requests you want here?
take(events.length),
switchMap(i => {
return this.eventService.getEventActivitiesAndSomeStreams(this.user,
events[i].getID(),
[DataLatitudeDegrees.type, DataLongitudeDegrees.type]).pipe(take(1))
})
).subscribe(fullEvent => {
this.logger.info(`Promise completed`)
})
(removed promises cause idk why they're being used in the first place and this kind of control is easier with rxjs IMO)
I have an array where I need to make single requests using very single data set of this array. The problem that I found difficult to solve was to schedule the calls. Means, only when a request finishes, the next request starts. I was looking for an RxJs queue, but I couldn't simply find a solution.
Example:
function makeRequest(body): Observable<any> {
return someAsyncRequest(body);
}
array.forEach((entry) => {
makeRequest(entry);
});
// This is just an example how the setup is. This code does not work.
// What I need is a queue like feature in RxJs to append requests and wait before the previous one is finished.
You have quite few options, but I suppose concat or forkJoin fits you best. concat calls second API only after previous completes, while forkJoin will do the same, but only if none of them errors. If any of them errors, it will not return anything.
Example with concat:
concat(...array.map(entry => makeRequest(entry)).subscribe()
p.s. import concat as static operator:
import { concat } from 'rxjs'
if you want to have things go out one by one, but still receive all the results at once, i recommend concat -> reduce. looks like this:
concat(...array.map(entry => makeRequest(entry))).pipe(
reduce((completed, curResponse) => completed.concat([curResponse]), [])
).subscribe(allResponses => console.log(allResponses))
this structure achieves the single emission that forkjoin will give you but will do the requests one by one and then gather them once all complete. if you want the results one by one as they complete though, then just concat gets the job done as shown by others
ForkJoin can meet your requirment
‘forkJoin’ waits for each HTTP request to complete and group’s all the observables returned by each HTTP call into a single observable array and finally return that observable array.
public requestDataFromMultipleSources(): Observable<any[]> {
let response1 = this.http.get(requestUrl1);
let response2 = this.http.get(requestUrl2);
let response3 = this.http.get(requestUrl3);
return forkJoin([response1, response2, response3]);
}
subscribe to single observable array and save the responses separately.
this.dataService.requestDataFromMultipleSources().subscribe(responseList => {
this.responseData1 = responseList[0];
this.responseData2 = responseList[1];
this.responseData3 = responseList[2];
});
MDN Promise.all see this you can do it even without observable
Promise.all(array.map(item => makeRequest(item))).then(values =>{
// your logic with recieved data
})
componentDidMount() {
Promise.all([OfferCreationActions.resetOffer()]).then( () => {
OfferCreationActions.updateOfferCreation('viewOnly', true);
OfferCreationActions.updateOfferCreation('loadingOffer', true);
Promise.all([
OfferCreationActions.loadBarcodes(),
OfferCreationActions.loadBrandBarcodes(),
OfferCreationActions.loadBrands(),
OfferCreationActions.loadPayers(),
OfferCreationActions.loadSegments(),
OfferCreationActions.loadTactics(),
OfferCreationActions.loadStates(),
]).then(() => {
// let state settle before loading
setTimeout(() => {
OfferCreationActions.loadOffer(this.props.match.params.offerId);
}, 1500);
});
});
}
I'm working on a React app that needs to preload some data into state then load a larger object that references the preloaded data to map some fields. I've ran into a race condition in which some of the data from the promise chain is still being processed when I try to do the mapping. I added in the timeout yesterday but that doesn't feel like the best solution to me. I'm still fairly new to React and we are using Reflux as the store (if that makes a difference). Is there a better way to ensure all the data from the promise is currently being reflected in the state prior to making the call? Should I hook into componentShouldUpdate and check all of the fields individually?
There is a fundamental flaw with the way this is implemented! You are breaking the principle of uni directional data flow. Here are a few suggestions to fix it.
Do your side effect handling inside a seperate overarching function.
Handling promise race condition is a side effect(Something that is outside of React's UniFlow). So this is not a problem linked to "React". So as the first step onComponentDidMount delegate this race condition logic to a seperate action. Possibly do it inside "resetOfferOverall()" which is actually what is happening I guess.
Manage the promise inside the action and dispatch payloads to the store
In your code you are guaranteed that the "then" will be executed after the promise is resolved. Howerver the 2 calls to "updateOfferCreation" do not fall under this contract since it's outside the promise.all. May be they also need to come inside the massive promise.all section? Maybe they need to be completed before running the massive section. Just recheck this!
resetOfferOverall() {
Promise.all([OfferCreationActions.resetOffer()]).then( () => {
.then( () => {
// These are not guaranteed to be completed before the next "then" section!
OfferCreationActions.updateOfferCreation('viewOnly', true);
OfferCreationActions.updateOfferCreation('loadingOffer', true);
//*****************************************
Promise.all([
OfferCreationActions.loadBarcodes(),
OfferCreationActions.loadBrandBarcodes(),
OfferCreationActions.loadBrands(),
OfferCreationActions.loadPayers(),
OfferCreationActions.loadSegments(),
OfferCreationActions.loadTactics(),
OfferCreationActions.loadStates(),
]).then(() => {
OfferCreationActions.loadOffer(offerId);
});
});
}
If you want this sections to be completed before getting into that
massive promise all, change your code as follows.
async resetOfferOverall() {
Promise.all([OfferCreationActions.resetOffer()]).then( () => {
.then( () => {
await OfferCreationActions.updateOfferCreation('viewOnly', true);
await OfferCreationActions.updateOfferCreation('loadingOffer', true);
//await will stop code execution until the above async code is completed
Promise.all([
OfferCreationActions.loadBarcodes(),
OfferCreationActions.loadBrandBarcodes(),
OfferCreationActions.loadBrands(),
OfferCreationActions.loadPayers(),
OfferCreationActions.loadSegments(),
OfferCreationActions.loadTactics(),
OfferCreationActions.loadStates(),
]).then(() => {
//Now JS Guarantees that this call will not be called until everything above has been resolved!
OfferCreationActions.loadOffer(offerId);
});
});
}
Make sure that the actions you are waiting are returning a promise
Whatever pomises you wait on, if you do not actually return the relevant promise that is within the call itself, your code will not work properly.Let's consider the load barcodes action and Let's assume you use axios to fetch data.
loadBarcodes(){
// This "return" right here is vital to get your promises to behave properly
return axios.get('https://localhost:8080/api/barcodes/').then((response) =>{
//DISPATCH_TO_STORE
});
//If you did not return this promise this call will resolve immediately
}
On your component watch for the relevent Store. Show a loader until the payload is loaded to the store.
As you can see by relying on a store update to show the data we do not break the unidirectional data flow.
I've got a list of urls i need to request from an API, however in order to avoid causing a lot of load i would ideally like to perform these requests with a gap of x seconds. Once all the requests are completed, certain logic that doesnt matter follows.
There are many ways to go about it, i've implemented a couple.
A) Using a recursive function that goes over an array that holds all the urls and calls itself when each request is done and a timeout has happened
B) Setting timeouts for every request in a loop with incremental delays and returning promises which upon resolution using Promise.all execute the rest of the logic and so on.
These both work. However, what would you say is the recommended way to go about this? This is more of an academic type of question and as im doing this to learn i would rather avoid using a library that abstracts the juice.
Your solutions are almost identical. Thought I would choose a bit different approach. I would make initial promise and sleep promise function, then I would chain them together.
function sleep(time){
return new Promise(resolve => setTimeout(resolve, ms));
}
ApiCall()
.then(sleep(1000))
.then(nextApiCall())...
Or more modular version
var promise = Promise.resolve()
myApiCalls.forEach(call => {
promise = promise.then(call).then(() => sleep(1000))
})
In the end, go with what you understand, what make you most sense and what you will understand in month. The one that you can read best is you preferred solution, performance won’t matter here.
You could use something like this to throttle per period.
If you want all urls to be processed even when some fail you could catch the failed ones and pick them out in the result.
Code would look something like this:
const Fail = function(details){this.details=details;};
twoPerSecond = throttlePeriod(2,1000);
urls = ["http://url1","http://url2",..."http://url100"];
Promise.all(//even though a 100 promises are created only 2 per second will be started
urls.map(
(url)=>
//pass fetch function to twoPerSecond, twoPerSecond will return a promise
// immediately but will not start fetch untill there is an available timeslot
twoPerSecond(fetch)(url)
.catch(e=>new Fail([e,url]))
)
)
.then(
results=>{
const failed = results.map(result=>(result&&result.constuctor)===Fail);
const succeeded = results.map(result=>(result&&result.constuctor)!==Fail);
}
)
What is the best way to determine if the subscriber has finished executing or better yet return something and catch it up-stream? For example:
this._subscriptions.push(this._client
.getCommandStream(this._command) // Returns an IObservable from a Subject stream
.subscribe(msg => {
// Do some processing maybe some promise stuff
http.request(url).then(
// some more stuff
);
});
What's the best know to determine that subscription has finished. I've implemented it as follows:
this._subscriptions.push(this._client
.getCommandStream(this._command)
.subscribe(msg => {
// Do some processing maybe some promise stuff
http.request(url).then(re => {
// some more stuff
msg.done()
}).catch(err => msg.done(err));
});
i.e. added a done method to the object being passed in to determine if this is finished. The issue with that is I'll have to call done in every promise or catch block and find that a little too exhaustive. Is there a cleaner and more automated way of doing this?
I think the examples I've given are not good enough. This implementation is using RX to build an internal messaging bus. The get command stream is actually returning a read-only channel (as an Observable) to get commands and process them. Now the processing could be a http request followed by many other things or just an if statement.
this._client
.getCommandStream(this._command) // Returns an IObservable from a Subject stream
.subscribe(msg => {
// Do some processing maybe some promise stuff
http.request(url).then({
// some more stuff
}).then({
// Here I wanna do some file io
if(x) {
file.read('path', (content) => {
msg.reply(content);
msg.done();
});
} else {
// Or maybe not do a file io or maybe even do some image processing
msg.reply("pong");
msg.done()
}
});
});
I feel like this is a fine usage of the Observable pattern as this is exactly a sequence of commands coming in and this logic would like to act on them. The question is notice msg.done() being called all over the place. I want to know what is the best way to limit that call and know when the entire thing is done. Another option is to wrap it all in a Promise but then again what's the difference between resolve or msg.done()?
Actually, making another asynchronous request inside subscribe() isn't recommended because it just makes things more complicated and using Rx in this way doesn't help you make your code more understandable.
Since you need to make a request to a remote service that returns a PRomise you can merge it into the chain:
this._subscriptions.push(this._client
.getCommandStream(this._command)
.concatMap(msg => http.request(url))
.subscribe(...)
Also the 3rd parameter to subscribe is a callback that is called when the source Observable completes.
You can also add your own teardown logic when the chain is being disposed. This is called after the complete callback in subscribe(...) is called:
const subscription = this._subscriptions.push(this._client
...
.subscribe(...)
subscription.add(() => doWhatever())
Btw, this is equivalent to using the finally() operator.
As per RxJs subscribe method documentation, the last Argument is completed function
var source = Rx.Observable.range(0, 3)
var subscription = source.subscribe(
function (x) {
console.log('Next: %s', x);
},
function (err) {
console.log('Error: %s', err);
},
function () {
console.log('Completed');
});
please refer this documentation
https://github.com/Reactive-Extensions/RxJS/blob/master/doc/api/core/operators/subscribe.md