I am trying operate on all children in a Firebase Realtime Database directory, however, I am running into a problem that I cannot solve. Using the "child_added" trigger to download the data calls the function for every child in the directory. Normally using promises in this kind of situation would be fine, however, because the same function is called multiple times my function ends up continuing after the on('child_added') function is called once and missing all the other calls. I have no idea how to remedy this, all suggestions are appreciated!
Don't use child_added, because it never stops listening, and your function must terminate as soon as possible. Instead, use once('value') to fetch all the data, and make use of its returned promise to continue when its snapshot is available. It might be similar to this sample, in addition to many others in that repo.
You can still use promises, but instead of closing the function after one promise is returned wait for all promises.
I'm not sure what your Cloud Function looks like, but let's assume it's a database trigger
functions.database.ref('/some/trigger').onWrite(async event => {
const promises = [];
admin.database().ref('/some_child').on('child_added', snapshot => {
const pushRef = admin.database().ref('/some_path').push(snapshot.val()); // Some pretend async operation
promises.push(pushRef);
});
return Promise.all(promises);
}
In this example I'm listening to /some/trigger then getting all the children at the path /some_child.
I'll then save each child into a new object under /some_path.
Each promise is pushed into an array and the Promise.all will cause the function to wait for all promises (writes) to resolve.
Related
I have a cloud function in Firebase that, among a chain of promise invocations, ends with a call to this function:
function sendEmail() {
return new Promise((accept) => {
const Email = require('email-templates');
const email = new Email({...});
email.send({...}).then(() => {
console.log('Email sent');
}).catch((e) => {
console.error(e);
});
accept();
});
}
I am well aware of the fact that email.send() returns a promise. There's a problem however with this approach, that is, if I were to change the function to be:
function sendEmail() {
const Email = require('email-templates');
const email = new Email({...});
return email.send({...});
}
It usually results in the UI hanging for a significant amount of time (10+ seconds) because the time it takes from the promise to resolve equals the amount of time it takes for the email to send.
That's why I figured the first approach would be better. Just call email.send() asynchronously, it'll send the email eventually, and return a response to the client whether the email has finished its round trip or not.
The first approach is giving me problems. The cloud function finishes execution must faster, and thus ends up being a better experience for the user, however, the email doesn't send for another 15+ minutes.
I am considering another approach where we have a separate cloud function hook that handles the email sending, but I wanted to ask StackOverflow first.
I think there are two aspects being mixed here.
One side of the question deals with promises in the context of Cloud Functions. Promises in Cloud Functions need to be resolved before you call res.send() because right after this call the function will be shutdown and there's no guarantee that unresolved promises will complete before the function instance is terminated, see this question. You might as well never call res.send() and instead return the result of a promise as shown in the Firebase documentation, the key here would be to ensure the promise is resolved properly for example using an idiom like return myPromise().then(console.log); which will force the promise resolution.
Separately, as Bergi pointed out in the comments the first snippet uses an anti-pattern with promises and the second one is way more concise and clear. If you're experiencing a delay in the UI it's likely that the execution gets freezed waiting for the Function response and you might consider whether this could be avoided in your particular use case.
All that said, your last idea of creating a separate function to deal with the email send process would also likely reduce the response time and could even make more sense from a separation of concerns point of view. To go this route I would suggest to send a PubSub message from the main function so that a second one sends the email. Moreover, PubSub triggered function allows to configure retry policies which may be useful to ensure the mail will be sent in the context of eventual errors. This approach is also suggested in the question linked above.
Hi I am having a very strange behaviour.
I am iterating over some documents and setting some promises that when the documents are fetched the UI is updated.
However, while the promises are atomic, the firestore / AngularFire waits for all the promises.
Example:
for (const event of events) {
this.eventService.getEventActivitiesAndSomeStreams(this.user,
event.getID(),
[DataLatitudeDegrees.type, DataLongitudeDegrees.type])
.pipe(take(1)).toPromise().then((fullEvent) => {
this.logger.info(`Promise completed`)
})
}
One would expect that slowly for each promise as the data comes it would print the promise completed.
However they are all printed as once. It doesn't look that those promises come one by one but "all at once". There is a big waiting time till the first console log is printed and then all promises print that.
So I would expect if I have a progress bar to increase little but little but increases at once
The inner call this.eventService.getEventActivitiesAndSomeStreams
return this.afs
.collection('users')
.doc(userID)
.collection('events')
.doc(eventID)
.collection('activities')
.doc(activityID)
.collection('streams', ((ref) => {
return ref.where('type', 'in', typesBatch);
}))
.get()
.pipe(map((documentSnapshots) => {
return documentSnapshots.docs.reduce((streamArray: StreamInterface[], documentSnapshot) => {
streamArray.push(this.processStreamDocumentSnapshot(documentSnapshot)); // Does nothing rather to create an class of the JSON object passed back from the firestore
return streamArray;
}, []);
}))
Now, if I put an await inside the for loop of course this works as it should going and completing the promises as it should, but then it takes a lot of time.
I also tried to not use AngularFire and use the native JS SDK with the same effect.
I am suspecting that the IndexedDB can be causing this or some other Firebase logic.
What am I missing here, and how can I have the desired behaviour if possible?
You could repro this via a ["users" -> "events" -> "something"] firestore collections, were each "user" has lets say 500 "events" and each of those events has 2 more docs.
So get all the events for the user and try to make for each one a promise that will return 2 documents of "something" inside a for array )
This behavior is pretty expected and has nothing at all to do with firebase. you're iterating over an array and sending out requests. there is no waiting or delay between items, so the for loop (without await statements) will finish in an imperceptibly small amount of time, which means all of the requests are being sent out within milliseconds of each other, or basically at the same time. So their responses should be expected to arrive at basically the same time as well.
You've stated that you don't want to use await statements and iterate one by one, so it's tough to know exactly what you do want or expect to happen. maybe you want them to be spaced .5 second apart? If so, you need to write that logic:
timer(0, 500).pipe( // put whatever ms time between requests you want here?
take(events.length),
switchMap(i => {
return this.eventService.getEventActivitiesAndSomeStreams(this.user,
events[i].getID(),
[DataLatitudeDegrees.type, DataLongitudeDegrees.type]).pipe(take(1))
})
).subscribe(fullEvent => {
this.logger.info(`Promise completed`)
})
(removed promises cause idk why they're being used in the first place and this kind of control is easier with rxjs IMO)
With the firebase Web SDK you can do:
commentsRef.on('child_added', function(data) {
addCommentElement(postElement, data.key, data.val().text, data.val().author);
});
However I wonder if it's possible to return a Promise instead of the callback above? Just like the following works:
this.productRef.once('value') // Attach listener
.then(result => {
console.log(result.val())
})
Thanks for the awesome work!
Promises are meant to be asynchronous tasks that complete exactly once. So I do something, and then it either gets resolved or rejected, and then that's it.
The child_added event, on the other hand, will fire zero to many times for each time the child changes. This is not suitable for a Promise, and so instead acts as a stream of events, not a single asynchronous task.
I'm working with AngularJS 1.5 (I'm really beginner with JS) in a user profile view where the user can change a lot of things in his data.
In this view the data is split in several sections and save all of this data means to do several calls to server ( due to the design of app). So the problem that I found is that when the user modify his data maybe modify only a part of this and when it push save button I don't want call all methods, only to necessary methods.
I've programed the way to detect the changes in data blocks when the user push the save button, sometimes the controller make a call and in other cases two or three. But the problem is that the calls (made with $resource library) is executed asyncronously and I would like can control this better.
I would like do the next: store all calls in a list or array but wihout execute them and after execute all at the same time (more or less). If any of this fails I would like show a error message to user (only a generic error message), but internally log the call that failed, and in the same way only show (and only one) a success message when all calls have ended with success ( and not a message per success call).
I don't know how to do this, some mates say me that maybe I need use $q AngularJS service to do this, or store the promises that $resource have to execute all after (I've trying this without success) or work with promises in JS.
Anyone can give me any idea?
Finally I resolved my problem using $q. At first I wanted store the calls without execute them (I thought that it was better) but finally I can check that only stored the results of this calls is enought for my aim. So, this a skeleton of the solution that I've been done:
At the beginning of the controller
var promises = [];
In all places where I need make a controlled call inside of save user data function:
var deferred = $q.defer();
var promise = vm.teacher.$update(
function () { // Success
deferred.resolve('Success updating the teacher.');
},
function (error) { // Fail
deferred.reject('Error updating the teacher, error: ' + error)
});
promises.push(deferred.promise)
}
...
... vm.otherItems.$update ...
...
And at the end of this function, something like this:
$q.all(promises).then(
function(value){
console.log('Resolving all promises, SUCCESS, value: ')
console.log(value);
toastService.showToast('Success updating the teacher.');
// It deleted old promises for next iteration (if it exists)
promises = [];
},function(reason){
console.log('Resolving all promises, FAIL, reason: ')
console.log(reason);
toastService.showToast('Fail updating the teacher.');
}
)
Thanks for the help!
Recently I made a webscraper in nodejs using 'promise'. I created a Promise for each url I wanted to scrape and then used all method:
var fetchUrlArray=[];
for(...){
var mPromise = new Promise(function(resolve,reject){
(http.get(...))()
});
fetchUrlArray.push(mPromise);
}
Promise.all(fetchUrlArray).then(...)
There were thousands of urls but only a few of them got timed out. I got the impression that it was handling 5 promises in parallel at a time.
My question is how exactly does promise.all() work. Does it:
Call each promise one by one and switch to the next one till the previous one is resolved.
Or does in process the promises in a batch of a few from the array.
Or does it fire all promises
What is the best way to solve this problem in nodejs. Because as it stands I can solve this problem way faster in Java/C#
What you pass Promise.all() is an array of promises. It knows absolutely nothing about what is behind those promises. All it knows is that those promises will get resolved or rejected sometime in the future and it will create a new master promise that follows the sum of all the promises you passed it. This is one of the nice things about promises. They are an abstraction that lets you coordinate any type of action (usually asynchronous) without regard for what type of action it is. As such, promises have literally nothing to do with the actual action. All they do is monitor the completion or error of the action and report that back to those agents following the promise. Other code actually runs the action.
In your particular case, you are immediately calling http.get() in a tight loop and your code (nothing to do with promises) is launching a zillion http.get() operations at once. Those will get fired as fast as the underlying transport can do them (likely subject to connection limits).
If you want them to be launched serially or in batches of say 10 at a time, then you have to code it that way yourself. Promises have nothing to do with that.
You could use promises to help you code them to launch serially or in batches, but it would take extra of your code to do that either way to make that happen.
The Async library is specifically built for running things in parallel, but with a maximum number in flight at any given time because this is a common scheme where you either have connection limits on your end or you don't want to overwhelm the receiving server. You may be interested in the parallelLimit option which lets you run a number of async operations in parallel, but with a maximum number in flight at any given time.
I would do it like this
Personally, I'm not a big fan of Promises. I think the API is extremely verbose and the resulting code is very hard to read. The method defined below results in very flat code and it's much easier to immediately understand what's going on. At least imo.
Here's a little thing I created for an answer to this question
// void asyncForEach(Array arr, Function iterator, Function callback)
// * iterator(item, done) - done can be called with an err to shortcut to callback
// * callback(done) - done recieves error if an iterator sent one
function asyncForEach(arr, iterator, callback) {
// create a cloned queue of arr
var queue = arr.slice(0);
// create a recursive iterator
function next(err) {
// if there's an error, bubble to callback
if (err) return callback(err);
// if the queue is empty, call the callback with no error
if (queue.length === 0) return callback(null);
// call the callback with our task
// we pass `next` here so the task can let us know when to move on to the next task
iterator(queue.shift(), next);
}
// start the loop;
next();
}
You can use it like this
var urls = [
"http://example.com/cat",
"http://example.com/hat",
"http://example.com/wat"
];
function eachUrl(url, done){
http.get(url, function(res) {
// do something with res
done();
}).on("error", function(err) {
done(err);
});
}
function urlsDone(err) {
if (err) throw err;
console.log("done getting all urls");
}
asyncForEach(urls, eachUrl, urlsDone);
Benefits of this
no external dependencies or beta apis
reusable on any array you want to perform async tasks on
non-blocking, just as you've come to expect with node
could be easily adapted for parallel processing
by writing your own utility, you better understand how this kind of thing works
If you just want to grab a module to help you, look into async and the async.eachSeries method.
First, a clarification: A promise does represent the future result of a computation, nothing else. It does not represent the task or computation itself, which means it cannot be "called" or "fired".
Your script does create all those thousands of promises immediately, and each of those creations does call http.get immediately. I would suspect that the http library (or something it depends on) has a connection pool with a limit of how many requests to make in parallel, and defers the rest implicitly.
Promise.all does not do any "processing" - it's not responsible for starting the tasks and resolving the passed promises. It only listens to them and checks whether they all are ready, and returns a promise for that eventual result.