In the following code,
Promise.allSettled( [ entry_save(), save_state(), get_HTML() ] ).then( ... );
promises entry_save and save_state are both readwrite database transactions and get_HTML is readonly. The two readwrite transactions could be combined together but that complicates the undo/redo chain that is maintained and it ties the success and rollback of the two together which is undesired.
The entry_save transaction needs to write before the save_state transaction. Before moving entry_save into the Promise.allSettled that is how it worked because the entry_save transaction was created prior to those of the others. This MDN article explains how the order in which requests are performed is based upon when the transactions are created independently of the order in which the requests are made.
My question is does the synchronous code of each promise process in the order in which it is placed in the array, such that placing entry_save first will always result in its transaction being created first and guaranteeing its database requests will be performed first?
Although it works and is quick enough, I'd prefer to not do this:
entry_save().then( () => { Promise.allSettled( [ save_state(), get_HTML() ] ) } ).then( ... );
If it matters, that's not the exactly the way it is written, it's more consistent with:
entry_save().then( intermediate ); where intermediate invokes the Promise.allSettled.
Thank you.
To clarify a bit, below is the example given in the above cited MDN document.
var trans1 = db.transaction("foo", "readwrite");
var trans2 = db.transaction("foo", "readwrite");
var objectStore2 = trans2.objectStore("foo")
var objectStore1 = trans1.objectStore("foo")
objectStore2.put("2", "key");
objectStore1.put("1", "key");
After the code is executed the object store should contain the value "2", since trans2 should run after trans1.
If entry_save creates trans1 and save_state create trans2, and all in the synchronous code of the functions, meaning not within an onsuccess or onerror handler of a database request or something similar, will the MDN example not hold?
Thus, where #jfriend00 writes,
The functions are called in the order they are placed in the array,
but that only determines the order in which the asynchronous are
started.
will this order the timing of the write requests by that of the creation of the transactions, since the transactions are created in the synchronous code before the asynchronous can commence?
I'd like to test it but I'm not sure how. If two nearly identical promises are used in a Promise.allSettled, how can the write request of the first created transaction be delayed such that it takes place after the write request of the second created transaction, to test if it will be written first? A setTimeout should terminate the transaction. Perhaps a long-running synchronous loop placed before the request.
The code at the very end of this question may better illustrate more precisely what I have attempted to ask. It takes the MDN example in the article cited above and spreads it across two promises placed in a Promise.allSettled, both of which attempt to write to the same object store from within the onsuccess event of a get request.
The question was will the same principle in the article of the first transaction created writing before the second transaction created, regardless of the order the requests are made, still hold in this set up. Since the synchronous portions of the promises will process in the order the promises are placed in the array, the transaction in promise p_1 will be created before that of p_2. However, the put request in the onsuccess event of the get request in p_1 is delayed by the loop building a large string. The question is will p_1 still write before p_2?
In experimenting with this, I cannot get p_2 to write before p_1. Thus, it appears that the MDN example applies even in this type of set up. However, I cannot be sure of why because I don't understand how the JS code is really interpreted/processed.
For example, why can the req.onsuccess function be defined after the request is made? I asked that question sometime ago but still don't know enough to be sure that it doesn't affect the way I attempted to add in a delay here. I know that it won't work the other way around; but my point is I'm not sure how the browser handles that synchronous loop before the put request is made in p_1 to really know for sure that this example demonstrates that the MDN article ALWAYS holds in this set up. However, I can observe that it takes longer for the requests to complete as the number of loop iterations is increased; and, in all cases I have observed, p_1 always writes before p_2. The only way p_2 writes before p_1 is if p_1 doesn't write at all because of the string taking up to much memory causing the transaction in p_1 to be aborted.
That being said, and returning to the fuller set up of my question concerning three promises in the array of the Promise.allSettled compared to requiring entry_save to complete before commencing a Promise.allSettled on the two remaining promises, in the full code of my project, for reasons I am not sure of, the latter is quicker than the former, that is, waiting for entry_save to complete is quicker than including it in the Promise.allSettled.
I was expecting it to be the other way around. The only reason I can think of at this point is that, since entry_save and save_state are both writing to the same object store, perhaps whatever the browser does equivalent to locking the object store until the first transaction, which is that in entry_save, completes and removing the lock takes longer than requiring that entry_save complete before the Promise.allSettled commences and not involving a lock. I thought that everything would be ready "in advance" just waiting for the two put requests to take place in transaction order. They took place in order but more slowly or at least not as quick as using:
entry_save().then( () => { Promise.allSettled( [ save_state(), get_HTML() ] ) } ).then( ... );
instead of:
Promise.allSettled( [ entry_save(), save_state(), get_HTML() ] ).then( ... );
function p_all() { Promise.allSettled( [ p_1(), p_2() ] ); }
function p_1()
{
return new Promise( ( resolve, reject ) =>
{
let T = DB.transaction( [ 'os_1', 'os_2' ], 'readwrite' ),
q = T.objectStore( 'os_1' ),
u = T.objectStore( 'os_2' ),
req, i, t ='', x = '';
req = q.get( 1 );
req.onsuccess = () =>
{
let i, t, r = req.result;
for ( i = 1; i < 10000000; i++ ) t = t + 'This is a string';
r.n = 'p1';
u.put( r );
console.log( r );
};
}); }
function p_2()
{
return new Promise( ( resolve, reject ) =>
{
let T = DB.transaction( [ 'os_1', 'os_2' ], 'readwrite' ),
q = T.objectStore( 'os_1' ),
u = T.objectStore( 'os_2' ),
req;
req = q.get( 1 );
req.onsuccess = () =>
{
let r = req.result;
r.n = 'p2';
u.put( r );
console.log( r );
};
}); }
indexedDB will maintain the order of the transactions in order created, except when those transactions do not overlap (e.g. do not involve the same store out of the set of stores each one involves). this is pretty much regardless of what you do at the higher promise layer.
at the same time, maybe it is unwise to rely on that behavior, because it is implicit and a bit confusing. so maybe it is ok to linearize with promises. the only reach catch is when you need maximum performance, which I doubt applies.
see https://www.w3.org/TR/IndexedDB-2/#transaction-construct
see are indexeddb/localforage reads resolved from a synchronous buffer?
moreover, promises begin execution at the time they are created. they just do not necessarily end at that time, they end eventually instead of immediately. that means that the calls happen in the order you 'create' the promises that are wrapping the indexedDB calls. which means that it relies on the order in which you create the transactions.
regardless of which promise wins the race. regardless of using promise.all.
also, promise.all will retain order even if promises complete out of order, just fyi, but do not let that throw you off.
When you do this:
Promise.allSettled( [ entry_save(), save_state(), get_HTML() ] ).then(...)
It's equivalent to this:
const p1 = entry_save();
const p2 = save_state();
const p3 = get_HTML();
Promise.allSettled([p1, p2, p3]).then(...);
So, the individual function calls you issue such as save_state() are STARTED in the order specified. But, each of those calls are asynchronous so the internal order of what happens before something else really depends upon what they do inside as they can all be in flight at the same time and parts of their execution can be interleaved in an indeterminate order.
Imagine that entry_save() actually consists of multiple asynchronous pieces such as first reading some data from disk, then modifying the data, then writing it to the database. It would call the first asynchronous operation to read some data from disk and then immediately return a promise. Then, save_state() would get to start executing. If save_state() just immediately issued a write to the database, then it very well may write to the database before entry_save() writes to the database. In fact, the sequencing of the two database writes is indeterminate and racy.
If you need entry_save() to complete before save_state(), then the above is NOT the way to code it at all. Your code is not guaranteeing that all of entry_save() is done before any of save_state() runs.
Instead, you SHOULD do what you seem to already know:
entry_save().then( () => { Promise.allSettled( [ save_state(), get_HTML() ] ) } ).then( ... );
Only that guarantees that entry_save() will complete before save_state() gets to run. And, this assumes that you're perfectly OK with save_state() and get_HTML() running concurrently and in an unpredictable order.
My question is does the synchronous code of each promise process in the order in which it is placed in the array, such that placing entry_save first will always result in its transaction being created first and guaranteeing its database requests will be performed first?
The functions are called in the order they are placed in the array, but that only determines the order in which the asynchronous are started. After that, they are all in-flight at the same time and the internal timing between them depends upon how long their individual asynchronous operations take and what those asynchronous operations do. If order matters, you can't just put them all in an indeterminate race. That's call a "race condition". Instead, you would need to structure your code to guarantee that the desired operation goes first before the ones that need to execute after it.
I have an array of objects. For each object I need to trigger an asynchronous request (http call). But I only want to have a certain maximum of requests running at the same time. Also, it would be nice (but not neccessary) if I could have one single synchronization point after all requests finished to execute some code.
I've tried suggestions from:
Limit number of requests at a time with RxJS
How to limit the concurrency of flatMap?
Fire async request in parallel but get result in order using rxjs
and many more... I even tried making my own operators.
Either the answers on those pages are too old to work with my code or I can't figure out how to put everything together so all types fit nicely.
This is what I have so far:
for (const obj of objects) {
this.myService.updateObject(obj).subscribe(value => {
this.anotherService.set(obj);
});
}
EDIT 1:
Ok, I think we're getting there! With the answers of Julius and pschild (both seem to work equally) I managed to limit the number of requests. But now it will only fire the first batch of 4 and never fire the rest. So now I have:
const concurrentRequests = 4;
from(objects)
.pipe(
mergeMap(obj => this.myService.updateObject(obj), concurrentRequests),
tap(result => this.anotherService.set(result))
).subscribe();
Am I doing something wrong with the subscribe()?
Btw: The mergeMap with resultSelector parameter is deprecated, so I used mergeMap without it.
Also, the obj of the mergeMap is not visible in the tap, so I had to use tap's parameter
EDIT 2:
Make sure your observers complete! (It cost me a whole day)
You can use the third parameter of mergeMap to limit the number of concurrent inner subscriptions. Use finalize to execute something after all requests finished:
const concurrentRequests = 5;
from(objects)
.pipe(
mergeMap(obj => this.myService.updateObject(obj), concurrentRequests),
tap(res => this.anotherService.set(res))),
finalize(() => console.log('Sequence complete'))
);
See the example on Stackblitz.
from(objects).pipe(
bufferCount(10),
concatMap(objs => forkJoin(objs.map(obj =>
this.myService.updateObject(obj).pipe(
tap(value => this.anotherService.set(obj))
)))),
finalize(() => console.log('all requests are done'))
)
Code is not tested, but you get the idea. Let me know if any error or explanation is needed
I had the same issue once. When I tried to load multiple images from server. I had to send http requests one after another. I achieved desired outcome using awaited promise. Here is the sample code:
async ngOnInit() {
for (const number of this.numbers) {
await new Promise(resolve => {
this.http.get(`https://jsonplaceholder.typicode.com/todos/${number}`).subscribe(
data => {
this.responses.push(data);
console.log(data);
resolve();
}
);
});
}
}
Main idea is here to resolve the promise once you get the response.
With this technique you can come up with custom logic to execute one method once all the requests finished.
Here is the stackblitz. Open up the console to see it in action. :)
I am implementing a search input in my React app which may take several seconds for a response from the server based on the size of the result set. The issue I am having right now is if the user searches for a term which has a large result set and then searches for a term with a small result set straight after, the first search is returning after the last search and is overwriting the data, therefore displaying incorrect results on the table.
Currently I have a search component which calls an async function in its parent (this function can be called from several child components, Search being one of them). Inside the async function I have an await call to the service with the search query. Once that returns the results are passed to a function which updates some state.
I have read about cancel tokens but i'm not totally sure how to implement this. When the search component makes the initial call, there will be a promise which will be pending until the result is received. When I search again, how would I be able to ignore the result of the first promise? Each time I search, would I store the promise in a field of the component and somehow check this field in future searches?
I read many possible solutions to this online. I am using fetch-retry to handle the API call and would rather not use a library such as bluebird or axios. The main part I don't understand is how to have my promise not resolve if a promise has been created in future.
Hope I explained this well enough!
Thanks
When I search again, how would I be able to ignore the result of the first promise?
You probably don't want that, as youbare wasting some bandwith if you do a request and ignore its result. Instead you should cancel the underlying request (not the promise, promises can't be canceled directly).
To do so you could keep an abortion controller for each search:
this.controller = new AbortController();
Then launch all fetch requests as:
fetch(url, { signal: this.controller.signal })
.then(res => res.json())
.then(res => this.setState({ /*...*/ }))
.catch(() => /*..*/)
Now if you restart the search just do:
this.controller.abort();
this.controller = new AbortController();
and do the fetching again.
Read on
I've got a list of urls i need to request from an API, however in order to avoid causing a lot of load i would ideally like to perform these requests with a gap of x seconds. Once all the requests are completed, certain logic that doesnt matter follows.
There are many ways to go about it, i've implemented a couple.
A) Using a recursive function that goes over an array that holds all the urls and calls itself when each request is done and a timeout has happened
B) Setting timeouts for every request in a loop with incremental delays and returning promises which upon resolution using Promise.all execute the rest of the logic and so on.
These both work. However, what would you say is the recommended way to go about this? This is more of an academic type of question and as im doing this to learn i would rather avoid using a library that abstracts the juice.
Your solutions are almost identical. Thought I would choose a bit different approach. I would make initial promise and sleep promise function, then I would chain them together.
function sleep(time){
return new Promise(resolve => setTimeout(resolve, ms));
}
ApiCall()
.then(sleep(1000))
.then(nextApiCall())...
Or more modular version
var promise = Promise.resolve()
myApiCalls.forEach(call => {
promise = promise.then(call).then(() => sleep(1000))
})
In the end, go with what you understand, what make you most sense and what you will understand in month. The one that you can read best is you preferred solution, performance won’t matter here.
You could use something like this to throttle per period.
If you want all urls to be processed even when some fail you could catch the failed ones and pick them out in the result.
Code would look something like this:
const Fail = function(details){this.details=details;};
twoPerSecond = throttlePeriod(2,1000);
urls = ["http://url1","http://url2",..."http://url100"];
Promise.all(//even though a 100 promises are created only 2 per second will be started
urls.map(
(url)=>
//pass fetch function to twoPerSecond, twoPerSecond will return a promise
// immediately but will not start fetch untill there is an available timeslot
twoPerSecond(fetch)(url)
.catch(e=>new Fail([e,url]))
)
)
.then(
results=>{
const failed = results.map(result=>(result&&result.constuctor)===Fail);
const succeeded = results.map(result=>(result&&result.constuctor)!==Fail);
}
)
I am trying operate on all children in a Firebase Realtime Database directory, however, I am running into a problem that I cannot solve. Using the "child_added" trigger to download the data calls the function for every child in the directory. Normally using promises in this kind of situation would be fine, however, because the same function is called multiple times my function ends up continuing after the on('child_added') function is called once and missing all the other calls. I have no idea how to remedy this, all suggestions are appreciated!
Don't use child_added, because it never stops listening, and your function must terminate as soon as possible. Instead, use once('value') to fetch all the data, and make use of its returned promise to continue when its snapshot is available. It might be similar to this sample, in addition to many others in that repo.
You can still use promises, but instead of closing the function after one promise is returned wait for all promises.
I'm not sure what your Cloud Function looks like, but let's assume it's a database trigger
functions.database.ref('/some/trigger').onWrite(async event => {
const promises = [];
admin.database().ref('/some_child').on('child_added', snapshot => {
const pushRef = admin.database().ref('/some_path').push(snapshot.val()); // Some pretend async operation
promises.push(pushRef);
});
return Promise.all(promises);
}
In this example I'm listening to /some/trigger then getting all the children at the path /some_child.
I'll then save each child into a new object under /some_path.
Each promise is pushed into an array and the Promise.all will cause the function to wait for all promises (writes) to resolve.