So I have a problem with JavaScript Promises. I'm using native implementation for the sake of reducing dependencies.
An illustrative example of what I need.
I need to retrieve lists of books, book authors and purchases.
I also need an author profile for each of the authors. After I got all of that, I need to create a nice set of Authors with their books and purchase list for each of the books.
Lists and profiles are separate API JSON calls. The only dependency is that I need a list of authors to be able to get author profiles.
I've solved this with Promises.
I use Promise.all to get 3 API JSON requests for: authors, books and purchases.
I use yet another Promise.all to get all the profiles for each of the authors I get (I loop through the list, map urls for each profile and send a batch of requests in parallel).
I run the profile request batch as soon as I get the list of authors, thus in the "Then" handler of the author list promise.
Now, the problem:
To be sure that all promises, 3 lists and all profiles, will be done prior to my assembling of the library set, I would need to send a profile batch of requests when I'm done with all the lists, in the first Promise.all Then handler.
But: lists of books an purchases take much longer time than the list of authors and I don't want to wait for all of those to send a batch of profile requests, so I send it in the Then handler of the author-list promise so these start as soon as I have the info.
However, a nested Promise.all does not count towards its parent Promise.all Then handler so since my FinalFunction is in the Then of the top-level Promise.all, it may (and sometimes does) fire before the nested Promise.all has finished retrieving all author profiles.
I need to be able to start all of the Promise requests as soon as possible, but only the batch of author requests depends on one promise being complete to start, so I need to wait on that one. All other should start independently.
Pseudo code
Promise.all(
requestBooks().then(){},
requestAuthors().then(){
GenerateArrayOfAuthorUris();
// now send a promisifyed batch of per-author requests
Promise.all(
[array of author uris to run requests for]
)
.then(){
// If I run this here, I may not have upper-level requests done
runCalculationsPerAuthorForAllAuthorsBooksPurchasesReceived();
}
},
requestPurchases().then(){},
)
.then(){
// this will fire when 3 top requests are done, but won't wait for
// the nested Promise.all with per-author requests
runCalculationsPerAuthorForAllAuthorsBooksPurchasesReceived();
}
If I do it this way, I'm wasting precious time by waiting for requests I don't need to wait on just to start per-author requests:
Promise.all(
requestBooks().then(){},
requestAuthors().then(){
GenerateArrayOfAuthorUris();
},
requestPurchases().then(){},
)
.then(){
// now send a promisifyed batch of per-author requests
Promise.all(
[array of author uris to run requests for]
)
.then(){
// If I run this here, I may not have upper-level requests done
runCalculationsPerAuthorForAllAuthorsBooksPurchasesReceived();
}
}
Hopefully this clarifies what I need.
Thank you.
This is the code sample: https://jsbin.com/qizohasofa/edit?js,console
As you were told in the comments, you didn't return anything from your functions, so then didn't know what inner promises to wait for.
function getJokeCategories() {
return Promise.all([
// ^^^^^^
pgetJSON("http://api.icndb.com/categories"),
pgetJSON("http://api.icndb.com/jokes/count").then(function(data) {
var jokesToGet = [];
for (var i=0; i<data; i++){
jokesToGet.push("http://api.icndb.com/jokes/"+i);
}
return Promise.all(jokesToGet.map(function(jk) {
// ^^^^^^
return pgetJSON(jk).then(function(joke) {
// ^^^^^^
console.log(jk + " just returned", joke);
return joke;
// ^^^^^^
});
})).then(function(jokes) {
console.log("All jokes returned. This does happen only when all jokes are retrieved.");
return {count:data, jokes:jokes};
// ^^^^^^
});
})
]);
}
getJokeCategories().then(function(result) {
console.log(result, "This does happen at the very end when joke count, joke categories and all jokes are returned.");
});
Related
there is one scenario when i tried to make multiple update calls of same endpoint with different body in forkJoin(), server returning
"Transaction (Process ID 92) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction."
so API team asking to send requests one after another, for simplified and as per recommended by TL i'm using forkJoin([]), not for loop based.
can we configure a setting / solution to ask forkJoin(), call the array of Observable API endpoints, one after another,not all at once, Please Help.
let url = 'https://jsonplaceholder.typicode.com/users/';
if (this.inviteUpdateBatch.length) {
forkJoin(this.http.putAsync(`${url}`, data1),
this.http.putAsync(`${url}`, data2)).subscribe(()=> {});
}
You can simply use concat rxjs operator.
for example:
const e = of(1);
const s = of(2);
concat(e, s).subscribe(val => {
console.log('val', val);
});
this will execute the the requests one after the one.
So I need to implement an "expensive" API endpoint. Basically, the user/client would need to be able to create a "group" of existing users.
So this "create group" API would need to check that each users fulfill the criteria, i.e. all users in the same group would need to be from the same region, same gender, within an age group etc. This operation can be quite expensive, especially since there are no limit on how many users in one group, so its possible that the client requests group of 1000 users for example.
My idea is that the endpoint will just create entry in database and mark the "group" as pending, while the checking process is still happening, then after its completed, it will update the group status to "completed" or "error" with error message, then the client would need to periodically fetch the status if its still pending.
My implementation idea is something along this line
const createGroup = async (req, res) => {
const { ownerUserId, userIds } = req.body;
// This will create database entry of group with "pending" status and return the primary key
const groupId = await insertGroup(ownerUserId, 'pending');
// This is an expensive function which will do checking over the network, and would take 0.5s per user id for example
// I would like this to keep running after this API endpoint send the response to client
checkUser(userIds)
.then((isUserIdsValid) => {
if (isUserIdsValid) {
updateGroup(groupId, 'success');
} else {
updateGroup(groupId, 'error');
}
})
.catch((err) => {
console.error(err);
updateGroup(groupId, 'error');
});
// The client will receive a groupId to check periodically whether its ready via separate API
res.status(200).json({ groupId });
};
My question is, is it a good idea to do this? Do I missing something important that I should consider?
Yes, this is the standard approach to long-running operations. Instead of offering a createGroup API that creates and returns a group, think of it as having an addGroupCreationJob API that creates and returns a job.
Instead of polling (periodically fetching the status to check whether it's still pending), you can use a notification API (events via websocket, SSE, webhooks etc) and even subscribe to the progress of processing. But sure, a check-status API (via GET request on the job identifier) is the lowest common denominator that all kinds of clients will be able to use.
Did I not consider something important?
Failure handling is getting much more complicated. Since you no longer create the group in a single transaction, you might find your application left in some intermediate state, e.g. when the service crashed (due to unrelated things) during the checkUser() call. You'll need something to ensure that there are no pending groups in your database for which no actual creation process is running. You'll need to give users the ability to retry a job - will insertGroup work if there already is a group with the same identifier in the error state? If you separate the group and the jobs into independent entities, do you need to ensure that no two pending jobs are trying to create the same group? Last but not least you might want to allow users to cancel a currently running job.
our NodeJS backend application has more than 200K users and every week we do some controls/calculations based on users. The operation is similar to this:
const users = await dbService.user.findMany({}); // Array of 200K+ users
for (let index = 0; index < users.length; index++) {
const user = users[index];
const result = await checkUserSubscriptions({ id: user.id });
// do operations according to result, mostly updateUser in database.
}
During the process, most of other requests are not executed or waited till this operation finishes. For example when a user tries to log in, they wait until this loop ends or they get a big delay. Because this operation does not need to be instant and can have delay or be slow, I need to use something else that does not block our main queue. What can I do to ensure this?
What you are trying to do is like a background job and it shouldn't be in the same process of your web service. The memory usage of procesing 200K users will impact on your web service performance. Nodejs architecture is single-threaded and It doesn't matter if you try to run 200K async tasks, your web service performance will be affected.
You can see worker_threads, useful for performing CPU-intensive JavaScript operations. There are other packages that can helps you. You can take a look at bull
It is a bad practice to place an await inside a loop because it renders the code synchronous and you lose the ability for parallelism: each loop iteration has to wait for the previous one to complete before being triggered.
You should use Promise.all:
const users = await dbService.user.findMany({}); // Array of 200K+ users
const results = await Promise.all(users.map((user) => {
return checkUserSubscriptions({ id: user.id });
}))
// results is an array of returned values from checkUserSubscriptions
I have two API urls to hit. One known to be fast (~50-100ms). One known to be slow (~1s). I use the results of these to display product choices to the user. Currently I await-download one, then do the second. Pretty synchronous and because of that it's adding 50-100ms to the already-slow second hit.
I would like to:
Send both requests at once
Start processing data as soon as one request comes back
Wait for both requests before moving on from there.
I've seen the example Axios give...
axios.all([getUserAccount(), getUserPermissions()])
.then(axios.spread(function (acct, perms) {
// Both requests are now complete
}));
But this appears to wait for both URLs to commit. This would still be marginally faster but I want the data from my 50ms API hit to start showing as soon as it's ready.
For sure you can chain additional .thens to the promises returned by axios:
Promise.all([
getUserAccount()
.then(processAccount),
getUserPermissions()
.then(processPermissions)
]).then(([userAccount, permissions]) => {
//...
});
wereas processAccount and processPermissions are functions that take the axios response object as an argument and return the wanted results.
For sure you can also add multiple .thens to the same promise:
const account = getUserAccount();
const permissions = getUserPermissions();
// Show permissions when ready
permissions.then(processPermissions);
Promise.all([account, permissions])
.then(([account, permissions]) => {
// Do stuff when both are ready
});
I replaced axios.all with Promise.all - I don't know why axios provides that helper as JS has a native implementation for that. I tried consulting the docs ... but they are not even documenting that API.
I'm trying to use callbacks to get rid of the synchronous ajax calls in my code but I can't figure out how this could work. I'm using the spotify API to get all the artists in a playlist then perform tasks based on that information. The basic logic of the code is:
Get the user's playlist selections
Populate an array with the artist ids in those playlists
Make more ajax calls based on the array.
Use the array from step 3 to do another task.
The problem is that step 4 will come before step 2 and 3 if I don't set step 2 and 3 to synchronous. BUT I can't just call step three at the end of the step 2, and step 4 at the end of step 3 function because both occur in a while loop. Can't figure out a solution to this.
The calling function
This while loop goes through all a user's selections in a multiple selection box and calls the ajax function to append the data.
artistArray = [];
while (artistUrls[i] != null) {
getArtists(artistArray, artistUrls[i]);
i++;
}
doSomethingWithArtistArray(artistArray);
doAnotherThingWithArray(artistsArray);
The ajax function
Uses ajax calls to get the artist information and append it to an array
getArtists(artistArray, url) {
(if (url == null) {
return;
}
$.ajax({
async: false,
url: url,
headers: {
'Authorization': 'Bearer ' + access_token
},
error: function() {
console.log("Something went wrong with " + url);
return;
},
success: function(tracks) {
getArtists_Append(artists, frequencyArray, tracks); //Uses a while loop to append all the artist information to artistArray
},
});
//My idea was to call doSomethingWithArtistArray here but that's not working because there might be more calls to make.
console.log("finished getting artists");
return;
}
}
Get artists=
getArtists_Append {
while loop that populates the array
}
The problem is that you are treating your Ajax requests as if they were synchronous, when they are asynchronous (and you should do it like that to prevent blocking the browser).
The best approach is to:
In the specific case of fetching multiple artists from Spotify, use the endpoint for getting several artists. This will reduce the amount of requests you need to make to the Spotify's Web API.
If using callback functions, you will make an Ajax request. Then in its callback you will check if you need to make another Ajax request with the next chunk. If you don't need to make any other request because you are done, then call your next function, in this case doSomethingWithArtistArray.
If you are using Promises, then use Promise.all() passing an array of promises, where each promise wraps an Ajax request. This is useful when you already know what requests you need to make, and don't need the response from a request to figure out the next request to be made.
Have a look at the Code Examples section on the Spotify Developer Site to see some open source sites using the Web API.
For instance, you can see how the 2nd alternative is applied in Sort Your Music when getting playlists tracks. The function will make a request to the next chunk if there are more tracks to fetch, otherwise it won't.
For the 3rd alternative, since you are using jQuery you could use $.when to use promises. Check out this example. If you like the idea of promises and plan to make other requests to the Web API, I would recommend you using a wrapper like Spotify Web API JS (shameless self promotion). With that you could simply do:
var api = new SpotifyWebApi();
var promises = [];
promises.add(api.getArtists(['id1', 'id2', 'id3', 'id4', 'id5']));
promises.add(api.getArtists(['id10', 'id11', 'id12', 'id13', 'id14']));
Promise.all(promises).then(function(data) {
// data contains the result of the promises (ajax requests)
// do something with it
});