I am attempting to make sequential post requests with angular just changing the request body. The reason for this is I have a REST API that I am calling to create users but it takes some time to return. I wanted to basically send up the requests in batches essentially calling the same endpoint just different request body. I have seen other questions about calling functions sequentially but they always seem to be a set number of functions that do different things. I just can't wrap my brain around the obvious recursion here.
So far I have this function that returns the promise but I don't understand how to write the recursion to call this function to go through all of $scope.csvResults.
$scope.importUsersPromise = function(currentIndex, step) {
var nextInput = $scope.csvResults.slice(currentIndex, currentIndex+step);
var requestBodyUsers = {
"mode": "SCRIPT",
"inputParams": [
JSON.stringify(nextInput)
]
};
return $http({
method: 'POST',
url: api_url + "v1/serverAction/",
headers: {
"Authorization":"user",
"Content-Type":"application/json"
},
requestBodyUsers
});
};
Say you have an array users with all the different request bodies. Then you do something like this:
var users = [/* your request bodies */];
var errors = [/* will store the errors */];
// make the first api call
var promise = apiCall(users[0]);
// use .reduce to chain all requests
promise = users.slice(1).reduce(function(promise, user){
return promise.then(apiCall.bind(null, user));
}, promise);
promise
.then(function(){
// do something when all users are inserted
})
.finally(function(){
// do something when all requests are done
// even if some of them have failed
console.log(errors);
})
function apiCall(user) {
return $http({... })
}
You have to keep in mind that if one of the requests fails the chain is broken and the following requests will not be send. If you want to send them anyway you should use .finally and optionally .catch to collect the errors:
// use .reduce to chain all requests
promise = users.slice(1).reduce(function(promise, user){
return promise
.catch(err => errors.push(err)) // fail handler (optional)
.finally(apiCall.bind(null, user)); // always make the next api call
}, promise);
It is good idea for you to check angular's documentation if not already ;)
You can put all the request set in a requestArray.
Please check out this link.
Related
Im' trying to understand using promises with Google Cloud Functions a bit better. I just learned about the 'finally' method on promises, which is called after all promises in the chain are fully resolved or rejected. In a http function is it good practice to put response.send() inside of the finally method?
The below code uses request-promise-native for the http request. In the first .then() I call parseSchedule, which uses the cheerio web scraping api to loop through some data and on a website, and add it to the scheduledGames array (synchronously, I think).
I return from that and the then log that data to the console in writeDB, but one thing I noticed is that I see response.send() log 'execution finished' before I see the data from scheduleGames in the log. Is that correct?
Should I be using the 'finally' block like this?
Thanks,
const options = {
uri: 'https://www.cbssports.com/nba/schedule/' + urlDate,
Connection: 'keep-alive',
transform: function (body) {
return cheerio.load(body);
}
};
return request(options)
.then(parseSchedule)
.then(writeSchedule)
.catch((err) => console.log("there was an error: " + err))
.finally(res.send("execution finished"));
function parseSchedule($){
const scheduledGames = [];
$('tbody').children('tr').each((i, element) => {
const gameTime = $(element).children('td').eq(2).find('a').text()
const scheduledGame = { gameTime: gameTime};
scheduledGames.push(scheduledGame);
});
return scheduledGames;
}
function writeDB(scheduledGames){
console.log(scheduledGames);
}
}
It typically makes more sense to send a success response at the time in the promise chain when everything is successful, or send an error response in a catch handler. If you do these two things, it doesn't make sense to use finally at all, since success and error are the only two cases you really need to handle. Unless you have some special case, stick to just success and error.
I have a dataGrid from Syncfusion and I have a column of checkboxes. When I press a button the code reads all of the selected rows creates an array and loops until the process is ended.
this.selectedRecords = this.$refs.grid.ej2Instances.getSelectedRecords();
this.selectedRecords.forEach(function(arg, index) {
call HTTP API request. with AXIOS
get the return values and store it to the database
}
I could have 100+ rows selected and I need to be able to tell when all of the API calls are finished.
I have slowed down my calls so I only have a maximum of 10 calls per second using
axios.interceptors.request.use(
function(config) {
setTimeout(console.log("here request interceptor"), 100);
return config;
},
function(error) {
return Promise.reject(error);
}
);
And I have tried
if (self.selectedRecords.length - 1 === index) {
alert("Done");
}
but since there is not a guarantee that the rows are processed in order it can call "Done" too early.
I hope I've given you enough code to understand my problem without giving you too much to make it sloppy.
If I've understood correctly then you should just need to gather up the promises in an array and then use Promise.all to wait for them to all complete:
var requests = this.selectedRecords.map(function(arg, index) {
return axios.get(/* request details here */);
});
Promise.all(requests).then(function() {
console.log('Done')
});
If you need to process the individual requests using a then that's fine, just chain it on the end of the axios.get call:
return axios.get(/* request details here */)
.then(function(response) {
// Handle response
})
Update:
Request interceptors can return promises, which will be necessary if you want to hold up the execution of the request:
http.interceptors.request.use(function (config) {
return new Promise(function (resolve) {
setTimeout(function () {
resolve(config)
}, 5000)
})
})
Note that the example above is not performing proper throttling, it's merely delaying the request. It is purely to illustrate how promises can be used with interceptors. You haven't included the real interceptor in the question so I can't be more specific.
I'm trying to wrap my head around done() so I can test an asynchronous web service call. The documentation from Jasmine and others don't make sense to me, I can't see where the code goes, where the test for completion goes, or where the actual Jasmine tests go. They don't even use any asynchronous calls.
This question previously asked about runs() and waitsFor(), they've been deprecated in 2.0, apparently because "It's much simpler." using done()! In any case, on my version 2.6, using done() anywhere in the code brings up a reference error that it isn't defined, so I'm really not sure what's going on.
This is the code I'd like to be able to adapt. The code to call is helper.send(url, data, params); and the code is done when response !== null.
it ('should be able to invoke a web service with POST as the HTTP method', function() {
// Data
let data = {name: 'StackOverflow'};
// POST
let params = {method: 'POST'};
// This call is asynchronous
// The URL just echoes back the get and post parameters
request.send(url, data, params);
// Need to wait for response !== null
// ...
// Can now resume tests
expect(response.post.name).toEqual('StackOverflow');
});
If anyone can help with how to reorganize this to work with done(), it would be much appreciated.
You need to move your function that handles the response into the it body. Things get reorganized a bit:
// *** Pass "done" in the function, Jasmine automagically knows what to do with it
it ('should be able to invoke a web service with POST as the HTTP method', function(done) {
// *** Local callback to use here
let localCallback = function(/* arguments */) {
// *** Local copy of response
let response = null;
// *** Copy your stuff into response
...
// *** Moved your tests here
// Can now resume tests
expect(response.post.name).toEqual('StackOverflow');
// *** Let Jasmine know this "it" is finished
done();
}
// *** Do your setup here to use the local callback above
let request = ...(localCallback);
// Data
let data = {name: 'StackOverflow'};
// POST
let params = {method: 'POST'};
// This call is asynchronous
// The URL just echoes back the get and post parameters
request.send(url, data, params);
});
If your request is promise based you would do something like this:
it ('should be able to invoke a web service with POST as the HTTP method',
function(done) {
// Data
let data = {name: 'StackOverflow'};
// POST
let params = {method: 'POST'};
// This call is asynchronous
// The URL just echoes back the get and post parameters
request.send(url, data, params).then(
funciton(response) {
expect(response.post.name).toEqual('StackOverflow');
done();
}
).catch(
function(err) {
done(new Error(err));
}
);
}
);
This will only run the expect when the data is returned.
done is called with no params to indicate that your code is finished.
done is called with an Error object to indicate that something failed. The message of the Error object should tell what failed.
I've created a data service that gets data sets from an API, but I'd like to have it first cache it locally and check if the same data is already available (nevermind the stale data factor... I'll deal with that next). Here's my code:
getData(url, use_cache = true) {
// Http Fetch Client to retreive data (GET)
let cache_index = this.cache.findIndex(r => { return r.url === url; });
if ((use_cache) && (cache_index > -1) && (this.cache[cache_index].data.length)) {
// Use cached data (available)
console.log("Found cached data!", this.cache[cache_index].data);
//
// I think this next line is the problem... need to return a promise???
//
return this.cache[cache_index].data;
} else {
console.log("Retrieving records from " + url);
return this.httpClient.fetch(url, {
credentials: 'include'
}).then(response => {
// Old statement was simple...
// return response.json();
// New method seems to be working because it's saving the data into the cache
return response.json().then(result => {
this.cache.push({'url': url, 'data': result});
// Not sure why I need this next line, but I do.
return result;
});
});
}
}
It works fine to retrieve the data the first time, and even on the second call I can see (from the console log) that it finds the correct cached data, but I'm getting an error that I believe is related to promises, which is not in my area of expertise yet.
Error message:
ERROR [app-router] TypeError: this.core.getData(...).then is not a function
This error is actually in my viewmodel's caller, which looks like this:
getAccounts() {
this.core.getData('/accounting/account/all').then(response => {
this.accounts = response;
});
}
I guess since when the data is cached, instead of returning a promise it's actually returning the data, and there's no .then method on the raw data.
I suspect I need to either create a fake promise (even though it's not an async transaction) to return when the data is cached or improve the way I'm calling this method from my data service (or returning the data).
Any ideas on how to fix this current problem? Any free advice on this whole topic as it relates to Aurelia?
I guess since when the data is cached, instead of returning a promise it's actually returning the data, and there's no .then method on the raw data.
Yes.
I suspect I need to either create a fake promise (even though it's not an async transaction) to return when the data is cached
Possible (using Promise.resolve), but no.
…or improve the way I'm calling this method from my data service (or returning the data).
No, for sure you shouldn't need that.
Instead, there's a much simpler solution: cache the promise object itself, and return the same promise from every call for that url!
getData(url, use_cache = true) {
// Http Fetch Client to retreive data (GET)
if (use_cache && url in this.cache)
return this.cache[url];
else
return this.cache[url] = this.httpClient.fetch(url, {
credentials: 'include'
}).then(response => response.json());
}
This has the additional benefit that you'll never have two parallel requests for the same resource - the request itself is cached, not only the arrived result. The only drawback is that you also cache errors, if you want to avoid that and retry on subsequent calls then you have to drop the cache on rejections.
I have this method called getFormattedURI(uri) which takes in a URI and parses it for contents like name of website and queries. This method sends a request to the URI and does things with the response. However, I want all the code after getFormattedURI(uri) to wait until that method is completed because I use the return values from that method in the following code. It is something like this:
function getFormattedURI(uri) {
request.get(uri).end((err, res) => {
//(using superagent request) do stuff with res and return parsed uri
});
}
...
let x = getFormattedURI('www.google.com');
//do stuff with x like index it into Elasticsearch
I want my code to wait until I get a response from the request, and not go past the function call immediately. How do I achieve this?
Its considered very bad practice to block(sync) the JS thread, and your case is very easy and straight forward to do using callbacks.
and if you want to get fancy, you can use promise(provided the JS engine support promise, or you should use third party library)
Sample using promise :
function getFormattedURI() {
return new Promise(function(resolve, reject) {
request.get(uri).end((err, res) => {
//(using superagent request) do stuff with res and return parsed uri
resolve(res);
});
});
}
getFormattedURI('www.google.com').then(function(res) {
})