I have an apollo-server setup in my backed, I want to use a post request API inside one of my resolvers and then use the response from that API to return it to my client. but the problem am having is the return statement is running before the API response get returned. there is my code sample bellow.
module.exports = {
Mutation: {
checkFace: async () => {
console.log("Checking.....");
let confidence;
var parameters = {
image_url1: "link to image 1",
image_url2: "link to image 2",
};
facepp.post("/compare", parameters, function (err, res) {
if (!err) {
confidence = res.confidence;
} else {
console.log(err);
}
});
return confidence
},
},
};
That's because the faceapp.post is probably running asynchronously. Your checkFace function is correctly used as async that's fine, but inside where you call the POST you should await the response and then return it
confidence = await res.confidence;
Also when using the function make sure you await for it to finish, so call it with
let someResponse = await Mutation.checkFace();
console.log(someResponse);
or
Mutation.checkFace().then(response => {
console.log(response);
});
Whichever you prefer depending on your situation.
https://nodejs.dev/learn/modern-asynchronous-javascript-with-async-and-await
Related
I am using loopback on server side of my application , to fetch and validate a data from database I'm using findOne method which is having a callback function. I wanted to get run the callback function as soon as the findone function is executed, The code i have written is working but i want to avoid usage of async-await. Any other alternative for this?
What I tried
function validId(req) {
const filter = {
where: {
ID: req.id,
}
};
//
const result = await model.findOne(filter);
if (result) {
return true;
} else {
return false;
}
}
module.exports = function () {
return async function validateTenant(req, res, next) {
var id = false;
if (req.url.includes("XYZ")) {
id = await validId(req)
}
//
if (id || !req.url.includes("XYZ")") {
next();
} else {
res.writeHead(404, { "Content-Type": "text/html" });
var html = fs.readFileSync(
"error.html"
);
res.end(html);
}
};
};
you could use the .then() function of the promise
model.findOne(filter).then((result)=>{
//execute the rest of the function that need to be executed after the findOne.
});
// The code will continue to execute while model.findOne is doing it's thing.
But if you want to wait for the FindOne to give a result without using await or the .then it is not possible unless you make a wrapper of findOne or your BDD package have a synchrone findOne
I am calling one API.in the then section of API call, i am calling another API. the output of first API will be passed to another API.
await axios
.post(process.env + '/certificates/upload', {
"name": "Shruti"
}
})
.then((response: any) => {
filenames = JSON.stringify(response.data.office);
axios // Not able to write async here
.patch(process.env + "/certificates/", {
file: filenames
})
.then(function(response: any) {
alert(' Record updated successfully');
})
.catch((err: any) => {
alert('Error in updating the record');
});
I am not able to use await in second API call. Where should I put async to use await in second API call? first call is working properly. also Is there any better way to call consecutive API and Passing output of first call to second.
Find the function that contains the statement you wish to await. Add async to the beginning of that function.
More generally, avoid mixing async/await with chained .then/.catch.
I think this is what you want:
try {
let response1 = await axios.post(
process.env + '/certificates/upload',
{ name: "Shruti" }
)
let filenames = JSON.stringify(response1.data.office)
await axios.patch(
process.env + "/certificates/",
{ file: filenames }
)
alert(`Update succeeded`)
} catch( error ) {
alert(`Update failed`)
}
I'm trying to limit the number of requests I send to an API.
I'm using Limiter and it's working just like I need, the only issue is that I can't find a way to use it with await (I need all the responses before rendering my page)
Can someone give me a hand with it?
Btw the Log returns a boolean.
const RateLimiter = require('limiter').RateLimiter;
const limiter = new RateLimiter(50, 5000)
for (let i = 0; i < arrayOfOrders.length; i++) {
const response = limiter.removeTokens(1, async (err, remainingRequests) => {
console.log('request')
return await CoreServices.load('updateOrder', {
"OrderNumber": arrayOfOrders[i],
"WorkFlowID": status
})
})
console.log('response', response)
}
console.log('needs to log after all the request');
this is loggin:
response true
response true
response false
needs to log after all the request
request
request
request
...
Promisifying .removeTokens will help, see if this code works
const RateLimiter = require('limiter').RateLimiter;
const limiter = new RateLimiter(50, 5000);
const tokenPromise = n => new Promise((resolve, reject) => {
limiter.removeTokens(n, (err, remainingRequests) => {
if (err) {
reject(err);
} else {
resolve(remainingRequests);
}
});
});
(async() => { // this line required only if this code is top level, otherwise use in an `async function`
const results = await Promise.all(arrayOfOrders.map(async (order) => {
await tokenPromise(1);
console.log('request');
return CoreServices.load('updateOrder', {
"OrderNumber": order,
"WorkFlowID": status
});
}));
console.log('needs to log after all the request');
})(); // this line required only if this code is top level, otherwise use in an `async function`
explanation
Firstly:
const tokenPromise = n => new Promise((resolve, reject) => {
limiter.removeTokens(n, (err, remainingRequests) => {
if (err) {
reject(err);
} else {
resolve(remainingRequests);
}
});
});
promisifies the limiter.removeTokens to use in async/await - in nodejs you could use the built in promisifier, however lately I've had too many instances where that fails - so a manual promisification (I'm making up a lot of words here!) works just as well
Now the code is easy - you can use arrayOfOrders.map rather than a for loop to create an array of promises that all run parallel as much as the rate limiting allows, (the rate limiting is done inside the callback)
await Promise.all(... will wait until all the CoreServices.load have completed (or one has failed - you could use await Promise.allSettled(... instead if you want)
The code in the map callback is tagged async so:
await tokenPromise(1);
will wait until the removeTokens callback is called - and then the request
return CoreServices.load
is made
Note, this was originally return await CoreServices.load but the await is redundant, as return await somepromise in an async function is just the same as return somepromise - so, adjust your code too
I am making call to the bitbucket API to get all the files that are in a repo. I have reached to a point where I can get the list of all the folders in the repo and make the first API call to all the root folders in the repo in parallel and get the the list of first 1000 files for all folders.
But the problem is bitbucket api can give me only 1000 files per folder at a time.
I need to append a query param &start =nextPageStart and make the call again, until it is null and isLastPage is true per API. How can I achieve that with below code??
I get the nextPageStart from first call to the api. See the API response below.
Below is the code that I have so far.
Any help or guidance is appreciated.
Response from individual API thats called per folder.
{
"values": [
"/src/js/abc.js",
"/src/js/efg.js",
"/src/js/ffg.js",
...
],
"size": 1000,
"isLastPage": false,
"start": 0,
"limit": 1000,
"nextPageStart": 1000
}
function where i made asynchronous calls to get the list of files
export function getFilesList() {
const foldersURL: any[] = [];
getFoldersFromRepo().then((response) => {
const values = response.values;
values.forEach((value: any) => {
//creating API URL for each folder in the repo
const URL = 'https://bitbucket.abc.com/stash/rest/api/latest/projects/'
+ value.project.key + '/repos/' + value.slug + '/files?limit=1000';
foldersURL.push(URL);
});
return foldersURL;
}).then((res) => {
// console.log('Calling all the URLS in parallel');
async.map(res, (link, callback) => {
const options = {
url: link,
auth: {
password: 'password',
username: 'username',
},
};
request(options, (error, response, body) => {
// TODO: How do I make the get call again so that i can paginate and append the response to the body till the last page.
callback(error, body);
});
}, (err, results) => {
console.log('In err, results function');
if (err) {
return console.log(err);
}
//Consolidated results after all API calls.
console.log('results', results);
});
})
.catch((error) => error);
}
I was able to get it working be creating a function with callback.
export function getFilesList() {
const foldersURL: any[] = [];
getFoldersFromRepo().then((response) => {
const values = response.values;
values.forEach((value: any) => {
//creating API URL for each folder in the repo
const URL = 'https://bitbucket.abc.com/stash/rest/api/latest/projects/'
+ value.project.key + '/repos/' + value.slug + '/files?limit=1000';
foldersURL.push(URL);
});
return foldersURL;
}).then((res) => {
// console.log('Calling all the URLS in parallel');
async.map(res, (link, callback) => {
const options = {
url: link,
auth: {
password: 'password',
username: 'username',
},
};
const myarray = [];
// This function will consolidate response till the last Page per API.
consolidatePaginatedResponse(options, link, myarray, callback);
}, (err, results) => {
console.log('In err, results function');
if (err) {
return console.log(err);
}
//Consolidated results after all API calls.
console.log('results', results);
});
})
.catch((error) => error);
}
function consolidatePaginatedResponse(options, link, myarray, callback) {
request(options, (error, response, body) => {
const content = JSON.parse(body);
content.link = options.url;
myarray.push(content);
if (content.isLastPage === false) {
options.url = link + '&start=' + content.nextPageStart;
consolidatePaginatedResponse(options, link, myarray, callback);
} else {
// Final response after consolidation per API
callback(error, JSON.stringify(myarray));
}
});
}
I think the best way is to wrap it in a old school for loop (forEach doesn't work with async, since it's synchronous and it will cause all the requests to be spawn at the same time).
What I understood is that you do some sort of booting query where you get the values array and then you should iterate among the pages. Here some code, I didn't fully grasp the APIs so I'll give a simplified (and hopefully readable) answer, you should be able to adapt it:
export async function getFilesList() {
logger.info(`Fetching all the available values ...`);
await getFoldersFromRepo().then( async values => {
logger.info("... Folders values fetched.");
for (let i = 0; ; i++ ) {
logger.info( `Working on page ${i}`);
try {
// if you are using TypeScript, the result is not the promise but the succeeded value already
const pageResult: PageResult = await yourPagePromise(i);
if (pageResult.isLastPage) {
break;
}
} catch(err) {
console.err(`Error on page ${i}`, err);
break;
}
}
logger.info("Done.");
});
logger.info(`All finished!`);
}
The logic behind is that first getFoldersFromRepo() returns a promise which returns the values, and then I sequentially iterate on all available pages through the yourPagePromise function (which returns a promise). The async/await construct allows to write more readable code, rather then having a waterfall of then().
I'm not sure it respects your APIs specs, but it's the logic you can use as foundation! ^^
I have a Meteor method that wraps around an http.get. I am trying to return the results from that http.get into the method's return so that I can use the results when I call the method.
I can't make it work though.
Here's my code:
(In shared folder)
Meteor.methods({
getWeather: function(zip) {
console.log('getting weather');
var credentials = {
client_id: "string",
client_secret: "otherstring"
}
var zipcode = zip;
var weatherUrl = "http://api.aerisapi.com/places/postalcodes/" + zipcode + "?client_id=" + credentials.client_id + "&client_secret=" + credentials.client_secret;
weather = Meteor.http.get(weatherUrl, function (error, result) {
if(error) {
console.log('http get FAILED!');
}
else {
console.log('http get SUCCES');
if (result.statusCode === 200) {
console.log('Status code = 200!');
console.log(result.content);
return result.content;
}
}
});
return weather;
}
});
For some reason, this does not return the results even though they exist and the http call works: console.log(result.content); does indeed log the results.
(Client folder)
Meteor.call('getWeather', somezipcode, function(error, results) {
if (error)
return alert(error.reason);
Session.set('weatherResults', results);
});
Of course here, the session variable ends up being empty.
(Note that this part of the code seems to be fine as it returned appropriately if I hard coded the return with some dummy string in the method.)
Help?
In your example Meteor.http.get is executed asynchronously.
See docs:
HTTP.call(method, url [, options] [, asyncCallback])
On the server, this function can be run either synchronously or
asynchronously. If the callback is omitted, it runs synchronously and
the results are returned once the request completes successfully. If
the request was not successful, an error is thrown
Switch to synchronous mode by removing asyncCallback:
try {
var result = HTTP.get( weatherUrl );
var weather = result.content;
} catch(e) {
console.log( "Cannot get weather data...", e );
}
Kuba Wyrobek is correct, but you can also still call HTTP.get asynchronously and use a future to stop the method returning until the get has responded:
var Future = Npm.require('fibers/future');
Meteor.methods({
getWeather: function(zip) {
console.log('getting weather');
var weather = new Future();
var credentials = {
client_id: "string",
client_secret: "otherstring"
}
var zipcode = zip;
var weatherUrl = "http://api.aerisapi.com/places/postalcodes/" + zipcode + "?client_id=" + credentials.client_id + "&client_secret=" + credentials.client_secret;
HTTP.get(weatherUrl, function (error, result) {
if(error) {
console.log('http get FAILED!');
weather.throw(error);
}
else {
console.log('http get SUCCES');
if (result.statusCode === 200) {
console.log('Status code = 200!');
console.log(result.content);
weather.return(result);
}
}
});
weather.wait();
}
});
There's not really much advantage to this method over a synchronous get in this case, but if you're ever doing something on the server which can benefit from something like an HTTP call running asynchronously (and thus not blocking the rest of the code in your method), but you still needs to wait for that call to return before the method can, then this is the right solution. One example would be where you need to execute multiple non-contingent gets, which would all have to wait for each other to return one by one if executed synchronously.
More here.
Sometimes asynchronous calls are preferable. You can use async/await syntax for that, and you need to promisify HTTP.get.
import { Meteor } from 'meteor/meteor';
import { HTTP } from 'meteor/http';
const httpGetAsync = (url, options) =>
new Promise((resolve, reject) => {
HTTP.get(url, options, (err, result) => {
if (err) {
reject(err);
} else {
resolve(result);
}
});
});
Meteor.methods({
async 'test'({ url, options }) {
try {
const response = await httpGetAsync(url, options);
return response;
} catch (ex) {
throw new Meteor.Error('some-error', 'An error has happened');
}
},
});
Notice that meteor test method is marked as async. This allows using await operator inside it with method calls which return Promise. Code lines following await operators won't be executed until returned promise is resolved. In case the promise is rejected catch block will be executed.