I want to send back two S3 pre-signed URL's for each key in [user.idKey, user.selfieKey] array in my Express route.
I know that S3 is successfully getting the pre-signed URLS because they will log to the console with the callback console.log(url).
I also tried using "await" in front of the getSignedUrl method but I don't think that works for S3..???
Any idea what I am doing wrong?
Thank you!
router.get(`/api/verification/load`, auth, async (req, res) => {
try {
const user = await User.findOne({ GETS A USER })
let urlArray = []
const keyArray = [user.idKey, user.selfieKey]
for (const key in keyArray) {
s3VerificationBucket.getSignedUrl(
"getObject",
{
Bucket: "app-verification",
Key: key,
Expires: 30,
},
(err, url) => urlArray.push(url) // when console.log(url) it logs urls
)
}
if (urlArray.length === 0) {
console.log("URL ARRAY EMPTY") -> RETURNS "URL ARRAY EMPTY"
}
const idUrl = urlArray[0]
const selfieUrl = urlArray[1]
res.send({ user, idUrl, selfieUrl })
} catch (err) {
res.status(500).send()
}
})
You are sending your response synchronously. You are signing your links asynchronously. So the array is populated AFTER you have already responded.
Instead, use s3VerificationBucket.getSignedUrl(...).promise() to get a promise for each operation.
Use Promise.all(...) to wait on the result of all operations before responding.
Only after you have processed all async work should you res.send(...).
Related
I have a route to check if a user is logged in. It works well, but I don't understand what is the problem if I create a second route just below that calls it just to do the same thing. It seems like I can't access the cookie anymore in the second route, but I don't know why. Thanks for your help !
// This route works :
router.get('/loggedin', async (req, res) => {
try {
const token = req.cookies.jwt;
console.log("token : " + token) // Token is correct here in loggedin route, but is undefined if I use the route below
const decodedToken = jwt.verify(token, process.env.JWT_SECRET);
if (decodedToken) {
res.send(true);
}
else {
res.send(false);
}
}
catch (err) {
res.status(500).send(false);
}
});
// This route calls the route above and doesn't work
router.get('/loggedinbyanotherway', async (req, res) => {
const checking = await fetch(`${process.env.API_URL}:${process.env.PORT || 3000}/loggedin`)
console.log(checking.ok) // Returns false
const data = await checking.json()
console.log(data) // Returns false
res.send(data)
});
Your fetch request isn't providing any cookies, so how could the code handling the request read any cookies?
More to the point... This entire operation is unnecessary. Why make an HTTP request to the application you're already using? Instead, extract the functionality you want into a common function and just call that function from both routes. For example:
const isLoggedIn = (req) => {
const token = req.cookies.jwt;
const decodedToken = jwt.verify(token, process.env.JWT_SECRET);
if (decodedToken) {
return true;
} else {
return false;
}
};
router.get('/loggedin', async (req, res) => {
try {
res.send(isLoggedIn(req));
}
catch (err) {
res.status(500).send(false);
}
});
router.get('/loggedinbyanotherway', async (req, res) => {
const checking = isLoggedIn(req);
res.send(checking);
});
In the example it's not really clear why you need the second route or what else it offers, but I can only assume it's just a placeholder for some additional functionality you plan to add.
Either way, the point is that the application doesn't need to make an entire HTTP request to itself, since you're already in that application and have access to the same logic.
My apollo-server is using graphql-upload package which includes file upload support for GraphQL endpoints. But they only documented about uploading single files. But we need multiple file upload support. Well, I get the streams as an Array. But whenever I createReadStream for each streams & pipe them to cloudinary uploader var, it just uploads the last created stream rather then uploading the each stream.
Code
// graphql reolver
const post = async (_, { post }, { isAuthenticated, user }) => {
if (!isAuthenticated) throw new AuthenticationError("User unauthorized");
const files = await Promise.all(post.files);
let file_urls = [];
const _uploadableFiles = cloudinary.uploader.upload_stream({ folder: "post_files" },
(err, result) => {
console.log("err:", err);
console.log("result:", result);
if (err) throw err;
file_urls.push({
url: result.secure_url,
public_id: result.public_id,
file_type: result.metadata,
});
return result;
}
);
files.forEach(async (file) => await file.createReadStream().pipe(_uploadableFiles));
.... other db related stuff
}
After that, I get the Secure_URL from uploaded files which is returned by cloudinary upload_stream functions callback. But it only gives me the properties of one stream which was the last of the all streams. Please help me in this case. Is there any way to pipe multiple streams?
Instead of making one const upload stream you make it into a factory function that returns an upload stream on each call for pipe'ing
Use array map so that you get an array that you can use in Promise.all
One by one each file should get uploaded to their own respective upload stream, appending the generated file url info to file_urls(on success callback), when all are done Promise.all would resolve and the code can resume to do other db related stuff
const post = async (_, { post }, { isAuthenticated, user }) => {
if (!isAuthenticated) throw new AuthenticationError("User unauthorized");
const files = await Promise.all(post.files);
let file_urls = [];
function createUploader(){
return cloudinary.uploader.upload_stream({ folder: "post_files" },
(err, result) => {
console.log("err:", err);
console.log("result:", result);
if (err) throw err;
file_urls.push({
url: result.secure_url,
public_id: result.public_id,
file_type: result.metadata,
});
return result;
}
);
}
await Promise.all( files.map(async (file) => await file.createReadStream().pipe(createUploader())) ); //map instead of forEach
//.... other db related stuff
}
I have an object with two embedded arrays of objects that seem to me to be almost identical, as seen here in my database:
But when I try to access one of the arrays in frontend javascript, it's apparently empty. The other is not, as seen here when I log it to the browser console:
The objects in the arrays are almost exactly the same. I am concerned that the problem is when I push a new object on to the 'stakeholders' array that the asynchronous function is not completing before the page loads again, but I am using async/await in that function before returning the response
addStakeholder = async (req, res, next) => {
...
project.stakeholders.push(stakeholder)
await project.save()
res.status(200).json({
status: 'success',
project: project
Could anyone please tell me what I am likely doing wrong here?
EDIT: Sorry I'll try and add some more detail, so on the form submission there is this.....
createStakeholderForm.addEventListener('submit', async (e) => {
// getting properties etc, this all works
await createStakeholder({ stakeholders, project })
window.setTimeout(() => {
location.reload()
}, 1000)
})
which passes it to this axios function....
createStakeholder = async (data) => {
try {
const url = `http://127.0.0.1:3000/stakeholder`
const res = await axios({
method: 'POST',
url: url,
data: data
})
if (res.data.status === 'success') {
showAlert('success', `Stakeholder created`)
}
} catch (err) {
showAlert('error', err.response.data.message)
}
}
and that routes posts to this function.....
addStakeholder = async (req, res, next) => {
const query = { _id: req.body.project }
const project = await Project.findById(query)
const stakeholder = req.body.stakeholders
project.stakeholders.push(stakeholder)
await project.save()
res.status(200).json({
status: 'success',
data: {
data: project
}
})
})
While it's not obvious what is wrong from your code. The debugging path is, fortunately.
Start tracing the wire. It sound like things are saving in the database correctly, but are not reaching the frontend. I would console.log in your code on the backend at the call site that queries the database. Confirm its what you expect. Assuming that worked, add another console.log downstream, keep doing that until stakeholder data vanishes. This exercise will show you where in the code stakeholders are getting dropped.
I'm trying to make a simple API that calls another API that will return some information. The thing is, in order to connect to the second API, I need to attach query parameters to it.
So what I've tried to do so far is to use an axios.get in order to fetch the API. If I didn't need to add queries on top of that, then this would be really simple but I'm having a really hard time trying to figure out how to attach queries on top of my request.
I've created an object that pulled the original query from my end and then I used JSON.stringify in order to turn the object I made into a JSON. Then, from my understanding of Axios, you can attach params my separating the URL with a comma.
On line 6, I wasn't sure if variables would carry over but I definitely can't have the tag var turned into the string "tag", so that's why I left it with the curly brackets and the back ticks. If that's wrong, then please correct me as to how to do it properly.
the var tag is the name of the query that I extracted from my end. That tag is what needs to be transferred over to the Axios GET request.
app.get('/api/posts', async (req, res) => {
try {
const url = 'https://myurl.com/blah/blah';
let tag = req.query.tag;
objParam = {
tag: `${tag}`
};
jsonParam = JSON.stringify(objParam);
let response = await axios.get(url, jsonParam);
res.json(response);
} catch (err) {
res.send(err);
}
});
response is SUPPOSED to equal a JSON file that I'm making the request to.
What I'm actually getting is a Error 400, which makes me think that somehow, the URL that Axios is getting along with the params aren't lining up. (Is there a way to check where the Axios request is going to? If I could see what the actual url that axios is firing off too, then it could help me fix my problem)
Ideally, this is the flow that I want to achieve. Something is wrong with it but I'm not quite sure where the error is.
-> I make a request to MY api, using the query "science" for example
-> Through my API, Axios makes a GET request to:
https://myurl.com/blah/blah?tag=science
-> I get a response with the JSON from the GET request
-> my API displays the JSON file
After looking at Axios' README, it looks like the second argument needs the key params. You can try:
app.get('/api/posts', async (req, res, next) => {
try {
const url = 'https://myurl.com/blah/blah';
const options = {
params: { tag: req.query.tag }
};
const response = await axios.get(url, options);
res.json(response.data);
} catch (err) {
// Be sure to call next() if you aren't handling the error.
next(err);
}
});
If the above method does not work, you can look into query-string.
const querystring = require('query-string');
app.get('/api/posts', async (req, res, next) => {
try {
const url = 'https://myurl.com/blah/blah?' +
querystring.stringify({ tag: req.params.tag });
const response = await axios.get(url);
res.json(response.data);
} catch (err) {
next(err);
}
});
Responding to your comment, yes, you can combine multiple Axios responses. For example, if I am expecting an object literal to be my response.data, I can do:
const response1 = await axios.get(url1)
const response2 = await axios.get(url2)
const response3 = await axios.get(url3)
const combined = [
{ ...response1.data },
{ ...response2.data },
{ ...response3.data }
]
I need to request X products from another server and I need to wait for that execution to finish before proceeding and saving the order in the database.
Let's say I receive via post an array of product Ids that I need to add to the order, e.g
JSON FILE:
{
"order_products":[1,2,3,4]
}
Here's a code sample:
//Express module
var router = require('express').Router();
//HTTP Request module
var client = require('request');
//Util that saves the URLs of the other databases
var productURL = require('../utils/product/productURL');
//Builds a product object given a JSON
var productBuilder = require('../utils/product/productBuilder');
router.post('/', req, res) {
//Instantiate a new order
var orderInstance = new order({
date: Date.now
});
//Query the products in the other server and add them to the order
req.body.order_products.forEach(id => {
client.get(productURL.HTTPS + id, { json: true }, (err, res, JSONProduct) => {
var product = productBuilder.build(JSONProduct);
orderInstance.order_products.push(product);
});
};
//Save the order in the database
orderInstance.save(....);
//Send response
res.status(201).json(orderInstance);
}
The problem here is that while the loop is still executing, the response is sent (201) and the orderInstance is saved without any product. If I console.log the products they only appear after the orderInstance is saved.
I've tried implementing callbacks to fix this issue, but with no success. I'd appreciate if anyone could lend me a hand here! Thanks in advance :smiley:(edited)
forEach runs synchronously - when the forEach ends, the client.get requests may have all been sent out, but the responses surely haven't come back yet. You need to convert each request into a Promise, and then call Promise.all on an array of those Promises. The Promise.all will resolve once all responses have come back. For example:
const allPromises = req.body.order_products.map(id => new Promise((resolve, reject) => {
client.get('productURL.HTTPS' + id, { json: true }, (err, res, JSONProduct) => {
if (err) reject (err);
else resolve(productBuilder.build(JSONProduct));
});
}));
Promise.all(allPromises)
.then((newProducts) => {
orderInstance.order_products.push(...newProducts);
res.status(201).json(orderInstance);
})
.catch((err) => {
// handle errors
});