Cloud Storage upload through Cloud Functions giving Error: Not Found - javascript

React CKEditor 5 on the front end. Trying to upload an image to Cloud Storage bucket when the user adds an image to the content of the editor. Uploading to Cloud Storage from a Cloud Function, however I'm getting Error: Not Found which I'm not really sure what to do with.
Here's my code:
module.exports = async (req) => {
const tmpFilePath = path.join(os.tmpdir(), 'tmpImage.png')
fs.writeFile(tmpFilePath, req.body, (err) => {
if(err) throw err
fs.readFile(tmpFilePath, (err, data) => {
console.log('Error: ', err)
console.log('Data: ', data)
})
})
try {
await storage.bucket('bucketUrl').upload(tmpFilePath, {
gzip: true,
metadata: {
cacheControl: 'public, max-age=31536000',
}
})
return 'Something'
} catch (err) {
console.log('----- ERROR START -----')
console.log(err)
console.log('----- ERROR END -----')
return { error: err }
}
}
Error:
Error: Not Found
> at new ApiError (/Users/garrettlove/development/devgrub/functions/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/build/src/util.js:59:15)
> at Util.parseHttpRespMessage (/Users/garrettlove/development/devgrub/functions/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/build/src/util.js:161:41)
> at Util.handleResp (/Users/garrettlove/development/devgrub/functions/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/build/src/util.js:135:76)
> at retryRequest (/Users/garrettlove/development/devgrub/functions/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/build/src/util.js:434:22)
> at onResponse (/Users/garrettlove/development/devgrub/functions/node_modules/retry-request/index.js:206:7)
> at res.text.then.text (/Users/garrettlove/development/devgrub/functions/node_modules/#google-cloud/storage/node_modules/teeny-request/build/src/index.js:150:17)
> at process._tickCallback (internal/process/next_tick.js:68:7)
If you want to see the rest of the error let me know, it's a little much to post.

The code is attempting to upload the file before it's been written to disk. fs.writeFile is asynchronous and returns immediately, before the write is complete. According to the linked API documentation:
When file is a filename, asynchronously writes data to the file, replacing the file if it already exists. data can be a string or a buffer.
If you want to wait until the file is fully written, consider either:
Uploading the file only after the write is complete by using the callback function that you pass to it. The upload code should be within that callback.
Using the version of writeFile that deals with promises instead of callbacks, and chain the promises.
Using fs.writeFileSync to write synchronously.

Related

Nodejs File get deleted before sending response (res.send)

Here the problem is File get deleted from the server before sending a response to the client and display error as no image on this path.
// Here some code
res.status(200).sendFile(path.join(__dirname, "../../image", `/${response.id}.png`));
// Delete image from server
fs.unlink(imagePath, function (err) {
if (err) throw err;
console.log('File deleted!');
})
You will need to monitor the callback to res.sendFile() so you know when the sending is actually done before you can safely delete your file.
res.sendFile() is asynchronous so it returns before the job is done, thus you were deleting the file before res.sendFile() was done. Use the callback to know when it's actually done.
let fname = path.join(__dirname, "../../image", `/${response.id}.png`);
res.status(200).sendFile(fname, err => {
if (err) {
console.log(err);
res.sendStatus(500);
}
fs.unlink(fname, function(err) => {
// log any error
if (err) {
console.log(err);
}
});
});
Note: When you pass a callback to res.sendFile() you have to manually handle an error condition and send an error response, retry, send alternative content, etc... to send some appropriate response.
I'm also wondering why you're sending one filename and attempting to delete a different one. Wouldn't it make sense to use the same local variable for the filename you're sending and the one you're deleting?

Problems Downloading files using Dropbox JavaScript SDK

I need to figure out where my files are downloading when I use the filesDownload(). I don't see an argument for file destination. Here's my code:
require('isomorphic-fetch');
var Dropbox = require('dropbox').Dropbox;
var dbx = new Dropbox({ accessToken: 'accessToken', fetch});
dbx.filesDownload({path: 'filepath}).
then(function(response) {
console.log(response);
})
.catch(function(error) {
console.log(error);
});
I'm getting a successful callback when I run the code but I don't see the file anywhere.
I need to know where my files are downloading to and how to specify the file destination in my function.
Thanks,
Gerald
I've used the function as described in the SDK's documentation (http://dropbox.github.io/dropbox-sdk-js/Dropbox.html#filesDownload__anchor) but I have no idea where my file goes.
Expected Result: Files are downloaded to Dropbox to path that I have designated.
Actual Results: I get a successful callback from Dropbox but I cannot find the files downloaded.
In Node.js, the Dropbox API v2 JavaScript SDK download-style methods return the file data in the fileBinary property of the object they pass to the callback (which is response in your code).
You can find an example of that here:
https://github.com/dropbox/dropbox-sdk-js/blob/master/examples/javascript/node/download.js#L20
So, you should be able to access the data as response.fileBinary. It doesn't automatically save it to the local filesystem for you, but you can then do so if you want.
You need to use fs module to save binary data to file.
dbx.filesDownload({path: YourfilePath})
.then(function(response) {
console.log(response.media_info);
fs.writeFile(response.name, response.fileBinary, 'binary', function (err) {
if (err) { throw err; }
console.log('File: ' + response.name + ' saved.');
});
})
.catch(function(error) {
console.error(error);
});

Catch AWS S3 Get Object Stream Errors Node.js

I'm trying to build an Express server that will send items in a S3 bucket to the client using Node.js and Express.
I found the following code on the AWS documentation.
var s3 = new AWS.S3({apiVersion: '2006-03-01'});
var params = {Bucket: 'myBucket', Key: 'myImageFile.jpg'};
var file = require('fs').createWriteStream('/path/to/file.jpg');
s3.getObject(params).createReadStream().pipe(file);
I have changed I slightly to the following:
app.get("/", (req, res) => {
const params = {
Bucket: env.s3ImageBucket,
Key: "images/profile/abc"
};
s3.getObject(params).createReadStream().pipe(res);
});
I believe this should work fine. The problem I'm running into is when the file doesn't exist or S3 returns some type of error. The application crashes and I get the following error:
NoSuchKey: The specified key does not exist
My question is, how can I catch or handle this error? I have tried a few things such as wrapping that s3.getObject line in a try/catch block, all of which haven't worked.
How can I catch an error and handle it my own way?
I suppose you can catch error by listening to the error emitter first.
s3.getObject(params)
.createReadStream()
.on('error', (e) => {
// NoSuchKey & others
})
.pipe(res)
.on('data', (data) => {
// data
})

error uploading to cloud storage using a cloud function

I am trying to upload files to google cloud storage using a cloud function which is triggered by HTTP. However when the cloud function sends the file to be uploaded I often (although not always) get the following error
ERROR uploading to storage: { ApiError: Anonymous caller does not have storage.objects.create access to bucket_name/folder/test.jpg.
I am not sure why this error occurs - and why only some of the time
Here is the code:
const storage = require('#google-cloud/storage')();
function uploadToStorage(filepath, folder, filename) {
const options = {
destination: bucket.file(`${folder}/${filename}`),
public: false,
resumable: false
};
storage
.bucket(BUCKET_NAME)
.upload(filepath, options)
.then(function () {
console.log(`${filename} uploaded to ${BUCKET_NAME}`);
})
.catch((err) => {
console.error('ERROR uploading to storage: ', err);
});
}
Thanks
I had the same error after adding a return statement at the end of my function that performed file deletes on storage objects. This is what I was doing:
Make a database call to get some data
Once that request comes back, delete some files out of cloud storage (GCS)
The code structurally looked like this:
deleteStuffOutStorage() {
admin.firestore().doc(`My-doc-ref`).get()
.then(snapshot => {
// Do the deleting here {Interacting with GCS}
return deleteFile(snapshot.data().path); // Deletes file
})
.then(success => {
// Worked
})
.catch(error => {
// Error = ApiError: Anonymous caller does not have storage.objects...
})
return; // This statement was creating the problems
}
When I removed the return statement, I no longer got the error. I thought in my case it may have something to do with firebase-admin object instance getting deallocated and re-allocated between asynchronous operations (steps 1 and 2 above), or at least its GCS auth token?
All FCF instances should have access to GCS via a service account that is auto-generated. You can confirm this in the GCP console : https://console.cloud.google.com/iam-admin/serviceaccounts/
From the code snippet you posted I can't see anything that would cause the same issue I was getting, but maybe have a think about any time-based events that could cause this behaviour. That may explain the inconsistent behaviour you elude to.
Hope that's some sort of help.

Files is deleting before its used in node js

I'm new to node js and i'm trying to do the following:
function createPasswordfile(content)
{
fs.writeFile(passwordFileName,content, function(err) {
if(err) {
console.log("Failed on creating the file " + err)
}
});
fs.chmodSync(passwordFileName, '400');
}
function deletePasswordFile()
{
fs.chmodSync(passwordFileName, '777');
fs.unlink(passwordFileName,function (err) {
if (err) throw err;
console.log('successfully deleted');
});
}
and there are three statements which call these functions:
createPasswordfile(password)
someOtherFunction() //which needs the created password file
deletePasswordFile()
The problem I'm facing is when I add the deletePasswordFile() method call, I get error like this:
Failed on creating the file Error: EACCES, open 'password.txt'
successfully deleted
Since its non blocking, I guess the deletePasswordFile function deletes the file before other function make use of it.
If deletePasswordFile is commented out, things are working fine.
How should I prevent this?
writeFile is asynchronous, so it's possible the file is still being written when you try and delete it.
Try changing to writeFileSync.
fs.writeFileSync(passwordFileName, content);

Categories

Resources