I'm trying to download an image from an url and then uploading it to my firebase cloud storage.
This is the code i'm using.
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
const download = require('image-downloader');
const tmp = require('tmp');
export const downloadFunction = functions.https.onCall(async (data, context) => {
var bucket = admin.storage().bucket();
await tmp.dir(async function _tempDirCreated(err: any, path: any) {
if (err) throw err;
const options = {
url: 'theUrlIWantToPutInTheStorage',
dest: path,
}
console.log('Dir: ', path);
await download.image(options)
.then(async () => {
console.log('Saved');
await bucket.upload(path, {
destination: "testfolder/test.jpg",
metadata: "metadata",
});
})
.catch((err2: any) => console.error(err2))
});
});
But from the firebase console (logs) I get this error:
{ Error: EISDIR: illegal operation on a directory, read errno: -21, code: 'EISDIR', syscall: 'read' }
What am I doing wrong?
Thanks in advance!
The path that you provide to the method upload should be a file and not a directory.
upload(pathString, optionsopt, callbackopt) → {Promise.<UploadResponse>}
Upload a file to the bucket. This is a convenience method that wraps File#createWriteStream.
Example :
const options = {
destination: 'new-image.png',
resumable: true,
validation: 'crc32c',
metadata: {
metadata: {
event: 'Fall trip to the zoo'
}
}
};
bucket.upload('local-image.png', options, function(err, file) {
// Your bucket now contains:
// - "new-image.png" (with the contents of `local-image.png')
// `file` is an instance of a File object that refers to your new file.
});
https://googleapis.dev/nodejs/storage/latest/Bucket.html
Related
node version: 14.20.0
jest version: 28.1.1
ts-jest version: 28.0.5
I have following function which is to be tested
uploadFile.ts
import * as fs from 'fs';
export async function uploadFile(containerName, customerId, fileName, filePath, logCtx){
fs.stat(filePath, (err, stats) => {
if (err) {
logger.error({err, ...logCtx}, `File doesn't exist. File path: ${filePath}`);
} else {
logger.info(logCtx, `File size: ${stats.size}`);
if(stats.size < 1){
logger.error({err, ...logCtx}, `Byte size is ${stats.size} - File is empty. File path: ${filePath}`);
throw new Error("File is empty.")
}
}
});
// do some other things
}
The uploadFile function uses fs module and I decided to mock it since we don't need to do anything with files for testing
uploadFile.test.js
// mock fs package
const fs = { stat: jest.fn(), createReadStream: jest.fn() };
jest.mock("fs", () => {
return fs;
});
it("should call fs.stat once", async () => {
// Arrange
const { uploadFile } = require("../../../lib/global/uploadFile");
const containerName = "qz";
const customerId = "w3";
const fileName = "none.pdf";
const filePath = "src/services/none.pdf";
const logCtx = {};
// Act
await uploadFile(containerName, customerId, fileName, filePath, logCtx);
// Assert
expect(fs.stat).toBeCalledWith(filePath, expect.any(Function));
expect(fs.stat).toBeCalledTimes(1);
});
when running the above test file, test case fails and shows the following error
● should call fs.stat once
require-at: stat 'C:\guardian group\c360-accounts' failed: Fs.statSync is not a function
at makeIt (node_modules/require-at/require-at.js:19:15)
at requireAt (node_modules/require-at/require-at.js:35:10)
at Object.<anonymous> (node_modules/mongoose/node_modules/mongodb/node_modules/optional-require/src/index.ts:318:64)
at Object.<anonymous> (node_modules/mongoose/node_modules/mongodb/node_modules/optional-require/index.js:8:13)
I have changed the mock of "fs" to the following hopeing it would resolve the error, and it is showing a different error instead.
const fs = { stat: jest.fn(), createReadStream: jest.fn(), statSync: jest.fn() };
The error
● should call fs.stat once
require-at: not a directory: 'C:\guardian group\c360-accounts'
at makeIt (node_modules/require-at/require-at.js:23:28)
at makeIt (node_modules/require-at/require-at.js:24:16)
at requireAt (node_modules/require-at/require-at.js:35:10)
at Object.<anonymous> (node_modules/mongoose/node_modules/mongodb/node_modules/optional-require/src/index.ts:318:64)
at Object.<anonymous> (node_modules/mongoose/node_modules/mongodb/node_modules/optional-require/index.js:8:13)
What's happening here, am I missing something?
Thanks in advance
I am using the source code from a security rules tutorial to attempt to do integration testing with Jest for my Javascript async function async_create_post, used for my firebase HTTP function create_post The files involved has a directory structure of the following:
Testing file: root/tests/handlers/posts.test.js
File to be tested: root/functions/handlers/posts.js
Helper code from the tutorial: root/tests/rules/helpers.js
And here is the source code that is involved:
posts.test.js
const { setup, teardown} = require("../rules/helpers");
const {
async_get_all_undeleted_posts,
async_get_post,
async_delete_post,
async_create_post
} = require("../../functions/handlers/posts");
describe("Post Creation", () => {
afterEach(async () => {
await teardown();
});
test("should create a post", async () => {
const db = await setup();
const malloryUID = "non-existent uid";
const firstPost = {
body: "First post from Mallory",
author_id: malloryUID,
images: ["url1", "url2"]
}
const before_post_snapshot = await db.collection("posts").get();
expect(before_post_snapshot.docs.length).toBe(0);
await async_create_post(firstPost); //fails at this point, expected to create a new post, but instead threw an error
const after_post_snapshot = await db.collection("posts").get();
expect(after_post_snapshot.docs.length).toBe(1);
});
});
posts.js
const {admin, db } = require('../util/admin');
//admin.initializeApp(config); //my credentials
//const db = admin.firestore();
const { uuid } = require("uuidv4");
const {
success_response,
error_response
} = require("../util/validators");
exports.async_create_post = async (data, context) => {
try {
const images = [];
data.images.forEach((url) => {
images.push({
uid: uuid(),
url: url
});
})
const postRecord = {
body: data.body,
images: images,
last_updated: admin.firestore.FieldValue.serverTimestamp(),
like_count: 0,
comment_count: 0,
deleted: false,
author_id: data.author_id
};
const generatedToken = uuid();
await db
.collection("posts")
.doc(generatedToken)
.set(postRecord);
// return success_response();
return success_response(generatedToken);
} catch (error) {
console.log("Error in creation of post", error);
return error_response(error);
}
}
When I run the test in Webstorm IDE, with 1 terminal running Firebase emulators:start , I get the following error message.
console.log
Error in creation of post TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string. Received an instance of Object
at validateString (internal/validators.js:120:11)
at Object.basename (path.js:1156:5)
at GrpcClient.loadProto (/Users/isaac/Desktop/project/functions/node_modules/google-gax/src/grpc.ts:166:23)
at new FirestoreClient (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/v1/firestore_client.js:118:38)
at ClientPool.clientFactory (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/index.js:330:26)
at ClientPool.acquire (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/pool.js:87:35)
at ClientPool.run (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/pool.js:164:29)
at Firestore.request (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/index.js:961:33)
at WriteBatch.commit_ (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/write-batch.js:485:48)
at exports.async_create_post (/Users/isaac/Desktop/project/functions/handlers/posts.js:36:5) {
code: 'ERR_INVALID_ARG_TYPE'
}
at exports.async_create_post (/Users/isaac/Desktop/project/functions/handlers/posts.js:44:13)
Error: expect(received).toBe(expected) // Object.is equality
Expected: 1
Received: 0
<Click to see difference>
at Object.<anonymous> (/Users/isaac/Desktop/project/tests/handlers/posts.test.js:59:45)
Error in creation of post comes from the console.log("Error in creation of post", error); in posts.js, so the error is shown in the title of this post.
I want to know why calling the async_create_post from posts.test.js will cause this error and does not populate my database with an additional record as expected behaviour. Do inform me if more information is required to solve the problem.
Here are some code snippets that may give more context.
helpers.js [Copied from the repository]
const firebase = require("#firebase/testing");
const fs = require("fs");
module.exports.setup = async (auth, data) => {
const projectId = `rules-spec-${Date.now()}`;
const app = firebase.initializeTestApp({
projectId,
auth
});
const db = app.firestore();
// Apply the test rules so we can write documents
await firebase.loadFirestoreRules({
projectId,
rules: fs.readFileSync("firestore-test.rules", "utf8")
});
// write mock documents if any
if (data) {
for (const key in data) {
const ref = db.doc(key); // This means the key should point directly to a document
await ref.set(data[key]);
}
}
// Apply the actual rules for the project
await firebase.loadFirestoreRules({
projectId,
rules: fs.readFileSync("firestore.rules", "utf8")
});
return db;
// return firebase;
};
module.exports.teardown = async () => {
// Delete all apps currently running in the firebase simulated environment
Promise.all(firebase.apps().map(app => app.delete()));
};
// Add extensions onto the expect method
expect.extend({
async toAllow(testPromise) {
let pass = false;
try {
await firebase.assertSucceeds(testPromise);
pass = true;
} catch (error) {
// log error to see which rules caused the test to fail
console.log(error);
}
return {
pass,
message: () =>
"Expected Firebase operation to be allowed, but it was denied"
};
}
});
expect.extend({
async toDeny(testPromise) {
let pass = false;
try {
await firebase.assertFails(testPromise);
pass = true;
} catch (error) {
// log error to see which rules caused the test to fail
console.log(error);
}
return {
pass,
message: () =>
"Expected Firebase operation to be denied, but it was allowed"
};
}
});
index.js
const functions = require('firebase-functions');
const {
async_get_all_undeleted_posts,
async_get_post,
async_delete_post,
async_create_post
} = require('./handlers/posts');
exports.create_post = functions.https.onCall(async_create_post);
The error message means that a method of the path module (like path.join) expects one of its arguments to be a string but got something else.
I found the offending line by binary search commenting the program until the error was gone.
Maybe one of your modules uses path and you supply the wrong arguments.
Documentation is extremely frustrating.
I'm using the upload widget to try to allow users to upload multiple pictures for their profile. I can't use unsigned uploads because of the potential for abuse.
I would much rather upload the file through the upload widget instead of through the server as it seems like it should be so simple
I've pieced together what I think should work but it is still saying: Upload preset must be whitelisted for unsigned uploads
Server:
// grab a current UNIX timestamp
const millisecondsToSeconds = 1000;
const timestamp = Math.round(Date.now() / millisecondsToSeconds);
// generate the signature using the current timestmap and any other desired Cloudinary params
const signature = cloudinaryV2.utils.api_sign_request({ timestamp }, CLOUDINARY_SECRET_KEY);
// craft a signature payload to send to the client (timestamp and signature required)
return signature;
also tried
return {
signature,
timestamp,
};
also tried
const signature = cloudinaryV2.utils.api_sign_request(
data.params_to_sign,
CLOUDINARY_SECRET_KEY,
);
Client:
const generateSignature = async (callback: Function, params_to_sign: object): Promise<void> => {
try {
const signature = await generateSignatureCF({ slug: 'xxxx' });
// also tried { slug: 'xxxx', params_to_sign }
callback(signature);
} catch (err) {
console.log(err);
}
};
cloudinary.openUploadWidget(
{
cloudName: 'xxx',
uploadPreset: 'xxxx',
sources: ['local', 'url', 'facebook', 'dropbox', 'google_photos'],
folder: 'xxxx',
apiKey: ENV.CLOUDINARY_PUBLIC_KEY,
uploadSignature: generateSignature,
},
function(error, result) {
console.log(error);
},
);
Let's all take a moment to point out how horrible Cloudinary's documentation is. It's easily the worst i've ever seen. Nightmare fuel.
Now that i've got that off my chest... I really needed to be able to do this and I spent way too long banging my head against walls for what should be extremely simple. Here it is...
Server (Node.js)
You'll need an endpoint that returns a signature-timestamp pair to the frontend:
import cloudinary from 'cloudinary'
export async function createImageUpload() {
const timestamp = new Date().getTime()
const signature = await cloudinary.utils.api_sign_request(
{
timestamp,
},
process.env.CLOUDINARY_SECRET
)
return { timestamp, signature }
}
Client (Browser)
The client makes a request to the server for a signature-timestamp pair and then uses that to upload a file. The file used in the example should come from an <input type='file'/> change event etc.
const CLOUD_NAME = process.env.CLOUDINARY_CLOUD_NAME
const API_KEY = process.env.CLOUDINARY_API_KEY
async function uploadImage(file) {
const { signature, timestamp } = await api.post('/image-upload')
const form = new FormData()
form.append('file', file)
const res = await fetch(
`https://api.cloudinary.com/v1_1/${CLOUD_NAME}/image/upload?api_key=${API_KEY}×tamp=${timestamp}&signature=${signature}`,
{
method: 'POST',
body: form,
}
)
const data = await res.json()
return data.secure_url
}
That's it. That's all it takes. If only Cloudinary had this in their docs.
Man. I hate my life. I finally figured it out. It literally took me beautifying the upload widget js to understand that the return of the function should be a string instead of an object even though the docs make it seem otherwise.
Here is how to implement a signed upload with a Firebase Cloud Function
import * as functions from 'firebase-functions';
import cloudinary from 'cloudinary';
const CLOUDINARY_SECRET_KEY = functions.config().cloudinary.key;
const cloudinaryV2 = cloudinary.v2;
module.exports.main = functions.https.onCall(async (data, context: CallableContext) => {
// Checking that the user is authenticated.
if (!context.auth) {
// Throwing an HttpsError so that the client gets the error details.
throw new functions.https.HttpsError(
'failed-precondition',
'The function must be called while authenticated.',
);
}
try {
return cloudinaryV2.utils.api_sign_request(data.params_to_sign, CLOUDINARY_SECRET_KEY);
} catch (error) {
throw new functions.https.HttpsError('failed-precondition', error.message);
}
});
// CLIENT
const uploadWidget = () => {
const generateSignature = async (callback: Function, params_to_sign: object): Promise<void> => {
try {
const signature = await generateImageUploadSignatureCF({ params_to_sign });
callback(signature.data);
} catch (err) {
console.log(err);
}
};
cloudinary.openUploadWidget(
{
cloudName: 'xxxxxx',
uploadSignature: generateSignature,
apiKey: ENV.CLOUDINARY_PUBLIC_KEY,
},
function(error, result) {
console.log(error);
},
);
};
I'm experiencing this timeout when trying to use electron to convert an html file to pdf. I'm running this js app through node.
`{ Error: Worker Timeout, the worker process does not respond after 10000 ms
at Timeout._onTimeout (C:\Users\Owner\Desktop\code\PDF-Profile-Generator\node_modules\electron-workers\lib\ElectronManager.js:377:21)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
workerTimeout: true,
message:
Worker Timeout, the worker process does not respond after 10000 ms,
electronTimeout: true }`
I do not know too much about electron. I have not been able to try too much to try to debug it. The js code is meant to generate an html file based on user input, pulling from a github profile. Then, that html file needs to be converted to a pdf file.
My js code is as follows:
const fs = require("fs")
const convertapi = require('convertapi')('tTi0uXTS08ennqBS');
const path = require("path");
const generate = require("./generateHTML");
const inquirer = require("inquirer");
const axios = require("axios");
const questions = ["What is your Github user name?", "Pick your favorite color?"];
function writeToFile(fileName, data) {
return fs.writeFileSync(path.join(process.cwd(), fileName), data);
};
function promptUser() {
return inquirer.prompt([
{
type: "input",
name: "username",
message: questions[0]
},
{
type: "list",
name: "colorchoice",
choices: ["green", "blue", "pink", "red"],
message: questions[1]
}
])
};
function init() {
promptUser()
.then(function ({ username, colorchoice }) {
const color = colorchoice;
const queryUrl = `https://api.github.com/users/${username}`;
let html;
axios
.get(queryUrl)
.then(function (res) {
res.data.color = color
const starArray = res.data.starred_url.split(",")
res.data.stars = starArray.length
console.log(res)
html = generate(res.data);
console.log(html)
writeToFile("profile.html", html)
})
var convertFactory = require('electron-html-to');
var conversion = convertFactory({
converterPath: convertFactory.converters.PDF
});
conversion({ file: './profile.html' }, function (err, result) {
if (err) {
return console.error(err);
}
console.log(result.numberOfPages);
console.log(result.logs);
result.stream.pipe(fs.createWriteStream(__dirname + '/profile.pdf'));
conversion.kill(); // necessary if you use the electron-server strategy, see bellow for details
});
// convertapi.convert('pdf', { File: './profile.html' })
// .then(function (result) {
// // get converted file url
// console.log("Converted file url: " + result.file.url);
// // save to file
// return result.file.save(__dirname + "/profile.pdf");
// })
// .then(function (file) {
// console.log("File saved: " + file);
// });
})
}
init();
I had a similar problem. I had installed multiple versions of Electron (electron, electron-html-to, electron-prebuilt), and the problem was resolved when I deleted the older versions in package.json so only one was left. The assumption is that they were interfering with each other.
So check the installed versions of electron, because the problem might be there rather than your code.
So I've been working with GraphQL uploads, and before stating my problem here's an overview for the tech stack that I am using:
Backend: Mongoose, Express, Apollo, GraphQL
Frontend: VueJS, Apollo, GraphQL
I'm using Apollo Upload Client to send the Upload files to the server side from the client. Since I am sending a list of files type scalar Upload from the client, I am receiving a list of promises that need to be resolved. On using Promise.all() I am getting the following error (which, weirdly, I wasn't getting before and I don't know why). If I upload more than one file, the first file just gets lost somewhere and the second file uploads.... But this isn't all the time. Sometimes it doesn't happen. Maybe I am not resolving or catering to the promises properly. Note that I also have to save the file name in MongoDB through Mongoose
{ BadRequestError: Request disconnected during file upload stream parsing.
at IncomingMessage.request.once (F:\repos\pushbox\node_modules\graphql-upload\lib\processRequest.js:245:35)
at Object.onceWrapper (events.js:285:13)
at IncomingMessage.emit (events.js:197:13)
at resOnFinish (_http_server.js:583:7)
at ServerResponse.emit (events.js:202:15)
at onFinish (_http_outgoing.js:683:10)
at processTicksAndRejections (internal/process/next_tick.js:74:9)
message: 'Request disconnected during file upload stream parsing.',
expose: true,
statusCode: 499,
status: 499 }
I have an HTML file input tag that takes multiple files and the mutation I use is:
async uploadFiles() {
// Check if input tag is empty
if (this.files.length === 0) {
this.uploadErrorAlert = true;
return;
}
// Mutation
this.isUploading = true;
await this.$apollo.mutate({
mutation: UPLOAD_FILES,
variables: {
files: this.files,
id: this.selectedCard.id,
},
})
.then(() => {
// clear files from the input tag
this.files = '';
this.$refs.selectedFiles.value = '';
this.isUploading = false;
})
.catch((err) => {
console.error(err);
});
},
And finally, the resolver on the server is:
/**
* Uploads files sent on disk and saves
* the file names in the DB
*
* #param {Object} attachments - List of files for a card
*
* #return {Boolean} - true if upload is
* successful
*/
uploadFiles: async (_, attachments, { controllers }) => {
Promise.all(attachments.files.map(async (file) => {
const { createReadStream, filename } = await file;
const stream = createReadStream();
/**
* We need unique names for every file being uploaded,
* so we use the ID generated by MongoDB and concat it
* to the filename sent by the user.
*
* Therefore we instantiate an attachment object to get an ID
*/
const attachment = await controllers.attachment.add({ id: attachments.id, file: '' });
const newFileName = `${attachment.id}_${filename}`;
const path = `${process.env.UPLOAD_DIR}/${newFileName}`;
await controllers.attachment.update({
id: attachment.id,
file: newFileName,
});
console.log(`reached for ${path}`);
// Attempting to save file in server
return new Promise((resolve, reject) => stream
.pipe(createWriteStream(path))
.on('finish', () => resolve())
.on('error', (error) => {
console.log('dude?');
if (stream.truncated) {
// Delete the truncated file
unlinkSync(path);
}
reject(error);
}));
})).then(() => {
pubsub.publish(ATTACHMENTS_ADDED, { attachmentsChanged: controllers.attachment.getAll() });
}).catch((err) => {
console.log(err);
});
},
Any help would be appreciated!
Okay so I don't know how I missed this issue here, but this right there is the solution! The issue is on the module's, that I am using, github issue forum.
So the problem is solved by using await before the Promise.all() function. So now the code inside the uploadFiles resolver looks like:
await Promise.all(attachments.files.map(async (file) => {
const { createReadStream, filename } = await file;
const stream = createReadStream();
/**
* We need unique names for every file being uploaded,
* so we use the ID generated by MongoDB and concat it
* to the filename sent by the user.
*
* Therefore we instantiate an attachment object to get an ID
*/
const attachment = await controllers.attachment.add({ id: attachments.id, file: '' });
const newFileName = `${attachment.id}_${filename}`;
const path = `${process.env.UPLOAD_DIR}/${newFileName}`;
await controllers.attachment.update({
id: attachment.id,
file: newFileName,
});
console.log(`reached for ${path}`);
// Attempting to save file in server
return new Promise((resolve, reject) => stream
.pipe(createWriteStream(path))
.on('finish', () => resolve())
.on('error', (error) => {
console.log('dude?');
if (stream.truncated) {
// Delete the truncated file
unlinkSync(path);
}
reject(error);
}));
})).then(() => {
pubsub.publish(ATTACHMENTS_ADDED, { attachmentsChanged: controllers.attachment.getAll() });
}).catch((err) => {
console.log(err);
});