Create File or Blob from local path in Javascript - javascript

I have extensively and systematically searched for an answer in stack overflow but haven't been able to find one that fits my needs.
I am trying to upload a number of files to Firebase Storage, which requires a File or Blob object.
var file = ... // use the Blob or File API
ref.put(file).then(function(snapshot) {
console.log('Uploaded a blob or file!');
});
I have a folder in my project with all the files I want to upload, and I'm trying to create such objects with their paths. However, none of my attempts have worked.
I tried importing the file:
let file = require('./Images/imagename.jpg');
and I researched using 'fs', the File API and other options, but none seem to have a way for me to get the file into an object using only the path.
In short: is there any simple way to get the object from a local path?

Here is how you can upload a file from the drive to Firebase Storage:
let bucket = admin.storage().bucket();
let uploadRes = await bucket.upload(filePath, options);
You can find a description of the option in the google cloud storage docs.
You will most likely create a key in the google cloud console with permissions and export the file as a json. You will find this in the Google Cloud Platform console -> IAM & admin -> Service accounts -> Create service account (create it with a key). Once you exported the json, set the environment variable like this:
export GOOGLE_APPLICATION_CREDENTIALS='./the_exported_file.json'
PLEASE STORE THIS FILE SECURELY ON YOUR SERVER AS IT HAS READ AND WRITE ACCESS!
Below is a full example of an upload function that also saves the file under it's hash name.
<!-- language: typescript -->
import admin from "firebase-admin";
import path from 'path'
const sha256File = require('sha256-file');
const firebaseConfig = {
apiKey: "...",
authDomain: "abc.firebaseapp.com",
databaseURL: "https://abc.firebaseio.com",
projectId: "abc",
storageBucket: "abc.appspot.com",
messagingSenderId: "123",
appId: "1:123:web:xxx",
measurementId: "G-XXX"
};
admin.initializeApp(firebaseConfig);
async function uploadFile(filePath: string, uploadFolder: string, contentType: string, hash: string | undefined = undefined,)
: Promise<string> {
if (!hash) {
hash = await sha256File(filePath);
}
let bucket = admin.storage().bucket();
const ext = path.extname(filePath);
const uploadPath = uploadFolder + hash + ext;
const options = {destination: uploadPath};
console.debug("starting upload");
let uploadRes = await bucket.upload(filePath, options);
console.debug("finished upload");
let newMetadata = {
contentType: contentType
};
if(uploadRes) {
let file = uploadRes[0];
file.setMetadata(newMetadata).then(() => {
// Updated metadata for 'images/forest.jpg' is returned in the Promise
}).catch(function (error) {
console.error(error)
});
}
return uploadPath;
}

This should work in recent versions of Node.js:
import fs from "fs";
import { Blob } from "buffer";
let buffer = fs.readFileSync("./your_file_name");
let blob = new Blob([buffer]);

First of all tell me that which technology you're using for front-end.
If you are using angular than you just need to get $event every time like change event.
than you need to create FormData object.
Pass that object into node and node side use multer for storing the file.
if you need the demo let me know I can help you....

Related

How to upload images to firebase without using an HTML form?

How do I upload images to firebase without an HTML form, I need to use code only.
I have tried some ways myself but the files fail to preview they're corrupted I guess.
I'm developing my App in React and I need a way to upload images to firebase without an HTML form
I tried:
await uploadBytes(storageRef, '../images/image.jpg')
I also tried:
const metadata ={ contentType:'image/jpeg' }
and also
const metadata ={
contentType:'image/svg+xml'
}
await uploadBytes(storageRef, '../images/image.jpg', metadata)
There is no way to upload a file to Cloud Storage for Firebase with just a local path as you do here uploadBytes(storageRef, '../images/image.jpg').
If you pass a string as the second argument, you will have to call uploadString and the second argument has to be the base64 encoded data that you want to upload.
If you want to upload a file based on its path, you will have to either create a File reference to that file, or read its data into a Blob and then pass that to uploadBytes.
All of these are covered in the Firebase documentation on uploading data, so I recommend keeping that handy.
use the Firebase SDK for Cloud Storage and the FileReader API in JavaScript.
import { getStorage, ref, uploadBytes } from "firebase/storage";
const storage = getStorage();
const handleFileUpload = (file) => {
const storageRef = ref(storage, "images/" + file.name);
const metadata = {
contentType: file.type,
};
const reader = new FileReader();
reader.readAsArrayBuffer(file);
reader.onload = async (event) => {
const buffer = event.target.result;
await uploadBytes(storageRef, buffer, metadata);
};
};

Consuming BigQuery API Client through middleware and locating secrets.json files

I want to use a secrets folder to store my bigquery api client secrets json and use BigQuery client library for node to make queries to the database. But in the documentations, it shows way to load credentials from a json file and not directly.
I am using typescript to query a table and append row, In NEXT js, I am a littlebit confused about how to access the path of a secrets folder that's not going to be exposed to client side and stays in for my middleware.
Here's the function for the same:
import type { NextApiHandler } from 'next';
import axios from 'axios';
import { BigQuery } from '#google-cloud/bigquery';
import bqSecrets from '../../../bigquery/keys.json';
const options = {
keyFilename: '../../../secrets/keys.json',
projectId: 'XXXXXXXXXXXXXXXXXXXXXX',
};
const bigquery = new BigQuery(options);
const submitUserData: NextApiHandler = async (request, response) => {
const { geoData, googleData, post, userRole } = request.body;
if (!post) response.json({ error: false, msg: 'Not Required' });
else {
delete googleData.isAuthenticated;
try {
const rep = await bigquery
.dataset('stackconnect')
.table('googleoAuth')
.insert([requestPayload]);
console.log(await rep);
response.json({ error: false, msg: 'Success' });
} catch (e) {
console.log('Error = ', e);
response.json({ error: true, msg: 'Error' });
}
}
};
But this throws an error saying: File not Found. Can someone help me identify about how to either locate the file in NEXT or how can I directly pass the JSON data using BQ Node Client?
Additionally, I want to know in NEXT, which is the ideal place to store secrets that are not exposed to client side?
After digging into the source code, I realized that there's an attribute called credentials, and so I was able to make it work by doing the following modification:
import bqSecrets from '../../../bigquery/keys.json';
const options = {
credentials: bqSecrets,
projectId: 'project_id',
};
const bigquery = new BigQuery(options);
I hope this helps anyone who's looking to directly inject the json credentials to the BigQuery Node Client.
Additionally, I am still not very sure about the file structure of NEXT as I still don't know if any other assets than those placed under public folder are exposed to client side or not.

Upload files to firebase storage via function

I am trying to allow ability for user to select file and pass to firebase function to store in storage.
I am uploading file in react client like following:
const formData = new FormData();
formData.append("myFile", aFile);
const aRequesObject= {
method: "POST",
body: formData,
};
const response = await fetch(aUrl, aRequesObject);
Then I have a serverless function like following where i want to save this file to cloud storage.
import firebase from "firebase";
import "firebase/storage";
import { config } from "./Config";
firebase.initializeApp(config);
const file = request.body.myFile;
const ref = firebase.storage().ref().child(file.name);
ref.put(file).then(() => {
console.log("Uploaded file", file.name);
}); */
I have tried several variations from firebase documentation. All the examples i have found are uploading directly to storage from client as opposed to passing file to function and extracting from the request and then saving to storage. I am looking for a simple example of this or a link to where someone has done this scenario.

firebase storage: storage.ref is not a function

I would like to use storage service from Firebase with a nodeJS api (hosted on "firebase functions") to allow the users to upload his avatars.
So I read the doc from https://firebase.google.com/docs/storage/web/start
and I do:
admin.js
const admin = require('firebase-admin');
const config = require('./config.js');
admin.initializeApp(config);
const db = admin.firestore();
const storage = admin.storage();
module.exports = { admin, db, storage };
user.js
const { admin, db, storage } = require('../util/admin');
exports.postAvatar = async (request, response) => {
const storageRef = storage.ref();
}
but I have the following error: storage.ref is not a function
Is something is missing from the documentation ?
The console.log of storage const is:
Storage {
INTERNAL: StorageInternals {},
storageClient: Storage {...},
appInternal: FirebaseApp {...}
}
admin.storage() returns a Storage object. If you want to use it to refer to a file in your default storage bucket, you should use its bucket() method with no parameters, and it will give you a Bucket object from the Google Cloud nodejs SDK.
There are no methods called ref() anywhere in that SDK. It's not much like the JavaScript web client SDK. You will have to learn a different but similar API to work with content in using the Cloud Storage node SDK. The Admin SDK just essentially wraps this API.
const file = storage.bucket().file('/path/to/file');
Try below code.
const storage = storage.bucket() // you can also put your bucket-id from config
const storageRef = storage.ref()
Also check this answer. TypeError: firebase.storage is not a function

How to restrict re-uploading after uploading of file using singed url in google cloud store in Node.js?

I am able to create siged url for uploading file to google cloud store with help of example given at
https://github.com/googleapis/nodejs-storage/blob/master/samples/generateV4UploadSignedUrl.js
var {Storage} = require('#google-cloud/storage')
var storage = new Storage({
projectId: "projectId",
credentials: {
client_email: "clientEmail",
private_key: "privateKey"
}
})
var generateUploadSignedUrl = async function(bucketName, remoteFilename, expires) {
const options = {
version: 'v4',
action: 'write',
expires: expires,
contentType: 'application/octet-stream',
}
var url = await storage.bucket(bucketName).file(remoteFilename).getSignedUrl(options)
return url
}
I am able to use singed url for uploading file.
But I want to put restriction that after uploading I should not be able to upload again. How can I add such policy?
I gave look on policy-document but could not find relevent condition for restricting re-upload.
This is not possible to do it.
If you use gsutil, you can use the flag -n. As this documentation says, "When specified, existing files or objects at the destination will not be overwritten. Any items that are skipped by this option will be reported as being skipped".
On the other hand, using the client library, there is nothing like that. Nevertheless, first, you can check if the file exists, and if not, copy it.

Categories

Resources