How to upload folders to GCS from node js using client library - javascript

To upload files to google-cloud-storage (gcs) from node there is documented process like
const Storage = require('#google-cloud/storage');
const storage = new Storage();
{
const bucket = storage.bucket("bt-name");
const res = await bucket.upload("./output_images/dir/file.png", {
public: true
});
}
But I want to upload entire dir to gcs in said bucket bt-name, when I try to do so it throws promise rejection error -- EISDIR: illegal operation on a directory, read

It is not a built-in feature to upload a whole folder using the API or client library. If you would like to upload the whole folder, you would need your code to iterate through all the folder files in a loop and upload all of them one at a time, or you could use the recursive upload option with the command-line utility gsutil.

Related

Firebase admin when uploading files to storage through url: Error: ENOENT: no such file or directory

So I am trying to make a music app where people can upload music there. First, the client takes file and changes it to object url like this:
const track_src = URL.createObjectURL(track);
data.track_src = track_src;
await Req.post("/api/u/music/upload", data)
After that, the server receives the data and object url and uploads it to Firebase storage:
//track_src is the object url
await st.bucket(sid).upload(track_src, {
gzip: true,
metadata: {
cacheControl: 'public, max-age=31536000',
},
})
But I get error that says:
Error: ENOENT: no such file or directory, stat 'E:\Server\blob:http:\localhost:3000\91e53bb5-abf2-46b4-bd0c-268b242e93f3'
What you are trying to do is not possible. There are basically 2 methods of uploading files in Storage, you either have to :
Use the bucket().upload() method to upload, which accepts the path to your local file as the pathString parameter, so you need the actual file for this, not a url. This is the ideal option if you have the file stored locally and given the information you shared this might be the way to go for you. You can look at this answer for more information.
Use the bucket().file() to create an empty file in Storage and then use the file.createWriteStream() method to get a stream that can write to the file content. This can be a valid solution if you have the file in memory.
I would suggest you to take a look at this documentation for the methods offered for the bucket class and this documentation for the methods offered for the file class.

How to Use "file-type" NPM Module Client-Side?

I'm attempting to use the "file-type" NPM module (which I have working on the server) client side to validate mime type prior to a file upload to an S3 bucket.
The readme for the module includes an example of using it in the browser:
const FileType = require('file-type/browser');
const url = 'https://upload.wikimedia.org/wikipedia/en/a/a9/Example.jpg';
(async () => {
const response = await fetch(url);
const fileType = await FileType.fromStream(response.body);
console.log(fileType);
//=> {ext: 'jpg', mime: 'image/jpeg'}
})();
That obviously won't work directly in the browser due to the "require", and if I link to the NPM file directly, I get:
Uncaught ReferenceError: require is not defined
So I've tried using Webpack (with which I'm not well versed at all, but have followed the official Webpack tutorial, and a few other tutorials) to create a "main.js" file and then accessed that via a script tag.
For example, running this through Webpack:
import * as mimeFromBuffer from 'file-type/browser.js';
export function mimeTyping(mimeValidationBytes){
(async () => {
const fileType = await mimeFromBuffer(mimeValidationBytes);
console.log(fileType);
//=> {ext: 'jpg', mime: 'image/jpeg'}
})();
return;
}
Results in this when I call the mimeTyping client side:
Uncaught ReferenceError: mimeTyping is not defined
I've tried Vinay's answer from this question in the browser:
import fileType from 'file-type';
const blob = file.slice(0, fileType.minimumBytes);
const reader = new FileReader();
reader.onloadend = function(e) {
if (e.target.readyState !== FileReader.DONE) {
return;
}
const bytes = new Uint8Array(e.target.result);
const { ext, mime } = fileType.fromBuffer(bytes);
// ext is the desired extension and mime is the mimetype
};
reader.readAsArrayBuffer(blob);
Which gets me this:
Uncaught SyntaxError: Cannot use import statement outside a module
I've researched that error a bit, with a common solution being to add "type":"module" to package.json. This did not help (error was unchanged).
I found a similar question in the module's Github repository, which suggests:
Although this module is primary designed for Node.js, using it in the
browser is possible as well. That is indeed where 'file-type/browser'
intended for. It provides the right dependencies and right functions
to the JavaScript module bundler. Some dependencies, which are
typically present in a Node.js environment, but are missing in a
browser environment, you may need to pass (polyfill) to your module
bundler. Although in some cases it may a bit tricky to configure it,
note that this is a pretty common task handled by a module bundler.
I'm struggling to understand what next steps that suggests.
Another comment at the bottom of that thread indicates the author had success with:
import { fromBuffer } from 'file-type/core';
...
const buffer = await uploadedFile.arrayBuffer();
const types = await fromBuffer(buffer);
I'm not sure how to implement that, and what other code I need (I'm guessing this gets passed to Webpack, and probably requires an export statement, and then an import on the client side?)
I've tried passing this into Webpack:
const fileType = require('file-type/browser');
module.exports = fileType;
But linking to the output file again gets:
Uncaught SyntaxError: Cannot use import statement outside a module
I think I understand conceptually what I need to do: pass the NPM module to Webpack, which in turn parses it and finds any dependencies, gets those dependencies, and creates a JavaScript file I can use client side. It seems I'm doing something in there wrong.
I've spent days trying to understand how to use NPM modules client side (still quite foggy in my mind) and trying all sorts of variants on the above code - would really appreciate some guidance (first time posting a question here - please go easy on me!).
Thank you!
Edit: I don't think this is a duplicate - I did review How to import JS library from node_modules, Meteor Npm-module client-side?, how to use node.js module system on the clientside, How to call a server-side NodeJS function using client-side JavaScript and Node Js application in client machine but no suggestion I tried in any of those seemed to help.
Finally got this working. In case anyone else is stuck on this, here's an explanation (apologies for the lack of brevity - probably this should be a blog post...).
To flesh out the use case a bit further, I'm using Uppy to allow users to upload files to an AWS S3 bucket. The way this works is that, when the user uploads a file, Uppy makes a call to my server where an AWS pre-signed URL is generated and passed back to the client. The client then uses that pre-signed URL to upload the file directly to the S3 bucket, bypassing the server, such that the file doesn't pass through the server at any point.
The problem I was attempting to solve was that files missing an extension ended up uploaded with the content / MIME type set as "application/octet", because it seems the browser, Uppy, and S3 all rely on the file extension to decide the file type (rather than parsing the so-called "magic bytes" of the file), and if the file extension is missing, AWS defaults to "application/octet". This causes issues when users attempt to open the file, as they are not handled correctly (i.e. a png file without an extension and with an "application/octet" content / MIME type opens a download dialog rather than being previewed, etc.). I also want to validate the MIME type / file type in cases even where the extension exists so that I can exclude certain types of files, and so the files get handled appropriately when they are later downloaded (where the MIME type will again be validated) if an incorrect file extension is used.
I use the "file-type" NPM module to determine the mimetype server side, and that's straight forward enough, but changing the file's content type / MIME type when generating the AWS pre-signed URL is not enough to fix the problem - it still gets uploaded as "application/octet". I wanted to use the same module client side so we get the exact same results on the client as on the server, and needed in any case to determine the MIME type and set it accordingly pre-upload but post-pre-signed URL. I had no idea how to do this (i.e. use "file-type" client side - the meat of my question).
I finally gave up on Webpack - nothing I tried worked. So I switched to Browserify, and the sample browser code at the "file-type" repository worked at once! So then I was left trying to figure out how to pass a function through Browserify to use in the client side code.
This proved impossible for me - I couldn't figure out how to pass the asynchronous IIFE through to my code. So instead, I moved my Uppy code into the code I pass to Browserify:
// Get the dependency, which we've added to node via "npm install file-type":
const FileType = require('file-type/browser');
// When the user adds a file for upload to Uppy...
uppy.on('file-added', (file) => {
// 1. Create a filereader:
const filereader = new FileReader();
filereader.onloadend = function(evt) {
// 4. Once the filereader has successfully finished reading the file...
if (evt.target.readyState === FileReader.DONE) {
// Get the unsigned 8 bit int8Array (ie Uint8Array) of the 600 bytes (this looks like '[119,80,78,71...]'):
const uint = new Uint8Array(evt.target.result);
// Determine the mime type using the "file-type" NPM package:
(async () => {
// Pass in our 600 bytes ("uint"):
const fileType = await FileType.fromBuffer(uint);
console.log(fileType); // outputs => {ext: 'jpg', mime: 'image/jpeg'}
// Do some validation here...
//
// Assign the results to the file for upload - we're done!:
file.extension = fileType.ext;
file.meta.type = fileType.mime;
file.type = fileType.mime;
})();
}
}
// 2. Grab the first 600 bytes of the file for mime type analysis server side - most mime
// types use the first few bytes, but some (looking at you, Microsoft...) start at the
// 513th byte; ISO CD files start at the 32,770th byte, so we'll ignore those rather than
// send that much data for each file - users of this system certainly aren't expected to
// upload ISO CD files! Also, some .zip files may start their identifying bytes at the
// 29,153nd byte - we ignore those too (mostly, .zip files start at the first, 31st, or 527th byte).
const blob = file.data.slice(0, 600);
// 3. Start reading those 600 bytes...continues above at 'filereader.onloadend':
filereader.readAsArrayBuffer(blob);
})
That all goes into a file I call "index.js" and then, having installed Browserify at the command line via "npm install -g browserify", I use this at the command line to create the file ("main.js") I link to in my client side code:
browserify index.js -o main.js

Load local file using Netlify functions

I've written a script which takes a JSON file and outputs it to an API endpoint using Netlify's Functions feature (https://functions.netlify.com/). For the most part, this works without a hitch, however, one of my endpoints has a lot of text and for ease of editing, I've split the large text blocks into markdown files which I then loaded into the endpoint.
Locally, this works perfectly, but when deployed I get a console error saying Failed to load resource: the server responded with a status of 502 (). I presume this is because I used a node fs method and Netlify doesn't allow that, however, I can't find any information about this.
The code I've used is here:
const marked = require('marked')
const clone = require('lodash').cloneDeep
const fs = require('fs')
const resolve = require('path').resolve
const data = require('../data/json/segments.json')
// Clone the object
const mutatedData = clone(data)
// Mutate the cloned object
mutatedData.map(item => {
if (item.content) {
const file = fs.readFileSync(resolve(`./src/data/markdown/${item.content}`), 'utf-8')
item.content = marked(file)
}
})
exports.handler = function(event, context, callback) {
callback(null, {
statusCode: 200,
body: JSON.stringify({data: mutatedData})
});
}
I've also attempted to replace
const file = fs.readFileSync(resolve(`./src/data/markdown/${item.content}`), 'utf-8')
with
const file = require(`../data/markdown/${item.content}`)
but that complains about a loader and I'd like to avoid adding webpack configs if possible as I'm using create-react-app, besides, I doubt it will help as I'd still be accessing the file-system after build time.
Has anyone else come across this issue before?
At the time when this answer is written (September 2019), Netlify does not seem to upload auxiliary files to AWS Lambda, it appears that only the script where the handler is exported will be uploaded. Even if you have multiple scripts exporting multiple handlers, Netlify seems to upload them into isolated "containers" (different AWS instances), which means the scripts will not be able to see each other in relative paths. Disclaimer: I only tested with a free account and there could be settings that I'm not aware of.
Workaround:
For auxiliary scripts, make them into NPM packages, add into package.json and require them in your main script. They will be installed and made available to use.
For static files, you can host them on Netlify just like before you have AWS Lambda, and make http requests to fetch the files in your main script.

How to download files and store them to s3 by nodejs (through heroku)?

I want to know how to download the files to aws-s3 through heroku, and dosen't use the web page.
I'm using wget in Nodejs to download audio files, and wanting to re-encoding them by ffmpeg. Though I have to store the file first, but heroku doesn't provide the space to store.
Then I find the way that heroku can connect with aws-s3, however the sample code is the page for user to upload, and I'm just want to using codes.
This is the code that haven't connected with s3 yet.
const wget = spawn('wget',['-O', 'upload_audio/'+ sender_psid +'.aac', audioUrl]);
//spit stdout to screen
wget.stdout.on('data', function (data) { process.stdout.write(data.toString()); });
// //spit stderr to screen
wget.stderr.on('data', function (data) { process.stdout.write(data.toString()); });
wget.on('close', function (code) {
console.log(process.env.PATH);
const ffmpeg = spawn('ffmpeg',['-i', 'upload_audio/'+ sender_psid +'.aac', '-ar', '16000', 'upload_audio/'+ sender_psid +'.wav', '-y']);
...
});
And it comes the error:
upload_audio/1847432728691358.aac: No such file or directory
...
Error: ENOENT: no such file or directory, open 'upload_audio/1847432728691358.wav'
How can I solve the problem?
Could anyone give some suggest plz?
Maybe won't use s3 can get it?
I don't recommend using S3 until you've tried the NodeJS Static Files API. On the server, that looks like this:
app.use(express.static(path.join(__dirname, 'my_audio_files_folder')));
More on NodeJS's Static Files API available here.
AWS S3 seems to be adding unneeded complexity, but if you have your heart set on it, have a look at this thread.

Uploading image from React-Native app to Firebase storage

In React-Native app how can i upload an image from user' device to Firebase storage? I'm doing like:
var file = {
uri: 'file:///Users/...../LONG_PATH/....../Documents/images/name.jpg'
}
var storageRef = this.firebase.storage().ref();
var uploadTask = storageRef.child('images/' + file.name).put(file);
But it throws an error:
Possible Unhandled Promise Rejection (id: 0):
Firebase Storage:
Invalid argument in put at index 0: Can't find variable: Blob
In this post, software engineer said:
React Native does not support the File and Blob types, so Firebase Storage uploads will not work in this environment. File downloads do work however.
But I found this module wich can transform file to blob types. Maibe can help in your solution
I found this, hopefully this helps. Functional React-Native with Firebase storage Example

Categories

Resources