In React-Native app how can i upload an image from user' device to Firebase storage? I'm doing like:
var file = {
uri: 'file:///Users/...../LONG_PATH/....../Documents/images/name.jpg'
}
var storageRef = this.firebase.storage().ref();
var uploadTask = storageRef.child('images/' + file.name).put(file);
But it throws an error:
Possible Unhandled Promise Rejection (id: 0):
Firebase Storage:
Invalid argument in put at index 0: Can't find variable: Blob
In this post, software engineer said:
React Native does not support the File and Blob types, so Firebase Storage uploads will not work in this environment. File downloads do work however.
But I found this module wich can transform file to blob types. Maibe can help in your solution
I found this, hopefully this helps. Functional React-Native with Firebase storage Example
Related
So I am trying to make a music app where people can upload music there. First, the client takes file and changes it to object url like this:
const track_src = URL.createObjectURL(track);
data.track_src = track_src;
await Req.post("/api/u/music/upload", data)
After that, the server receives the data and object url and uploads it to Firebase storage:
//track_src is the object url
await st.bucket(sid).upload(track_src, {
gzip: true,
metadata: {
cacheControl: 'public, max-age=31536000',
},
})
But I get error that says:
Error: ENOENT: no such file or directory, stat 'E:\Server\blob:http:\localhost:3000\91e53bb5-abf2-46b4-bd0c-268b242e93f3'
What you are trying to do is not possible. There are basically 2 methods of uploading files in Storage, you either have to :
Use the bucket().upload() method to upload, which accepts the path to your local file as the pathString parameter, so you need the actual file for this, not a url. This is the ideal option if you have the file stored locally and given the information you shared this might be the way to go for you. You can look at this answer for more information.
Use the bucket().file() to create an empty file in Storage and then use the file.createWriteStream() method to get a stream that can write to the file content. This can be a valid solution if you have the file in memory.
I would suggest you to take a look at this documentation for the methods offered for the bucket class and this documentation for the methods offered for the file class.
I'm trying to obtain a BLOB from a request. The request object is constructed using FormData in Angular.
const buffer = fs.readFileSync(fileFromRequest);
The code above returns an error:
Error: ENOENT: no such file or directory, open '[object File]'
I can't find any resource to read/parse [object File].
Hope you guys can help me on this. Many thanks!
When upload a file using angular to node.js you cant use this
const buffer = fs.readFileSync(fileFromRequest);
The fs module only use to read to local file from the server. To handle upload data you can use multer
I want to know how to download the files to aws-s3 through heroku, and dosen't use the web page.
I'm using wget in Nodejs to download audio files, and wanting to re-encoding them by ffmpeg. Though I have to store the file first, but heroku doesn't provide the space to store.
Then I find the way that heroku can connect with aws-s3, however the sample code is the page for user to upload, and I'm just want to using codes.
This is the code that haven't connected with s3 yet.
const wget = spawn('wget',['-O', 'upload_audio/'+ sender_psid +'.aac', audioUrl]);
//spit stdout to screen
wget.stdout.on('data', function (data) { process.stdout.write(data.toString()); });
// //spit stderr to screen
wget.stderr.on('data', function (data) { process.stdout.write(data.toString()); });
wget.on('close', function (code) {
console.log(process.env.PATH);
const ffmpeg = spawn('ffmpeg',['-i', 'upload_audio/'+ sender_psid +'.aac', '-ar', '16000', 'upload_audio/'+ sender_psid +'.wav', '-y']);
...
});
And it comes the error:
upload_audio/1847432728691358.aac: No such file or directory
...
Error: ENOENT: no such file or directory, open 'upload_audio/1847432728691358.wav'
How can I solve the problem?
Could anyone give some suggest plz?
Maybe won't use s3 can get it?
I don't recommend using S3 until you've tried the NodeJS Static Files API. On the server, that looks like this:
app.use(express.static(path.join(__dirname, 'my_audio_files_folder')));
More on NodeJS's Static Files API available here.
AWS S3 seems to be adding unneeded complexity, but if you have your heart set on it, have a look at this thread.
I am currently working on creating a sort of virus scanning function.
What I am trying to do: When a file is uploaded to a specified blob storage container, an Azure Function is triggered. That function will scan the file for viruses, and if clean, will move the file to another blob storage container.
I have created an Azure Function that is triggered when a blob is created (i.e. when a file is uploaded), but I cannot figure out how to integrate virus scanning into the mix.
I tried using ClamAV.js, but I couldn't get it working. I am not sure how to install ClamAV (the daemon) so it is usable by an Azure Function, so that likely contributed to why it didn't work. Also, I am not sure how to install npm packages (in an Azure Function), so I had to upload the actual js file from the package to the function and then import it. Not sure if that's even valid...
I have tried using AttachmentScanner, but I couldn't get that working within an Azure Function (more specifically, I couldn't get the function to send a POST request).
One major problem I am facing that I don't think I can get around: how do I use npm packages in an Azure Function? Can I npm install them somewhere? Can I just download the package and manually upload the js file to the Azure Function and import it that way?
Here is my attempt at using AttachmentScanner:
module.exports = async function (context, myBlob) {
var req = new XMLHttpRequest();
req.open( "POST", "https://beta.attachmentscanner.com/requests", false );
req.headers({
"authorization": "bearer [OMITTED]",
"content-type": "application/json"
});
req.type("json");
req.send({
"url": context.bindingData.uri //"http://www.attachmentscanner.com/eicar.com"
});
req.end(function (res) {
if (res.error) throw new Error(res.error);
context.log(req.responseText);
});
context.log("JavaScript blob trigger function processed blob \n Name:", context.bindingData.name, "\n Blob Size:", myBlob.length, "Bytes");
context.log("context");
context.log(context);
context.log("myBlob");
context.log(myBlob);
};
That produces an error: Exception: ReferenceError: XMLHttpRequest is not defined
With the following function, I can detect the blob and print information about it:
module.exports = async function (context, myBlob) {
context.log("JavaScript blob trigger function processed blob \n Name:", context.bindingData.name, "\n Blob Size:", myBlob.length, "Bytes");
context.log("context");
context.log(context);
context.log();
context.log("myBlob");
context.log(myBlob);
};
Any help is appreciated!
First of all, I'm sure you can not install ClamAV into Azure Functions, so you need to create a Linux VM to install it.
Next, you can follow the offical quickstart tutorials like for Visual Studio Code, Azure CLI, Python or Linux to install Azure Functions Core Tool on your local environment for Windows or Linux to create a func project for Node.js and publish it to Azure.
Finally, here is my own thoughts for your needs. You can tried to use Azure Function with Blob Trigger to generate the url with sas token for a blob which need to be scanned. There is a code sample Node.js Azure Function for generating SAS tokens which you refer to know how to do. And then, to pass the blob url with sas token to the ClamAV in VM via a Node.js server with ClamAV.js to scan it with HTTP stream.
Of couse, you can integrate ClamAV.js with Azure Functions, but I think to scan a big file for a long time is not a good idea for a server-less architecture like Azure Functions. Hope it helps.
To upload files to google-cloud-storage (gcs) from node there is documented process like
const Storage = require('#google-cloud/storage');
const storage = new Storage();
{
const bucket = storage.bucket("bt-name");
const res = await bucket.upload("./output_images/dir/file.png", {
public: true
});
}
But I want to upload entire dir to gcs in said bucket bt-name, when I try to do so it throws promise rejection error -- EISDIR: illegal operation on a directory, read
It is not a built-in feature to upload a whole folder using the API or client library. If you would like to upload the whole folder, you would need your code to iterate through all the folder files in a loop and upload all of them one at a time, or you could use the recursive upload option with the command-line utility gsutil.