VueJS/AdonisJs image upload and usage - javascript

I'm building a webapp that needs to display some images, the frontend is build with VueJS and the backend is build with AdonisJS.
I'm currently having a problem that I'm uploading images from my frontend to my backend. AdonisJS generates a storage path that is local. As example, I upload from my frontend in this form:
Input form
That uses this code on the VueJS side:
let formData = new FormData();
let imagefile = document.querySelector('#file');
formData.append("image", imagefile.files[0]);
axios.post('/users/' + this.user.id + "/image", formData, {
headers: {
'Content-Type': 'image/*'
}
})
And on the AdonisJS side:
* updateProfilePicture(request, response) {
const image = request.file('image', {
maxSize: '20mb',
allowedExtensions: ['jpg', 'png', 'jpeg']
})
const userId = request.param('id');
const user = yield User.findOrFail(userId)
const fileName = `${new Date().getTime()}.${image.extension()}`
yield image.move(Helpers.storagePath(), fileName)
if (!image.moved()) {
response.badRequest(image.errors())
return
}
user.profilepicture = image.uploadPath()
yield user.save();
response.ok(user);
}
Which is working at the moment, but that generates a path that is used by AdonisJS:
ProjectFolder/backend/storage/1500586654324.jpg
VueJS is located in:
ProjectFolder/frontend/*
How can I use my uploaded images in the frontend? Is there some way that these frameworks can be coupled?

There's multiple way to get this image accessible via the browser.
You create a route within Adonis to handle a "media" route (like ~/media/1500586654324.jpg). This route will take the image ID and send the correspondant image inside your storage folder.
You don't upload your image to your storage folder and instead put them directly into the public folder of your application that means you can access the image directly via its URL.
I prefer using the first option since my public directory is 100% generated via script.

Related

How to upload an image of File type to Firebase Storage from Node.js with the Admin SDK

I have Angular running on the FrontEnd and Firebase Admin SDK for Node.js on the BackEnd.
What I want to achieve is to allow the user to select an image from his computer, using a simple <input> of type file. When I receive the user image which is of type File on the Angular side, I want to send this to my Node.js server and let him upload it to the Firebase Storage.
Here's how I'm sending the image to Node.js:
method(imageInput): void {
const image: File = imageInput.files[0];
const reader = new FileReader();
reader.addEventListener('load', (event: any) => {
const imageData = {
source: event.target.result,
file: image
}
this.myService.uploadImage(imageData.file).subscribe(
(res) => {
// image sent successfully
},
(err) => {
// error
})
});
reader.readAsDataURL(image);
}
So on the Node.js side I don't see a way to upload this image.
I'm trying:
admin.storage().bucket().upload(imageFromAngular, { --> Here's the problem
destination: "someDestination/",
contentType: "image/png",
metadata: {
contentType: "image/png"
}
}).then(() => {
// send successful response
}).catch(err => {
// send error response
});
The issue comes from the fact that the upload method only takes as a parameter the path to the image and the options. However in this case I can't pass the path to the image, rather I can pass the image itself. I read this - https://googleapis.dev/nodejs/storage/latest/ but I couldn't find anything that would suit my needs.
What would be the correct way to do this ?
Update:
Here's a more detailed explanation to the approach I took:
I'm using the arrayBuffer method of the image File inside my method. This method returns a promise of type ArrayBuffer. I get the value and send it to my Server.
The Server uses Buffer.from(ArrayBuffer, 'base64') to convert the data and then I can safely use the save API (https://googleapis.dev/nodejs/storage/latest/File.html#save).
To get the image later on I use download - (https://googleapis.dev/nodejs/storage/latest/File.html#download).
You can write a byte stream (or a Buffer) to Cloud Storage.
createWriteStream() API for streaming data to Cloud Storage: https://googleapis.dev/nodejs/storage/latest/File.html#createWriteStream
save() API for writing buffered data to Cloud Storage: https://googleapis.dev/nodejs/storage/latest/File.html#save

How can I upload files to google drive that are in a url?

I try to upload a photo that I have in a URL on another server, but it does not work for me or I do not know how to upload them in this case I am going to upload a photo but I also want to upload files that will upload to that URL.
const img = await fetch("http://example.com/api/photo")
await gapi.client.drive.files.create({
resource: {
name: "New Folder",
body: img,
}
})
The simple anwser is you cant do it like that. The file being Uploaded must be sent in the form of a stream
Download the file to your own machine and then upload it from there. Or try to figure out how to turn your url into a stream.
var fileMetadata = {
'name': 'photo.jpg'
};
var media = {
mimeType: 'image/jpeg',
body: fs.createReadStream('files/photo.jpg')
};
drive.files.create({
resource: fileMetadata,
media: media,
fields: 'id'
}, function (err, file) {
if (err) {
// Handle error
console.error(err);
} else {
console.log('File Id: ', file.id);
}
});
I believe your goal as follows.
You want to download an image data from an URL, and want to upload the downloaded image data to Google Drive.
From your script, the image data is downloaded by const img = await fetch("http://example.com/api/photo").
You want to achieve this using googleapis for Javascript.
Modification points:
In this case, it retrieves Blob of image data from fetch, and the blob is uploaded to Google Drive.
Unfortunately, in the current stage, it seems that although googleapis for Javascript can create new file with the metadata, the file content cannot be included. By this, in this answer, I use the method of this thread. The downloaded image data is uploaded using fetch with multipart/form-data.
When above poiints are reflected to your script, it becomes as follows.
Modified script:
const img = await fetch("http://example.com/api/photo").then((e) => e.blob());
const fileMetadata = {name: "sampleName"}; // Please set filename.
const form = new FormData();
form.append('metadata', new Blob([JSON.stringify(fileMetadata)], {type: 'application/json'}));
form.append('file', img);
fetch('https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart', {
method: 'POST',
headers: new Headers({'Authorization': 'Bearer ' + gapi.auth.getToken().access_token}),
body: form
}).then(res => res.json()).then(res => console.log(res));
By this modification, the downloaded image data is uploaded to Google Drive with multipart/form-data.
Note:
In this modification, it supposes as follows.
Your URL of http://example.com/api/photo is the direct link of the image data.
Your authorization script can be used for uploading a file to Google Drive.
In this answer, as a sample script, the file is uploaded with uploadType=multipart. In this case, the maximum file size is 5 MB. Please be careful this. When you want to upload the file with the large size, please check the resumable upload. Ref
References:
Google API Client Library for JavaScript
Using Fetch
Files: create
Upload file data
Related question
How I can upload file to google drive with google drive api?

Upload image to strapi with external link

What is a proper way to upload an external image via URL into strapi on backend-side?
I had tried to load image with node-fetch and processed it with buffer()/blob()/blob().stream() and then passed it into strapi.plugins['upload'].services.upload.upload(). Also tried to generate FormData in node.js and passed it into upload service but still didn't help.
How to convert image buffer from fetch into suitable type for upload service?
I used axios and it was on client, but you can try in on server too I think.
This worked for me:
Fetch an image and create File instance from it
async getImage(imageUrl, imageName) {
const response = await axios.get(imageUrl, { responseType: 'blob' });
const mimeType = response.headers['content-type'];
const imageFile = new File([response.data], imageName, { type: mimeType });
return imageFile;
}
GraphQL API query
{
query: `
mutation($files: [Upload!]!) {
multipleUpload(files: $files) {
id
}
}
`,
variables: {
files: [
// your files to upload
]
}
}
Then I called this mutation and it worked perfect.
Resources that I used to find this solution:
https://www.freecodecamp.org/news/how-to-manage-file-uploads-in-graphql-mutations-using-apollo-graphene-b48ed6a6498c/
Client side convert png file stream into file object
https://github.com/jaydenseric/graphql-multipart-request-spec

AWS S3 File Download from the client-side

I am currently trying to download the file from the s3 bucket using a button from the front-end. How is it possible to do this? I don't have any idea on how to start this thing. I have tried researching and researching, but no luck -- all I have searched are about UPLOADING files to the s3 bucket but not DOWNLOADING files. Thanks in advance.
NOTE: I am applying it to ReactJS (Frontend) and NodeJS (Backend) and also, the file is uploaded using Webmerge
UPDATE: I am trying to generate a download link with this (Tried node even if I'm not a backend dev) (lol)
see images below
what I have tried so far
onClick function
If the file you are trying to download is not public then you have to create a signed url to get that file.
The solution is here Javascript to download a file from amazon s3 bucket?
for getting non public files, which revolves around creating a lambda function that will generate a signed url for you then use that url to download the file on button click
BUT if the file you are trying to download you is public then you don't need a signed url, you just need to know the path to the file, the urls are structured like: https://s3.amazonaws.com/ [file path]/[filename]
They is also aws amplify its created and maintain by AWS team.
Just follow Get started and downloading the file from your react app is simply as:
Storage.get('hello.png', {expires: 60})
.then(result => console.log(result))
.catch(err => console.log(err));
Here is my solution:
let downloadImage = url => {
let urlArray = url.split("/")
let bucket = urlArray[3]
let key = `${urlArray[4]}/${urlArray[5]}`
let s3 = new AWS.S3({ params: { Bucket: bucket }})
let params = {Bucket: bucket, Key: key}
s3.getObject(params, (err, data) => {
let blob=new Blob([data.Body], {type: data.ContentType});
let link=document.createElement('a');
link.href=window.URL.createObjectURL(blob);
link.download=url;
link.click();
})
}
The url in the argument refers to the url of the S3 file.
Just put this in the onClick method of your button. You will also need the AWS SDK

How to return files saved on a file system with Node js and Multer to angular front-end?

I'm new to programming with angular and node js, and I need to return the files that are saves in a file system (handled by the backend in node js) to the front end, to give the user the option to view them or download them, to save them I used the multer middleware, but to bring them back for the front end I not found a effective solution.
I tried using fs to create a buffer array, but it didn't work.
Does anyone jnow an effective solution?
In the request will be passed parameters to identify which file returns, but for now I'm testing with a static file.
My request :
let headers: Headers = new Headers();
headers.append('Content-type', 'application/json');
headers.append('Authorization', token);
let link = ${URL_AuthAPI}/systemUsers/list;
let body = JSON.stringify({ obj });
let option = new RequestOptions({ headers: headers });
return this.http.post(link, body, option).map((resposta: Response)=> resposta);
Nodejs Server:
var filePath = path.join("/home", 'rg.png');
var stat = fileSystem.statSync(filePath);
res.writeHead(200, {
'Content-Type': 'image/png',
'Content-Length': stat.size,
// 'Content-Disposition': 'attachment ; filename=teste.png'
});
var readStream = fileSystem.readFileSync(filePath);
readStream.on('data', function(data) {
response.write(data);
});
readStream.on('end', function() {
response.end();
});
Component Code:
this.systemUsersService.listUsers(this.token, null).subscribe((apiResponse) => {
var data = apiResponse['_body'];
console.log(data);
}, (error: any) => {
}
);
If the files you want to allow the user to download are public, then the best option is send (from your backend) the array of files urls to the angular application, (in case of images to create the proper from frontend)
If you want to download the image using node, you can read the file (fs.createReadStream) and send the proper header before perform the "send". Take a look into Nodejs send file in response it is a really good answer
In the end, my personal recommendation is "don't send files using node", you can use nginx to send static content

Categories

Resources