How do you unzip a string in React - javascript

The response from an API is a zipped folder of three small files. I've got this response in a string. I want to display the contents of one of the files and store another in the browser's local storage for later use. I'm having trouble unzipping. How do I do this without accessing the file system?
fetch(URL,
{
method: 'GET',
headers: {
'API-KEY': API_Key,
},
}).then(response => response.text()).then(zippedFolderAsString => {
// Need to unzip
});

Here's how I solved it. I used JSZip which can take blobs as input as opposed to the path to a file like most other libraries.
import JSZip from 'jszip';
...
var new_zip = new JSZip();
new_zip.loadAsync(zippedFolderAsBlob).then(async function(zipped) {
var jsonFile = await zipped.file("theJsonFile.json").async("text");
})

Related

Strapi Upload Plugin: How to upload a remote file in a cron job

I am using the strapi upload plugin with s3 as a provider. It is working great when hitting the upload api endpoint on my strapi instance (/upload). However, I have a cron job in our repo that checks images in an s3 bucket and uploads them. Is there any way to call the upload plugin using the global strapi object, without having to make an http request in the cron job? The latter seems a little strange, since the cron job is running on the same server as strapi.
In my config/functions/cron.js file, I currently have this:
const imageBuffer = await fetch(imageURL).then((response) => response.buffer());
const formData = new FormData();
formData.append('files', imageBuffer, { filename: imageURL.split('/').pop() });
const uploadResult = await fetch('xxxxx/upload', {
method: 'POST',
body: formData
}).then((response) => response.json());
I would prefer to do something simple like:
const imageBuffer = await fetch(imageURL).then((response) => response.buffer());
await strapi.plugins.upload(imageBuffer)
I have been trying to reverse engineer the what the plugin does in its controller file, but that doesn't seem ideal either.
Any help from Strapi experts would be appreciated!
I found the solution and it's:
return strapi.plugins['upload'].services.upload.upload(
{ data: {fileInfo: {}}, files: {
path: path.resolve("public/uploads" + filename), // Put your file path
name: "asd.png",
type: 'image/png'
}}
)

Upload image to strapi with external link

What is a proper way to upload an external image via URL into strapi on backend-side?
I had tried to load image with node-fetch and processed it with buffer()/blob()/blob().stream() and then passed it into strapi.plugins['upload'].services.upload.upload(). Also tried to generate FormData in node.js and passed it into upload service but still didn't help.
How to convert image buffer from fetch into suitable type for upload service?
I used axios and it was on client, but you can try in on server too I think.
This worked for me:
Fetch an image and create File instance from it
async getImage(imageUrl, imageName) {
const response = await axios.get(imageUrl, { responseType: 'blob' });
const mimeType = response.headers['content-type'];
const imageFile = new File([response.data], imageName, { type: mimeType });
return imageFile;
}
GraphQL API query
{
query: `
mutation($files: [Upload!]!) {
multipleUpload(files: $files) {
id
}
}
`,
variables: {
files: [
// your files to upload
]
}
}
Then I called this mutation and it worked perfect.
Resources that I used to find this solution:
https://www.freecodecamp.org/news/how-to-manage-file-uploads-in-graphql-mutations-using-apollo-graphene-b48ed6a6498c/
Client side convert png file stream into file object
https://github.com/jaydenseric/graphql-multipart-request-spec

How to return files saved on a file system with Node js and Multer to angular front-end?

I'm new to programming with angular and node js, and I need to return the files that are saves in a file system (handled by the backend in node js) to the front end, to give the user the option to view them or download them, to save them I used the multer middleware, but to bring them back for the front end I not found a effective solution.
I tried using fs to create a buffer array, but it didn't work.
Does anyone jnow an effective solution?
In the request will be passed parameters to identify which file returns, but for now I'm testing with a static file.
My request :
let headers: Headers = new Headers();
headers.append('Content-type', 'application/json');
headers.append('Authorization', token);
let link = ${URL_AuthAPI}/systemUsers/list;
let body = JSON.stringify({ obj });
let option = new RequestOptions({ headers: headers });
return this.http.post(link, body, option).map((resposta: Response)=> resposta);
Nodejs Server:
var filePath = path.join("/home", 'rg.png');
var stat = fileSystem.statSync(filePath);
res.writeHead(200, {
'Content-Type': 'image/png',
'Content-Length': stat.size,
// 'Content-Disposition': 'attachment ; filename=teste.png'
});
var readStream = fileSystem.readFileSync(filePath);
readStream.on('data', function(data) {
response.write(data);
});
readStream.on('end', function() {
response.end();
});
Component Code:
this.systemUsersService.listUsers(this.token, null).subscribe((apiResponse) => {
var data = apiResponse['_body'];
console.log(data);
}, (error: any) => {
}
);
If the files you want to allow the user to download are public, then the best option is send (from your backend) the array of files urls to the angular application, (in case of images to create the proper from frontend)
If you want to download the image using node, you can read the file (fs.createReadStream) and send the proper header before perform the "send". Take a look into Nodejs send file in response it is a really good answer
In the end, my personal recommendation is "don't send files using node", you can use nginx to send static content

VueJS/AdonisJs image upload and usage

I'm building a webapp that needs to display some images, the frontend is build with VueJS and the backend is build with AdonisJS.
I'm currently having a problem that I'm uploading images from my frontend to my backend. AdonisJS generates a storage path that is local. As example, I upload from my frontend in this form:
Input form
That uses this code on the VueJS side:
let formData = new FormData();
let imagefile = document.querySelector('#file');
formData.append("image", imagefile.files[0]);
axios.post('/users/' + this.user.id + "/image", formData, {
headers: {
'Content-Type': 'image/*'
}
})
And on the AdonisJS side:
* updateProfilePicture(request, response) {
const image = request.file('image', {
maxSize: '20mb',
allowedExtensions: ['jpg', 'png', 'jpeg']
})
const userId = request.param('id');
const user = yield User.findOrFail(userId)
const fileName = `${new Date().getTime()}.${image.extension()}`
yield image.move(Helpers.storagePath(), fileName)
if (!image.moved()) {
response.badRequest(image.errors())
return
}
user.profilepicture = image.uploadPath()
yield user.save();
response.ok(user);
}
Which is working at the moment, but that generates a path that is used by AdonisJS:
ProjectFolder/backend/storage/1500586654324.jpg
VueJS is located in:
ProjectFolder/frontend/*
How can I use my uploaded images in the frontend? Is there some way that these frameworks can be coupled?
There's multiple way to get this image accessible via the browser.
You create a route within Adonis to handle a "media" route (like ~/media/1500586654324.jpg). This route will take the image ID and send the correspondant image inside your storage folder.
You don't upload your image to your storage folder and instead put them directly into the public folder of your application that means you can access the image directly via its URL.
I prefer using the first option since my public directory is 100% generated via script.

Piping zip file from SailsJS backend to React Redux Frontend

I have a SailsJS Backend where i generate a zip File, which was requested by my Frontend, a React App with Redux. I'm using Sagas for the Async Calls and fetch for the request. In the backend, it tried stuff like:
//zipFilename is the absolute path
res.attachment(zipFilename).send();
or
res.sendfile(zipFilename).send();
or
res.download(zipFilename)send();
or pipe the stream with:
const filestream = fs.createReadStream(zipFilename);
filestream.pipe(res);
on my Frontend i try to parse it with:
parseJSON(response) => {
return response.clone().json().catch(() => response.text());
}
everything i tried ends up with an empty zip file. Any suggestions?
There are various issues with the options that you tried out:
res.attachment will just set the Content-Type and Content-Disposition headers, but it will not actually send anything.
You can use this to set the headers properly, but you need to pipe the ZIP file into the response as well.
res.sendfile: You should not call .send() after this. From the official docs' examples:
app.get('/file/:name', function (req, res, next) {
var options = { ... };
res.sendFile(req.params.name, options, function (err) {
if (err) {
next(err);
} else {
console.log('Sent:', fileName);
}
});
});
If the ZIP is properly built, this should work fine and set the proper Content-Type header as long as the file has the proper extension.
res.download: Same thing, you should not call .send() after this. From the official docs' examples:
res.download('/report-12345.pdf', 'report.pdf', function(err) { ... });
res.download will use res.sendfile to send the file as an attachment, thus setting both Content-Type and Content-Disposition headers.
However, you mention that the ZIP file is being sent but it is empty, so you should probably check if you are creating the ZIP file properly. As long as they are built properly and the extension is .zip, res.download should work fine.
If you are building them on the fly, check this out:
This middleware will create a ZIP file with multiples files on the fly and send it as an attachment. It uses lazystream and archiver
const lazystream = require('lazystream');
const archiver = require('archiver');
function middleware(req, res) {
// Set the response's headers:
// You can also use res.attachment(...) here.
res.writeHead(200, {
'Content-Type': 'application/zip',
'Content-Disposition': 'attachment; filename=DOWNLOAD_NAME.zip',
});
// Files to add in the ZIP:
const filesToZip = [
'assets/file1',
'assets/file2',
];
// Create a new ZIP file:
const zip = archiver('zip');
// Set up some callbacks:
zip.on('error', errorHandler);
zip.on('finish', function() {
res.end(); // Send the response once ZIP is finished.
});
// Pipe the ZIP output to res:
zip.pipe(res);
// Add files to ZIP:
filesToZip.map((filename) => {
zip.append(new lazystream.Readable(() => fs
.createReadStream(filename), {
name: filename,
});
});
// Finalize the ZIP. Compression will start and output will
// be piped to res. Once ZIP is finished, res.end() will be
// called.
zip.finalize();
}
You can build around this to cache the built ZIPs instead of building them on the fly every time, which is time and resource consuming and totally unadvisable for most uses cases.

Categories

Resources