I am currently in the process of creating a REST API for my personal website. I'd like to include some downloads and I would like to offer the possibility of selecting multiple ones and download those as a zip file.
My first approach was pretty easy: Array with urls, request for each of them, zip it, send to user, delete. However, I think that this approach is too dirty considering there are things like streams around which seems to be quite fitting for this thing.
Now, I tried around and am currently struggling with the basic concept of working with streams and events throughout different scopes.
The following worked:
const r = request(url, options);
r.on('response', function(res) {
res.pipe(fs.createWriteStream('./file.jpg'));
});
From my understanding r is an incoming stream in this scenario and I listen on the response event on it, as soon as it occurs, I pipe it to a stream which I use to write to the file system.
My first step was to refactor this so it fits my case more but I already failed here:
async function downloadFile(url) {
return request({ method: 'GET', uri: url });
}
Now I wanted to use a function which calls "downloadFile()" with different urls and save all those files to the disk using createWriteStream() again:
const urls = ['https://download1', 'https://download2', 'https://download3'];
urls.forEach(element => {
downloadFile(element).then(data => {
data.pipe(fs.createWriteStream('file.jpg'));
});
});
Using the debugger I found out that the "response" event is non existent in the data object -- Maybe that's already the issue? Moreover, I figured that data.body contains the bytes of my downloaded document (a pdf in this case) so I wonder if I could just stream this to some other place?
After reading some stackoveflow threads I found the following module: archiver
Reading this thread: Dynamically create and stream zip to client
#dankohn suggested an approach like that:
archive
.append(fs.createReadStream(file1), { name: 'file1.txt' })
.append(fs.createReadStream(file2), { name: 'file2.txt' });
Making me assume I need to be capable of extracting a stream from my data object to proceed.
Am I on the wrong track here or am I getting something fundamentally wrong?
Edit: lmao thanks for fixing my question I dunno what happened
Using archiver seems to be a valid approach, however it would be advisable to use streams when feeding large data from the web into the zip archive. Otherwise, the whole archive data would need to be held in memory.
archiver does not support adding files from streams, but zip-stream does. For reading a stream from the web, request comes in handy.
Example
// npm install -s express zip-stream request
const request = require('request');
const ZipStream = require('zip-stream');
const express = require('express');
const app = express();
app.get('/archive.zip', (req, res) => {
var zip = new ZipStream()
zip.pipe(res);
var stream = request('https://loremflickr.com/640/480')
zip.entry(stream, { name: 'picture.jpg' }, err => {
if(err)
throw err;
})
zip.finalize()
});
app.listen(3000)
Update: Example for using multiple files
Adding an example which processes the next file in the callback function of zip.entry() recursively.
app.get('/archive.zip', (req, res) => {
var zip = new ZipStream()
zip.pipe(res);
var queue = [
{ name: 'one.jpg', url: 'https://loremflickr.com/640/480' },
{ name: 'two.jpg', url: 'https://loremflickr.com/640/480' },
{ name: 'three.jpg', url: 'https://loremflickr.com/640/480' }
]
function addNextFile() {
var elem = queue.shift()
var stream = request(elem.url)
zip.entry(stream, { name: elem.name }, err => {
if(err)
throw err;
if(queue.length > 0)
addNextFile()
else
zip.finalize()
})
}
addNextFile()
})
Using Async/Await
You can encapsulate it into a promise to use async/await like:
await new Promise((resolve, reject) => {
zip.entry(stream, { name: elem.name }, err => {
if (err) reject(err)
resolve()
})
})
zip.finalize()
Related
I have an object with two embedded arrays of objects that seem to me to be almost identical, as seen here in my database:
But when I try to access one of the arrays in frontend javascript, it's apparently empty. The other is not, as seen here when I log it to the browser console:
The objects in the arrays are almost exactly the same. I am concerned that the problem is when I push a new object on to the 'stakeholders' array that the asynchronous function is not completing before the page loads again, but I am using async/await in that function before returning the response
addStakeholder = async (req, res, next) => {
...
project.stakeholders.push(stakeholder)
await project.save()
res.status(200).json({
status: 'success',
project: project
Could anyone please tell me what I am likely doing wrong here?
EDIT: Sorry I'll try and add some more detail, so on the form submission there is this.....
createStakeholderForm.addEventListener('submit', async (e) => {
// getting properties etc, this all works
await createStakeholder({ stakeholders, project })
window.setTimeout(() => {
location.reload()
}, 1000)
})
which passes it to this axios function....
createStakeholder = async (data) => {
try {
const url = `http://127.0.0.1:3000/stakeholder`
const res = await axios({
method: 'POST',
url: url,
data: data
})
if (res.data.status === 'success') {
showAlert('success', `Stakeholder created`)
}
} catch (err) {
showAlert('error', err.response.data.message)
}
}
and that routes posts to this function.....
addStakeholder = async (req, res, next) => {
const query = { _id: req.body.project }
const project = await Project.findById(query)
const stakeholder = req.body.stakeholders
project.stakeholders.push(stakeholder)
await project.save()
res.status(200).json({
status: 'success',
data: {
data: project
}
})
})
While it's not obvious what is wrong from your code. The debugging path is, fortunately.
Start tracing the wire. It sound like things are saving in the database correctly, but are not reaching the frontend. I would console.log in your code on the backend at the call site that queries the database. Confirm its what you expect. Assuming that worked, add another console.log downstream, keep doing that until stakeholder data vanishes. This exercise will show you where in the code stakeholders are getting dropped.
here's my problem :
I want to create a Google Sheets extension in which I basically extract data from a sheet in Google Sheets, that I modify using methods in node JS.
Then, having the data that I modified in a string, I want to upload that string into the client's Drive, in a csv or xml file. Therefore I don't have a local file that I can use to upload the file, just a string variable.
How do I upload that string ?
Thanks a lot, that's my first app and I'm struggling a bit.
Code
const {google} = require ('googleapis');
const keys = require ('./keys.json');
const client = new google.auth.JWT(
keys.client_email, null,
keys.private_key,
['googleapis.com/auth/drive'],
'https://www.googleapis.com/…'
);
client.authorize(function(err, tokens){
if (err){
console.log(err);
return
} else {
console.log('Connected');
gsrun(client);
}
});
async function gsrun(cl) {
const gsapi = google.sheets({version: 'v4', auth: cl});
}
You have to set your file's metadata and the data it will contain (it's important the MIME type for this case must be text/csv) and the file's body will be a simple string. This code will help you taking into consideration you already did the OAuth process and have the string you want to insert:
module.exports.init = async function (){
// Before calling the API, build your own Drive service instance
// In the second argument, you must pass your own string message
const pro = await uploadSimpleString(drive, null);
console.log(pro);
}
uploadSimpleString = (drive, message) => {
// Set file metadata and data
message = message || 'This is a simple String nice to meet you';
const fileMetadata = {'name': 'uploadSimpleStringt.csv'};
const media = {
mimeType: 'text/csv',
body: message
};
// Return the Promise result after completing its task
return new Promise((resolve, reject) => {
try{
// Call Files: create endpoint
return drive.files.create({
resource: fileMetadata,
media: media,
fields: 'id'
},(err, results) => {
// Result from the call
if(err) reject(`Drive error: ${err.message}`);
resolve(results);
})
} catch (error){
console.log(`There was a problem in the promise: ${error}`);
}
});
}
Notice
To test this code, run it in your CLI using this command:
node -e 'require("./index.js").init()'
Where index.js is your file's name and init() is your main function.
Docs
For more info, please check these links and also consider using the [google-drive-api] tag in that way, there are more chances to receive help because more people will be able to find your question.
How to get Help
Files: create
G Suite documents and corresponding export MIME types
I am using Node.js 8.0.0 and wanted to update a file on a platform. For this they have an API with very clear ways to use it.
In this case I have to use a PUT method for this, also the right hostname and path, right auth keys and the file itself that must be of Content-Type: multipart/form-data. So I really wanted to use the node https module and try to not install anything else.
I tried using the request http client (https://github.com/request/request)and worked like a charm but as I previously told, would like to use what we already have in Node without installing anything else. I see some working replies here using Request, but no one using Node https module.
Using the https.request I managed to go to the right URL, pass auth, but it always shows me a correlation error (specifically for this platform not something you can Google I guess).
function update(method, path, params) {
return new Promise((resolve, reject) => {
https.request({
method,
host: HOST,
path: path + (params ? '?' + qs.stringify(params) : ''),
auth: `${USER}:${PASS}`,
formData: document,
}, res => {
let body = '';
res
.on('data', message => body += message)
.on('error', (e) => console.log(e))
.on('end', () => resolve(body));
})
.end();
Where: const document = fs.createReadStream(path.resolve(__dirname, '../../src/', 'myfile.xlf'));
And where I call the update function like this:
await operations.update('PUT', '/right/path/to/update', {
id: `${rightId}`,
});
With this code I don't have any auth problem and I can communicate with the platform, in fact if I use other api method (GET, POST) I can obtain statistics and things like that, but in this case explained before, the response have a 400 Bad Request error, that I am sure is a problem with the way I am "trying to send the file".
Using the request http client, I get no errors and managed to update the document with this code:
function update(path, params) {
const url = 'https://' + HOST + path + (params ? '?' + qs.stringify(params) : '');
return new Promise((resolve, reject) => {
try {
resolve(requests.put({
url,
formData: document,
}).auth(USER, PASS));
} catch (err) {
reject(err);
}
});
}
In an Express.js API I'm creating a zip file that stores a collection of PDFs that is intended to be passed as a download
I have created the zipfile using the yazl package following the README file, and it's pretty good, the problem comes when I use the pipe to create the createWriteStream, because I don't know how to properly wait until is finished.
Then in my Express route I want to send the file, but this code is executed before the write stream is finished...
This is a piece of code of a Promise function named renderReports inside my repository.js file, after I write the PDFs file I use a loop to added to the yazl's zipFile, then I proceed to create the zip with the fs.createWriteStream
const renderFilePromises = renderResults.map((renderedResult, index) =>
writeFile(`./temporal/validatedPdfs/${invoices[index].id}.pdf`, renderedResult.content)
);
await Promise.all(renderFilePromises);
const zipfile = new yazl.ZipFile();
invoices.map((invoice, index) => {
zipfile.addFile(`./temporal/validatedPdfs/${invoice.id}.pdf`, `${invoice.id}.pdf`)
});
zipfile.outputStream.pipe(fs.createWriteStream("./temporal/output.zip").on('close', () => {
console.log('...Done');
}));
zipfile.end();
resolve();
And the following code is how I use the promise
app.post('/pdf-report', async (req, res, next) => {
const { invoices } = req.body;
repository.renderReports(reporter, invoices)
.then(() => {
res.sendFile('output.zip', {
root: path.resolve(__dirname, './../../temporal/'),
dotfiles: 'deny',
}, (err) => {
if (err) {
console.log(err);
res.status(err.status).end();
}
else {
console.log('Sent:', 'output.zip');
}
});
})
.catch((renderErr) => {
console.error(renderErr);
res.header('Content-Type', 'application/json');
return res.status(501).send(renderErr.message);
});
});
I hope somebody can explain how to approach this
You need to store the write stream in a variable so you can access it. Then, on this variable, wait for the stream‘s finish event. This is emitted by Node.js once the stream is done with writing.
The question is: How can I import json from a URL specifically, NOT an internal file in Express, and contain it such that I can use it across multiple views. For example, I have a controller. How can I get in in there (controller)? I am using request.
I have a router with 2 routes but I want to have a bunch more, and the bulk of the logic for the routes is being done in controllers.
Below is a controller with the route for showing all. I had hardcoded a small piece of "json" in it as data to use temporarily, but now I want to populate my view via an outside api. This is my controller:
module.exports = {
//show all USERS
showDogs: (req,res) => {
const dogs = [
{
name:"Fluffy", breed:"ChowChow", slug:"fluffy", description:"4 year old Chow. Really, really fluffy."
},
{
name:"Buddy", breed:"White Lab", slug:"buddy", description:"A friendly 6 year old white lab mix. Loves playing ball"
},
{
name: "Derbis", breed:"Schmerbis",slug:"derbis", description:"A real Schmerbis Derbis"
}
];
res.render("pages/dogs", {dogs: dogs, title:"All Dogs"});
}
};
How can I get this json the data to come from an outside line? I have used request before but I don't know how to transfer the data between files. I don't want to put it inside the showDogs or it won't be accessible to other functions here. Right?
I had something like this below, with require('request') at the top of the controller, but it just gave errors.
const options = {
url:'https://raw.githubusercontent.com/matteocrippa/dogbreedjsondatabase/master/dog-breed.json',
method:'GET',
headers:{
'Accept-Charset': "utf-8"
NO IDEA ABOUT THIS AREA FOR NOW EITHER
}
I also tried wrapping the entire thing, all the functions, in a request:
request('https://raw.githubusercontent.com/matteocrippa/dogbreedjsondatabase/master/dog-breed.json', function(error, response, body)
But still I got an error.
And this the route.js where the controller sends:
//dogs
router.get('/dogs', dogsController.showDogs)
I am a Node beginner so the only thought I have is to write some middleware. The deeper problem here is I don't know how to use/write middleware properly. Perhaps I can become informed.
Add a utility file that contains the code to talk to the external API. Include this file and use it's function to get dogs data. Later, you can add more functions for other APIs as well.
const getDogData = require('../externalApis').getDogData;
module.exports = {
//show all USERS
showDogs: (req, res) => {
getDogData(function(err, dogs) {
if (err) {
//handle err
} else {
res.render("pages/dogs", {
dogs: dogs,
title: "All Dogs"
});
}
}
}
};
// externalApis.js
const request = require ('request');
module.exports = {
getDogData: function(done) {
const options = {
url: 'https://raw.githubusercontent.com/matteocrippa/dogbreedjsondatabase/master/dog-breed.json',
method: 'GET',
headers: {
'Accept-Charset': "utf-8"
}
}
request(options, function(error, response, body) {
if (error) {
return done(err);
} else {
var data = JSON.parse(body); // not sure how's data is returned or if it needs JSON.parse
return done(null, data.dogs); //return dogs
}
});
}