How to modify node.js stream - javascript

I am streaming an xml file from S3. I need to build a new xml file with a different structure for sphinx search engine. I am already streaming the file from S3, piping it into my SAX parser but now I need to figure out how I can make modifications to the stream (after the SAX parser) and upload to S3.
parser.on('startElement', function(name, attrs) {
// Do something
});
I found what seems to be a great S3 library that streams called knox, so I am currently using that library. I'm not stuck on this library, just what I found that seems to be decent. The code that they have to stream data to S3 in the example, is only from an HTTP request. I am relatively new to streams, since I have a PHP background.
Knox Example Stream:
http.get('http://google.com/doodle.png', function(res){
var headers = {
'Content-Length': res.headers['content-length']
, 'Content-Type': res.headers['content-type']
};
client.putStream(res, '/doodle.png', headers, function(err, res){
// Logic
});
});
I am thinking I would need to do something on the lines of this.
parser.on('startElement', function(name, attrs) {
var headers = {
'Content-Length': res.headers['content-length']
, 'Content-Type': res.headers['content-type']
};
client.putStream(res, '/doodle.png', headers, function(err, res){
// Logic
});
});
Any help is greatly appreciated. Thanks.

This discussion of Node's new streams speaks explicitly about transforming streams from S3.
NB: This is in reference to streams as implemented in Node 0.10.x

https://www.npmjs.org/package/through
This module will do what you need.

Related

Getting ERR_FS_FILE_TOO_LARGE while using unirest file send with put

I am using unirest to upload a file like so
unirest.put(fullUri)
.auth({
user: self.userName,
pass: self.password
})
.header('X-Checksum-Sha1', sha1Hash)
.header('X-Checksum-Md5', md5Hash)
.send(fs.readFileSync(filePath))
.end(function (response) {
This works fine for smaller files but for large files I get ERR_FS_FILE_TOO_LARGE error. I have already tried max_old_space_size without success. Looks like I can fix this by streaming the file but I can't find an api to do that in unirest js library.
Looks like this is an issue with form-data From GitHub Issues
It turns out that the unirest are using the NPM module form-data, and the form-data module requires that if it received a stream that not fs.ReadStream, we need to provide the file information by additional.
Example:
form.append(
'my_file',
stream,
{
filename: 'bar.jpg',
contentType: 'image/jpeg',
knownLength: 19806,
},
)
See: https://github.com/form-data/form-data#void-append-string-field-mixed-value--mixed-options-
Streaming files with unirest is available via the .attach method:
unirest.put(fullUri)
.auth({
user: self.userName,
pass: self.password
})
.header('X-Checksum-Sha1', sha1Hash)
.header('X-Checksum-Md5', md5Hash)
// .send(fs.readFileSync(filePath))
.attach('filename', fs.createReadStream(filePath))
.end(function (response) {
I can't find an API to do that in unirest js library.
That's because there is none: https://github.com/Kong/unirest-nodejs/issues/49:
You can use the underlying request library to do streaming if you want, I am open to a pull request either on this branch or the 1.0 version to add streaming.
Issue is still open.
But from this issue and from the source code you can find out that end() returns request (see https://github.com/request/request)
Unirest.request = require('request')
...
end: function (callback) {
...
Request = Unirest.request($this.options, handleRequestResponse)
Request.on('response', handleGZIPResponse)
...
return Request
}
and from request's source code you can find out that no actual request is sent yet (it's defered). So you can hack into it. And use it's API instead:
const request = unirest.put(constants.server2)
.auth({
user: self.userName,
pass: self.password
})
.header('X-Checksum-Sha1', sha1Hash)
.header('X-Checksum-Md5', md5Hash)
// .send(fs.readFileSync(filePath))
.end(...)
fs.createReadStream(filePath).pipe(request) // just pipe it!
As a side note: unirest is based on request, request is deprecated now. So... maybe you need to steer away from unirest.

How to return files saved on a file system with Node js and Multer to angular front-end?

I'm new to programming with angular and node js, and I need to return the files that are saves in a file system (handled by the backend in node js) to the front end, to give the user the option to view them or download them, to save them I used the multer middleware, but to bring them back for the front end I not found a effective solution.
I tried using fs to create a buffer array, but it didn't work.
Does anyone jnow an effective solution?
In the request will be passed parameters to identify which file returns, but for now I'm testing with a static file.
My request :
let headers: Headers = new Headers();
headers.append('Content-type', 'application/json');
headers.append('Authorization', token);
let link = ${URL_AuthAPI}/systemUsers/list;
let body = JSON.stringify({ obj });
let option = new RequestOptions({ headers: headers });
return this.http.post(link, body, option).map((resposta: Response)=> resposta);
Nodejs Server:
var filePath = path.join("/home", 'rg.png');
var stat = fileSystem.statSync(filePath);
res.writeHead(200, {
'Content-Type': 'image/png',
'Content-Length': stat.size,
// 'Content-Disposition': 'attachment ; filename=teste.png'
});
var readStream = fileSystem.readFileSync(filePath);
readStream.on('data', function(data) {
response.write(data);
});
readStream.on('end', function() {
response.end();
});
Component Code:
this.systemUsersService.listUsers(this.token, null).subscribe((apiResponse) => {
var data = apiResponse['_body'];
console.log(data);
}, (error: any) => {
}
);
If the files you want to allow the user to download are public, then the best option is send (from your backend) the array of files urls to the angular application, (in case of images to create the proper from frontend)
If you want to download the image using node, you can read the file (fs.createReadStream) and send the proper header before perform the "send". Take a look into Nodejs send file in response it is a really good answer
In the end, my personal recommendation is "don't send files using node", you can use nginx to send static content

createReadStream not working/extremely slow for large files

Im using DropBox API to upload files. To upload the files to dropbox I am going through the following steps:
First upload file from form to a local directory on the server.
Read File from local directory using fs.createReadStream
Send file to Dropbox via the dropbox API.
The issue:
For some reason fs.createReadStream takes absolute ages when reading and uploading a large file. Now the file I'm trying to upload is only 12MB which is not a big file and it takes approximately 18mins to upload/process a 12MB file.
I don't know where the issue is either it's in createReadStream or dropbox api code.
It works with files of size within kb.
My Code:
let options = {
method: 'POST',
uri: 'https://content.dropboxapi.com/2/files/upload',
headers: {
'Authorization': 'Bearer TOKEN HERE',
'Dropbox-API-Arg': "{\"path\": \"/test/" + req.file.originalname + "\",\"mode\": \"overwrite\",\"autorename\": true,\"mute\": false}",
'Content-Type': 'application/octet-stream'
},
// I think the issue is here.
body: fs.createReadStream(`uploads/${req.file.originalname}`)
};
rp(options)
.then(() => {
return _deleteLocalFile(req.file.originalname)
})
.then(() => {
return _generateShareableLink(req.file.originalname)
})
.then((shareableLink) => {
sendJsonResponse(res, 200, shareableLink)
})
.catch(function (err) {
sendJsonResponse(res, 500, err)
});
Update:
const rp = require('request-promise-native');
I had an experience similar to this issue before and after a large amount of head scratching and digging around, I was able to resolve the issue, in my case anyway.
For me, the issue arose due to the default chunking size for createReadStream() being quite small 64kb and this for some reason having a knock on effect when uploading to Dropbox.
The solution therefore was to increase the chunk size.
// Try using chunks of 256kb
body: fs.createReadStream(`uploads/${req.file.originalname}`, {highWaterMark : 256 * 1024});
https://github.com/request/request#streaming
I believe you need to pipe the stream to the request.
see this answer:
Sending large image data over HTTP in Node.js

Upload large files as a stream to s3 with Plain Javascript using AWS-SDK-JS

There is a pretty nice example available for uploading large files to s3 via aws-sdk-js library but unfortunately this is using nodeJs fs.
Is there a way we can achieve the same thing in Plain Javascript? Here is a nice Gist as well which breaks down the large file into the smaller Chunks however this is still missing the .pipe functionality of nodeJs fs which is required to pass to asw-sdk-js upload function. Here is a relevant code snippet as well in Node.
var fs = require('fs');
var zlib = require('zlib');
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
on('httpUploadProgress', function(evt) {
console.log('Progress:', evt.loaded, '/', evt.total);
}).
send(function(err, data) { console.log(err, data) });
Is there something similar available in Plain JS (non nodeJs)? Useable with Rails.
Specifically, an alternative to the following line in Plain JS.
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
The same link you provided contains an implementation intended for the Browser, and it also uses the AWS client SDK.
// Get our File object
var file = $('#file-chooser')[0].files[0];
// Upload the File
var bucket = new AWS.S3({params: {Bucket: 'myBucket'});
var params = {Key: file.name, ContentType: file.type, Body: file};
bucket.upload(params, function (err, data) {
$('#results').html(err ? 'ERROR!' : 'UPLOADED.');
});
** EDITS **
Note the documentation for the Body field includes Blob, which means streaming will occur:
Body — (Buffer, Typed Array, Blob, String, ReadableStream)
You can also use the Event Emitter convention in the client offered by the AWS SDK's ManagedUpload interface if you care to monitor progress. Here is an example:
var managed = bucket.upload(params)
managed.on('httpUploadProgress', function (bytes) {
console.log('progress', bytes.total)
})
managed.send(function (err, data) {
$('#results').html(err ? 'ERROR!' : 'UPLOADED.');
})
If you want to read the file from your local system in chunks before you send to s3.uploadPart, you'll want to do something with Blob.slice, perhaps defining a Pipe Chain.

save incoming file from s3 w/nodejs & knox?

This is likely very basic because the docs leave it out... from knox docs:
"Below is an example GET request on the file we just shoved at s3, and simply outputs the response status code, headers, and body."
client.get('/test/Readme.md').on('response', function(res){
console.log(res.statusCode);
console.log(res.headers);
res.setEncoding('utf8');
res.on('data', function(chunk){
console.log(chunk);
});
}).end();
Easy enough, but how do I save the incoming data as a local file? new BufferList() or something?
I'm trying to build an 'on-the-fly' image resizing service that loads images from s3 or cloudfront and returns them sized based on the request. The browser then caches the sized images instead of the full ones straight from s3. Of course, I need this basic bit working first! Any ideas?
Thanks guys!
It doesn't look like knox supports the stream API, so you can't use stream.pipe() and get proper backpressure. However, chances are your disk will be faster than S3, so this probably doesn't matter.
In the "response" callback, open up a writable stream from the filesystem module with var outstream = fs.createWriteStream(filename);. In the "data" callback, call outstream.write(chunk); Hopefully there is a "end" callback you can use the close the write stream as well.
As an alternative to answer above, you can save the incoming file to buffer like this:
var buffer = '';
client.get('/test/Readme.md').on('response', function(res){
res.setEncoding('utf8');
res.on('data', function(chunk){
buffer += chunk;
});
res.on('end', function(){
// do something with the buffer such as save it to file,
// or directly resize the image here.
// eg. save to file:
fs.writeFile('downloaded_readme.md', buffer, 'utf8', function (err) {
});
});
}).end();

Categories

Resources