save incoming file from s3 w/nodejs & knox? - javascript

This is likely very basic because the docs leave it out... from knox docs:
"Below is an example GET request on the file we just shoved at s3, and simply outputs the response status code, headers, and body."
client.get('/test/Readme.md').on('response', function(res){
console.log(res.statusCode);
console.log(res.headers);
res.setEncoding('utf8');
res.on('data', function(chunk){
console.log(chunk);
});
}).end();
Easy enough, but how do I save the incoming data as a local file? new BufferList() or something?
I'm trying to build an 'on-the-fly' image resizing service that loads images from s3 or cloudfront and returns them sized based on the request. The browser then caches the sized images instead of the full ones straight from s3. Of course, I need this basic bit working first! Any ideas?
Thanks guys!

It doesn't look like knox supports the stream API, so you can't use stream.pipe() and get proper backpressure. However, chances are your disk will be faster than S3, so this probably doesn't matter.
In the "response" callback, open up a writable stream from the filesystem module with var outstream = fs.createWriteStream(filename);. In the "data" callback, call outstream.write(chunk); Hopefully there is a "end" callback you can use the close the write stream as well.

As an alternative to answer above, you can save the incoming file to buffer like this:
var buffer = '';
client.get('/test/Readme.md').on('response', function(res){
res.setEncoding('utf8');
res.on('data', function(chunk){
buffer += chunk;
});
res.on('end', function(){
// do something with the buffer such as save it to file,
// or directly resize the image here.
// eg. save to file:
fs.writeFile('downloaded_readme.md', buffer, 'utf8', function (err) {
});
});
}).end();

Related

socket.io, node.js forwarding image from server to client

I want to receive an image via socket.io on node.js and would like forward it to a client (browser), but the image sent via the message to the browser is not recognised and therefore not show.
However, when I save the message/image first on the node.js server and load the saved file again to forward the image it works fine. I can also open the jpeg file on the server from the file system without a problem. Sending a different jpeg directly from the server works also as expected.
socket.on('image', function(msg) {
var fileName = 'clientImage.jpg';
// First save the file
fs.writeFile(fileName, msg.buffer, function() {});
// reload the image and forward it to the client
fs.readFile(__dirname +'/clientImage.jpg', function(err, buf){
socket.emit('serverImage', {image: true, buffer: buf});
});
}
If I simplify the function to forward the message (msg) received without the "fs" workaround, like:
socket.emit('serverImage', {image: true, buffer: msg.buffer});
or in the simples expected way
socket.emit('serverImage', msg);
the message will not be recognised as an image on the browser and the client does not fire the "onload" event for the Image.
Client code (works with jpeg files fine):
socket.on('serverImage', function(msg) {
var blob = new Blob([msg.buffer], {type: 'image/jpeg'} );
var url = URL.createObjectURL(blob);
var limg = new Image();
limg.onload = function () {
console.log(' -- image on load!');
rcontext.drawImage(this, 0, 0);
URL.revokeObjectURL(url);
};
limg.src = url;
});
Is there a way that the message can be adopted/converted somehow i.e. encoding, to be recognised directly without the "fs" library, or any other suggestions?
many thanks!
Many thanks for the responses,
I did further tests and found a workaround / solution by using an additional buffer variable specifying the type in front of the socket.emit :
var nbuffer = new Buffer(msg.buffer,'image/jpeg');
socket.emit('serverImage', {image: true, buffer: nbuffer});
with this additional step, the browser recognises now the message as image.
Many thanks for your help!
writeFile is asynchronous. It takes time to write a file to the disk. You passed it a callback function, but it's empty. Re-read the image inside that callback function.
// First save the file
fs.writeFile(fileName, msg.buffer
, function() { // When writing is done (that's the important part)
// reload the image and forward it to the client
fs.readFile(__dirname +'/clientImage.jpg', function(err, buf){
socket.emit('serverImage', {image: true, buffer: buf});
});
});

"Echoing" an image in Node.js

I have a fully functioning PHP application that I am trying to make a Node.js version of. It deals with serving image tiles. When it's ready to display the image it does:
// Stream out the image
echo self::$tile;
How would I do something similar in Node.js? I understand this is a broad question, but I think my biggest issue is that I don't understand how PHP "echoes" an image.
Details:
I'm using AWS to get the image. The AWS call returns a Buffer. At this point of time, in the Javascript I have left the image as a Buffer.
The site populates a map with tiled images, so there are multiple calls with the image placed at a particular location on the page. I am using express to handle the requests.
app.get(/^\/omb\/1.0.0\/(.+)\/(.+)\/(.+)\/(.+)\.[a-zA-Z]*$/, function(req, res){
var MosaicStreamer = require('./models/MosaicStreamer.js');
var ms = new MosaicStreamer;
var configs = {library: req.params[0], zoom: req.params[1], column: req.params[2], row: req.params[3]};
ms.handleTile(configs);
});
handleTile grabs the image and ultimately brings me to where I am now. The image is grabbed using the following:
var aws = new AWS.S3();
var params = {
Bucket: this.bucket,
Key: this.tileDirectory + this.filepath,
Range: 'bytes=' + (this.toffset + 4) + "-" + (this.tsize + this.toffset + 4)
};
var ts = this;
aws.getObject(params, function(err, data){
if(ts.tile == null){
ts.tile = data.Body; //S3 get object
}
}
I think what you want to do is take a given URL which represents closely the naming convention of folders/files in your S3 Bucket. So assuming that you've established a client connection to your S3, you can use the readFile method. The 2nd argument is an imageStream which you can pass in the response. Once the stream has ended from S3, it will automatically end the res from the client, outputting the image directly to the client (as you intend).
Some psuedo code:
app.get(/^\/omb\/1.0.0\/(.+)\/(.+)\/(.+)\/(.+)\.[a-zA-Z]*$/, function(req, res){
var MosaicStreamer = require('./models/MosaicStreamer.js');
var ms = new MosaicStreamer;
var configs = {library: req.params[0], zoom: req.params[1], column: req.params[2], row: req.params[3]};
return ms.handleTile(configs, res);
//return the handleTile function, add 2nd argument and pass res through
});
Inside of handleTile function you can make the call for the S3
function handleTile(configs, res){
client.readFile('filename', function(error, imageStream){
imageStream.pipe(res);
});
}
Now requests to images like this:
<img src="/path/to/my/file/that/matches/regexp/expression"/>
It will request that image from the S3 Bucket and stream the resource back to the client directly.
To successfully render an image, you have to implement three steps:
Retrieve the image data (for instance as a Buffer read via fs.readFile) or a stream (for instance via fs.createReadStream
Set the appropriate headers in the web request handler with the arguments (req, res); something like
res.writeHead(200, {'Content-Type': 'image/png'});
Write the file. If you have the file in a Buffer, with
res.end(buf, 'binary');
If you have a stream via
read_stream.pipe(res)
The whole code may look like (assuming you want to serve the file image.jpg from the current directory):
'use strict';
var fs = require('fs');
var http = require('http');
http.createServer(function(req, res) {
fs.readFile('image.jpg', function(err, buf) {
if (err) {
res.writeHead(500);
res.end('Cannot access file.');
return;
}
res.writeHead(200, {'Content-Type': 'image/jpeg'});
res.end(buf, 'binary');
});
}).listen(8002, '');
Using a stream, a very simple version (beware: no error handling, with error handling it can get a little bit more complex, depending how you want to handle errors occurring while the file is being read)
'use strict';
var fs = require('fs');
var http = require('http');
http.createServer(function(req, res) {
var stream = fs.createReadStream('image.jpg');
// Error handling omitted here
res.writeHead(200, {'Content-Type': 'image/jpeg'});
stream.pipe(res);
}).listen(8003, '');
Code that uses a Buffer is easier to write, but means that your server must hold the whole file in memory - for instance, you will be unable to serve a 320 Gigapixel image file. You also only start sending data once you have the whole file.
Using a stream allows sending the file as soon as you get it, so it will be a little faster. If you're reading from file or a local fast server the speed difference is likely negligible. In addition, you'll only need a little bit of memory. On the other hand, error handling is more complex.

Download Csv file on front end after clicking on button using angularjs , nodejs ,expressjs

I want to download .csv file on frontend.
this is my code:
$http.get('/entity/consultations/_/registerationReport' )
.success(function (data) {
myWindow = window.open('../entity/consultations/_/registerationReport', '_parent');
myWindow.close();
});
and I use json2csv converter to write in csv file.
json2csv({data: report, fields: fields}, function (err, csv) {
if (err) throw err;
res.setHeader('Content-Type', 'application/csv');
res.setHeader("Content-Disposition", "attachment; filename=" + "Report.csv");
res.end(csv, 'binary');
});
but it prints data of csv file on browser instead of downloading csv file.
#Pawan, there's nothing wrong with your json2csv function. The issue is the fact that you're trying to trigger the download with an XMLHttpRequest (XHR) request using Angular's $http service. An XHR call suggests that your code will be handling the response from the server. As such the Content-Disposition headers are ignored by the browser and do not trigger a download on the browser.
From what I can tell you have several options:
If you don't have any pre-processing to do on the client, why not just use a direct link to /entity/consultations/_/registerationReport (using and <a> tag),
You may also write $window.open(...) from your Angular code (this will have the ugly side effect of a flashing popup tab or window)
There are probably a number of other solutions, but these are the only ones that immediately come to mind. The bottom line is that XHR is not the right tool for the task you're trying to accomplish.

upload file to dropbox using its link

i would like to know if it is possible to upload file to dropbox using its link only and without downloading the file to the server.using nodejs in the server side or javascript in client side.the methode i am using now oblige me to download the file and then buffer it in order to send it after.
eq = http.get('http://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.0/jquery.js', function(res) {
var chunks = [];
res.on('data', function(chunk) {
console.log('telechargement');
chunks.push(chunk);
});
res.on('end', function() {
client.put('adaptation/jsfile.js', jsfile, function(status, reply) {
console.log(reply);
});
});
There's no method in the Core API to allow saving a file to Dropbox just via a URL.
But there is an interactive way for users to do this via the Saver: https://www.dropbox.com/developers/dropins/saver. Perhaps that meets your needs.

How to modify node.js stream

I am streaming an xml file from S3. I need to build a new xml file with a different structure for sphinx search engine. I am already streaming the file from S3, piping it into my SAX parser but now I need to figure out how I can make modifications to the stream (after the SAX parser) and upload to S3.
parser.on('startElement', function(name, attrs) {
// Do something
});
I found what seems to be a great S3 library that streams called knox, so I am currently using that library. I'm not stuck on this library, just what I found that seems to be decent. The code that they have to stream data to S3 in the example, is only from an HTTP request. I am relatively new to streams, since I have a PHP background.
Knox Example Stream:
http.get('http://google.com/doodle.png', function(res){
var headers = {
'Content-Length': res.headers['content-length']
, 'Content-Type': res.headers['content-type']
};
client.putStream(res, '/doodle.png', headers, function(err, res){
// Logic
});
});
I am thinking I would need to do something on the lines of this.
parser.on('startElement', function(name, attrs) {
var headers = {
'Content-Length': res.headers['content-length']
, 'Content-Type': res.headers['content-type']
};
client.putStream(res, '/doodle.png', headers, function(err, res){
// Logic
});
});
Any help is greatly appreciated. Thanks.
This discussion of Node's new streams speaks explicitly about transforming streams from S3.
NB: This is in reference to streams as implemented in Node 0.10.x
https://www.npmjs.org/package/through
This module will do what you need.

Categories

Resources