"Echoing" an image in Node.js - javascript

I have a fully functioning PHP application that I am trying to make a Node.js version of. It deals with serving image tiles. When it's ready to display the image it does:
// Stream out the image
echo self::$tile;
How would I do something similar in Node.js? I understand this is a broad question, but I think my biggest issue is that I don't understand how PHP "echoes" an image.
Details:
I'm using AWS to get the image. The AWS call returns a Buffer. At this point of time, in the Javascript I have left the image as a Buffer.
The site populates a map with tiled images, so there are multiple calls with the image placed at a particular location on the page. I am using express to handle the requests.
app.get(/^\/omb\/1.0.0\/(.+)\/(.+)\/(.+)\/(.+)\.[a-zA-Z]*$/, function(req, res){
var MosaicStreamer = require('./models/MosaicStreamer.js');
var ms = new MosaicStreamer;
var configs = {library: req.params[0], zoom: req.params[1], column: req.params[2], row: req.params[3]};
ms.handleTile(configs);
});
handleTile grabs the image and ultimately brings me to where I am now. The image is grabbed using the following:
var aws = new AWS.S3();
var params = {
Bucket: this.bucket,
Key: this.tileDirectory + this.filepath,
Range: 'bytes=' + (this.toffset + 4) + "-" + (this.tsize + this.toffset + 4)
};
var ts = this;
aws.getObject(params, function(err, data){
if(ts.tile == null){
ts.tile = data.Body; //S3 get object
}
}

I think what you want to do is take a given URL which represents closely the naming convention of folders/files in your S3 Bucket. So assuming that you've established a client connection to your S3, you can use the readFile method. The 2nd argument is an imageStream which you can pass in the response. Once the stream has ended from S3, it will automatically end the res from the client, outputting the image directly to the client (as you intend).
Some psuedo code:
app.get(/^\/omb\/1.0.0\/(.+)\/(.+)\/(.+)\/(.+)\.[a-zA-Z]*$/, function(req, res){
var MosaicStreamer = require('./models/MosaicStreamer.js');
var ms = new MosaicStreamer;
var configs = {library: req.params[0], zoom: req.params[1], column: req.params[2], row: req.params[3]};
return ms.handleTile(configs, res);
//return the handleTile function, add 2nd argument and pass res through
});
Inside of handleTile function you can make the call for the S3
function handleTile(configs, res){
client.readFile('filename', function(error, imageStream){
imageStream.pipe(res);
});
}
Now requests to images like this:
<img src="/path/to/my/file/that/matches/regexp/expression"/>
It will request that image from the S3 Bucket and stream the resource back to the client directly.

To successfully render an image, you have to implement three steps:
Retrieve the image data (for instance as a Buffer read via fs.readFile) or a stream (for instance via fs.createReadStream
Set the appropriate headers in the web request handler with the arguments (req, res); something like
res.writeHead(200, {'Content-Type': 'image/png'});
Write the file. If you have the file in a Buffer, with
res.end(buf, 'binary');
If you have a stream via
read_stream.pipe(res)
The whole code may look like (assuming you want to serve the file image.jpg from the current directory):
'use strict';
var fs = require('fs');
var http = require('http');
http.createServer(function(req, res) {
fs.readFile('image.jpg', function(err, buf) {
if (err) {
res.writeHead(500);
res.end('Cannot access file.');
return;
}
res.writeHead(200, {'Content-Type': 'image/jpeg'});
res.end(buf, 'binary');
});
}).listen(8002, '');
Using a stream, a very simple version (beware: no error handling, with error handling it can get a little bit more complex, depending how you want to handle errors occurring while the file is being read)
'use strict';
var fs = require('fs');
var http = require('http');
http.createServer(function(req, res) {
var stream = fs.createReadStream('image.jpg');
// Error handling omitted here
res.writeHead(200, {'Content-Type': 'image/jpeg'});
stream.pipe(res);
}).listen(8003, '');
Code that uses a Buffer is easier to write, but means that your server must hold the whole file in memory - for instance, you will be unable to serve a 320 Gigapixel image file. You also only start sending data once you have the whole file.
Using a stream allows sending the file as soon as you get it, so it will be a little faster. If you're reading from file or a local fast server the speed difference is likely negligible. In addition, you'll only need a little bit of memory. On the other hand, error handling is more complex.

Related

How to get number of directory items in Node.js

I need to get number of items of specific directory in Node.js
If I get the items like
var dirItems = fs.readdirSync(__dirname+'/my_dir');
and then get specific item like
dirItems[1]
everything is ok
But if I try to get their number like
dirItems.length
or
Object.keys(dirItems).length
the page doesn't work in the browser
How to get the number of directory items?
UPDATED
My full code:
var http = require('http'),
fs = require('fs');
http.createServer(function(req, res) {
var dirItems = fs.readdirSync(__dirname+'/my_dir');
res.writeHead(200, {'Content-Type': 'text/html'});
res.end(dirItems.length);
}).listen(80, 'localhost');
I was able to reproduce the error nyou get.
res.end() for the basic http server class is very picky about what you send it. You must give it a string (the error you got should have been a big clue here).
So, change this:
res.end(dirItems.length);
to this:
res.end(dirItems.length.toString());
And, it works for me. I was able to reproduce your original error and then make it work by making this simple change.
Logically, you can only send string data as an http response so apparently res.end() isn't smart enough to attempt a string conversion on its own. You have to do it yourself.
FYI, if you use a higher level framework like Express, it is more tolerant of what you send it (it will attempt a string conversion in a situation like this).
Here is how I would do it:
const fs = require('fs');
const dir = './somedir';
fs.readdir(dir, (err, files) => {
console.log(files.length);
});

Upload large files as a stream to s3 with Plain Javascript using AWS-SDK-JS

There is a pretty nice example available for uploading large files to s3 via aws-sdk-js library but unfortunately this is using nodeJs fs.
Is there a way we can achieve the same thing in Plain Javascript? Here is a nice Gist as well which breaks down the large file into the smaller Chunks however this is still missing the .pipe functionality of nodeJs fs which is required to pass to asw-sdk-js upload function. Here is a relevant code snippet as well in Node.
var fs = require('fs');
var zlib = require('zlib');
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
on('httpUploadProgress', function(evt) {
console.log('Progress:', evt.loaded, '/', evt.total);
}).
send(function(err, data) { console.log(err, data) });
Is there something similar available in Plain JS (non nodeJs)? Useable with Rails.
Specifically, an alternative to the following line in Plain JS.
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
The same link you provided contains an implementation intended for the Browser, and it also uses the AWS client SDK.
// Get our File object
var file = $('#file-chooser')[0].files[0];
// Upload the File
var bucket = new AWS.S3({params: {Bucket: 'myBucket'});
var params = {Key: file.name, ContentType: file.type, Body: file};
bucket.upload(params, function (err, data) {
$('#results').html(err ? 'ERROR!' : 'UPLOADED.');
});
** EDITS **
Note the documentation for the Body field includes Blob, which means streaming will occur:
Body — (Buffer, Typed Array, Blob, String, ReadableStream)
You can also use the Event Emitter convention in the client offered by the AWS SDK's ManagedUpload interface if you care to monitor progress. Here is an example:
var managed = bucket.upload(params)
managed.on('httpUploadProgress', function (bytes) {
console.log('progress', bytes.total)
})
managed.send(function (err, data) {
$('#results').html(err ? 'ERROR!' : 'UPLOADED.');
})
If you want to read the file from your local system in chunks before you send to s3.uploadPart, you'll want to do something with Blob.slice, perhaps defining a Pipe Chain.

Split video file to stream from browser

I split a video file into two using the split-file module.
There are no file part extensions. They seem like: gan-1, gan-2
I am hosting these two files on my own server.
http://bilketay.com/download/gan-1
http://bilketay.com/download/gan-2
I try to stream these two files through the browser like a single video file. Like;
// Dependencies
var express = require('express');
var app = express();
var CombinedStream = require('combined-stream2');
var request = require('request');
// Some routes
app.get('/', function(req, res) {
// Set header
res.set({
"Content-Type": 'video/mp4'
});
res.writeHead(200);
var combinedStream = CombinedStream.create();
// This function is to call gan-1 first, then gan-2
var recursive = function(param) {
var req = request('http://bilketay.com/download/' + param);
// First add gan-1, then gan-2
combinedStream.append(req);
req.on('end', function() {
if (param != 'gan-2') {
recursive('gan-2')
}
});
}
// Start recursive
recursive('gan-1');
// Start stream browser
// But, It does not start until it is completely loaded :(
combinedStream.pipe(res);
});
// Listen port
app.listen(3000);
I created this code with restricted node.js information. No problem for me, but I think Google Chrome is different. :)
The problem is, the two parts do not stream without being loaded. The stream starts after two parts have been uploaded. What I want to do is start the stream right away. A short note; gan-1 and gan-2 files are working locally. But it does not work on the remote server. What am I doing wrong?
I used the combined-stream2 module to merge the parts.
This module simplifies streaming by adding two different files. But because I can not get the result I want, I might have used it wrong.
In short, I want to stream two different files through the browser, respectively.
I need the help of ninjas. Thank you.
Screen shot describing the problem;
stream.gif

Downloading Torrent with Node.JS

I was wondering if anyone had an example of how to download a torrent using NodeJS? Essentially, I have an RSS Feed of torrents that I iterate through and grab the torrent file url, then would like to initiate a download of that torrent on the server.
I've parsed and looped through the RSS just fine, however I've tried a few npm packages but they've either crashed or were just unstable. If anyone has any suggestions, examples, anything... I would greatly appreciate it. Thanks.
router.get('/', function(req, res) {
var options = {};
parser.parseURL('rss feed here', options, function(err, articles) {
var i = 0;
var torrent;
for (var title in articles.items) {
console.log(articles.items[i]['url']);
//download torrent here
i++;
}
});
});
You can use node-torrent for this.
Then, to download a torrent:
var Client = require('node-torrent');
var client = new Client({logLevel: 'DEBUG'});
var torrent = client.addTorrent('a.torrent');
// when the torrent completes, move it's files to another area
torrent.on('complete', function() {
console.log('complete!');
torrent.files.forEach(function(file) {
var newPath = '/new/path/' + file.path;
fs.rename(file.path, newPath);
// while still seeding need to make sure file.path points to the right place
file.path = newPath;
});
});
Alternatively, for more control, you can use transmission-dæmon and control it via its xml-rpc protocol. There's a node module called transmission that does the job! Exemple:
var Transmission = require('./')
var transmission = new Transmission({
port : 9091,
host : '127.0.0.1'
});
transmission.addUrl('my.torrent', {
"download-dir" : "/home/torrents"
}, function(err, result) {
if (err) {
return console.log(err);
}
var id = result.id;
console.log('Just added a new torrent.');
console.log('Torrent ID: ' + id);
getTorrent(id);
});
If you are working with video torrents, you may be interested in Torrent Stream Server. It a server that downloads and streams video at the same time, so you can watch the video without fully downloading it. It's based on torrent-stream library.
Another interesting project is webtorrent. It's a nice torrent library that works in both: NodeJs & browser and has streaming support. From my experience, it doesn't have very good support in the browser, but should fully work in NodeJS.

File copying using NodeJS

I am trying to copy a file from one location to another location. Here is my code below and I am calling this script like [1] http://localhost:8000/prdcopy/acbd.pdf
var http = require('http');
var fs = require('fs');
var express=require('express');
var app=express();
var path_upload = "/234.567.890.123/";
var path_prodn = "//123.345.678.999/sample/temp/";
app.get('/prdcopy/:file',function(req,res){
var rstream = fs.createReadStream(path_upload + req.params.file);
var wstream = fs.createWriteStream(path_prodn + req.params.file);
rstream.pipe(wstream);
res.end();
rstream.on('end', function () {
console.log('SrcFile');
});
wstream.on('close', function () {
console.log('Done!');
});
});
var server=app.listen(8000,function(){
console.log("listening on port 8000...");
});
It copies the file properly however after copying the Firefox browser opens up a PDF reader. There is no file loaded in it though. This is my first node script and I would like to know what is that I am doing wrong. In IE it is not opening any PDF Reader window.
This is not necessarily an error.
With res.end() you are sending back an http response with no Content-Type header. What Firefox does in this case is detecting the .pdf at the end of the typed URL and assuming that the response will contain something that is displayable by the PDF viewer. This is not the case as you are not sending anything back (no body).
Try substituting res.end() with something like:
res.header("Content-Type", "text/html");
res.end();
You will see that no PDF viewever is displayed even in Firefox. You could also use other res methods that automatically set the content type for you. For instance, to send a json response back substitute res.end() with:
status = {};
status.message = "Copied successfully";
res.json(status);

Categories

Resources