Node http res.end - javascript

I am currently reading through "Node.js in action" as well as following through a number of online learning resources. One of the first examples in the book is showing how to pipe a stream through to a response. Like so:
var http = require('http'),
fs = require('fs');
http.createServer(function(req, res){
res.writeHead(200, {"Content-Type": "image/png"});
fs.createReadStream("./image.png").pipe(res);
}).listen(xxxx)
My question is how valid is this code? I was under the impression that when ever using the http you should always end with:
res.end();
Is this not necessary as piping it implies an end? When ever writing a response should I always end it?

When your readable stream finishes reading (the image.png file), by default, it emits and end() event, which will call the end() event on the writable stream (the res stream). You don't need to worry about calling end() in this case.
It's worth point out that, in this scenario, your res will no longer be writable after the end() event is called. So, if you want to keep it writable, just pass the end: false option to pipe(), like:
fs.createReadStream("./image.png").pipe(res, { end: false });
, and then call the end() event sometime in the future.

end() is not necessary for streams. There is a great set of tutorials here. One of the exercises (#11 http file server with streams) is to create a static file server. here is what the code looks like:
var fs = require('fs'),
http = require('http'),
port = parseInt( process.argv[2] ),
file = process.argv[3],
opts = { encoding:'utf8' },
server;
server = http.createServer(function(req, res) {
console.log( 'request url: ', req.url );
res.writeHead(200, { 'Content-type':'text/plain' });
var stream = fs.createReadStream( file );
stream.pipe( res );
stream.on('end', function() {
console.log('stream ended...');
});
});
server.listen( port, function() {
console.log('server listening on port: ', port );
});
Lots of other good tutorials and examples as well. Hope this helps.

Related

Client program for continuous talking to an echo server in nodejs

I'm an absolute beginner in nodejs. I've created an echo server in nodejs. And honestly i would say, i followed few youtube tutorials for this. There is nothing wrong with the server code. I want to create a client program to talk to this server. I dont want to use telnet client or any such thing. And by 'continuous' I mean the server and client should stay connected till I close the server manually using ctrl+c. Here's my server.js.
const express = require('express');
const bodyParser=require('body-parser');
var server=express();
server.use(bodyParser.urlencoded({extended: false}));
server.use(bodyParser.json());
server.post("/", function (req, res) {
console.log("I got: "+req.body.message);
res.send(req.body.message);
});
server.listen(3000, function() {
console.log("Express echo server is listening on port 3000");
})
I do not say hey write the code for me. In fact I tried also. Here's my client.js
var request = require('request');
var arg="";
process.argv.slice(2).forEach(function (val, index, array) {
arg+=val +" ";
});
request.post({
url: "http://localhost:3000",
json: true,
body: {message: arg}
}, function (err, response, body) {
if(!err && response.statusCode==200) {
console.log(body);
}
});
But client sends data only once that too using command line argument.
node client.js hello
PS: I'm using these npm modules express, body-parser and request
What you made is a HTTP Server using express.
The server runs alright, but the client closes because you are only making a single request to the server. So what is happening is expected behaviour.
So, there are multiple ways,
The most simple way would be, using readline or some other to continuously read the lines that you type And sending it to the server:
const request = require('request');
const readline = require("readline").createInterface({
input: process.stdin,
output: process.stdout
});
readline.setPrompt('msg: ');
readline.prompt();
readline.on('line', function(input) {
if(input === 'close') return readline.close();
request.post({
url: "http://localhost:3000",
json: true,
body: {message: input}
}, function (err, response, body) {
readline.prompt();
});
}).on('close', function() {
console.log('Closed');
process.exit(0);
});
But the proper way would be using sockets like socket-io to make a persistent connection between the server and client. Read here for more information.

Node.js - piping a readable stream to http response

I am doing node.js exercises from nodeschool.io (learnyounode). One of the exercises involves creating a http server which serves a text file from a readable file stream. I'm very new to asynchronous programming. The solution I came up with is:
var http = require('http');
var fs = require('fs');
var readable = fs.createReadStream(process.argv[3]);
var server = http.createServer(function(request, response) {
readable.on('data', function(chunk) {
response.write(chunk);
})
});
server.listen(process.argv[2]);
This works, however the official solution uses a pipe instead of on-data event:
var http = require('http')
var fs = require('fs')
var server = http.createServer(function (req, res) {
res.writeHead(200, { 'content-type': 'text/plain' })
fs.createReadStream(process.argv[3]).pipe(res);
})
server.listen(Number(process.argv[2]))
What are the (potential) differences and/or benefits of doing it either way?
Well, there's more code in your version, and that usually means you have more options to make mistakes. Take into account some edge cases, like what happens when the stream throws an error?
I'm not exactly sure what the behavior would be (you can check yourself by e.g. inserting some non-existing filename) but chances are that in your version the error handling is not working very well, potentially ignoring errors (because you're not listening for error events).

How do I write the results of a Node function to html/console log?

I am new with node and I am trying to print the results to the console and eventually display them in HTML. I have tried invoking the function as a var that I would later use in HTML but this didn't work. Some similar example code:
var app = require('express')();
var x = require('x-ray')();
app.get('/', function(req, res) {
res.send(x('http://google.com', 'title').write());
})
Thanks!
I don't know much about the "x-ray" library, but I presume the problem is with that since it has to asynchronously make a request before it can return the response data. The documentation says that if you don't set a path as an argument to the write function it returns a readable stream, so try this:
app.get('/', function(req, res) {
var stream = x('http://google.com', 'title').write(),
responseString = '';
stream.on('data', function(chunk) {
responseString += chunk;
});
stream.on('end', function() {
res.send(responseString);
});
});
You also need to start the server listening on a particular port (3000 in the example below):
const PORT = 3000;
app.listen(PORT, function() {
console.log("Server is listening on port " + PORT + ".");
}); // the callback function simply runs once the server starts
Now open your browser and navigate to 127.0.0.1:3000 or localhost:3000, and you'll see "Google" appear!
ALSO: If you want to use the response data in a full HTML page (rather than just sending the string on its own), you may want to explore further how to do this in Express with Jade (or similar) templates. And the code at the moment scrapes Google every time someone makes a request to the appropriate route of your server; if you only want to scrape Google once, and then use the same string again and again in your server's responses, you may want to think about how to implement this (it's easy!).

Node.js + request - saving a remote file and serving it in a response

I understand how to load a remote file with Node.js + request, and I can then read it and return the png binary blob. Is there a elegant way to do it with one request (or even a one-liner)
something like:
http.createServer(function(req, res) {
res.writeHead(200, {
'Content-Type': 'image/png'
});
var picWrite = fs.createWriteStream(local);
var picFetch = fs.createReadStream(local);
picStream.on('close', function() {
console.log("file loaded");
});
request(remote).pipe(picWrite).pipe(picFetch).pipe(res);
})
To be clear: my aim is to load a remote file from a CDN, cache it locally to the server and then return the file in the original request. In future requests I use fs.exists() to check it exists first.
This is my best effort so far:
http.createServer(function(req, res) {
var file = fs.createWriteStream(local);
request.get(remote).pipe(file).on('close', function() {
res.end(fs.readFileSync(local), 'binary');
});
})
Since the request will return a readable stream, we can listen on its data and end events to write to both the HTTP response and a writable stream.
var http = require('http');
var request = require('request');
http.createServer(function(req, res) {
res.writeHead(200, { 'Content-Type': 'image/png' });
var file = fs.createWriteStream(local);
// request the file from a remote server
var rem = request(remote);
rem.on('data', function(chunk) {
// instead of loading the file into memory
// after the download, we can just pipe
// the data as it's being downloaded
file.write(chunk);
res.write(chunk);
});
rem.on('end', function() {
res.end();
});
});
The method that you showed first writes the data to disk, then reads it into memory again. This is rather pointless, since the data is already accessible when it's being written to disk.
If you use an event handler, you can write to both the HTTP response and the file stream without needing to pointlessly load the file to memory again. This also solves the problem with using pipe(), because pipe() will consume the data from the readable stream, and can only be done once.
This also solves problems with running out of memory, because if you were to download a large file, then it would effectively run your Node.js process out of memory. With streams, only chunks of a file are loaded into memory at one time, so you don't have this problem.

Setting a document root to your Node.js http server?

I just setup a basic node.js server with socket.io on my local machine. Is there a way to set a document root so that you can include other files. Ie. Below I have a DIV with a a background image. The path the image is relative to the location of the server, however this is not working. Any ideas? Thanks!
var http = require('http'),
io = require('socket.io'), // for npm, otherwise use require('./path/to/socket.io')
server = http.createServer(function(req, res){
// your normal server code
res.writeHead(200, {'Content-Type': 'text/html'});
res.end('<div style="background-image:url(img/carbon_fibre.gif);"><h1>Hello world</h1></div>');
});
server.listen(8080);
// socket.io
var socket = io.listen(server);
Use Express or Connect. Examples: https://github.com/spadin/simple-express-static-server, http://senchalabs.github.com/connect/middleware-static.html
For the background-image style, browser will create a entirely new HTTP Request to your server with path *img/carbon_fibre.gif*, and this request will certainly hit your anonymous function, but your response function only write back a div with ContentType: text/html regardless the req.pathname so that the image cannot be properly displayed.
You may add some code to your function like:
var http = require('http'),
io = require('socket.io'),
fs = require('fs'),
server = http.createServer(function(req, res){
// find static image file
if (/\.gif$/.test(req.pathname)) {
fs.read(req.pathname, function(err, data) {
res.writeHead(200, { 'Content-Type': 'image/gif' });
res.end(data);
});
}
else {
// write your div
}
});
server.listen(8080);
I'm not very familiar with nodejs, so the code above only demonstrates a logic but not the actual runnable code block.

Categories

Resources