I'm new to JS and more specifically Node. Even after reading the api docs, I'm confused about what 'requestListener' is in the following method.
http.createServer([requestListener]);
Searching google revealed that 'requestListener' is a(n) (anonymous) function with the following signature:
function (request, response) { };
I suppose I'm reading the docs incorrectly, hopefully someone can point me in the right direction.
The docs say that the method call takes a function which will be called when a new request is received by your application. This function, as you correctly stated in your question, takes two arguments - a request and response objects.
You should inspect the contents of these objects to learn what information is available to you. Also, take a look at the API docs for request and response.
The function is optional; you could also attach the request handler in the following way:
var server = http.createServer()
server.on('request', function (req, res) {
// Process the request here
})
In practice, this function is called when someone opens up your website in their browser (i.e. issues a GET http request). The purpose of that function is to provide a HTTP response body back to the client, i.e. render a web page or perform any business logic as necessary.
To directly answer your question: it's a function that gets called when a request is received by the server and is given those two parameters.
At the very least you can experiment with doing a console.log(request, response) inside the function and see what is spit out in your terminal.
But that's only the beginning of the rabbit hole. You should read up on "callback functions", as they are integral to how Node (and quite a bit of client-side javascript) operates (asynchronously).
The http.createServer method creates a server object.
The server object has a listen method. If you invoke the listen method, for example:
createServer(requestListener).listen({ port:80 });
the server object will be listening on port 80, and when a Http request is received on that port, the server object will invoke the function requestListener, passing it two objects, of type request and response. So you can write your requestListener like, for example:
function requestListener(req, res) {
res.write("Hello world");
res.end();
}
and it will push out the string Hello world to every url that is hitting your this simple web server.
write is one of the many methods of the response object.
You can run the above few lines of code on your PC, and point your browser to http://localhost.
If you have other applications listening on port 80, then use a different port number in your listen method.
Using an anonymous function for requestListener is merely a different coding pattern, for better or for worse. My code above can be re-written as:
createServer((req, res) => {
res.write("Hello world");
res.end();
}).listen({ port:80 });
The above code is very rudimentary and will send the same response to every Http request that hits it. The code does not differentiate between localhost/page1 or localhost/page2, etc. So to do anything more, the requestListener have to be expanded significantly to parse different paths in the url and to decide on what content to send for each. But to do anything useful as a real web server without writing too much code, you will need a package. Express is an excellent one that drives many real life web servers.
Related
I have a front end application, which I would like to return results with from an Express backend. Let's just call those results country and city for this reference.
I have done a bunch of searching, but I cannot find any solid resources on the relationship between the front end and middleware. Yes, I know what these things are, and the order in which they should flow, but the confusion sits with :
Do I need to connect my front end and middleware? How so?
If I am already connected to my backend from the front end, would I also have to connect to middleware?
How would I return the country and city from the middleware and/or express backend?
Any other info you think would be helpful for the greater dev community who is still learning would be beneficial.
While you could return data from a middleware, it's probably not what you are trying to do. A middleware is a piece of code that is executed between the time the request is receive by your backend, and the resource is fetch. In a middleware you could do things such as check if a user has access to a certain resource or authenticate a user by some sort of credential passed with the request.
Either way, the way you would, typically, do request from your front-end to your backend is via an XmlHttpRequest. Those request are usually Asynchronous, so they usage will not block the whole page while being executed. There are many ways you could create XmlHttpRequest. The native Javascript way is kinda ugly so I would suggest using the fetch api instead. You could also go with third party library if you need to do more complex stuff. I personnally like axios but this is up to you.
To give you a better understanding of what Express is doing, it's basically an infinite loop that waits for http request. You need to defined routes, that execute function that returns data.
Here is a basic example. Note that this script is executed via NodeJS :
// myserver.js
const express = require('express')
const app = express()
app.get('/cities', (req, res) => {
const cities = /** somehow get all the cities **/
res.json(cities);
})
/** the rest of the server... **/
/** For example, the route for Countries **/
In the previous example, we've built a basic server that listen to the url localhost:3000/cities and execute a function when this url is fetched. The said function will fetch all the cities and return them as JSON.
In your frontend, You would need to do a XmlHttpRequest that would call this url, to get the server to execute the function, which will return the data. Phew... I hope I did not lost you there.
A typical example would be a simple call using the fetch api.
Please note that this script is executed in the browser.
// myclient.js
async fetchAllCities() {
const cities = await fetch('http://localhost:3000/cities');
console.log(cities);
}
// just for fun, we add a click listener on a button and call the function defined above.
document.getElementById('myButton').addEventListener('click', async function() {
// we fetch the cities when we click on the button !
await fetchAllCities();
});
In the previous example, I am using the fetch function to call the url we declared in our Express server.
I'm also using Async / Await, which can be a little tricky, but it just mean Wait for the data to be there before going forward.
I highly suggest reading on the subject. Here are some references.
How do I return the response from an asynchronous call?
Understanding async/await on NodeJS.
Await from MDN
I hope this brief overview of XmlHttpRequest helped you get the base of how an API works.
Middleware is used to help the back-end do its job in processing incoming requests. It does not exist separate from the back-end. It's part of the back-end. For example, you might have middleware that checks to see if an incoming request is properly authorized/authenticated before the route can be handled by it's regular route handler.
Do I need to connect my front end and middleware? How so?
No. Your front-end sends requests to the back-end. The back-end may or may not use middleware to service the request. That's entirely up to the implementation in the back-end and what it needs to do for any given request.
If I am already connected to my backend from the front end, would I also have to connect to middleware?
No. You don't separately connect to middleware. You connect to your back-end and the back-end may or may not use middleware to do its job (something the front-end will have no knowledge of).
How would I return the country and city from the middleware and/or express backend?
You would have to show more details about what you're actually trying to return back from a request, but a common data format is JSON so you could construct a Javascript object with your desired response and then send it back to the client as the response from the incoming request using either res.json(someObj) or res.send(someObj) (both do the same thing if someObj is a Javascript object).
For example:
app.get("/getsomething", (req res) => {
// do some processing here to get cityResult and countryResult
// construct object to send back to client
const obj = { city: cityResult, country: countryResult};
// send this object as JSON back the the client as the response to this
// incoming request
res.json(obj);
});
I've got a Node.js web server communicating with a locally running Python TCP socket server (communicating via their respective socket modules net.Socket, socket).
Clients make HTTP post requests from the browser which get handled by a Node http.createServer function, with some of them sent to the Python server for heavy computations, the results of which are then sent back to Node and back to the browser for rendering.
The Python server is necessary instead of a Node child process as there are some large (immutable) objects required for the Python computations that take a while to initialise and are then shared across threads. It would be infeasible to create and destroy these objects for every browser request.
So my question is, using the Node callback paradigm, how do I capture the response object for each POST request in the net.Socket data event handler/s?
Note 0: Each request has a unique id that is sent to the Python server and returned.
This currently works* inside my http.createServer callback:
http.createServer((request, response) => {
// route and parse incoming requests etc.
// send POST data to Python
python_socket.write(parsed_request_post_data);
// Python works away diligently then emits a data event handled below
python_socket.once('data', (data_from_python) => {
// error and exception handling
response.setHeader('Content-Type', 'application/json');
response.end(data_from_python);
});
}).listen(HTTPport);
*However if I bomb the server with multiple requests, sometimes I get the same data returned in each response (even though Python handles each data separately). I worry that I am trying to assign multiple once('data' callbacks in the same Node event loop and only one of them is persisting, and that is the one repeatedly sending the Python data back to the browser? Though if this were the case the response object would also be getting repeated and I would get an error for trying to end an already closed response right? But each response seems to end fine.
Apologies for the rather long and vague question. I'm still learning and would really appreciate any advice or references I can study to help me understand what is going on. Also very open to trying different approaches (except changing web server - see note 2 below).
Note 1: I tried declaring a global data handler (note the on instead of once) for the net.Socket server as follows, but couldn't figure out how to forward the returned data to each http response?
python_socket.on('data', (data_from_python) => {
// error and exception handling
// how do I get data_from_python out to each http response
// then close it in a non-blocking way?
});
Note 2: I'm not allowed to use a Python web server as the business wants to reuse this design in future to plug and play other services (R, Julia, C++, ...) into Node web servers.
I am starting to learn more about how this "web world" works and that's why I am taking the free code camp course. I already took front-end development and I really enjoyed it. Now I am on the back end part.
The back end is much more foggy for me. There are many things that I don't get so I would hope that someone could help me out.
First of all I learned about the get method. so I did:
var http = require('http');
and then made a get request:
http.get(url, function callBack(response){
response.setEncoding("utf8");
response.on("data", function(data){
console.log(data);
});
});
Question 1)
So apparently this code "gets" a response from a certain URL. but What response? I didn't even ask for anything in particular.
Moving on...
The second exercise asks us to listen to a TCP connection and create a server and then write the date and time of that connection. So here's the answer:
var server = net.createServer(function listener (socket){
socket.end(date);
});
server.listen(port);
Question 2)
Okay so I created a TCP server with net.createServer() and when the connection was successful I outputted the date. But where? What did actually happen when I put date inside of socket.end()?
Last but not least...
in the last exercise I was told to create an HTTP server (what?) to server a text file for every time it receives requests, and here's what I did:
var server = http.createServer(function callback(request, response){
var read = fs.createReadStream(location);
read.pipe(response);
});
server.listen(port);
Question 3)
a) Why did I have to create an HTTP server instead of a regular TCP? what's the difference?
b)what does createReadStream do?
c) What does pipe() do?
If someone could help me, trying to make the explanation easier would help me a lot since I am, as you can see, pretty dumb on this subject.
Thank you a lot!
This is a little broad for Stackoverflow which favors focused questions that address specific problems. But I feel your pain, so…
Questions 1:
Http.get is roughly equivalent to requesting a webpage. The url in the function is the page you are requesting. The response will include several things like the HTTP response code, but also (most importantly) the content of the page, which is what you are probably after. On the backend this is normally used for hitting APIs that get data rather than actual web pages, but the transport mechanism is the same.
Question 2:
When you open a socket, you are waiting for someone else to request a connection. (The way you do when you use http.get(). When you output data you are sending them a response like the one you received in question 1.
Question 3:
HTTP is a higher level protocol than TCP. This basically means it is more specific and TCP is more general (pedants will take issue with that statement, but it's an easy way to understand it). HTTP defines the things like GET and POST that you use when you download a webpage. Lower down in the protocol stack HTTP uses TCP. You could just use TCP, but you would have to do a lot more work to interpret the requests that come in. The HTTP library does that work for you. Other protocols like FTP also use TCP, but they are different protocol than HTTP.
For this answer, you need to understand two things. An IP address is the numeric value of a website, it's the address to the server pointing to the site. A domain name is a conversion from IP to a NAMED system which allows humans an easier way to see the names of websites, so instead of typing numbers for websites, like 192.168.1.1, we can now just type names (www.hotdog.com). That's what your get request is doing, it's requesting the site.
socket.end is a method you're calling. socket.end "Half-closes the socket. i.e., it sends a FIN packet. It is possible the server will still send some data" from the nodejs.org docs, so basically it half closes your socket at the parameter you're sending in, which is todays current date.
HTTP is hyper text transfer protocol, TCP (transmissioncontrol protocol) is a link between two computers
3a HTTP is for browsers, so that's why you did it, for a web page you were hosting locally or something.
3b createreadstream() Returns a new ReadStream object. (See Readable Stream).
Be aware that, unlike the default value set for highWaterMark on a readable stream (16 kb), the stream returned by this method has a default value of 64 kb for the same parameter.
3c pipe:
The 'pipe' event is emitted when the stream.pipe() method is called on a readable stream, adding this writable to its set of destinations.
I want to design a live updating Node.js Express server, perhaps having a particular route, say /update, which loads a new configuration file. The only concern I have with this right now is that the server could potentially be in any state when the update happens. If I load a new configuration file while there is a JS message being processed for a user request, at the beginning of the user request there could be one configuration and before the request completes there could be a second configuration when the new config file is loaded. The only way I can think of to prevent this is to take down the server for at least one minute (keep the server live, but prevent any incoming requests altogether) and then update the server and put it back online, but then that's not really the best form of hot reloading or live updating is it?
How can I somehow trick the JS event loop so that the config file only gets loaded once all requests have completed and delay any new requests until after the config is loaded?
One algorithm would be:
set a flag "starting re-configuration"
above flag prevents any new requests from being processed (using Express middleware)
check that all current requests are completed (can't think of anything better than a polling loop here)
once above check is done, load new configuration
once configuration is loaded, switch the flag from (1)
Disclaimer: I have not tried this in production. In fact, I have not tried this at all. while I believe the idea is sane, there may be hidden pitfalls along the road which are not currently known to me.
There is one thing that many Node.js developers tend to forget/not fully realise:
There can always be only one JavaScript statement executed at a time.
It does not matter that you can do async I/O or execute a function later in time. No matter how hard you try, all the JS code that you write is executed in a single thread, no parallelism. Only the underlying implementation (which is completely out of our control) can do things in parallel.
This helps us, because as long as our update procedure is synchronous, no other JS code (i.e. client response) can be executed.
Configuration live-patching
The solution to prevent configuration change mid-request is rather simple:
Each request gets its own copy of the application's configuration.
If your application's configuration lives in a JavaScript object, you simply clone that object for each new request. This means that even if you suddenly change the configuration, it will only be applied to new incoming requests.
There are tons of modules for cloning (even deep cloning) objects, but since I believe mine is best I will use this opportunity for some small self-promotion - semantic-merge.
Code live-patching
This is a bit trickier, but should be generally possible with enough effort.
The trick here is to first remove/unregister current Express routes, clear Node's require cache, require the updated files again and re-register route handlers. Now Express will finish all pending requests using the old code (this is because Node cannot remove these old functions from memory as long as the code contains active object references - which it does - req and res) and use the newly required modules/routes for new incoming requests. Old code should get released from memory as soon as there are no more requests that started with the old code.
You must not use require anywhere during request processing, otherwise you risk the same problem as with changing configuration mid-request. You can of course use require in a module-level scope because that will be executed when the module itself is required, thus being synchronous.
Example:
// app/routes/users.js (or something)
// This is okay, because it is executed only once - when users.js
// itself is required
var path = require('path')
// This function is something you would put into app.use()
module.exports = function usersRoute (req, res, next) {
// Do not use require() here! It will be executed per-request!
}
I think that instead of looping a request to the server you can use a Websocket.
That way, when there's a change in the config file that you mentioned, the server can 'emit' a message to the users, so they refresh their data.
If you are using nodeJS and Express, this will help you:
Socket.io with NodeJS
The server will wait for the signal of some user or anybody and emit the signal to all the users, so they get the new data
Node.js:
var express = require('express');
var app = express();
var server = require('http').createServer(app);
var io = require('socket')(server);
var port = process.env.PORT || 3000;
server.listen(port, function () {
console.log('Server listening at port %d', port);
});
app.use(express.static("/PATH_TO_PROJECT"));
io.on('connection', function (socket) {
socket.on('someone update data', function (data) {
socket.to(socket.room).broadcast.emit('data updated', {params:"x"});
}
});
Meanwhile, the client will be listening if there's any change:
View.js:
var socket = io();
socket.on('new message', function (data) {
liveUpdate(data);
});
I hope I understood correctly what you asked
This is a good problem to solve.
A possible solution could be:
That you derive the controllers on each path from a parent controller. The parent controller can flag a property ON (a flag / file) when a request arrives and turn it OFF when the response is sent back.
Now subclass this parent controller for each express end-point facing the front end. If you make a request now to '/update', the update controller would know if the server is busy or not through the FLAG and send back a reply if the update was successful or not.
For update failures the front end could possibly post back to the '/update' end point with some back-off scheme.
This scheme might work for you ...
I've decided to learn node, an so I'm following, to begin with, The Node Beginner Book. As in I guess a lot of other resources, there is the "simple HTTP server", first step, something like:
var http = require("http");
http.createServer(function(request, response) {
response.writeHead(200, {"Content-Type": "text/plain"});
response.write("Hello World");
response.end();
}).listen(8888);
As I understand it, when someone, in this case me though localhost:8888, makes a request, an event is triggered, and the anonymous function that got passed to http.createServer gets fired. I put here the documentation that I've managed to find about http.createserver for anyone that finds it useful:
http.createServer([requestListener])
Returns a new web server object.
The requestListener is a function which is automatically added to the 'request' event.
(from the node.js site)
I couldn't find or figure out through how does this triggered function get it's parameters passed, and how do I find out about it. So... how do I know where does these parameters come from, what methods do they offer, etc?
Thanks in advance!
In JavaScript, functions can be passed into methods as a parameter. Example:
function funcA(data) {
console.log(data);
}
function funcB(foo) {
foo('I'm function B'); // Call 'foo' and pass a parameter into that function
}
funcB(funcA); // Pass funcA as a parameter into funcB
What you're doing with http.createServer is the above, passing a function that can accept parameters. A new server expects you to pass in a function that it can call. The server will do internal actions which it will create a request and response object, and then call the function you passed in with those variables.
Read about the Http Event: Request for details about these parameters.
this should be the create stack:
https://github.com/joyent/node/blob/master/lib/http.js#L62 > https://github.com/joyent/node/blob/master/lib/_http_server.js#L253
so if a request is fired, this should be get triggered: https://github.com/joyent/node/blob/master/lib/_http_server.js#L502 - or maybe this: https://github.com/joyent/node/blob/master/lib/_http_server.js#L505
The node.js documentation, explains pretty much everything you need to know about a http.ClientRequest and a http.ServerResponse, including methods and events.
If you need information about the HTTP protocol in general, you can find a lot of resources by googling it, such as the HTTP Wikipedia page.
If you want to see in details how HTTP is implemented in node, you'll have to jump into the node.js source code.
Also, you might be interested in express.js, which the most used web framework for node, hence a lot of resources about it are available online.