Why does Chrome send so many HTTP requests? - javascript

I am running a barebones Nodejs server only using the HTTP module. I've created an HTTP server and am listening on socket connections and on requests. I noticed that when I use chrome and go to localhost, three sockets connect, and two requests are made to "/". I know that, using some other webservers, I've seen Chrome request the same thing multiple times if it does not receive a quick response (about 5 seconds), but I am sending a response right away and still Chrome is connecting/requesting multiple times.
Is this expected, and if it is, should I be expected to handle duplicate requests?
My relevant code
let server = http.createServer();
server.listen({
host: host,
port: port
});
server.on('connection', function(socket){
// gets printed 3 times
console.log('connection')
});
server.on('request', function(request, response){
// gets printed two times
console.log('hi')
// yet chrome only receives one response (seemingly)
response.end('hi')
});
Edit: Half solved. Now I am printing request.url and I see
/
and
favicon.ico
So there are 2 requests, but still 3 socket connections. I guess every single request is on a new socket?

All individual images, css and javascript will definitely make http requests. No doubt about it.

Related

How to handle tcp/ip raw requests and http requests on the same server

I am working on a gps tracking system and have built a server on node js.
This is how the file looks like for reference.
const net = require('net');
const lora_packet = require('lora-packet');
const dataParser = require('./dataParser');
const clients = [];
net.createServer(function(socket) {
socket.name = socket.remoteAddress + ":" + socket.remotePort;
clients.push(socket);
socket.on('data', function(data) {
console.log("Buffer sent by terminal : ");
console.log(data);
const packet = lora_packet.fromWire(data, 'hex');
const str = packet._packet.PHYPayload.toString('hex');
dataParser.parse_data(str, socket);
});
socket.on('end', function() {
clients.splice(clients.indexOf(socket), 1);
//broadcast(socket.name + "has left the cartel.\n");
});
function broadcast(message, sender) {
clients.forEach(function(client) {
if (client === sender) client.write(message + "\nsent\n");
return;
client.write(message);
});
process.stdout.write(message);
}
}).listen(8080);
console.log("cartel is running on the port 8080\n");
This server file handles only requests from the hardware and processes raw tcp/ip requests.
I want the server to handle http requests also and want to incorporate routing feature in the server too for client side applicarions for browser.
1) Is there any way that http requests can also be handled by the same server or should I open another port and deploy an express node js app on that?
2) If I use the same 8080 port for http, how can the routing be achieved?
3) If I use different ports for http and raw tcp/ip, what would be the best way for communication between the two server. The communication between tcp/ip server and http server should happen via socket(sending data dynamically).
From http server using socket, data has to be sent dynamically to browser to update live location
So is the flow right?
Hardware (<---->)TCP/IP server(<--->)Http server(<--->)Browser
If more information is needed to solve the query, I'll provide with that!
Thank you
It's very complicated to try to speak multiple protocols on the same port. It requires some sort of scheme at the beginning of each connection to sample the incoming data and identify which protocol it is and then shunt that connection off to the right code to handle that protocol. I wouldn't suggest it.
It is way, way easier to just open a second server on a different port for an Express server to field your http requests. Very simple. You can do it right in the same app. Because both servers can be in the same app, you can just directly read from one connection and write to the other. There's no need for interprocess communication.
Is there any way that http requests can also be handled by the same server or should I open another port and deploy an express node js app on that?
Open another port. No need to write another app unless you have a specific reason to use two processes. You can put both the plain TCP server and the Express server in the same node.js app.
If I use the same 8080 port for http, how can the routing be achieved?
It's not easy. Not suggest to use the same port for multiple protocols.
If I use different ports for http and raw tcp/ip, what would be the best way for communication between the two server. The communication between tcp/ip server and http server should happen via socket(sending data dynamically).
You can put both servers in the same node.js app and then you can just read/write directly from one to the other with the same code. No need for interprocess communication.
From http server using socket, data has to be sent dynamically to browser to update live location
Sending data dynamically to a browser usually means you want the browser to hold something like a webSocket or socket.io connection to your server so you can then send data to the browser at any time over the existing connection. Otherwise, you would have to "wait" for the browser to request data and then respond with the data when it asks.

Handling long response time for REST API

I have created a Javascript based REST API page (private chrome extension) which integrates with the Oracle tool and fetches response. It works fine if the response is received within around 3-5 mins however, if it takes additional time it gives ERR_EMPTY_RESPONSE error.
I have tried xhr.timeout but still it gives the same ERR_EMPTY_RESPONSE error. How can we ask the Javascript to wait for more time?
Thanks..
If you are making ajax call to server and want to increase waiting time of response
then you need to set "timeout" interval at server side.
In nodejs I am giving the way that you can apply at server side to increase timeout period.
in app.js file(express framework)
write down following code
app.use(function(req, res, next) {
//Set time out for request to 24hour
req.connection.setTimeout(24 * 60 * 60 * 1000);
next();
});
You can also refer this
HTTP keep-alive timeout
Proper use of KeepAlive in Apache Htaccess
you need to do this at server side

How the request response cycle in node js works with external I/O

Complete node.js beginner here. I saw this "hello world" example somewhere
// Load the http module to create an http server.
var http = require('http');
// Configure our HTTP server to respond with Hello World to all requests.
var server = http.createServer(function (request, response) {
response.writeHead(200, {"Content-Type": "text/plain"});
response.end("Hello World\n");
});
// Listen on port 8000, IP defaults to 127.0.0.1
server.listen(8000);
// Put a friendly message on the terminal
console.log("Server running at http://127.0.0.1:8000/");
Really simple code in which the server responds to HTTP requests with a simple HTTP response with plain text "Hello World"
I also ready about a library to make HTTP requests from javascript
http.get(options, function(resp){
resp.on('data', function(chunk){
//do something with chunk
});
}).on("error", function(e){
console.log("Got error: " + e.message);
});
Here you make an HTTP request with some options and then do something with the response in a callback.
If one was to make such an API request when an HTTP request comes to the node.js server, what happens? Since the flow has to be single threaded, how can one change the state of the response node.js sends to the client in the callback of the HTTP API request? Won't the response be sent to the event loop already by then? How can one simulate synchronous requests in this system so that you can use the response of the API request to send a response to the client?
Since the flow has to be single threaded, how can one change the state of the response node.js sends to the client in the callback of the HTTP API request?
Because the response isn't sent synchronously with the request having been received.
Won't the response be sent to the event loop already by then?
The response isn't sent until you call res.send or similar, which doesn't have to be in the same job from the job queue that triggered your request callback — and frequently isn't.
How can one simulate synchronous requests in this system so that you can use the response of the API request to send a response to the client?
There's no need to, and doing so would kill throughput.
On any given thread (and NodeJS uses only one), JavaScript works on the basis of a job queue: The single JavaScript thread works by picking up a job from the queue, running the code for it all the way through, and then picking up the next job from the queue (or idling until one is added). When an event comes in or similar, if you've set up a handler for that event, a call to your handler is added to the job queue. (There are actually at least two layers to the job queue; see this answer for more on that if you're interested.)
It's absolutely fine if you don't respond to the "We got an HTTP request" from within the code for the job that called your handler. That's perfectly normal. The job and the request are completely disassociated from each other. And so it's fine (and normal) if you start an asynchronous process (like a get, or a readFile). Later, when the result of that process is available, a new job gets added to the queue, the JavaScript thread picks it up, and you use res.send or similar to reply to the request that's been waiting.
This is how NodeJS manages high throughput despite having only a single thread: If you use asynchronous I/O throughout, your actual code doesn't have to occupy the thread for all that long, because it doesn't wait for the I/O to complete. It can do more work on other things while the I/O is pending, and then respond to it when the I/O completes.
You need to change your code like this:
http.get(options, function(resp){
resp.on('data', function(chunk){
resp.send(chunk);
});
}).on("error", function(e){
console.log("Got error: " + e.message);
});

Getting an error Socket is trying to reconnect to Sails

My Sails version is 0.11.2 and running with port 1337
In assets/js/dependencies/sails.io.js, i can see version as 0.11.0
Below is the client side script.
<script src="http://localhost/project/assets/js/dependencies/sails.io.js"></script>
<script type="text/javascript">
// `io` is available as a global.
// `io.socket` will connect automatically, but at this point in the DOM, it is not ready yet
// (think of $(document).ready() from jQuery)
//
// Fortunately, this library provides an abstraction to avoid this issue.
// Requests you make before `io` is ready will be queued and replayed automatically when the socket connects.
// To disable this behavior or configure other things, you can set properties on `io.sails`.
// You have one cycle of the event loop to set `io.sails.autoConnect` to false before the auto-connection
// behavior starts.
io.socket.get('/hello', function serverResponded (body, JWR) {
// JWR ==> "JSON WebSocket Response"
console.log('Sails responded with: ', body);
console.log('with headers: ', JWR.headers);
console.log('and with status code: ', JWR.statusCode);
// first argument `body` === `JWR.body`
// (just for convenience, and to maintain familiar usage, a la `JQuery.get()`)
});
I am getting the error like
Socket is trying to reconnect to Sails...
When i checked some other posts, there saying something related with sails version.
I tried to change the sails.io.js version to 0.11.2, but still same error.
Is this error have any connection with port ?
Because the response from below request is 404
http://localhost/socket.io/?__sails_io_sdk_version=0.11.2&__sails_io_sdk_platform=browser&__sails_io_sdk_language=javascript&EIO=3&transport=polling&t=1444654910110-52
Response
<p>The requested URL /socket.io/ was not found on this server.</p>
<address>Apache/2.2.22 (Ubuntu) Server at localhost Port 80</address>
Any help what is wrong ?
You're running the Sails app on port 1337, but loading the sails.io.js file from port 80 (because you don't specify another port):
<script src="http://localhost/project/assets/js/dependencies/sails.io.js">
I guess you have an Apache server running on port 80, so it's finding the sails.io.js file and returning it, but then the socket client assumes that it should be connecting on port 80 as well.
Either update your script tag with a port:
<script src="http://localhost:1337/js/dependencies/sails.io.js">
or specify an alternate URL for the socket to connect to, using the following code before the io.socket.get:
io.sails.url = "http://localhost:1337";
The 404 is coming from a client-side socket attempting to connect to the server, which is not accepting socket connections.
if you are not using a socket in your application you need to delete the sails.io.js script.
and below both options didn't work for me but this worked. I deleted the sails.io.js file.

event stream request not firing close event under passenger apache

So I have an event stream in my express js node application. Here's an overview:
app.get('/eventstream', function(req, res){
req.socket.setTimeout(Infinity);
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
});
res.write('/n');
req.on('close', function(){
console.log('connection closed');
});
}
On my local dev box, running from the command line with
node app.js
it works fine, and prints out 'connection closed' when i close my tab in my browser.
However, when running on my server, under Apache with Passenger, it doesn't print out any message - the server seems to not fire the 'close' event. I'm using this event to remove from my count of active users. Any ideas?
Cheers,
Dan
Phusion Passenger author here. The short answer is: technically, the connection hasn't closed yet.
Here's the long answer. If the client connects directly to your Node.js process, then yes, the connection is closed. But with Phusion Passenger there's a proxy in between the client and Node.js. The thing about sockets is that there are two ways to find out whether a socket has been closed: either 1) by reading end-of-file from it, or 2) by writing to it and getting an error. Phusion Passenger stops reading from the client as soon as it has determined that the request body has ended. And in case of GET requests, that is immediately after header. Thus, the only way Phusion Passenger can notice that the client has closed to the connection, is by sending data to it. But your application never sends any data after that newline, and so Phusion Passenger doesn't do that either and never notices that the connection is closed.
This issue is not limited to Phusion Passenger. If you put your Node.js app behind a load balancer or any other kind of reverse proxy, then you could also run into the same issue.
The standard solution is to regularly send "ping" messages, with the purpose of checking whether the connection is alive.
A similar issue also applies to WebSockets. It is the reason why the WebSocket supports ping frames.
UPDATE February 28, 2016:
We have found a solution, and Passenger 5.0.26 and later supports forwarding half-close events which fixes the problem described by #coffeedougnuts. Just use 5.0.26 and later and it'll work as expected.

Categories

Resources