I have simulated a time consuming asynchronous operation on the node-express route using setInterval. while the setInterval was waiting, node-express didn't respond to another client's request.
what can be done to prevent blocking of the route?
This is the simulation I have used:
I opened two clients and tried to send requests from both of the clients at the same time. the test was done on localhost.
I have tested it again and again, and the second client gets the responses from the server only after all ten of the first client request were responded.
Client:
<button onclick='getManyData()'>Get Data </button>
<script>
function getManyData(){
for(let i=0; i<10;i++){
getData()
}
}
function getData(){
fetch('/api/req1')
.then(res=>res.json())
.then(data=>{
console.log(data)
})
}
</script>
Server:
const express = require('express');
const app = express();
app.use(express.static('public'))
app.get('/api/req1', (req, res) => {
setTimeout(()=>{
res.send({ ok: true })
},500)
})
app.listen(3000, () => { console.log('listen on port 3000') })
This is how it looks on the network:
Update:
I have found that this phenomenon happens only on chrome, but not on firefox (in Firefox, all the requests are answered after approximately 500ms). Any suggestions on why this happens?
Related
I have a completed script that acts as a parser. The script is written in NodeJS and it works properly. The script returns an array of data and also saves it to my computer.
I would like to run this script from the frontend, at the click of a button. As far as I understand, I have to send a request to the server? It's suggested to use Express for the server, but I still haven't figured out how to call a third-party script from it, much less return any data from it.
Right now all I want is for my script to run when I make a request for the root directory "/" and send me a json in response (or for example a json file)
const express = require('express')
const runParser = require("./parser");
const app = express()
const port = 3000
app.get('/', async (req, res,next) => {
await runParser()
next()
})
app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
})
All you need for Express is this:
const express = require('express');
const app = express();
const runParser = require("./parser");
const port = 3000;
app.get("/", (req, res) => {
runParser().then(results => {
res.json(results);
}).catch(err => {
console.log(err);
res.status(500).send("error");
});
});
app.listen(port, () => {
console.log(`Server running on port ${port}`);
});
And, then you can access that either by just going to:
http://localhost:3000
from your local host or
http://yourdomain.com:3000
in the browser or by issuing an ajax call to the desired URL from webpage Javascript.
I wouldn't personally put this type of activity on a GET request to / because that can be hit by things like web crawlers, search engines, etc...
It probably belongs on a POST (so crawlers won't issue it) and I'd personally put it on some pathname such as:
app.post("/runparser", (req, res) => {
// put your code here
});
And, then use a form submission or ajax call to that URL to trigger it.
I made a web server that serves as a client-side using socket.io-client and express (because I have to use this form in other project).
It emits the string posted and when receiving 'boom' emit from io server it responds by sending the string served.
Posting 'heat_bomb' works well for the first time, but when I try second time '[ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client' occurs at res.send(data) in socket.on().
Is there a way to refresh whenever post request is generated, so that each request uses independent response?
app.ts
import express from 'express'
import {io} from 'socket.io-client'
import bodyParser from 'body-parser'
const app=express()
const PORT=8080
const socket=io(`http://localhost:2002`, {
query:{
hello:"merhaba"
}
})
app.use(bodyParser.urlencoded({extended:false}))
app.get('/', (req, res)=>{
res.sendFile(__dirname+`/index.html`)
})
app.post('/heat_bomb', (req, res)=>{
socket.emit('heat_bomb', req.body.elem)
socket.on('boom', (data)=>{
res.send(data)
})
})
app.listen(PORT, ()=>{
console.log(`Server Running: ${PORT}`)
})
index.html
$('#heat_button').click(function(){
console.log('heating bomb')
$.post('/heat_bomb', {elem: $('#input_number').val()},(data, status)=>{
console.log(data)
console.log('heated')
})
})
Your /heat_bomb middleware registers a new boom handler for every request on the same globally defined socket. Although your code snippet does not show how the heat_bomb and boom events are connected, I assume that the emit('boom') during the second request re-triggers the heat_bomb handler that was registered during the first request, leading to another res.send for the res from the first request, which is already completed. This leads to the observed error message.
socket.once('boom') alone will not solve this reliably: if one heat_bomb event can overtake another, the data from the first can wrongly be "paired" with the res from the second. The res needs to be a second argument of your events, something like this:
socket.emit('heat_bomb', req.body.elem, res);
socket.once('boom', function(data, res) {
res.send(data);
});
Learning node from past week and got some hold on node and express. But now I am facing a problem. I am trying to run multiple express servers on different port and want them to return response after 10 seconds. After running the program, servers are starting fine but when I hit http://localhost:3000 or any of the server's url, observing following:
- on client side I am getting proper response from all servers after 10 secs
- server is getting into infinite loop and continuously printing "returning data..." after the delay of 10 secs
I tried using a function, using a js file to export the server and another class importing it and calling inside for loop. But sever is constantly printing "returning data..." after the delay of 10 secs. Below is my code:
var express = require('express');
const data = '{"key":"value"}';
const server = function (port) {
let app = express();
app.get('/', (req, res) => {
setInterval(function () {
console.log('returning data...')
res.end(data);
}, 10000); //want a delay of 10 secs before server sends a response
})
app.listen(port, () => console.log("Server listening at http://%s:%s",
"localhost", port))
}
console.log('\nStarting servers.......')
for (var i = 0; i < 5; i++) {
server(3000 + i)
}
You need to create multiple app instances from express. Below is the code snippet to start multiple server on different ports from same file.
var express = require('express');
let app1 = express();
let app2 = express();
app1.listen(3000, () => {
console.log("Started server on 3000");
});
app2.listen(3002, () => {
console.log("Started server on 3002");
});
You are using window.setInterval instead of window.setTimeout, that's why is running multiple times.
already answered: https://stackoverflow.com/a/71831233/17576982
(3 ways to start multiple servers on one run in nodejs)
I have an ExpressJs (version 4.X) server, and I need to correctly stop the server.
As some requests can take on long time (1-2 seconds), I must reject new connections and wait the end of all the requests in progress. Kill the server in the middle of some tasks could put the server in an unstable state.
I've tried the following code:
//Start
var app = express();
var server = https.createServer(options, app);
server.listen(3000);
//Stop
process.on('SIGINT', function() {
server.close();
});
However, this code doesn't close keep-alive connections, and some clients can keep the connexion for a long time.
So, how can I properly close all connections ?
You could use some middleware to block new requests if the server is being shut down.
var app = express(),
shuttingDown = false;
app.use(function(req, res, next) {
if(shuttingDown) {
return;
}
next();
});
var server = https.createServer(options, app);
server.listen(3000);
process.on('SIGINT', function() {
shuttingDown = true;
server.close(function(){
process.exit();
});
});
I can't figure out why this code is blocking/preventing the server to fulfill concurrent requests, since I am only using async code. Can someone throw some light?
I am using mongoose streaming feature! The following is an express route.
function getChart(req, res, next) {
var stream = Model.find({}).stream({
transform: JSON.stringify
});
stream.on('data', function(record) {
res.write(record);
});
stream.on('end', function() {
res.end();
});
}
The problem can be verified when requesting around 10000 records from database, which takes around 10 seconds. During this time I open another tab and make a request to any other route, say /home, the content of /home only arrives immediately after the first request finishes!
EDIT
I have just made some basic tests (even set mongoose connection pool to a higher value) and now I know that the problem is not with mongoose, but with nodejs itself or perhaps with express. I created the following test app:
var express = require('express'),
http = require('http'),
app = express();
app.get('/', function(req, res, next) {
console.log('> Request received!');
setTimeout(function() {
console.log('> Request sent!');
res.send(200);
}, 5000);
});
app.listen(8000, function() {
console.log(' > APP LISTENING');
console.log(' > maxSockets: ' + http.globalAgent.maxSockets);
});
Within the 5 seconds of the first request I open more 3 tabs on chrome and make more requests, message > Request received! is only shown after first ones are already sent.
I get 5 as the number of maxSockets
Can anyone help me?