I want to kill the worker threads after the 'done' event is fired in Threads module of Nodejs. How do I achieve this?
const Threads = require('threads');
const Pool = Threads.Pool;
const workerPool = new Pool();
module.exports = class JobManager {
static bufferedJob(pathToScript, params, callback){
workerPool
.run(pathToScript)
.send(params)
.on('done', (result, input) => {
console.log(`Worker Job done: ${pathToScript} `);
callback(null, result);
})
.on('error', (job, error) => {
console.log(`Error in executing Worker Job: ${pathToScript}`);
callback(job || error);
})
}
}
port.close()
Added in: v10.5.0
Disables further sending of messages on either side of the connection. This method can be called when no further communication will happen over this MessagePort.
The 'close' event will be emitted on both MessagePort instances that are part of the channel.
Related
We use the following code to stream the results of a query back to the client:
app.get('/events', (req, res) => {
try {
const stream = db('events')
.select('*')
.where({ id_user: 'foo' })
.stream()
stream.pipe(JSONStream.stringify()).pipe(res)
} catch (err) {
next(err)
}
})
While the code seems to have an excellent memory usage profile (stable/low memory usage) it creates random DB connection acquisition timeouts:
Knex: Timeout acquiring a connection. The pool is probably full. Are
you missing a .transacting(trx) call?
This happens in production at seeming random intervals. Any idea why?
This happens because aborted requests (i.e client closes the browser mid-request) don't release the connection back to the pool.
First, ensure you're on the latest knex; or at least v0.21.3+ which has introduced fixes to stream/pool handling.
From the on you have a couple options:
Either use stream.pipeline instead of stream.pipe which handles aborted requests correctly like so:
const { pipeline } = require('stream')
app.get('/events', (req, res) => {
try {
const stream = db('events')
.select('*')
.where({ id_session: req.query.id_session })
.stream()
return pipeline(stream, JSONStream.stringify(), res, err => {
if (err) {
return console.log(`Pipeline failed with err:`, err)
}
console.log(`Pipeline ended succesfully`)
})
} catch (err) {
next(err)
}
})
or listen to the [close][close] event on req and destroy the DB stream yourself, like so:
app.get('/events', (req, res) => {
try {
const stream = db('events')
.select('*')
.where({ id_session: req.query.id_session })
.stream()
// Not listening to this event will crash the process if
// stream.destroy(err) is called.
stream.on('error', () => {
console.log('Stream was destroyed')
})
req.on('close', () => {
// stream.end() does not seem to work, only destroy()
stream.destroy('Aborted request')
})
stream.pipe(JSONStream.stringify()).pipe(res)
} catch (err) {
next(err)
}
})
Useful reading:
knex Wiki: Manually close streams. Careful, the stream.end mentioned here doesn't seem to work.
knex Issue: stream.end() does not return connection to pool
I have script to move data from one platform to another. The source db allows only 100 records to be fetched in a single request. So I created a routine to fetch by batches of 100 which works fine I guess.
Now I try to process each records of 100 and do the necessary transformations (which involves axios call to get certain data) and create a record in firebase firestore.
Now when I run this migration in firebase express node, I get socket hang up ECONNRESET.
I know this is caused by wrong handling of promises.
Here is what my code looks like:
import { scrollByBatches } from "../helpers/migrations/apiScroll";
import { createServiceLocation } from "../helpers/locations";
const mapServiceLocationData = async (serviceLocation: any, env: string) => {
try {
const migratedServiceLocation: any = {
isMigrated: true,
id: serviceLocation._id,
};
if (serviceLocation.list?.length) {
await Promise.all(serviceLocation.ids.map(async (id: string) => {
const { data } = await dbEndPoint.priceMultiplier({ id }); // error says socket hangup on this call
let multiplierUnit;
let serviceType;
if (data.response._id) {
multiplierUnit = data.response;
const result = await dbEndPoint.serviceType({ id: multiplierUnit.service_custom_service_type }); // error says socket hangup on this call
if (result.data.response._id) {
serviceType = result.data.response.type_text;
migratedServiceLocation.logs = [...multiplierUnit.history_list_text, ...migratedServiceLocation.logs];
}
}
}));
}
await createServiceLocation(migratedServiceLocation); // create record in destination db
} catch (error) {
console.log("Error serviceLocation: ", serviceLocation._id, JSON.stringify(error));
}
return null; // is this even necessary?
};
export const up = async () => {
try {
// get 100 docs from source db => process it.. => fetch next 100 => so on...
await scrollByBatches(dbEndPoint.serviceLocation, async (serviceLocations: any) => {
await Promise.all(
serviceLocations.map(async (serviceLocation: any) => {
await mapServiceLocationData(serviceLocation);
})
);
}, 100);
} catch (error) {
console.log("Error", JSON.stringify(error));
}
return null; // is this even necessary?
};
The error I get in firebase functions console is:
For clarity on how the fetch by batches looks like:
const iterateInBatches = async (endPoint: any, limit: number, cursor: number, callback: any, resolve: any, reject: any) => {
try {
const result = await endPoint({ limit, cursor });
const { results, remaining }: any = result.data.response;
if (remaining >= 0) {
await callback(results);
}
if ((remaining)) {
setTimeout(() => {
iterateInBatches(endPoint, limit, (cursor + limit), callback, resolve, reject);
}, 1000); // wait a second
} else {
resolve();
}
} catch (err) {
reject(err);
}
};
export const scrollByBatches = async (endPoint: any, callback: any, limit: number, cursor: number = 0) => {
return new Promise((resolve, reject) => {
iterateInBatches(endPoint, limit, cursor, callback, resolve, reject);
});
};
What am I doing wrong? I have added comments in the code sections for readability.
Thanks.
There are two cases when socket hang up gets thrown:
When you are a client
When you, as a client, send a request to a remote server, and receive no timely response. Your socket is ended which throws this error. You should catch this error and decide how to handle it: whether to retry the request, queue it for later, etc.
When you are a server/proxy
When you, as a server, perhaps a proxy server, receive a request from a client, then start acting upon it (or relay the request to the upstream server), and before you have prepared the response, the client decides to cancel/abort the request.
I would suggest a number of possibilities for you to try and test that might help you solve your issue of ECONNRESET :
If you have access to the source database, you could try looking
there for some logs or metrics. Perhaps you are overloading the
service.
Quick and dirty solution for development: Use longjohn, you get long
stack traces that will contain the async operations. Clean and
correct solution: Technically, in node, whenever you emit an 'error'
event and no one listens to it, it will throw the error. To make it
not throw, put a listener on it and handle it yourself. That way you
can log the error with more information.
You can also set NODE_DEBUG=net or use strace. They both provide you
what the node is doing internally.
You could restart your server and run the connection again, maybe
your server crashed or refused the connection most likely blocked by
the User Agent.
You could also try running this code locally, instead of in cloud
functions to see if there is a different result. It's possible that
the RSG/google network is interfering somehow.
You can also have a look at this GitHub issue and stackoverflow
thread to see the common fixes for the ECONNRESET issue and see if
those help resolve the issue.
I'm having a problem with socket.io at the moment I try to send a second time from the server to the client
here is the server code with express and socket.io
io.on('connection', async function (socket) {
let socketId = socket.id;
const mta = new Client("20.64.24.144", 22005, "*", "*");
mta.resources.evokestats.getPlayerCount()
.then((result) => {
console.log("result", result);
socket.emit("players-start", { players: result })
})
.catch((err) => {
console.error(`Ooops! Something went wrong ${err}`);
});
app.post('/player_connect', async function (req, res) {
let ip = req.body[0];
let player = await players.findOne({ ip: ip })
if (player) {
await socket.emit("players", { players: req.body[1] })
} else {
try {
player = await players.create({ ip: ip, name: req.body[2] })
await socket.emit("players", { players: req.body[1] })
await socket.emit("last_24_players", { players: 1 });
} catch (error) {
console.log("error", error)
}
}
res.send("connected")
});
});
and here is my client with reactjs and socket.io
useEffect(() => {
getStats();
}, [])
async function getStats(params) {
socket.on("players-start", function (data) {
setNowPlayers(data.players)
});
socket.on("players", function (data) {
console.log("players", data)
setNowPlayers(data.players)
});});
And in my client using react, in useEffect I listen to the "players-start" and the "players" that was emit.
players-start: It is for every first time that I enter my client he only calls once, to bring all players connected
players: Every time someone connects to the game server, a post call is made to my server where I use the express with socket, in the url '/player_connect' and then immediately emit
The problem: whenever I issue an issue on 'players-start' and then immediately enter the game server that calls the url '/player_connect' it is not triggering the issue of 'players' or at least the client is not receiving.
Test I've done:
My first attempt was to stick everything to the listener "players" but it still doesn’t work
I really appreciate everyone's help.
I have GraphQL Subscriptions on my Apollo server that I want to close after the user logs out. The initial question is whether we should close this (socket) connections on the client side or in the backend.
On the front-end, I am using Angular with Apollo Client and I handle GraphQL subscriptions by extending the Subscription class from apollo-angular. I am able to close the subscription channels with a typical takeUntil rxjs implementation:
this.userSubscription
.subscribe()
.pipe(takeUntil(this.subscriptionDestroyed$))
.subscribe(
({ data }) => {
// logic goes here
},
(error) => {
// error handling
}
);
However, this does not close the websocket on the server, which If I'm right, will result in a subscription memory leak.
The way the Apollo Server (and express) is set up for subscriptions is as follows:
const server = new ApolloServer({
typeDefs,
resolvers,
subscriptions: {
onConnect: (connectionParams, webSocket, context) => {
console.log('on connect');
const payload = getAuthPayload(connectionParams.accessToken);
if (payload instanceof Error) {
webSocket.close();
}
return { user: payload };
},
onDisconnect: (webSocket, context) => {
console.log('on Disconnect');
}
},
context: ({ req, res, connection }) => {
if (connection) {
// set up context for subscriptions...
} else {
// set up context for Queries, Mutations...
}
When the client registers a new GraphQL subscription, I always get to see console.log('on connect'); on the server logs, but I never see console.log('on Disconnect'); unless I close the front-end application.
I haven't seen any example on how to close the websocket for subscriptions with Apollo. I mainly want to do this to complete a Logout implementation.
Am I missing something here? Thanks in advance!
I based my solution based on this post
Essentially, the way we created the Subscription with sockets was using subscriptions-transport-ws
export const webSocketClient: SubscriptionClient = new
SubscriptionClient(
`${environment.WS_BASE_URL}/graphql`,
{
reconnect: true,
lazy: true,
inactivityTimeout: 3000,
connectionParams: () => ({
params: getParams()
})
}
);
As specified in the question, I wanted to unsubscribe all channels and close the subscription socket connection before logout of the user. We do this by using the webSocketClient SubscriptionClient in the logout function and call:
webSocketClient.unsubscribeAll();
webSocketClient.close();
I'm working on SocketIO with ReactJS vie a chat app.
When emitting message to my server my client doesn't receive the response of my server. The console.log controlling the mechanism is never displayed.
I can't figure out why since I follow exactly the SocketIO blueprint.
here my client.js :
send= (e) => {
e.preventDefault();
const socket= io.connect(this.state.endpoint);
socket.emit("message", () => {
message: "hey !"
})
console.log("send ended")
}
componentDidMount(){
const socket= io.connect(this.state.endpoint);
socket.on("new_message", (message) => {
console.log("new message ", message)
})
socket.on("user_connected", (message) => {
console.log(message)
})
}
here my server.js :
client.on("message", (message) => {
client.emit("new_message", message)
})
Any hint would be great,
Thanks
The reason for your problem is that you essentially have multiple instances of socket connections created over the life span of your client component.
From the server's perspective, the "new_message" is being emitted to the socket that you created in your components send arrow function. Because that socket instance does not listen to "new_message", you're therefore not going to see the expected log messages in the console.
Perhaps you could consider refactoring your client component code like this, to connect a single socket, and use that as a single means of sending and listening to messages from the server?
class YourComponent extends Component {
// Add socket field to component class
socket : ''
// Note that the send method is not an arrow function here, so
// care should be taken to consider how you invoke send() if
// your current implementation relies on this being an arrow function
function send(e) {
e.preventDefault();
const socket = this.state.socket // UPDATE: Access socket via state
// Send messages to server via the same socket instance of this class
if(socket) {
socket.emit("message", () => {
message: "hey !"
})
console.log("send ended")
}
}
function componentDidMount(){
const socket = io.connect(this.state.endpoint)
socket.on("new_message", (message) => {
console.log("new message ", message)
})
socket.on("user_connected", (message) => {
console.log(message)
})
// UPDATE: Connect the socket, and hold a reference for reuse by the component class
// instance via the component's state (seeing you can't add a class field for this)
this.setState({ socket : socket })
}
}