Error trying to terminate worker thread using setTimeout() - javascript

When trying to terminate a worker thread using setTimeout() I get the following error:
node:internal/worker:361
this[kHandle].stopThread();
^
TypeError: Cannot read properties of undefined (reading 'stopThread')
at Timeout.terminate [as _onTimeout] (node:internal/worker:361:19)
at listOnTimeout (node:internal/timers:564:17)
at process.processTimers (node:internal/timers:507:7)
Node.js v18.12.0
Here is my code
const { Worker } = require("node:worker_threads")
const worker = new Worker("./test.js")
const setAppTimeout = (workerName, seconds) => {
setTimeout(workerName.terminate, seconds * 1000)
}
setAppTimeout(worker, 1)
When I try terminating the worker like this it works though
const { Worker } = require("node:worker_threads")
const worker = new Worker("./test.js")
const setAppTimeout = (workerName, seconds) => {
setTimeout(terminateWorker, seconds * 1000, workerName)
}
const terminateWorker = (workerName) => {
workerName.terminate()
}
setAppTimeout(worker, 1)
Can someone tell me why this is the case?

It's because you are loosing the this context to worker when passing worker.terminate to setTimeout.
What you can do is to .bind the .terminate method to the worker object and pass the bound function:
const setAppTimeout = (workerName, seconds) => {
setTimeout(workerName.terminate.bind(workerName), seconds * 1000);
}
Read more about this on MDN
And more about .bind on MDN, too

Related

Cloudflare websocket returns only one value of Date.now()

Ok so i made a websocket in cloudflare workers which creates a variable connectime when client connects with Date.now() and also when it disconnects. Then it calculates the difference and logs client disconnected after a connection of --ms, where -- is the difference between times..
addEventListener('fetch', (event) => {
event.respondWith(handleRequest(event.request))
})
async function handleRequest(request) {
const connecttime = Date.now();
const upgradeHeader = request.headers.get('Upgrade');
if (!upgradeHeader || upgradeHeader !== 'websocket') {
return new Response('Expected Upgrade: websocket', { status: 426 });
}
const webSocketPair = new WebSocketPair();
const [client, server] = Object.values(webSocketPair);
server.accept();
//Do something
server.addEventListener('close', event => {
const endtime = Date.now();
console.log(endtime);
const contime = endtime-connecttime;
console.log("client disconnected after a connection of " + contime + "ms");
})
The problem is it returns 0 value;
client disconnected after a connection of 0ms
This is by design and part of the security model of Cloudflare Workers - from the documentation:
Workers is designed to make it impossible for code to measure its own
execution time locally. For example, the value returned by Date.now()
is locked in place while code is executing. No other timers are
provided.
For a full explanation, I recommend the following documentation page

NodeJs: How to handle delayed errors in streams

I have the following situation.
function emitErrorInStream() {
let resultStream = new PassThrough();
let testStream = new PassThrough();
testStream.on("error", () => {
throw new Error("AA");
}
// the setTimeout simulates what is actually happening in the code.
/*
* actual code
* let testStream = s3.getObject(params).createReadStream();
* if I pass in an incorrect parameter option to the getObject function
* it will be a few milliseconds before an error is thrown and subsequently caught by the stream's error handling method.
*/
setTimeout(() => {testStream.emit("error", "arg");}, 100);
return testStream.pipe(resultStream);
}
try{
let b = emitErrorInStream();
}
catch(err){
console.log(err) // error will not be caught
}
///... continue
I have tried a slew of things to catch the error thrown inside the error handler. I have tried using promises, which never resolve. How can I catch the error thrown inside thetestStream's error handler?
I have found that sending an end event inside the on("error") handler partially solves my issue as it does not crash the application running. It is not a recommended solution https://nodejs.org/api/stream.html#stream_event_end_1
Lastly, is catching this error possible if emitErrorInStream is a third party function to which I do not have access?
Any insights would be greatly appreciated.
// actual typescript code
downloadStream(bucketName: string, filename: string): Stream {
const emptyStream = new PassThrough();
const params = { Bucket: bucketName, Key: filename };
const s3Stream = this.s3.getObject(params).createReadStream();
// listen to errors returned by the service. i.e. the specified key does not exist.
s3Stream.on("error", (err: any) => {
log.error(`Service Error Downloading File: ${err}`);
// Have to emit an end event here.
// Cannot throw an error as it is outside of the event loop
// and can crash the server.
// TODO: find better solution as it is not recommended https://nodejs.org/api/stream.html#stream_event_end_1
s3Stream.emit("end");
});
return s3Stream.pipe(emptyStream);
}

How to let a webworker do multiple tasks simultaneously?

I am trying to let a Web-Worker manage its state, meanwhile serving multiple async requests.
worker.ts file
let a =0; //this is my worker's state
let worker=self as unknown as Worker;
worker.onmessage =(e)=>{
console.log("Rec msg", e.data);
if(e.data === "+1"){
setTimeout(()=>{
a=a+1;
worker.postMessage(a);
},3000);
}else if(e.data=== "+2"){
setTimeout(()=>{
a=a+2;
worker.postMessage(a);
},1000)
}
}
And this is my main file: main.ts
let w =new Worker("./worker.ts", {type: "module"})
let wf =async (op: string)=>{
w.postMessage(op);
return new Promise<any>((res,rej)=>{
w.onmessage=res;
});
}
(async()=>{
let f1 = await wf("+1");
console.log("f1",f1.data);
})();
(async()=>{
let f2 = await wf("+2");
console.log("f2",f2.data);
})()
Only f2 is returned , and f1 is lost.
I have used timeouts to simulate say some async task done by worker themselves.
How do I receive both f1 and f2?
Your problem is that you are trying to take an event based API and use it as a Promise based one, but events may fire multiple times, while Promise should resolve only once.
The communication between the Worker and the main thread works by sending and receiving messages, but there is by default no one-to-one relation between these messages. Both ends of the communication (ports) will simply stack incoming messages, and handle them sequentially, when they'll get time.
In your code, the main thread's worker.onmessage handler of f1 has been overwritten by the second call f2 synchronously (one microtask later, but that's still synchronous for our matter).
You could attach your event using the addEventListener method, at least this way it wouldn't be overwritten. But even then, when the first message event will fire on worker, both handlers will think it's there own message that did arrive, while in fact it was the one of f2. so that's not what you need...
What you need is to set up a protocol of communication which would allow both ends to identify each task. You could for instance wrap all your tasks' data with an object containing a .UIID member, be sure both ends wraps their message this way, and then from main thread check that UUID to resolve the appropriate Promise.
But that can become a bit complicated to implement and to use.
My personal favorite way is to create a new MessageChannel per task. If you don't know this API, I invite you to read this answer of mine explaining the basics.
Since we are sure the only one message that will come through this MessageChannel is the response from the Worker to the one task we sent to it, we can await it just like a Promise.
All we have to do, is to make sure that in the Worker thread we respond through the transferred port instead of the global scope.
const url = getWorkerURL();
const worker = new Worker(url)
const workerFunc = (op) => {
// we create a new MessageChannel
const channel = new MessageChannel();
// we transfer one of its ports to the Worker thread
worker.postMessage(op, [channel.port1]);
return new Promise((res,rej) => {
// we listen for a message from the remaining port of our MessageChannel
channel.port2.onmessage = (evt) => res(evt.data);
});
}
(async () => {
const f1 = await workerFunc("+1");
console.log("f1", f1);
})();
(async () => {
const f2 = await workerFunc("+2");
console.log("f2", f2);
})()
// SO only
function getWorkerURL() {
const elem = document.querySelector( '[type="worker-script"]' );
const script = elem.textContent;
const blob = new Blob( [script], { type: "text/javascript" } );
return URL.createObjectURL( blob );
}
<script type="worker-script">
let a = 0;
const worker = self;
worker.onmessage = (evt) => {
const port = evt.ports[0]; // this is where we will respond
if (evt.data === "+1") {
setTimeout(() => {
a = a + 1;
// we respond through the 'port'
port.postMessage(a);
}, 3000);
}
else if (evt.data === "+2") {
setTimeout(() => {
a = a + 2;
// we respond through the 'port'
port.postMessage(a);
}, 1000)
}
};
</script>

Parse Scheduled Background Jobs

I am trying to make an app which has daily quotes logic and shows quotes. It should picks random object in my parse class and shows them to user. If users saw the todays object they should be can't see different random object in same day.
I made this algorithm with Swift. But I think Cloud Code and Background Job is the more clear and right way to do this algorithm. I researched background job tutorials guides etc to made that but I couldn't because I don't have enough JavaScript knowledge to do that. Whatever I created Background Job in my Parse server like that;
Parse.Cloud.define('todaysMentor', async (request) => {
var Mentor = Parse.Object.extend('Mentor');
var countQuery = new Parse.Query(Mentor);
const count = await countQuery.count();
const query = new Parse.Query('Mentor');
const randomInt = Math.floor(Math.random() * count);
query.equalTo('position', randomInt);
query.limit(1); // limit to at most 10 results
const results = await query.find();
const Today = Parse.Object.extend('Today');
const today = new Today();
today.set('mentor', results[0]);
today.save()
.then((today) => {
// Execute any logic that should take place after the object is saved.
}, (error) => {
});
return results;
});
Parse.Cloud.job('pickTodaysMentor', async function(request) {
const { params, headers, log, message } = request;
Parse.Cloud.run('todaysMentor', (request) => {
if (!passesValidation(request.object)) {
throw 'Ooops something went wrong';
}
});
});
I want to get random Mentor object from my Mentor class and add it to Today class. In this way I can get Today object in my mobile apps. First function is working well when I call it with Swift.
My server logs like that;
May 13, 2019, 22:22:45 +03:00- ERROR
(node:41) UnhandledPromiseRejectionWarning: TypeError: Parse.Cloud.run is not a function
at Parse.Cloud.job (/opt/app-root/src/repo/cloud/functions.js:28:19)
at Object.agenda.define [as fn] (/opt/app-root/src/build/agenda/agenda.js:74:25)
at process._tickCallback (internal/process/next_tick.js:68:7)
May 13, 2019, 22:22:45 +03:00- ERROR
(node:41) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 4)
I googled this error and learned it is a syntax error with Parse 3.0 they changed some function syntax. How can I fix that? or Do you have any suggestion to make this algorithm ?
Thank you!
I'd suggest you to go with something like this:
async function todaysMentor() {
var Mentor = Parse.Object.extend('Mentor');
var countQuery = new Parse.Query(Mentor);
const count = await countQuery.count();
const query = new Parse.Query('Mentor');
const randomInt = Math.floor(Math.random() * count);
query.equalTo('position', randomInt);
query.limit(1); // limit to at most 10 results
const results = await query.find();
const Today = Parse.Object.extend('Today');
const today = new Today();
today.set('mentor', results[0]);
await today.save();
return results;
}
Parse.Cloud.define('todaysMentor', async (request) => {
return await todaysMentor();
});
Parse.Cloud.job('pickTodaysMentor', async function(request) {
return await todaysMentor();
});

Why does setTimeout behave this way in my stream implementation?

The final line of this code successfully calls the _read method of a custom Duplex stream in node.
const timeContext = new TimeContext(sampleRate);
const input = new InputStream(timeContext); // Stream.Readable
const throttle = new Throttle(sampleRate); // Stream.Transform
const stackSource = [];
const stack = new StackStream(stackSource); // Stream.Duplex
input.pipe(throttle).pipe(stack);
stack.read(); // This will call the _read method of StackStream
Adding setTimeout to delay the stream.read() call, setTimeout's callback does NOT get called:
const timeContext = new TimeContext(sampleRate);
const input = new InputStream(timeContext); // Stream.Readable
const throttle = new Throttle(sampleRate); // Stream.Transform
const stackSource = [];
const stack = new StackStream(stackSource); // Stack.Duplex
input.pipe(throttle).pipe(stack);
setTimeout(() => {
stack.read(); // This callback never gets called
}, 1000);
It definitely does get called but something else is erroring
setTimeout(() => {
console.log('We got here');
stack.read(); // This is what is crashing in your code
console.log('We don\'t get here');
}, 1000);
It is just not behaving as you expect because some other error is occurring. Look in the console to see what errors are raised.
Looks like, read() function is a local property of the stack object and the setTimeout is not able to see this local property of stack object. That's why it's behaving in such a way.
Refer this solution for reference,
https://stackoverflow.com/a/4536268/10371717

Categories

Resources