Using setInterval() to do simplistic continuous polling - javascript

For a simple web app that needs to refresh parts of data presented to the user in set intervals, are there any downsides to just using setInterval() to get a JSON from an endpoint instead of using a proper polling framework?
For the sake of an example, let's say I'm refreshing the status of a processing job every 5 seconds.

From my comment:
I would use setTimeout [docs] and always call it when the previous response was received. This way you avoid possible congestion or function stacking or whatever you want to call it, in case a request/response takes longer than your interval.
So something like this:
function refresh() {
// make Ajax call here, inside the callback call:
setTimeout(refresh, 5000);
// ...
}
// initial call, or just call refresh directly
setTimeout(refresh, 5000);

A simple non-blocking poll function can be implemented in recent browsers using Promises:
var sleep = duration => new Promise(resolve => setTimeout(resolve, duration))
var poll = (promiseFn, duration) => promiseFn().then(
sleep(duration).then(() => poll(promiseFn, duration)))
// Greet the World every second
poll(() => new Promise(() => console.log('Hello World!')), 1000)

You can do just like this:
var i = 0, loop_length = 50, loop_speed = 100;
function loop(){
i+= 1;
/* Here is your code. Balabala...*/
if (i===loop_length) clearInterval(handler);
}
var handler = setInterval(loop, loop_speed);

Just modify #bschlueter's answer, and yes, you can cancel this poll function by calling cancelCallback()
let cancelCallback = () => {};
var sleep = (period) => {
return new Promise((resolve) => {
cancelCallback = () => {
console.log("Canceling...");
// send cancel message...
return resolve('Canceled');
}
setTimeout(() => {
resolve("tick");
}, period)
})
}
var poll = (promiseFn, period, timeout) => promiseFn().then(() => {
let asleep = async(period) => {
let respond = await sleep(period);
// if you need to do something as soon as sleep finished
console.log("sleep just finished, do something...");
return respond;
}
// just check if cancelCallback is empty function,
// if yes, set a time out to run cancelCallback()
if (cancelCallback.toString() === "() => {}") {
console.log("set timout to run cancelCallback()")
setTimeout(() => {
cancelCallback()
}, timeout);
}
asleep(period).then((respond) => {
// check if sleep canceled, if not, continue to poll
if (respond !== 'Canceled') {
poll(promiseFn, period);
} else {
console.log(respond);
}
})
// do something1...
console.log("do something1...");
})
poll(() => new Promise((resolve) => {
console.log('Hello World!');
resolve(); //you need resolve to jump into .then()
}), 3000, 10000);
// do something2...
console.log("do something2....")

I know this is an old question but I stumbled over it, and in the StackOverflow way of doing things I thought I might improve it. You might want to consider a solution similar to what's described here which is known as long polling. OR another solution is WebSockets (one of the better implementations of websockets with the primary objective of working on all browsers) socket.io.
The first solution is basically summarized as you send a single AJAX request and wait for a response before sending an additional one, then once the response has been delivered, queue up the next query.
Meanwhile, on the backend you don't return a response until the status changes. So, in your scenario, you would utilize a while loop that would continue until the status changed, then return the changed status to the page. I really like this solution. As the answer linked above indicates, this is what facebook does (or at least has done in the past).
socket.io is basically the jQuery of Websockets, so that whichever browser your users are in you can establish a socket connection that can push data to the page (without polling at all). This is closer to a Blackberry's instant notifications, which - if you're going for instant, it's the best solution.

Related

How do I queue incoming websocket events in javascript for slow execution?

I have an open Websocket connection and it's handing out events. All good, but once a new event arrives, I need to do a whole lot of things and sometimes events arrive so quickly one after the other that there is no time to get the stuff done properly. I need some sort of queue inside this function that tells the events to take it easy and only keep going at most one per second, and otherwise wait in some sort of queue until the second elapses to go ahead and continue.
edit: No external libraries allowed, unfortunately.
ws = new WebSocket(`wss://hallo.com/ws/`);
ws.onmessage = readMessage;
async function readMessage(event) {
print(event)
//do important things
//but not too frequently!
}
How do I do that?
I found this but it goes over my simple head:
"You can have a queue-like promise that keeps on accumulating promises to make sure they run sequentially:
let cur = Promise.resolve();
function enqueue(f) {
cur = cur.then(f); }
function someAsyncWork() {
return new Promise(resolve => {
setTimeout(() => {
resolve('async work done');
}, 5);
}); } async function msg() {
const msg = await someAsyncWork();
console.log(msg); }
const main = async() => {
web3.eth.subscribe('pendingTransactions').on("data", function(tx) {
enqueue(async function() {
console.log('1st print: ',tx);
await msg();
console.log('2nd print: ',tx);
});
}) }
main();
"
I'd honestly use something like lodash's throttle to do this. The following snippet should solve your problem.
ws = new WebSocket(`wss://hallo.com/ws/`);
ws.onmessage = _.throttle(readMessage, 1000);
async function readMessage(event) {
print(event)
//do important things
//but not too frequently!
}
For achieving queuing, you can make use of "settimeout" in simple/core javascript.
Whenever you receive a message from websocket, put the message processing function in a settimeout, this will ensure that the message is processed not immediately as its received, but with a delay, hence in a way you can achieve queuing.
The problem with this is that it does not guarantee that the processing of messages is sequential as they are received if that is needed.
By default settimeout in javascript does give the guarantee of when the function inside will be triggered after the time given is elapsed.
Also it may not reduce the load on your message processor service for a high volume situation and since individual messages are queued two/more functions can become ready to be processed from setimeout within some time frame.
An ideal way to do so would be to create a queue. On a high level code flow this can be achieved as follows
var queue = [];
function getFromQueue() {
return queue.shift();
}
function insertQueue(msg) { //called whenever a new message arrives
queue.push(msg);
console.log("Queue state", queue);
}
// can be used if one does not want to wait for previous message processing to finish
// (function executorService(){
// setTimeout(async () => {
// const data = getFromQueue();
// await processData(data);
// executorService();
// }, 1000)
// })()
(function executorService(){
return new Promise((res, rej) => {
setTimeout(async () => {
const data = getFromQueue();
console.log("Started processing", data)
const resp = await processData(data); //waiting for async processing of message to finish
res(resp);
}, 2000)
}).then((data) =>{
console.log("Successfully processed event", data)
}).catch((err) => {
console.log(err)
}).finally(() => {
executorService();
})
})()
// to simulate async processing of messages
function processData(data){
return new Promise((res, rej) => {
setTimeout(async () => {
console.log("Finished processing", data)
res(data);
}, 4000)
})
}
// to simulate message received by web socket
var i = 0;
var insertRand = setInterval(function(){
insertQueue(i); // this must be called on when web socket message received
i+=1;
}, 1000)

Chrome crash after 1 hour of running due to out of memory. Javascript memory management issue probably

I'm doing a polling method to my API every 5 seconds, to get real-time data. The code below works but after 1 hour of running, the page crash(Aww Snap, with the dinosaur image and error: out of memory). The data I'm collecting is quite large, and I'm expecting that javascript will offload the memory(garbage collection) every time the function is being called again. I can see in the Chrome Task Manager, the memory footprint is growing over time. Is there a way to clear the memory or offload the memory from growing over time?
data(){
return{
newdata:[],
};
},
methods: {
loadData:async function () {
try {
let response = await axios.get('/monitoring_data');
if (response.status != 200) {
await new Promise(resolve => setTimeout(resolve, 1000));
await this.loadData();
}else {
// Get the data
this.newdata= response.data.Ppahvc;
// Call loadData() again to get the next data
await new Promise(resolve => setTimeout(resolve, 5000));
await this.loadData();
}
} catch (e) {
await this.loadData();
}
},
},
mounted:function(){
this.loadData();
},
You can use setinterval instead of calling the loadData function recursivley
You are calling the loadData function recursivley (function calls itself) which is likely the cause of your memory issue because the browser has to remember these nested calls to eventually return again.
Try using an iterative approach instead:
setInterval(function {
let response = await axios.get('/monitoring_data');
this.newdata = response.data.Ppahvc;
}, 5000);

Stop running task after request timeout in Express.js

Let's assume that we have below code, which has timeout set for 5 seconds.
router.get('/timeout', async (req, res, next) => {
req.setTimeout(5000, () => {
res.status(503)
res.send()
})
while (true) {
console.log("I'm alive")
}
res.status(200)
res.send({msg: 'success'})
})
I know that the last two lines will never be reached, but that's not a point. The problem which I want to solve is that the while loop is still working despite response was sent.
Is there some way to kill such still working tasks?
There are two types of long running tasks and cancelling is different for both:
1) Asynchronous tasks:
They may take a while, however they are not using the JavaScript engine, instead the engine is in idle to wait for some external data (database / files / timers whatever). In some cases (timers for example) you can easily discard that external action, also you can trigger it as an event as the engine is not blocked and can handle the cancellation. If the async action cannot be cancelled directly (database read for example) you can wait until it is done and cancel it then:
class Cancelable {
constructor() {
this.cancelled = false;
this.handlers = [];
}
onCancel(handler) { this.handlers.push(handler); }
cancel() {
this.cancelled = true;
this.handlers.forEach(handler => handler());
}
}
// inside of the request handler:
const canceller = new Cancelable;
req.setTimeout(5000, () => {
res.status(503);
res.send();
canceller.cancel(); // propagate cancellation
});
// Some long running, async cancellable task
const timer = setTimeout(function() {
res.send("done");
}, 10000 * Math.random())
// on cancellation just remove the timer
canceller.onCancel(() => clearTimeout(timer));
unCancellableAction(function callback() {
if(canceller.canceled) return; // exit early if it was cancelled
res.send("done");
});
2) Synchronous tasks:
You cannot cancel synchronous tasks directly as the engine is busy doing the task, and can't handle the cancellation. To make them cancellable you have to use polling, the task has to pause its job, check wether it should cancel, and then either continue or abort. In JS that can be done with generator functions (as they can yield their execution):
function runMax(time, action) {
const gen = action(), start = Date.now();
let done, value;
do {
({ done, value } = gen.next());
} while(!done && Date.now() < start + time)
return value;
}
// inside the request handler:
runMax(5000, function* () {
while(true) {
// ... some jobs
// yield at a safe position to allow abortion:
yield;
}
});
I think you need to add inside the while loop the new if statement to break the loop.
eg.:
while (!req.timeout) {
if (!req.timeout) {
break;
}
}

Resend HTTP post before error timeout

I have an Angular 4 project that uses a http post to commands across to our backend. The issue is that sometimes a command can be sent out before the backend is fully up and running. Normally an "ERR_CONNECTION_TIME_OUT" would occur, but our embedded browser for whatever reason holds onto the post for an extremely long time before giving us the error (5 minutes). Since a 5 minute wait is unacceptable, I need to come up with a way to re-send our http post if there isn't a response within 15-30~ seconds
Here is what the current post looks like
this._http.post(this.sockclientURL, body, { headers: headers })
.subscribe((res) => {
let text = res.text();
if (text.startsWith("ERROR")) {
console.log("Sockclient Error.");
if (this.sockclientErrorRetryCount < this.sockclientErrorRetryLimit) {
console.log("Retrying in 3 seconds.");
this.sockclientErrorRetryCount++;
setTimeout(() => {
this.SendCommand(command, success, fail);
}, 3000);
}
return;
}
else {
this.sockclientErrorRetryCount = 0;
}
if (text == "N" || text.startsWith("N ")) {
this._modalService.alert(this._nackLookup.convert(text));
if (typeof fail == 'function') {
fail(text);
}
}
else {
let deserializedCommand = command.deserialize(text);
success(deserializedCommand);
let repeatMillis: number = deserializedCommand.getRepeatMillis();
if (repeatMillis && repeatMillis > 0) {
setTimeout(() => {
this.SendCommand(command, success, fail);
}, repeatMillis);
}
}
},
(err) => {
console.log(err);
let repeatMillis = 1000;
setTimeout(() => {
this.SendCommand(command, success, fail);
}, repeatMillis);
});
So to re-cap, I have some code in place to re-attempt the command if an error occurs, but our embedded browser holds onto its timeout error for several minutes. I need something to attempt to re-send after 15-30 seconds of no response
Retries are immediately executed without waiting for a delay. A better one consists of waiting for a bit before retrying and abort after a given amount of time. Observables allow to mix retryWhen, delay and timeout operators to achieve this, as described in the following snippet:
this._http.post(this.sockclientURL, body, { headers: headers })
.retryWhen(error => error.delay(500))
.timeout(2000, new Error('delay exceeded'))
.map(res => res.map());
Not 100% sure it still works in Angular4, but you should be able to do:
this
._http
.post(this.sockclientURL, body, { headers: headers })
.timeout(15000, new Error('timeout exceeded')) // or 30000
.subscribe((res) => { /* ... */ })
A little more information would be needed as to the entire scope of the application, however when I get myself in to situations like this I normally look at the following avenues of approach:
Will a try / catch solve my problem.
In your catch, you could redirect elsewhere. If you don't catch anything
often times you will get a bit of lag.
Is there a way to avoid the error all together through user constraints.
Lastly, You may want to use setInterval() over setTimeout().
Can you provide more information as to the scope of the operation?

AngularJS $q.all - wait between http calls

So I have a situation where I need to perform a bunch of http calls, then once they are complete, continue on to the next step in the process.
Below is the code which does this and works fine.
However, I now need to wait a few seconds between each of the http calls. Is there a way to pass in a timeout with my current set up, or will it involve a good bit of refactoring?
Can post more code if needs be. I have tried passing in a timeout config varable into the http call, however, they still get fired at the same time.
Any advice would be great.
Code
var allThings = array.map(function(object) {
var singleThingPromise = getFile(object.id);
return singleThingPromise;
});
$q.all(allThings).then(function() {
deferred.resolve('Finished');
}, function(error) {
deferred.reject(error);
});
Instead of using $q.all, you might want to perform sequential calls one on success of previous and probably with use of $timeout. Maybe you could build a recursive function.
Something like this..
function performSequentialCalls (index) {
if(angular.isUndefined(array[index])) {
return;
}
getFile(array[index].id).then(function() {
$timeout(function() {
performSequentialCalls(index + 1)
}, 1000) // waiting 1 sec after each call
})
}
Inject required stuff properly. This assumes array to contain objects with ids using which you perform API calls. Also assumes that you are using $http. If using $resource, add $promise accordingly.
Hope that helps a bit!
function getItemsWithDelay(index) {
getFile(object[index].id).then(()=>{
setTimeout(()=>{
if(index+1 > object.length) { return }
getItemsWithDelay(index+1)
}, 5000)
})
}
You can make sequential calls
This is a awesome trick question to be asked in an interview, anyways I had a similar requirement and did some research on the internet and thanks to reference https://codehandbook.org/understanding-settimeout-inside-for-loop-in-javascript
I was able to delay all promise call in angularjs and the same can be applied in normal JS syntax as well.
I need to send tasks to a TTP API, and they requested to add a delay in each call
_sendTasks: function(taskMeta) {
var defer = $q.defer();
var promiseArray = [];
const delayIncrement = 1000 * 5;
let delay = 0;
for (i = 0; i < taskMeta.length; i++) {
// using 'let' keyword is VERY IMPORTANT else 'var' will send the same task in all http calls
let requestTask = {
"action": "SOME_ACTION",
"userId": '',
"sessionId": '',
};
// new Promise can be replaced with $q - you can try that, I haven't test it although.
promiseArray.push(new Promise(() => setTimeout(() => $http.post(config.API_ROOT_URL + '/' + requestTask.action, requestTask), delay)));
delay += delayIncrement;
}
$q.all(promiseArray).
then(function(results) {
// handle the results and resolve it at the end
defer.resolve(allResponses);
})
.catch(error => {
console.log(error);
defer.reject("failed to execute");
});
return defer.promise;
}
Note:: using 'let' keyword in FOR loop is VERY IMPORTANT else 'var' will send the same task in all http calls - due to closure/context getting switched

Categories

Resources