Currently, the code attaching the onClick events looks like this:
$("#LieblingsButton").click(function(){
reactivateFavSeatCheck()
})
$("#LieblingsButton").click(function(){
checkForWeekReservationByFavSeatButton()
})
$("#LieblingsButton").click(function(){
fetchDataFromDatabase()
})
$("#LieblingsButton").click(function(){
executeReservation()
})
fetchDataFromDatabase() does some async work, but this is already taken care of by async/await and promises.
executeReservation()shall ONLY start if fetchDataFromDatabase() has finished its execution.
Currently, everything is working. But I fear that this might only be the case because the cirumcstances allow for it. What if fetchDataFromDatabase() takes a few ms "too long"?
I already learned that when you add multiple event handlers to an element via jquery (how about with native JS?), they fire in the order which you have determined in your code. But I dont know if this "rule" also encompasses that Event2 will only fire if Event1 has already finished execution?
And besides, does the following code the same as the code above (from a functional perspective)?
$("#LieblingsButton").click(function(){
reactivateFavSeatCheck()
checkForWeekReservationByFavSeatButton()
fetchDataFromDatabase()
executeReservation()
})
First thing's first: JS execution is single-threaded in nature. No matter how asynchronous your code appears to be, only one part of it is running at any given time. You can read up more about this here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/EventLoop
Secondly, event listeners are triggered in a loop, in order of when they were attached. You can think of it sort of like this:
handlers.forEach((handler) => {
try { handler(event); } catch (e) { /* put warning in console */ }
});
But I fear that this might only be the case because the cirumcstances allow for it. What if fetchDataFromDatabase() takes a few ms "too long"?
With the following test, you can observe how a while loop in the first event listener stops the second one from firing, hence confirming your suspicion.
Note: I did not embed it as a snippet because snippets overriding of console somehow broke this example. Just paste it in your browser console.
// $(e).click(fn) roughly equals e.addEventListener('click', fn);
window.addEventListener('test', () => {
console.log('first handler', new Date());
const time = Date.now();
while (Date.now() - time < 2000) {}
});
window.addEventListener('test', () => {
console.log('second handler', new Date());
});
window.dispatchEvent(new Event('test'));
However...
If you are doing work asynchronously, things get much better.
window.addEventListener('test', () => {
setTimeout(() => {
console.log('first handler');
}, 1000);
});
window.addEventListener('test', () => {
console.log('second handler');
});
window.dispatchEvent(new Event('test'));
With this example you can see that although the first event handler schedules a timer, this does not block the next event listeners from running. The same is true if you were to say, make an XHR request.
So finally, armed with this information, we can say that it is actually better to use a single event listener, like in your second snippet.
Related
I'm working with React, And I have a problem synchronizing clicks successively (onClick events):
My button triggers a function that calls two functions resetViews and chargeViews (chargeViews takes time), so when I click two times rapidely:
resetView is executed fine (it sets viewsRef.current to null)
then chargeViews is called but as it is taking time it won't update viewsRef.current right after.
so resetView is called but viewsRef.current is still null (because chargeViews doesn't update it yet) so it will do nothing,
then I get two delayed executions of chargeViews
so after two clicks I got 1 execution of resetView and two delayed executions of chargeViews
What I want is after clicking, block everything (even if the user clicks we do nothing) and execute resetView then chargeViews and then unblock to receive another click and do the same work.
<MyButton onClick={() => {
resetViews();
chargeViews();
}}>
Click me
</MyButton>
The function that takes time
const chargeViews = () => {
if(!viewsRef.current){
...
reader.setUrl(url).then(()=>{
reader.loadData().then(()=>{
...
viewsRef.current = reader.getData();
})})
}}
The function that getting ignored if I click so fast, (It works fine if I click and I wait a little bit then I click again) but if I click and click again fast It is ignored.
const resetViews = () => {
if (viewsRef.current){
...
viewsRef.current = null;
}}
I'm not totally sure to grasp the whole issue... this would be a comment if it didn't require such a long text.
Anyway as far as you need to disable the button once it's clicked you should deal with it at the beginning of its onclick handler:
$(this.event.target).prop('disabled', true);
and reset it at the end:
$(this.event.target).prop('disabled', false);
In general what you did was correct in terms of calling a number of functions to be executed in chain inside the handler. But those 2 functions seem to have a promise invocation.. in that case those won't be executed in chain waiting for the first one to finish.
So you should pass the second invocation as a callback to the first, to have the chance to call it ONLY once the first one finished its job.
I hope someone will just go straight to the point suggesting how making an async function "await" so that whatever it does, when invocated it will be waited for to complete before the next statement is evaluated. Usually it's just a matter of adding await before its signature but there are some caveats.
First, you need to convert your Promise-utilizing functions into async functions and then await them when invoking them.
This will make it easier to control the order of execution:
const chargeViews = async () => {
if(!viewsRef.current){
...
await reader.setUrl(url);
await reader.loadData();
...
viewsRef.current = reader.getData();
}
}
Then, you need an isExecuting ref that will be true when other invokations are executing and false when none are currently executing:
const isExecuting = useRef(false);
const handleClick = async () => {
if (!isExecuting.current) {
// block other clicks from performing actions in parallel
isExecuting.current = true;
try {
resetViews();
await chargeViews();
} finally {
// unblock other clicks
isExecuting.current = false;
}
}
};
Lastly, use the newly-created handleClick function in your JSX:
<MyButton onClick={handleClick}>
Click me
</MyButton>
I am following https://www.youtube.com/watch?v=Bv_5Zv5c-Ts, and it is explained there that the event loop of JS engine places events in Event Queue - their handlers will be executed only when the Execution Stack is empty. The author even shows an example of it at 1:45:18. The code is:
function waitThreeSeconds() {
var ms = 3000 + new Date().getTime();
while(new Date() < ms) {}
console.log('finished function');
}
function clickHandler() {
console.log('click event!')
}
document.addEventListener('click', clickHandler);
waitThreeSeconds()
console.log('finished execution')
When I run it in the browser, the while loop runs for 3 seconds as expected, and then the 2 messages get printed:
If I click anywhere AFTER the while loop finishes, a message "click event!" gets printed. However, if I click DURING the while loop, the click event is never registered. In the video, in both situations, the click is registered.
Is it due to some updates in the JavaScript engines that happened since the video's premiere in 2015? I'm using the MS Edge browser.
The video's author suggests that even though JS is a single-threaded language, the web browsers implement JS interprets/engines in a concurrent way where the events get added to the Event Queue separately from the JS code's execution.
My experiment confused me since it shows different behavior. Could someone explain why is that?
//EDIT
After further experimentation, I found out that the behavior seen in the video is also to be found in Chromium (and I guess Chrome as well, although I do not have it installed).
Edge, however (the Chromium-based one), behaves differently and does not register the click at all during the while loop.
Web browsers can only operate asynchronously, not technically concurrently. What you've implemented here is what is traditionally called a "busy-wait" loop. This method is a blocking algorithm, meaning that the later code will not execute until the first code is done. This is a good thing though. Imagine how confusing it would be if your code just executed out of order.
If you want to utilize the browser's builtin asynchonous capabilities, you'll need to use one of the functions provided to interact with Javascript's Event Loop.
In this case, you would likely want to use setTimeout() to actually make this properly asynchronous.
function waitThreeSeconds() {
setTimeout(function() {
console.log('finished function');
}, 3000);
}
function clickHandler() {
console.log('click event!')
}
document.addEventListener('click', clickHandler);
waitThreeSeconds()
console.log('finished execution')
Other similar functions are setImmediate(), which executes as soon as the stack is empty, and setInterval(), which allows you to execute a function several times at a regular period or delay.
JavaScript doesn't run concurrently in a browser tab, so whenever you run a for/while loop the thread gets blocked and it must be complete to be able to handle other events in this case your event listeners.
So when you run the while loop no event will get processed until the loop finishes.
I tried running in chrome and it works the same way as it should both the listeners get fired after the loop.
The while loop is blocking, and will prevent the event queue from progressing. There is a similar question here! A good approach is to replace the while loop with a setTimeout(function(){},3000) or setInterval(function(){},3000)
If you use a while loop, you will be blocked from doing anything for those 3 seconds. Instead, you need to use timeout. This will allow you to continue to do things, while the timeout is running.
let activeTimer = null;
function waitThreeSeconds() {
if (activeTimer) {
console.log('stopped function execution');
clearTimeout(activeTimer);
}
console.log('began function execution');
activeTimer = setTimeout(function() {
console.log('finished function function');
activeTimer = null;
}, 3000);
}
function nonBlockingClickHandler() {
console.log(`Click event <${activeTimer != null}>!`);
}
document.addEventListener('click', nonBlockingClickHandler);
waitThreeSeconds();
I have multiple jQuery click event handlers, each that run asynchronous code (i.e. AJAX call) when clicked.
I have some code like this:
$(selector1).click((e) => {
asyncCode1();
)};
$(selector2).click((e) => {
asyncCode2();
)};
$(selector3).click((e) => {
asyncCode3();
)};
$(selector1).click(); // runs asyncCode1()
$(selector2).click(); // runs asyncCode2()
$(selector3).click(); // runs asyncCode3()
I want $(selector2).click() to run only after $(selector1).click() has completed execution, since asyncCode1() generates elements in the DOM that will eventually be selected in $(selector2).
Should I be returning promises on each of the click handlers? I'm not sure about best practices and how to go about doing this.
Yes, you can do it with Promise. Like this:
let asyncCode1Promise;
$(selector1).click((e) => {
asyncCode1Promise = asyncCode1(); // no await here!
)};
$(selector2).click(async (e) => {
await asyncCode1Promise;
asyncCode2();
)};
async asyncCode1() {
...
}
...
In that case, clicking selector2 will wait for completing code1 (if it's not complete yet). And if it was complete before clicking selector2, then asyncCode2 will run without any delay.
The good thing about using promises is that you can click selector2 multiple times, and it will work as expected (all clicks which are done before completion asyncCode1 will wait for its completion, and all clicks after asyncCode1 finished will call asyncCode2 immediately).
I have an event listener in Node JS, as shown below.
client.on('collect', async reaction => {
await external.run(reaction, ...);
});
The function I called external.run returns a promise, and takes around 5 seconds to complete. If this event is triggered again while the previous trigger is still in execution (i.e before the 5 seconds it takes), it messes with my program.
Is there a way to wait for the previous execution to finish before running the new one?
Thanks.
Yes, what you want is called a Lock in other languages ... JS doesn't provide that mechanism natively, but its easy to write one yourself:
const createLock = () => {
let queue = Promise.resolve();
return task => queue = queue.then(() => task());
};
const externalLock = createLock();
client.on('collect', reaction => externalLock(async () => {
await external.run(reaction, ...);
}));
For sure this is only a contrived example, you might want to handle errors properly ... or you just use one of the libraries out there that do this
Is there some way to force event handler in JS to fire always as the last one? Say I know that some new event handlers will occur in the future, but I want to be sure that every new handler will be fired before my special one.
One way would be to wrap your main function in setTimeout function and pass 0 delay, so Javascript engine will add it to its functions stack and execute that function last.
document.getElementById("div1").addEventListener('click', function(e) {
setTimeout(function(){
console.log('MAIN Event');
},0);
})
document.getElementById("div1").addEventListener('click', function(e) {
console.log('dosmth2');
})
document.getElementById("div1").addEventListener('click', function(e) {
console.log('dosmth3asd');
})
P.s. using many setTimeouts might decrease performance.