Event execution sequence and rendering - javascript

var con = document.getElementById('con');
con.onclick = function () {
Promise.resolve().then(function Promise1() {
con.textContent = 0;
// requestAnimationFrame(() => con.textContent = 0)
});
};
<div id="con">this is con</div>
Why this code does not trigger rendering after performing microtasks?
setTimeout(function setTimeout1() {
console.log('setTimeout1')
}, 0)
var channel = new MessageChannel();
channel.port1.onmessage = function onmessage1() {
console.log('postMessage');
Promise.resolve().then(function promise1() {
console.log('promise1');
})
};
channel.port2.postMessage(0);
setTimeout(function setTimeout2() {
console.log('setTimeout2')
}, 0);
console.log('sync');
Why postmessage is executed before timer?

Why this code does not trigger rendering after performing microtasks?
It does, otherwise you wouldn't see the text being updated...
Maybe you are not able to tell it from your dev tools?
This is probably because mouse events are now generally throttled to the screen-refresh rate, meaning that when the task dispatching the mouse event will run, you'd already be in a painting frame, this may be for an other reason (because to my knowledge, mousemove events are throttled this way, not click...).
So there, your Promise callback will get executed synchronously (with only the sixth step "set currentTask to null" in between), before the update the rendering steps kicks in, and all the dev tools will see is a normal painting frame, just like it was expecting.
So maybe, the dev tools won't show anything particular here, but given the broadness of your claim, it's quite hard to pin-point a particular reason, and this is just a theory of mine.
You can try to validate this theory by calling requestAnimationFrame from inside such an event and check if it did execute in the same event loop iteration:
onclick = (evt) => {
console.clear();
setTimeout( () => console.log( 'timeout' ), 0 );
requestAnimationFrame( () => console.log( 'rAF' ) );
};
Click anywhere<br>
If "rAF" gets logged before "timeout", the click event got handled in a painting frame.
For me it does quite often in Chrome, and only once in a while in Firefox, but in the mean time I know Chrome's rAF is broken... so this theory is quite weak.
Why postmessage is executed before timer?
That will depend on the User-Agent (browser) and on when this code is executed for this statement to hold true, and also of course for the reason why it does.
In Chrome, they set a minimum 1ms to the timeout value passed to setTimeout:
base::TimeDelta interval_milliseconds =
std::max(base::TimeDelta::FromMilliseconds(1), interval);
the message task has no timeout and will thus get queued immediately. So if no other task is to be processed, it will be the next one executed, long before the 1ms timeout resolves.
In Firefox, they treat tasks scheduled by setTimeout as low priority, when scheduled from the page load (that means that in Firefox, the message task would actually fire after the setTimeout one, if both are scheduled after the page load:
function test() {
setTimeout(function setTimeout1() {
console.log('setTimeout1')
}, 0)
var channel = new MessageChannel();
channel.port1.onmessage = function onmessage1() {
console.log('postMessage');
Promise.resolve().then(function promise1() {
console.log('promise1');
})
};
channel.port2.postMessage(0);
setTimeout(function setTimeout2() {
console.log('setTimeout2')
}, 0);
console.log('sync');
}
console.log( 'testing # page load' );
test();
setTimeout(() => {
console.log( 'testing after page load' );
test();
}, 1000 );
/* results in Firefox:
testing # page load
sync
postMessage
promise1
setTimeout1
setTimeout2
testing after page load
sync
setTimeout1
setTimeout2
postMessage
promise1
*/
).
So there, in this particular case of a page load, they will treat the message task as more important than the timeout one, and when the task executor will have to choose which task to execute next (as part of the first step of the Event Loop processing model), it will pick the message over the timeout.
But these are implementation quirks, and nothing in the specs does formalize this behavior.

Related

Does self.close actually kill the web worker thread?

See this example:
// CLOSEWORKER.JS:
self.postMessage('foo');
self.close();
self.postMessage('bar');
setTimeout(() => self.postMessage('baz'), 0);
// MAIN.JS:
const worker = new Worker('./worker.js');
worker.onmessage = ({data}) => console.log(data);
Even if I have invoked self.close(),the next line self.postMessage('bar'); still executed and log 'bar'.I can't understand that since the close method should just kill the worker thread right after being called.
Also ,see another example:
TERMINATEWORKER.JS:
self.onmessage = ({data}) => console.log(data);
MAIN.JS:
const worker = new Worker('./worker.js');
setTimeout(() => {
worker.postMessage('foo');
worker.terminate();
worker.postMessage('bar');
setTimeout(() => worker.postMessage('baz'), 0);
}, 1000)
When using the terminate method instead of the close method, the code executed as I expected. It just logs 'foo', no-log any other thing, which suggests that the thread has been killed right after calling the terminate method.
WorkerGlobalScope.close() when called discards any tasks that have been scheduled and sets the closing flag of the WorkerGlobalScope, effectively preventing any new task to get queued.
However, it doesn't stop the execution of the current task.
So indeed, in your first example, it will run the current script until its end and still execute the self.postMessage('bar') operation, that your main thread will be able to handle.
Worker.terminate() on the other hand will also "[a]bort the script currently running in the worker."

setTimeout() Irregular Execution Behaviour [duplicate]

The following example is given in a Node.js book:
var open = false;
setTimeout(function() {
open = true
}, 1000)
while (!open) {
console.log('wait');
}
console.log('open sesame');
Explaining why the while loop blocks execution, the author says:
Node will never execute the timeout callback because the event loop is
stuck on this while loop started on line 7, never giving it a chance
to process the timeout event!
However, the author doesn't explain why this happens in the context of the event loop or what is really going on under the hood.
Can someone elaborate on this? Why does node get stuck? And how would one change the above code, whilst retaining the while control structure so that the event loop is not blocked and the code will behave as one might reasonably expect; wait
will be logged for only 1 second before the setTimeout fires and the process then exits after logging 'open sesame'.
Generic explanations such as the answers to this question about IO and event loops and callbacks do not really help me rationalise this. I'm hoping an answer which directly references the above code will help.
It's fairly simple really. Internally, node.js consists of this type of loop:
Get something from the event queue
Run whatever task is indicated and run it until it returns
When the above task is done, get the next item from the event queue
Run whatever task is indicated and run it until it returns
Rinse, lather, repeat - over and over
If at some point, there is nothing in the event queue, then go to sleep until something is placed in the event queue or until it's time for a timer to fire.
So, if a piece of Javascript is sitting in a while() loop, then that task is not finishing and per the above sequence, nothing new will be picked out of the event queue until that prior task is completely done. So, a very long or forever running while() loop just gums up the works. Because Javascript only runs one task at a time (single threaded for JS execution), if that one task is spinning in a while loop, then nothing else can ever execute.
Here's a simple example that might help explain it:
var done = false;
// set a timer for 1 second from now to set done to true
setTimeout(function() {
done = true;
}, 1000);
// spin wait for the done value to change
while (!done) { /* do nothing */}
console.log("finally, the done value changed!");
Some might logically think that the while loop will spin until the timer fires and then the timer will change the value of done to true and then the while loop will finish and the console.log() at the end will execute. That is NOT what will happen. This will actually be an infinite loop and the console.log() statement will never be executed.
The issue is that once you go into the spin wait in the while() loop, NO other Javascript can execute. So, the timer that wants to change the value of the done variable cannot execute. Thus, the while loop condition can never change and thus it is an infinite loop.
Here's what happens internally inside the JS engine:
done variable initialized to false
setTimeout() schedules a timer event for 1 second from now
The while loop starts spinning
1 second into the while loop spinning, the timer is ready to fire, but it won't be able to actually do anything until the interpreter gets back to the event loop
The while loop keeps spinning because the done variable never changes. Because it continues to spin, the JS engine never finishes this thread of execution and never gets to pull the next item from the event queue or run the pending timer.
node.js is an event driven environment. To solve this problem in a real world application, the done flag would get changed on some future event. So, rather than a spinning while loop, you would register an event handler for some relevant event in the future and do your work there. In the absolute worst case, you could set a recurring timer and "poll" to check the flag ever so often, but in nearly every single case, you can register an event handler for the actual event that will cause the done flag to change and do your work in that. Properly designed code that knows other code wants to know when something has changed may even offer its own event listener and its own notification events that one can register an interest in or even just a simple callback.
This is a great question but I found a fix!
var sleep = require('system-sleep')
var done = false
setTimeout(function() {
done = true
}, 1000)
while (!done) {
sleep(100)
console.log('sleeping')
}
console.log('finally, the done value changed!')
I think it works because system-sleep is not a spin wait.
There is another solution. You can get access to event loop almost every cycle.
let done = false;
setTimeout(() => {
done = true
}, 5);
const eventLoopQueue = () => {
return new Promise(resolve =>
setImmediate(() => {
console.log('event loop');
resolve();
})
);
}
const run = async () => {
while (!done) {
console.log('loop');
await eventLoopQueue();
}
}
run().then(() => console.log('Done'));
Node is a single serial task. There is no parallelism, and its concurrency is IO bound. Think of it like this: Everything is running on a single thread, when you make an IO call that is blocking/synchronous your process halts until the data is returned; however say we have a single thread that instead of waiting on IO(reading disk, grabbing a url, etc) your task continues on to the next task, and after that task is complete it checks that IO. This is basically what node does, its an "event-loop" its polling IO for completion(or progress) on a loop. So when a task does not complete(your loop) the event loop does not progress. To put it simply.
because timer needs to comeback and is waiting loop to finish to add to the queue, so although the timeout is in a separate thread, and may indeed finsihed the timer, but the "task" to set done = true is waiting on that infinite loop to finish
var open = false;
const EventEmitter = require("events");
const eventEmitter = new EventEmitter();
setTimeout(function () {
open = true;
eventEmitter.emit("open_var_changed");
}, 1000);
let wait_interval = setInterval(() => {
console.log("waiting");
}, 100);
eventEmitter.on("open_var_changed", () => {
clearInterval(wait_interval);
console.log("open var changed to ", open);
});
this exemple works and you can do setInterval and check if the open value changed inside it and it will work

What are the limitation using before unload event?

What is and what is not possible to do inside the beforeunload callback ?
Is it possible to open an XHR/fetch and send data to the server ?
If no, is it possible to just send data, without any success callback blocking ?
Is it possible to change the location of the page with window.location?
How long does the function continue to execute?
window.addEventListener("beforeunload", function (event) {
// code
});
You can put anything you want inside of a "beforeunload" callback, you're just not guaranteed it will execute unless you use synchronous/blocking code.
Here's a blocking sleep function I'll use for demonstration purposes:
function sleep(delay) {
var start = new Date().getTime();
while (new Date().getTime() < start + delay);
}
Any synchronous, blocking code is guaranteed to execute to completion:
window.addEventListener("beforeunload", function (event) {
console.log("Blocking for 1 second...");
sleep(1000);
console.log("Done!!");
});
Any async code (i.e. code with callbacks) like the code below will be queued on the event loop, but it may or may not finish executing depending on when your browser completes the "unload" action (e.g. close window, refresh page.) The time here really depends on your computer's performance. On my computer, for example, a timeout of 10ms with log statement still executes because my browser hasn't had time to refresh/close the page.
window.addEventListener("beforeunload", function (event) {
setTimeout(() => {
console.log("Timeout finished!"); // Odds are this will finish
}, 10);
console.log("Done!!");
});
A timeout of 100ms, however, is never executed:
window.addEventListener("beforeunload", function (event) {
setTimeout(() => {
console.log("Timeout finished!");
}, 100);
console.log("Done!!");
});
The only way to guarantee your function will run to completion is to make sure you're writing synchronous code that blocks the event loop as in the first example.
To help some of the people in the comments on the first answer, check out this functionality to make an XHR request during the unload event: https://developer.mozilla.org/en-US/docs/Web/API/Navigator/sendBeacon

Why does the following JS function wreck the browser process?

var wait = function (milliseconds) {
var returnCondition = false;
window.setTimeout(function () { returnCondition = true; }, milliseconds);
while (!returnCondition) {};
};
I know there have been many posts already about why not to try to implement a wait() or sleep() function in Javascript. So this is not about making it usable for implementation purposes, but rather making it work for proof of concept's sake.
Trying
console.log("Starting...");wait(3000);console.log("...Done!");
freezes my browser. Why does wait() seemingly never end?
Edit: Thanks for the answers so far, I wasn't aware of the while loop never allowing for any other code to execute.
So would this work, then?
var wait = function (milliseconds) {
var returnCondition = false;
var setMyTimeOut = true;
while (!returnCondition) {
if (setMyTimeOut) {
window.setTimeout(function() { returnCondition = true; }, milliseconds);
setMyTimeOut = false;
}
};
return;
};
JavaScript is executed in a single thread. Only when an execution path exits can another execution path begin. Thus, when you launch your wait(3000), the following happens:
returnCondition is set to false
a timeout is scheduled
an infinite loop is started.
Each <script> tag, each event being handled, and each timeout (and also UI refresh, in case of a browser) initiate a separate execution path. Thus, a timeout of 3000 is not guaranteed to run in 3000ms, but at any time after 3000ms when the engine is "free".
The wait function never exits, so your script's execution path never ends, and the scheduled timeout's turn never comes.
EDIT:
That means, once a <script> tag has begun, or Node.js has started executing a JavaScript file, the execution has to reach the bottom before anything else can happen. If a function is started as a result of an event or a timeout, that function needs to exit before anything else can happen.
<script>
console.log("script top");
function theTimeout() {
console.log("timeout top");
// something long
console.log("timeout bottom");
}
setTimeout(theTimeout, 0);
setTimeout(theTimeout, 0);
console.log("script bottom");
</script>
There are three execution paths here. The first is the <script> tag's: it starts with printing "script top", schedules two timeouts (for "right now"), then prints "script bottom", and then the end of <script> is reached and the interpreter is idle. That means it has time to execute another execution path, and there are two timeouts is waiting, so it selects one of them and starts executing it. While it is executing, again nothing else can execute (even UI updates); the other timeout, even though it was also scheduled at "immediately", is left to wait till the first timeout's execution path ends. When it does, the second timeout's turn comes, and it gets executed as well.
JavaScript is single threaded. When you call setTimeout the method you passed in as an argument is placed to the async call stack. It means the very next line of code in your block is executing immediately after the setTimeout call and the function you passed in as an argument will execute after your wait method exits.
Your while loop is waiting for a condition which will never happen while the wait function is running because the function which will set your flag will not run until the wait function is done.
The correct way to implement wait is:
var wait = function (milliseconds, onEnd) {
window.setTimeout(function () { onEnd(); }, milliseconds);
};
wait(1000, function(){alert('hi')});
Here you pass in a callback function which will execute after the timeout.
If you have multiple async style calls you can use promises. Promises will make your code easy to read and it will be easy to chain multiple async calls together. There are very good promise librarians: JQuery has $.Deferred built into it but you can use Q if you are writing node.js code.
A promise style implementation would look something like this:
var wait = function (milliseconds) {
var onEnd = null;
window.setTimeout(function () { onEnd(); }, milliseconds);
return {
then: function(action){
onEnd = action;
}
}
};
wait(1000).then(function(){alert('hi')});
https://api.jquery.com/jquery.deferred/
https://github.com/kriskowal/q
The following book helped me a lot to understand this subject:
Async JavaScript: Build More Responsive Apps with Less Code by Trevor Burnham
https://pragprog.com/book/tbajs/async-javascript

HTML5 Canvas: Get Event when drawing is finished

I'm drawing an image to a canvas element. I then have code that depends on this process to be finished. My code looks like this:
var myContext = myCanvasElement.getContext('2d'),
myImg = new Image();
myImg.onload = function() {
myContext.drawImage(containerImg, 0, 0, 300, 300);
};
myImg.src = "someImage.png";
So now, I would like to be notified when drawImage is done. I checked the spec but I couldn't find either an event or the possibility to pass a callback function. So far I just set a timeout, but this obviously is not very sustainable. How do you solve this problem?
Like almost all Javascript functions, drawImage is synchronous, i.e. it'll only return once it has actually done what it's supposed to do.
That said, what it's supposed to do, like most other DOM calls, is queue-up lists of things to be repainted next time the browser gets into the event loop.
There's no event you can specifically register to tell you when that is, since by the time any such event handler could be called, the repaint would have already happened.
Jef Claes explains it pretty well on his website:
Browsers load images asynchronously while scripts are already being
interpreted and executed. If the image isn't fully loaded the canvas
fails to render it.
Luckily this isn't hard to resolve. We just have to wait to start
drawing until we receive a callback from the image, notifying loading
has completed.
<script type="text/javascript">
window.addEventListener("load", draw, true);
function draw(){
var img = new Image();
img.src = "http://3.bp.blogspot.com/_0sKGHtXHSes/TPt5KD-xQDI/AAAAAAAAA0s/udx3iWAzUeo/s1600/aspnethomepageplusdevtools.PNG";
img.onload = function(){
var canvas = document.getElementById('canvas');
var context = canvas.getContext('2d');
context.drawImage(img, 0, 0);
};
}
You already have an event when the image loads, and you do one thing (draw). Why not do another and call the function that will do whatever it is you want done after drawImage? Literally just:
myImg.onload = function() {
myContext.drawImage(containerImg, 0, 0, 300, 300);
notify(); // guaranteed to be called after drawImage
};
drawImage() as any drawing method on the 2D canvas in itself is "mostly" synchronous.
You can assume that any code that needs a read-back of the pixels will have the updated pixels. Also, for drawImage in particular, you can even assume that the image will have been fully decoded "synchronously", which can take some time with big images.
Technically, in most modern configs the actual painting work will be deferred to the GPU, which implies some parallelization and some asynchronicity, but read-backs will wait for the GPU has done its work and lock the CPU for that time.
However the drawing on the canvas is only the first step of the full rendering of the canvas to the monitor.
The canvas then needs to go through the CSS compositor, where it will get painted along the rest of the page. This is what is deferred to the next rendering step.
alert() in Chrome does currently block the CSS compositor, and thus, even though the actual pixels of the canvas buffer have been updated, these changes haven't been reflected by the CSS compositor yet. (In Firefox alert() triggers a kind of "spin the event loop" which allows the CSS compositor to still kick in, even if the global tasks of the event loop are paused).
To hook to the CSS compositor, there is a requestPostAnimationFrame method that is being incubated, but apparently got dropped of Chrome experiments recently.
We can polyfill it using both requestAnimationFrame and a MessageEvent to hook to the next task as soon as possible (setTimeout is generally given less priority).
Now, even this requestPostAnimationFrame is only an event for when the browser's compositor kicked in, there is still some time before that image gets to the OS compositor and to the monitor (about a full V-Sync frame).
Some configuration of Chrome on Windows have access to a shortcut that allows the browser to talk directly to the OS compositor, and bypasses the CSS compositor. To enable this option, you can create your 2D context with the desynchhronized option set to true. However, this option is only supported in a few configurations.
Below is a demo of almost all this:
// requestPostAnimationFrame polyfill
if (typeof requestPostAnimationFrame !== "function") {
(() => {
const channel = new MessageChannel();
const callbacks = [];
let timestamp = 0;
let called = false;
let scheduled = false; // to make it work from rAF
let inRAF = false; // to make it work from rAF
channel.port2.onmessage = e => {
called = false;
const toCall = callbacks.slice();
callbacks.length = 0;
toCall.forEach(fn => {
try {
fn(timestamp);
} catch (e) {}
});
}
// We need to overwrite rAF to let us know we are inside an rAF callback
// as to avoid scheduling yet an other rAF, which would be one painting frame late
// We could have hooked an infinite loop on rAF, but this means
// forcing the document to be animated all the time
// which is bad for perfs
const rAF = globalThis.requestAnimationFrame;
globalThis.requestAnimationFrame = function(...args) {
if (!scheduled) {
scheduled = true;
rAF.call(globalThis, (time) => inRAF = time);
globalThis.requestPostAnimationFrame(() => {
scheduled = false;
inRAF = false;
});
}
rAF.apply(globalThis, args);
};
globalThis.requestPostAnimationFrame = function(callback) {
if (typeof callback !== "function") {
throw new TypeError("Argument 1 is not callable");
}
callbacks.push(callback);
if (!called) {
if (inRAF) {
timestamp = inRAF;
channel.port1.postMessage("");
} else {
requestAnimationFrame((time) => {
timestamp = time;
channel.port1.postMessage("");
});
}
called = true;
}
};
})();
}
// now the demo
// if the current browser can use desync 2D context
// let's try it there too
// (I couldn't test it myself, so let me know in comments)
const supportsDesyncContext = CanvasRenderingContext2D.prototype.getContextAttributes &&
document.createElement("canvas")
.getContext("2d", { desynchronized: true })
.getContextAttributes().desynchronized;
test(false);
if (supportsDesyncContext) {
setTimeout(() => test(true), 1000);
}
async function test(desync) {
const canvas = document.createElement("canvas");
document.body.append(canvas);
const ctx = canvas.getContext("2d", { desynchronized: desync });
const blob = await fetch("https://upload.wikimedia.org/wikipedia/commons/4/47/PNG_transparency_demonstration_1.png")
.then((resp) => resp.ok && resp.blob());
const bitmap = await createImageBitmap(blob);
ctx.drawImage(bitmap, 0, 0, 300, 150);
// schedule our callback after rendering
requestPostAnimationFrame(() => {
alert("Right after CSS compositing");
});
// prove that we actually already painted on the canvas
// even if the CSS compositor hasn't kicked in yet
const pixelOnCanvas = ctx.getImageData(120,120,1,1).data;
alert("Before CSS compositing." + (desync ? " (desynchronized)": "") + "\nPixel on canvas: " + pixelOnCanvas);
}
The answer by #MikeGledhill (that got deleted) is essentially the beginning of the answer, though it could have explained it better, and browsers may not have all had the requestAnimationFrame API available at that time:
Painting of pixels happens in the next animation frame. This means that if you call drawImage, the screen pixels won't actually be updated at that time, but in the next animation frame.
There's no event for this.
But! We can use requestAnimationFrame to schedule a callback for the next frame before paint (display update) happens:
myImg.onload = function() {
myContext.drawImage(containerImg, 0, 0, 300, 300);
requestAnimationFrame(() => {
// This function will run in the next animation frame, *right before*
// the browser will update the pixels on the display (paint).
// To ensure that we run logic *after* the display has been
// updated, an option is to queue yet one more callback
// using setTimeout.
setTimeout(() => {
// At this point, the page rendering has been updated with the
// `drawImage` result (or a later frame's result, see below).
}, 0)
})
};
What is happening here:
The requestAnimtionFrame call schedules a function that will be called right before the browser updated display pixels. After this callback is completed, the browser will continue to synchronously update the display pixels in a following tick that is very similar to a microtask.
The "microtask"-like in which the browser updates the display, happens after your requestAnimationFrame callback, and happens after all user-created microtasks that a user creates in the callback using Promise.resolve().then() or an await statement. This means one cannot make deferred code fire immediately (synchronously) after the paint task happens.
The only way to guarantee that logic will fire after the next paint task, is to use setTimeout (or a postMessage trick) to queue a macrotask (not microtask) from an animation frame callback. A macrotask queued from a requestAnimationFrame callback will fire after all microtasks and microtask-likes, including the task that updates the pixels. The setTimeout (or postMessage) macrotask will not fire synchronously after animation frame microtasks.
This approach is not perfect though. Most of the time, the macrotask queued from setTimeout (and more likely with postMessage) will fire before the next animation frame and paint cycle. But, due to the specification of setTimeout (and postMessage), there is no guarantee that the delay will be exactly what we specify (0 in this example), and the browser is free to use heuristics and/or hard-coded values like 2ms to determine when is the soonest time to run a setTimeout (macrotask) callback.
Due to this non-guaranteed non-synchronous nature of macrotask scheduling, it is possible, though in practice unlikely, that your setTimeout (or postMessage) callback can fire not just after the current animation frame (and the paint cycle that updates the display), but after the next animation frame (and its paint task), meaning that a macrotask callback has a small chance firing too late for the frame you were targeting. This chance is reduced when using postMessage instead of setTimeout.
That being said, this sort of thing is probably something you should not do unless you're trying to write tests that capture painted pixels and compare them to expected results or something similar.
In general, you should schedule any drawing logic (f.e. ctx.drawImage()) using requestAnimationFrame, never rely on the actual timing of the paint update, and assume that the user will see what the browser APIs guarantee you've specified for them to see (the browsers have their own tests in place for ensuring their APIs work).
Finally, we don't know what your actual goal is. Most likely this answer may be irrelevant to that goal.
Here's the same example using the postMessage trick:
let messageKey = 0
myImg.onload = function() {
myContext.drawImage(containerImg, 0, 0, 300, 300);
requestAnimationFrame(() => {
// This function will run in the next animation frame, *right before*
// the browser will update the pixels on the display (paint).
const key = "Unique message key for after paint callback: "+ messageKey++
// To ensure that we run logic *after* the display has been
// updated, an option is to queue yet one more callback
// using postMessage.
const afterPaint = (event) => {
// Ignore interference from any other messaging in the app.
if (event.data != key) return
removeEventListener('message', afterPaint)
// At this point, the page rendering has been updated with the
// `drawImage` result (or a later frame's result, but
// more unlikely than with setTimeout, as per above).
}
addEventListener('message', afterPaint)
// Hack: send a message which arrives back to us in a
// following macrotask, more likely sooner than with
// setTimeout.
postMessage(key, '*')
})
};

Categories

Resources