I'm making modifications to this file https://github.com/davidguttman/react-pivot/blob/master/index.jsx#L84 to move the Dimensions component out to a parent component.
One strange thing i noticed is that i have to call setTimeout(this.updateRows, 0) instead of this.updateRows() for the views to update correctly.
Any idea why this is so? AFAIK, setTimeout(_,0) simply makes the function call asynchronous (i.e. allows concurrent execution for performance). Why would that help with rendering views correctly? I'm asking this question to avoid "Programming by Coincidence".
This is because setState is asynchronous.
Since you are reading from this.state in the updateRows function, it won't work until the state is actually updated.
Using setTimeout as you did, is one way to allow the state to update. setState will complete, and then updateRows will execute in the next frame.
A better way would be to use the callback parameter of setState
this.setState({dimensions: updatedDimensions}, () => {
this.updateRows();
});
Another option is to keep any state changes in an object and pass it into the function instead of reading directly from this.state, but this can lead to much more complexity.
In this instance it's probably less about "concurrent execution" and more about the event loop. The setTimeout call removes function execution from the current call stack and adds an entry to the message queue. The currently executing stack will run to completion before the next message in the queue begins execution.
I don't know why this is required in this particular instance - some sort of state must be getting set in the current stack that's required for updateRows to produce the desired result.
Related
Lets say you have a computed property that filters and sorts an array of values based on a user's input.
If the user begin filtering values from the array, and the sorting value changes during the computation of the filtering, will the computed property continue the execution of the filtering, or will the computed property jump to the next calculation of the computed property in the queue, with the new sorting value?
Javascript is (essentially) single threaded. That means nothing can even process the fact that a user triggered event occurred until the currently executing synchronous code finishes executing.
You can do something yourself to emulate this, though. Add async pauses where you free up the run loop which allows events to be processed, and then resume and check some cancelation condition, then you may be able to achieve something like you're asking. But that's a lot of code, and unless that filtering is really slow, that's probably a bad idea.
Computed properties will completely finish execution.
If the old Vue.js v2 documentation still holds any relevance, then the following passage indicates how changes result in updates:
[...]Vue performs DOM updates asynchronously. Whenever a data change is observed, it will open a queue and buffer all the data changes that happen in the same event loop. If the same watcher is triggered multiple times, it will be pushed into the queue only once. This buffered de-duplication is important in avoiding unnecessary calculations and DOM manipulations. Then, in the next event loop “tick”, Vue flushes the queue and performs the actual (already de-duped) work.
Given JavaScript is a single-threaded language, this would mean that during any particular "tick" of the event loop, no other actions will be processed until the code execution during this "tick" has finished.
If hypothetically Vue's internal behavior were such that these updates were handled purely asynchronously, then there would be the potential for race conditions between any two event loop ticks, especially if the first tick takes a long time to complete while the second tick finishes quickly. Halting the execution of a computed property part way, even if it were possible, could also result in any side effects normally triggered by the computed property's execution (bad practice, but that's another subject) to not be triggered.
These inconsistencies in behavior would cause all kinds of problems for maintaining consistency in the application state, which is a problem that would make any reactive framework effectively worthless.
So we have a redux (with thunk middleware) / react environment.
We have the following piece of code:
onMyClick = () => {
this.props.mySynchronousActionWhichWillCreateNewReducerState();
this.setState(...my state change which wants to see the new reducer state...);
}
It will not work in this form since the code is synchronous. Which means that the react lifecycle will never get to update itself with the new props.
However if we change it like this:
onMyClick = () => {
Promise.resolve(this.props.myActionWichWillCreateNewReducerState())
.then(
() => this.setState(...my state change which want to see the new reducer state...)
);
}
It now works as "expected" (The dispatch triggers the new reducer state, the component updates, then the setState runs). First I thought this solution is error prone and works because we 'win' just a bit of time with asynchronicity to allow the update to kick in before the setState. But it never fails, (You can make the code slow in the action, or in the reducer, in the middleware, in the component update, where ever you want, it still works).
Why?
Some explanation might be warranted why this is so hard to wrap my head around. And more in the sense of "Why does it work the way it does" instead of "How does it work"
So first and foremost lets look at the two pieces of code like plain javascript.
In this case - for me at least - the first should work, and the second should not. Or at least the second should be fuzzy.
First:
I make synchronous call (dispatch->action creation->store change), then I make an other, and yet the second cannot expect the changes made by the first. I have to know how redux and react operates quite intimately to know how, and why. And btw you can even mutate the redux store (big no no) instead of returning a new object from the reducer to retain the reference and it still doesn't work. Which is mind boggling, you synchronously mutate an object, then cannot access the change afterwards...
Second:
In this case (just like Jaromanda X commented) what I "seemingly" tell the code 'Hey run these two pieces of code in parallel'. And now it works, and works all the time. Wut. Adding my (superficial or so it seems) understanding of react life cycles to the mix makes it even more paradoxical. Since it means that even more logic - react lifecycle update - will have to 'outrun' the setState call for it to work.
If this wouldn't be redux/react environment with all the support and intelligence behind it, I would say this code behavior smells like all hell and it smells like black-magic and go-to :).
When you wrap a piece of code inside a promise, you are essentially delaying its execution by a minimum of 1 tick. For your code, that time was sufficient enough for reducer dispatch to complete its update. Hence when the code inside then was executed it got the updated value, as this.state is an object and even within a closure it always points to a memory reference which will be updated.
That said neither reducer update in redux or setState in React returns a promise. Your code is equivalent to :
Promise.resolve(console.log("dummy")).then(() => console.log("second"));
console.log("first")
first will always be printed before second as the promisified snippet is executed in the next tick of the event queue.
Your code is not error prone at the moment because React decided 1 tick was sufficient to update the state. But don't rely on that as for some other piece of code or in later versions of React, the time required for updates might change.
I'm trying to think the best practice for this use case:
I have an internal state: results[] that will be used for displaying data in render.
Then, I need to call 3 API's, in parallel, and then merge and sort them into results[] before finally rendering it.
I'm using axios to call the API's, but i don't want to use axios.all because i need to display the results[] as each of the 3 apis are returning, that way, it looks faster.
Im running into problem because everytime i update results[] with new data using this.setState(), the latter operation always see the old results[]; this.setState() doesn't immediately get applied..
What's the best way to do this?
There will never be a race condition since Javascript runs on a single thread. It is guaranteed that only one asynchronous callback will run at any one time, since a single thread cannot execute both at once. The next asynchronous callback will be scheduled to run later. Within one function call, you can be assured that you are the only one working on some data.
Data may come in at different times, but if you simply merge and sort the data every time you receive new data, that will be atomic.
I recently read through a large portion of the React and ReactDOM codebase in order to get an understanding of what happens when a call to setState occurs. At this point, I would say that I understand the sequence of events at a high level. A call to setState results in a message being added to a queue of updates that need to be processed. The updates are processed and changes to the DOM are only made if necessary.
Here's where I am lost. There are hundreds of blog posts discussing the asynchronous behavior of setState. While I don't doubt that setState is asynchronous, I cannot find a line of code in the ReactDOM codebase that would introduce asynchronous behavior. Does anyone know exactly where this happens?
First of all setState may be execute in async way, but it is not allwys executed as such. Ben Nadel list some of his findings in setState() State Mutation Operation May Be Synchronous In ReactJS
To summarize setStates seems to gets batched in situations where react can intercept originating event, like onClick handlers. Since react creates actual DOM from virtual react DOM (and since it is aware of semantics of attributes) it can wrap any of onClick handlers that you provide into something like this
wrappedHandler = () => {
turnBatchingOn()
originalHanlder()
executeBatched()
}
In this case you get async behavior, all setState calls get enqueued, and they get executed only after your original onClick handler has finished executing.
Note this is not actual react code it is just my speculation how it is async effect achieved. I understand it is not actual line of code that does it, but I think it could help you find it.
Best article explaining setState implementation that I found is on React Kung Fu
i think setState isn't async, but there is an optimization of multiple setState calls and in some cases sync canot be guaranteed.
In my app I am receiving data via an HTTP channel that's handled in a custom way. I'm building some [data] objects from the pipe, wrap them in a scope.$new(true) and when I receive an update call childScope.$apply() to set the new properties.
This works fine for light loads, all the watchers get notified and has really been running without any issues or missed updates.
Now I'm trying to push a lot more updates and don't know if the pattern used above is the way to go. I think (though have not checked) that each call to $apply calls the digest on the root scope and I want to coalesce these on browser cycles or ~50ms intervals. Currently, whenever I receive ~100 updates on 5000 objects/scopes it kills the browser.
I saw that angular docs say each scope has an $applyAsync method but I cannot find it anywhere, this would be essentially what I am after.
Is this a bad idea and the performance is already good enough? Should I implement my own applyAsync method by using $browser.defer() or some other method?
Edit: just tested the code and indeed the $rootScope.$digest is called for each child scope $apply(). Perhaps moving this part away from Angular JS and using a listener-based approach is better, so this is also a valid answer.
In the end I used evalAsync and this seems to work as intended.
I probably need to call $digest (or $apply) every so often to make sure there are no pending scope changes but I have not seen the need to do this yet.
So my idea would be to:
call evalAsync for all the scope changes that need to happen very fast
increment a counter before the evalAsync call
set a variable with the current time inside the evalAsync function parameter and decrement the counter
on a timer (50-100ms) see if the counter is >0 and if the last evaluation time was some time ago (>50-100ms) and if yes force a digest loop.
I will not mark this as a correct answer since it does not seem like the best idea but it was the best I could come up with and it does the job as intended.