React trying to execute methods on undefined variables before they load? - javascript

As you can tell from my title Im not even sure what to call the problem.
I created an example here in code sandbox
https://codesandbox.io/s/interesting-poitras-5bcuv?file=/src/Demo.js
If its a wait for the value to be set problem, I've read I need to implement a loading state of some sort? I've attempted that here:
https://codesandbox.io/s/blissful-antonelli-tu624?file=/src/Demo.js
but can't seem to get it to work, and I don't know if I am even on the right track.
I've been working with react for a couple of months now, and one of the things I don't completely understand is:
It seems like the code runs multiple times, the first time it will pass the default state of {} as value to my demo component, the state will be set and it will pass the actual value in.
My app is crashing before the actual value gets passed in because of the split.
In the past, I've managed by just throwing a bunch of if and && statements at the problem like below, but is this the best way?
const split = props.values.home && props.values.home.split("-")

You need to have a safe check there. Your problem is a normal js problem, you cannot access a prop of an undefined type
just change: props.values?.home?.split("-") || []; (as stackblitz supports optional chaining)
and it will work as you expect

Related

useEffect fires multiple times even after changing parameters

I was getting very high firebase reads from past few days so when checked I found it's because of the useEffect getting fired multiple times. if I change the parameter to [] it don't fetch anything.
I am created a related petition section where petition of particular category will be shown, when I set the parameter to relatedPetition it does work but the useEffect fires multiple times making my firestore reads high and when I just keep it [] it shows nothing in related section
Go back to using [] for the second argument for useEffect. Then change your setRelatedPetition to setRelatedPetition([...somePetition]).
Hey yashraj this is a common problem with the useEffect hook, can i see the code inside the useEffect? so I can be able to help you more, but then to solve the problem add a variable to the dependency array [] a variable that is directly or indirectly affected by updates from data fetched from the firebase api,
would love to see the code, are you calling the api from useEffect hook?
It seems to cause infinite rendering. Why not putting the array stringified? See the link below. https://stackoverflow.com/a/59468261/19622195

Sliders providing null values to complex shiny application at startup and periodically as inputs change

I am currently developing a complex (multiple pages of nested navs/tabs) web application using R's Shiny package.
I am currently experiencing an issue where sometimes accessing the values of these sliders from R via input$ returns NULL for no apparent reason. When this happens, the various outputs dependent on these sliders cannot be produced due to errors.
This problem goes away if the user forces the application to recalculate by changing an input.
The error occurs primarily at startup, though on occasion it will occur in the middle of a session as the user plays with the inputs.
I am currently working on a clean example to reproduce this, but if anyone has experienced this before or has any ideas I would certainly appreciate the help.
As I just answered in a different question, try using observeEvent maybe? observeEvent fires the given expression when a reactive value changes BUT it ignores NULL values by default.
https://stackoverflow.com/a/30514860/3943160
On the shiny google group, I was informed that this is expected behavior. The inputs take time to load and you must define what inputs are required to be non-null to produce each given output.
The solution here was to use the following in the relevant outputs:
shiny::validate(need(input$myinput, message=FALSE))
It is important to call validate this way as this name is used by other packages. I spent a long time trying to use validate and getting errors because it was calling the wrong validate function from a different package.

Angular ui-router, save state by url/url variables?

Have an interesting problem I'm trying to see if I can solve with angulars ui router plugin. What I am trying to do is save a view state by means of a customized url.
I have set up a very basic example to see if this is possible then I would like to abstract upon this if it works!
So I very simply have a
$scope.test = false;
and then on the page there is an ng-click that toggles this test variable between true and false. So what I am trying to see if I can save that state and pass it in - so say if i change it to true, I would like to be able to save a URL and then go to that, when i come to that, this variable would then be changed to true (instead of it's default false in the controller)
So I was thinking I could try something like
url: "test/{form_id}"
(I'm assuming this is how this works), however I might want to pass more than just an id. BEST case scenario is I could pass an object entirely via url (or rather reference to one, I know passing an object sounds silly), but otherwise I'm assuming on either end I would have to set up a way to make the string for the id and a way to interpret it. If i could fashion it to be polymorphic that would be fantastic, however my first steps are trying to be able to do it on a low level.
So my thinking is - every time you click to change it sets a variable which can then be set on the url. If the url is given to someone who is logged into this app and goes to that view, the controller would know from the url that it needed to change the $scope.test = true on load. I'm wondering if it's possible to conquer this with angular-ui-router, and if so if someone would be willing to provide some guidance I would be most appreciative. Thank you for reading!
You can't really pass a reference since you are talking about 2 different instances of the app. But you could JSON encode your object, urlencode that and pass it on the URL. If that is too big it will not work in all browsers though. It would also be "better" in many cases to have a more semantic URL i.e. instead of passing a big old blob, use meaningful parameters and pass those.
The documentation for $routeParams is at https://docs.angularjs.org/api/ngRoute/service/$routeParams

Coalesce / defer multiple child scope $apply calls

In my app I am receiving data via an HTTP channel that's handled in a custom way. I'm building some [data] objects from the pipe, wrap them in a scope.$new(true) and when I receive an update call childScope.$apply() to set the new properties.
This works fine for light loads, all the watchers get notified and has really been running without any issues or missed updates.
Now I'm trying to push a lot more updates and don't know if the pattern used above is the way to go. I think (though have not checked) that each call to $apply calls the digest on the root scope and I want to coalesce these on browser cycles or ~50ms intervals. Currently, whenever I receive ~100 updates on 5000 objects/scopes it kills the browser.
I saw that angular docs say each scope has an $applyAsync method but I cannot find it anywhere, this would be essentially what I am after.
Is this a bad idea and the performance is already good enough? Should I implement my own applyAsync method by using $browser.defer() or some other method?
Edit: just tested the code and indeed the $rootScope.$digest is called for each child scope $apply(). Perhaps moving this part away from Angular JS and using a listener-based approach is better, so this is also a valid answer.
In the end I used evalAsync and this seems to work as intended.
I probably need to call $digest (or $apply) every so often to make sure there are no pending scope changes but I have not seen the need to do this yet.
So my idea would be to:
call evalAsync for all the scope changes that need to happen very fast
increment a counter before the evalAsync call
set a variable with the current time inside the evalAsync function parameter and decrement the counter
on a timer (50-100ms) see if the counter is >0 and if the last evaluation time was some time ago (>50-100ms) and if yes force a digest loop.
I will not mark this as a correct answer since it does not seem like the best idea but it was the best I could come up with and it does the job as intended.

Is it acceptable to execute functions with undefined variables in javascript?

I have a ajax form that populates select lists with values based on the previous selected select list item. This form is used in 3 different views with each view adding an extra select list. I have written some basic validation code that keeps the form process in sync and doesn't confuse the user.
I have written one function that handles all 3 forms in an external script file.
My Question:
Is it acceptable or is there anything I need to worry about if some of my variables are undefined based on the form and view?
Here is some sample code that illustrates my question:
Note: These are not the actual names of my variables.
(function ($){
var objects = {sl1:$('#SelectList1'),sl2:$('#SelectList2'),sl3:$('#SelectList3'),lbl1:$('#Label1'),lbl2:$('#Label2'),lbl3:$('#Label3')};
objects.sl1.change(function(){
mapValues();
}
function mapValues(){
objects.lbl1.text(objects.sl1.val());
objects.lbl2.text(objects.sl2.val());
objects.lbl3.text(objects.sl3.val());//What if this select list is undefined for View1?
}
})(jQuery);
To summarize, View #1 has SelectList1 & SelectList2. View #2 has all 3. Is there a performance issue or is it bad practice to call a function where some of the variables are undefined?
Thanks.
This is more of a jQuery issue, not a JS one. jQuery simply does nothing (it does not even fail!) if you execute a method such as .text() or .val() on an empty result from a selector. For the performance issue, test it yourself. If the element is not found, I expect the performance to be a little better compared to when an element exists.
So, it's valid to use such code.
Note that you're mixing up "undefined variables" with "non-available elements" which are totally different matters. Using undefined variables is strongly discouraged and often lead to unexpected behavior.
I think it's more about readability and maintainability at this point. I mean would it be clear to another developer just by looking at your JS that View #1 has SelectList1 & SelectList2 ? Looking at the code you would think it has all three since all the forms use the same JS. Maybe making it more flexible to where individual forms can specify which selectLists are contained within the respective form, this way the global script is only using the selectLists specified in the forms and not assuming all at available.
Yes it is bad practice. And is source of bugs.
For good practice, define default value, and/or check for it in your function.
thats why you should use the || operator
e.g. :
( $('#SelectList1').length || '0')
The issue is that you will introduce a level of uncertainty, and hence hard to trace bugs, if you do so. Different JS parsers will respond differently - some are more forgiving and will do nothing, others will just crash. So right away you have potential cross-browser issues.
Further, as those variables get passed around inside your code, if you do not know their values, you'll have a difficult time predicting how the rest of your code will interact with them. So now you also have potential logic/program bugs.
So do yourself a favor and a) check that any required parameters are passed, and do some error handling if it is not and b) make sure optional parameters are handled as soon as you receive them (eg assign them a default value, make sure they don't get passed on to other functions if they are not defined, whatever is most appropriate for your application logic).

Categories

Resources