js / vue app doing CRUD, how can I track what needs updating? - javascript

I'm working on a vue app that uses vuex and gets objects from an api. The tables have paging and fetch batches of objects from the api, sometimes including related entities as nested objects. The UI allows some editing via inputs in a table, and adds via modals.
When the user wants to save all changes, I have a problem: how do I know what to patch via the api?
Idea 1: capture every change on every input and mark the object being edited as dirty
Idea 2: make a deep copy of the data after the fetch, and do a deep comparison to find out what's dirty
Idea 3: this is my question: please tell me that idea 3 exists and it's better than 1 or 2!
If the answer isn't idea 3, I'm really hoping it's not idea 1. There are so many inputs to attach change handlers to, and if the user edits something, then re-edits back to its original value, I'll have marked something dirty that really isn't.
The deep copy / deep compare at least isolates the problem to two places in code, but my sense is that there must be a better way. If this is the answer (also hoping not), do I build the deep copy / deep compare myself, or is there a package for it?

It looks like you have the final state on the UI and want to persist it on the server. Instead of sending over the delta - I would just send over the full final state and overwrite whatever there was on server side
So if you have user settings - instead of sending what settings were toggled - just send over the "this is what the new set of settings is"

Heavy stuff needs to be done on the server rather than the client most of the time. So I'll follow the answer given by Asad. You're not supposed to make huge objects diffs, it's 2022 so we need to think about performance.
Of course, it also depends of your app, what this is all about. Maybe your API guy is opposed to it for a specific reason (not only related to performance). Setup a meeting with your team/PO and check what is feasible.
You can always make something on your side too, looping on all inputs should be feasible without manually doing that yourself.
TLDR: this needs to be a discussion in your company with your very specific constrains/limitations. All "reasonable solutions" are already listed and you will probably not be able to go further because those kind of "opinion based" questions are not allowed anyway on SO.

Related

Using mutationobservers to detect changes in the results of a fetch request

I'm working on a narrow cast that displays an amount of tickets (an integer with the total added up to eachother) from a 3rd party API. I want to display a notification when this amount increases. I've read about mutationobservers, and that they are good for doing similar tasks like when something gets added or deleted.
The app has a Vue frontend, and a Laravel backend which does the requesting/authenticating. The index blade loads in a Vue component which contains the other components (and distributes the data from the API to child components).
I'm not quite sure wether mutationobservers are good for this specific job, though. Googling really didn't give me great alternatives.
In conclusion, I want to know if mutationobservers are the right tools for this task and what property would work. Better suited alternatives are also welcome.
Using vue, you can use a watcher function to watch for changes in a particular variable (amount). Mutation Observers only watches for dom updates, it won't give you what you want

Destroying Polymer-element and re-create new one with other attributes...or?

I'm wondering what's the best approach to do the following: I have a Polymer-element (PubNub (which handled Realtime Messaging)) which is "instantiated" with a some attribute values (the elements properties (more precisely, which channel to listen to/join)). And since the user should be able to switch chat rooms (/channels), I'm not sure if it's such a great idea to "instantiate" 5 PubNub-elements and in turn have 5 active chats going on in the background (receiving messages), if nothing else it would drain more battery power(?).
So, should I instantiate one PubNub-element and then remove and replace it when a user swaps channel? And how is this done best in Polymer?
Or is there some other approach one should take when dealing with this kind of problem?
You are right, the current pubnub-element is very basic and lacks many functionalities, as you have already noticed.
As Craig said, I don't generally recommend to create a bunch of instances either, however, the polymer element lacks the way you can modify all properties on the fly- you may be able to change the channel name as publishing, but it doesn't work the way for subscribing...
Instead of making some workaround using Polymer elements, I suggest to just use our vanilla JavaScript APIs. It maybe so much easier for your scenario.
And please do not hesitate to send us a pull requests or two :-)

Is there a standard way of change tracking with a Knockout bound page?

I have a rather complex web page with various tabs, forms, radio buttons, drop downs, etc. It's all bound using Knockout.js to a very complex JavaScript object that was loaded via an AJAX call. Of course the user can muck with stuff to their heart's content, at which point they hit a Save button to persist all their changes back to the server.
I'm in the process of coming up with a good design to track exactly what was changed on the page so I can implement saving. So, I've come up with a few possible implementations.
Option 1) Just send everything back and let the server sort it out: With this method, I'd let Knockout just update the data source. The Save button would call .toJS() and send that data back to the server. Pros: It's super easy, and takes very little work on the client. Cons: The server doesn't really know what changed and has to either load the data from the database to compare, or just save all the fields again. These fields come from multiple tables and have complex relations. It also treats the entire document as a single atomic unit. If someone else changed Field A and you changed field B, one user is going to lose their change.
Option 2) Use JavaScript to compare the original data and the current data: Using this technique, when the user clicks on the Save button, I would systematically compare the original data and current data and generate a graph of changes. Pros: This would ideally result in a compact graph of exactly what the user changed, and could even no-op if nothing was changed. Cons: The data I'm binding to is complex. It consists of strings, arrays, objects, arrays of objects, arrays of objects with other objects, etc. Looking for changes would be a rather complex nested loop.
Option 3) Track changes as they are being made in the UI: I would have to observe changes as they happen, and keep a delta as UI elements were changed. The Save button would simply send that change graph to the server if it had any pending changes. Pros: No need to compare two huge JavaScript objects looking for changes, but still has all the benefits of option 2. Cons: Knockout doesn't appear to have a standard way to listen to all changes using a single event handler. I believe I would have to resort to binding to all the UI elements, or creating custom bindingHandlers in Knockout to implement this real-time change tracking.
My Question:
My question is mostly for Knockout.js experts. Is there a standard approach, or recommended guidelines to solving this obviously common scenario? Is sending back all the data, even stuff that hasn't changed, a common design? Or are people implementing custom change trackers? Does Knockout provide any sort of framework that eases this requirement?
Update: Found this thing, not sure if it could be useful or if anyone has any feedback on it.
If it's a question of enabling/disabling the Save button, allowing the user to navigate "from" that page/state, then you can check with the https://github.com/CodeSeven/kolite
check the knockout.dirtyFlag.js
Hope this helps.
Edit: remember that you should "never" trust the data coming from the "UI". The real comparison and validation, ultimately goes in your "controlled" environment within the server.
What I would probably do is take option 2 - the comparison itself can be as simple as stringifying the JS object and comparing it with a cached version of itself.
A few other options are discussed here.
P.S. Maybe ko.mapping can help you manage this monster of a JS object?
I wrote a change tracker extension for knockout that Pete Smith greatly expanded on...
Take a look here:
https://roysvork.wordpress.com/2014/01/12/tracking-changes-to-complex-viewmodels-with-knockout-js/
It works on the principle of extending the observable to track initial state vs. changes the user has made on the client. I think this works really great and can give users real-time feedback to know what they've modified. In practice, we actually implement a save panel that shows all pending changes and even lets them undo individual changes, all by using the change tracker's reusable capability.
ko.extenders.trackChange = function (target, track) {
if (track) {
target.isDirty = ko.observable(false);
target.originalValue = target();
target.subscribe(function (newValue) {
// use != not !== so numbers will equate naturally
target.isDirty(newValue != target.originalValue);
});
}
return target;
};

Save or destroy data/DOM elements? Which takes more resources?

I've been getting more and more into high-level application development with JavaScript/jQuery. I've been trying to learn more about the JavaScript language and dive into some of the more advanced features. I was just reading an article on memory leaks when i read this section of the article.
JavaScript is a garbage collected language, meaning that memory is allocated to objects upon their creation and reclaimed by the browser when there are no more references to them. While there is nothing wrong with JavaScript's garbage collection mechanism, it is at odds with the way some browsers handle the allocation and recovery of memory for DOM objects.
This got me thinking about some of my coding habits. For some time now I have been very focused on minimizing the number of requests I send to the server, which I feel is just a good practice. But I'm wondering if sometimes I don't go too far. I am very unaware of any kind of efficiency issues/bottlenecks that come with the JavaScript language.
Example
I recently built an impound management application for a towing company. I used the jQuery UI dialog widget and populated a datagrid with specific ticket data. Now, this sounds very simple at the surface... but their is a LOT of data being passed around here.
(and now for the question... drumroll please...)
I'm wondering what the pros/cons are for each of the following options.
1) Make only one request for a given ticket and store it permanently in the DOM. Simply showing/hiding the modal window, this means only one request is sent out per ticket.
2) Make a request every time a ticket is open and destroy it when it's closed.
My natural inclination was to store the tickets in the DOM - but i'm concerned that this will eventually start to hog a ton of memory if the application goes a long time without being reset (which it will be).
I'm really just looking for pros/cons for both of those two options (or something neat I haven't even heard of =P).
The solution here depends on the specifics of your problem, as the 'right' answer will vary based on length of time the page is left open, size of DOM elements, and request latency. Here are a few more things to consider:
Keep only the newest n items in the cache. This works well if you are only likely to redisplay items in a short period of time.
Store the data for each element instead of the DOM element, and reconstruct the DOM on each display.
Use HTML5 Storage to store the data instead of DOM or variable storage. This has the added advantage that data can be stored across page requests.
Any caching strategy will need to consider when to invalidate the cache and re-request updated data. Depending on your strategy, you will need to handle conflicts that result from multiple editors.
The best way is to get started using the simplest method, and add complexity to improve speed only where necessary.
The third path would be to store the data associated with a ticket in JS, and create and destroy DOM nodes as the modal window is summoned/dismissed (jQuery templates might be a natural solution here.)
That said, the primary reason you avoid network traffic seems to be user experience (the network is slower than RAM, always). But that experience might not actually be degraded by making a request every time, if it's something the user intuits involves loading data.
I would say number 2 would be best. Because that way if the ticket changes after you open it, that change will appear the second time the ticket is opened.
One important factor in the number of redraws/reflows that are triggered for DOM manipulation. It's much more efficient to build up your content changes and insert them in one go than do do it incrementally, since each increment causes a redraw/reflow.
See: http://www.youtube.com/watch?v=AKZ2fj8155I to better understand this.

Connecting a form to a Javascript Object (and other best practice advice)

I've been using javascript to do lightweight functionality on sites for years - DOM manipulation etc - but just now im beginning to investigate using it to do a lot more of the heavy lifting (in combo with PHP). I've only just started getting into OO JS, and im still trying to get my head around the best-practices and design patterns that work well with it.
To be more specific - my question is, can anyone here suggest techniques for connecting a form to a javascript object?
In my current implementation I have an JS object that can be edited by a (fairly large) form. When I instantiate the object I attach an onchange observer to the form, whose callback syncs the form fields with the object parameters. I'm handling the form submitting through AJAX - there is also a periodic request that saves a temporary version of the form info to a mySQL DB. One thing I wonder is whether it is possible to easily handle syncing in the other direction - onchange of the object the form fields update (on form reset for instance).
I am interested to know if this approach is a correct/sensible one, and more generally I would be very interested to hear advice with regard to OOJS form handling.
Cheers in advance :)
(Im using Prototype btw)
You can use $("form").serialize(true);
http://www.prototypejs.org/api/form/serialize
You dont need the onchange event, you can just call the serialize() method every time you need to get the form data.
Why not create a method in you object that resyncs the object with the form? And call that on every change of the object? You could create a special change function to assure that it gets called on every change.
It's a perfectly reasonable approach. JS doesn't entirely encourage this sort of thing due to its curious object system and in particular the way bound methods are not first-class objects, but with a bit of suitable metaclass and callback glue it's eminently possible.
You might also want to look at a widget library if you'd like to get more of this kind of low-level form handling stuff for free. Haven't tried the ones built on top of Prototype; other possibilities include YUI's one.
Updating the model from the server can be pretty simple. Generally you'd poll an AJAX request and have the server pass back either diffs, if it knows them, or else just timestamp each object update, send the new object details to the client side on each update, and have the client decide how to merge that with any changes the user has made in the meantime.

Categories

Resources