What I'm Trying to Do...
I need to use some subproperties which are stored in the user's Meteor.user() object, such as Meteor.user().profile.preferences.preference_one, Meteor.user().profile.preferences.preference_two, et cetera. These subproperties are being used inside reactive autorun blocks because there are recalculations that must be done anytime they change.
My Problem Is...
I've discovered that when I refer to the value of these subproperties from within a reactive block, then the autorun is fired for any change to the Meteor.user() object, including changes which do not affect in any way the data that I am explicitly referencing. For example, if Meteor.user().profile.name is updated, then any autorun that includes Meteor.user().profile.preferences.preference_one or Meteor.user().profile.preferences.preference_two gets fired as well, because they all have a common parent.
I have seen a similar question dealing with limiting the scope of Meteor's reactivity, but it deals with a custom collection, not the Meteor.users collection. I cannot see how the solution there could be made applicable because they are specifying fields in subscriptions to limit what subproperties are published to the client, and in my case, I need all the subproperties of Meteor.user(). But I need to be able to choose which subproperties I am reacting to!
Storing subproperty values locally and then comparing on every change would of course work, but it is a brute force solution solution in that it requires extra logic and that the autoruns will all be firing anyway.
I don't know if this is the best way, but have a look at this example:
Tracker.autorun(function() {
var user = Meteor.user();
if (user && user.profile)
Session.set('p1', user.profile.preference1);
});
Tracker.autorun(function() {
var p1 = Session.get('p1');
console.log("p1 is " + p1);
});
The first autorun will fire every time the user data changes, however the second autorun will fire only when that particular property changes.
David's solution is great (as always).
Just to offer some variety, I'd suggest moving your preferences (or the whole darn profile) to its own collection. Then, use a .publish(null,... to always have access to that collection.
Either solution will work great, it is simply my preference to have nothing except login credentials attached to the critical users collection.
Related
When I'm offline, if I add an object to a path where a cloud function is listening, and then I delete it while still offline, when going online, Firebase servers will receive the object creation, then right after its deletion.
The problem is that it will trigger, on creation, a cloud function. This cloud function will catch some data at another path and will add that data in the object that was created. But because the object was deleted while offline, it ends up being deleted. But the cloud function will recreate it (partially) when adding the data it went to grab somewhere else.
Because I don't want to have to track every single object I create/delete, I thought about checking if the object would still exist right before saving that data. The problem is that when I do so, the object still exist but by the time I save the data into it, it doesn't exist anymore.
What are my options? I thought about adding a 0.5s sleep but I don't think it's the best practice.
First of all, there's not much you can do on the client app to help this situation. Everything you do to compensate for this will be in Cloud Functions.
Second of all, you have to assume that events could be delivered out of order. Deletes could be processed by Cloud Functions before creates. If your code does not handle this case, you can expect inconsistency.
You said "I don't want to have to track every single object I create/delete", but the fact of the matter is that this is the best option if you want consistent handling of events that could happen out of order. There is no easy way out of this situation if you're using Cloud Functions. On top of that, your functions should be idempotent, so they can handle events that could be delivered more than once.
One alternative is to avoid making changes to documents, and instead push "command" objects to Cloud Functions that tell it the things that should change. This might help slightly, but you should also assume that these commands could arrive out of order.
This is all part of the downside of serverless backends. The upside is that you don't have to set up, manage, and deallocate server instances. But your code has to be resilient to these issues.
I have a case where I need to pass a variable from a modal back to the main screen. I have decided the cleanest way to do so is to store that value in localStorage so that I can pass it back to the main file and then display it on the main page.
Is there a way to listen to the localStorage item being changed? I've been searching, but only have found information getting and setting the variable.
My item looks like this.
var length = $('.table').find('tbody').find('tr').length;
localStorage.setItem('length', length);
There's a storage event on window object for local storage changes.
MDN local storage api
So you can watch the changes. But, if I am correct, that would work only for the same domain frames. So if they are, this is your choice.
you can use a global scope like define it out of function and it will be one global scope.
var globalVariable={
x: 'globalval'
};
or use...
window.globalvar1='test' //it will set to global.
that way it will be bind to window scope.
when you set the item, always first do a get and compare with that value,
if value is changed notify, otherwise set.
var length = $('.table').find('tbody').find('tr').length;
if(localStorage.getItem(length)){
localStorage.getItem(length)!=length?notifyChange():
}
Note: This is an expensive and overly complicated approach but meets your needs for pure localStorage.
If you must use localStorage the only way I can think of that you can watch/listen for a variable in localStorage is like so
setInterval(function(){
var modalValue = localStorage.getItem( 'length' );
if( modalValue ){
// use the value
// remove the value when you are done so that this code doesn't run every time
localStorage.setItem( 'length', null );
}
}, 100 );
This will check for the variable every 1/10 of a second and run the code when it's set.
You mentioned in a comment that it is complicated and that their are iframes involved. Perhaps you can leverage the messaging api to better meet your needs?
Don't use localstorage for critical functionality. localstorage is blocked completely when using private browsing on certain browsers.
Only use localstorage for non-critical enhancements.
There is no callback for localstorage, so you wont be able to monitor changes as they happen. You could however pull the value on page load, store the value in your execution context and watch for that value to change.
Otherwise, to do what you want will require using a database.
Don't do this, find a mechanism of dependency injection that will let you pass the the variable in a sane manner. Do that either with something akin to Angular.js' service/factory pattern, or a React store.
Leveraging a mechanism like that will also allow you to listen for changes to whatever you're storing. In fact, that's the whole point of things like RxJS Observables which are at the core of Angular2+'s framework.
Additionally, something like couchdb would let you persist this data asynchronously and still not have the drawbacks of coordinating globals through raw localstorage.
Allowing for uncontrolled access to some global variable is just asking for a defect.
Most of the Meteor revolves around collections and cursors and fetching new documents when they appear in collection and match the criteria. Yet I am working with bigger documents, that contain multiple fields and has a deep and not predictable structure. On the top level there is a clear schema, but some subdocuments are unpredictable json data.
But let's look at a simpler example:
Reports = new Mongo.collection('reports');
Meteor.publish('reports', function() {
return Reports.find({});
});
and then on a client side, I open a report, put it in on screen using rather complicated not-only-html rendering functionality and then there are free text comment field embedded within report. And when they are changed, I want to automatically save them
Meteor.call("autosaveReport",reportId,comment);
and then there is meteor method that writes in the comment
Meteor.methods({
"autosaveReport": function(reportId,comment) {
Reports.update({_id:reportId},{$set:{comment:comment}});
}
);
Problem is, that every time comment is autosaved, Meteor Tracker reruns all the subscribtions and finds related to this report. And as report is big and has complicated rendering, that reload is visible for the user and destroys the purpose of seamless autosaving.
So, question - is it possible to trigger reactivity only on parts of the mongo document? Currently I have solved it by manually comparing old and new document on rendering, and if there is no difference in core, then stopping the re-rendering. That feels odd and against meteor spirit.
In your helper or route that sets the data context for your template, use {reactive: false} in the find:
return Reports.find(query,{reactive: false});
That way the helper won't update when the underlying object changes.
That flag is all or nothing however, it doesn't let you be selective about what changes to observe and which to ignore.
I am developing a firefox extension where I need to save the state of an arbitrary web page in order to be able to restore that webpage later. The quirk is that I need to restore the entire state of the page, including the state of all javascript variables. The "saving" can be done in memory, it doesn't need to be serializable.
So, is there a way to exactly clone a browser element, so that it starts running from the same point of execution that the original is currently at?
If not, how much effort would it require to add this to firefox (using C++), and which files and documentation would I start looking at?
No, there isn't a way to do exactly what you want. Even the built-in session restore will only restore form fields (and some other selected things), but not the full JS and native object state.
Implementing something like this yourself not feasible (and would be also a massive task):
You could uneval() most js objects, but that will loose type information and you'll only get the source, but not any internal state (think "hidden" state via closures). Native objects like window or document need some special treatment, and getting the internal state isn't exactly always possible here without some C++-level "reflection".
You could potentially get a lot of the actual state using the debugger API in new ways, however I don't see any way to actually restore it later. And "a lot" is still not the same as "all".
About the closed-over "hidden" state:
There is no way I know of to reliably get the internal state of counter in the following example, let alone restore it later, without getting as low-level as a platform-dependent full memory dump.
var count = (function() {
var counter = 0;
return function() { return ++counter; };
})();
count();
count();
I guess that you could walk the properties of all objects and save them somewhere but preserving context of e.g. bound functions would be difficult. Maybe you could make some use of the session store?
See:
Session_store_API and nsISessionStore
I have a rather complex web page with various tabs, forms, radio buttons, drop downs, etc. It's all bound using Knockout.js to a very complex JavaScript object that was loaded via an AJAX call. Of course the user can muck with stuff to their heart's content, at which point they hit a Save button to persist all their changes back to the server.
I'm in the process of coming up with a good design to track exactly what was changed on the page so I can implement saving. So, I've come up with a few possible implementations.
Option 1) Just send everything back and let the server sort it out: With this method, I'd let Knockout just update the data source. The Save button would call .toJS() and send that data back to the server. Pros: It's super easy, and takes very little work on the client. Cons: The server doesn't really know what changed and has to either load the data from the database to compare, or just save all the fields again. These fields come from multiple tables and have complex relations. It also treats the entire document as a single atomic unit. If someone else changed Field A and you changed field B, one user is going to lose their change.
Option 2) Use JavaScript to compare the original data and the current data: Using this technique, when the user clicks on the Save button, I would systematically compare the original data and current data and generate a graph of changes. Pros: This would ideally result in a compact graph of exactly what the user changed, and could even no-op if nothing was changed. Cons: The data I'm binding to is complex. It consists of strings, arrays, objects, arrays of objects, arrays of objects with other objects, etc. Looking for changes would be a rather complex nested loop.
Option 3) Track changes as they are being made in the UI: I would have to observe changes as they happen, and keep a delta as UI elements were changed. The Save button would simply send that change graph to the server if it had any pending changes. Pros: No need to compare two huge JavaScript objects looking for changes, but still has all the benefits of option 2. Cons: Knockout doesn't appear to have a standard way to listen to all changes using a single event handler. I believe I would have to resort to binding to all the UI elements, or creating custom bindingHandlers in Knockout to implement this real-time change tracking.
My Question:
My question is mostly for Knockout.js experts. Is there a standard approach, or recommended guidelines to solving this obviously common scenario? Is sending back all the data, even stuff that hasn't changed, a common design? Or are people implementing custom change trackers? Does Knockout provide any sort of framework that eases this requirement?
Update: Found this thing, not sure if it could be useful or if anyone has any feedback on it.
If it's a question of enabling/disabling the Save button, allowing the user to navigate "from" that page/state, then you can check with the https://github.com/CodeSeven/kolite
check the knockout.dirtyFlag.js
Hope this helps.
Edit: remember that you should "never" trust the data coming from the "UI". The real comparison and validation, ultimately goes in your "controlled" environment within the server.
What I would probably do is take option 2 - the comparison itself can be as simple as stringifying the JS object and comparing it with a cached version of itself.
A few other options are discussed here.
P.S. Maybe ko.mapping can help you manage this monster of a JS object?
I wrote a change tracker extension for knockout that Pete Smith greatly expanded on...
Take a look here:
https://roysvork.wordpress.com/2014/01/12/tracking-changes-to-complex-viewmodels-with-knockout-js/
It works on the principle of extending the observable to track initial state vs. changes the user has made on the client. I think this works really great and can give users real-time feedback to know what they've modified. In practice, we actually implement a save panel that shows all pending changes and even lets them undo individual changes, all by using the change tracker's reusable capability.
ko.extenders.trackChange = function (target, track) {
if (track) {
target.isDirty = ko.observable(false);
target.originalValue = target();
target.subscribe(function (newValue) {
// use != not !== so numbers will equate naturally
target.isDirty(newValue != target.originalValue);
});
}
return target;
};