Creating a backup of 'this' - javascript

In javascript, I have an object (think of it as a shape), that can be put in edit mode and edited, or a not editable mode. When editable mode, I want to have a cancel button that cancels all edits and returns the shape back to its original form. I was hoping to use something like the following, but assigning things to 'this' doesn't work. What would the best way to do this be? I would prefer not to use external objects to store backups, because there could be many shapes and sorting out which backup corresponds to what adds code that is not as nicely packaged.
Shape.prototype.edit = function() {
this.backup = this;
...
}
Shape.prototype.cancelEdit = function() {
this = this.backup;
...
}

I think Shape should contain properties object, for example this.properties. In that object you should store all information about shape (it will be something like shape's model, only data, without any methods, or other internal class data). And in backup function you should backup only properties, not all shape's object.
(I'm a non native english speaker, feel free to correct my message if need)

You could implement something like this, where you go through each key in the object and if it's a property and not a function then store in a backup array.
var backup ;
function backup()
{
backup = [];
for(var key in this) {
if(this.hasOwnProperty(key) && typeof this[key] !== 'function') {
backup[key] = this[key];
}
}
}
function restore()
{
for(var key in backup) {
this[key] = backup[key];
}
}

Related

Node.js garbage collection and synchronized objects

This is a little bit tricky to explain, but I'll give it a try:
In a node.js server application I would like to deal with data objects that can be used in more than one place at once. The main problem is, that these objects are only referred to by an object id and are loaded from the database.
However, as soon as an object is already loaded into one scope, it should not be loaded a second time when requested, but instead the same object should be returned.
This leads me to the question of garbage collection: As soon as an object is no longer needed in any scope, it should be released completely to prevent having the whole database in the server's memory all the time. But here starts the problem:
There are two ways I can think of to create such a scenario: Either use a global object reference (which prevents any object from being collected) or, really duplicate these objects but synchronize them in a way that each time a property in one scope gets changed, inform the other instances about that change.
Again, therefore each instance would have to register an event handler, which in turn is pointing back to that instance thus preventing it from being collected again.
Did anyone come up with a solution for such a scenario I just didn't realize? Or is there any misconception in my understanding of the garbage collector?
What I want to avoid is manual reference counting for every object in the memory. Everytime when an object is being removed from any collection, I would have to adapt the reference count manually (there is even no destructor or "reference decreased" event in js)
Using the weak module, I implemented a WeakMapObj that works like we originally wanted WeakMap to work. It allows you to use a primitive for the key and an object for the data and the data is retained with a weak reference. And, it automatically removes items from the map when their data is GCed. It turned out to be fairly simple.
const weak = require('weak');
class WeakMapObj {
constructor(iterable) {
this._map = new Map();
if (iterable) {
for (let array of iterable) {
this.set(array[0], array[1]);
}
}
}
set(key, obj) {
if (typeof obj === "object") {
let ref = weak(obj, this.delete.bind(this, key));
this._map.set(key, ref);
} else {
// not an object, can just use regular method
this._map.set(key, obj);
}
}
// get the actual object reference, not just the proxy
get(key) {
let obj = this._map.get(key);
if (obj) {
return weak.get(obj);
} else {
return obj;
}
}
has(key) {
return this._map.has(key);
}
clear() {
return this._map.clear();
}
delete(key) {
return this._map.delete(key);
}
}
I was able to test it in a test app and confirm that it works as expected when the garbage collector runs. FYI, just making one or two objects eligible for garbage collection did not cause the garbage collector to run in my test app. I had to forcefully call the garbage collector to see the effect. I assume that would not be an issue in a real app. The GC will run when it needs to (which may only run when there's a reasonable amount of work to do).
You can use this more generic implementation as the core of your object cache where an item will stay in the WeakMapObj only until it is no longer referenced elsewhere.
Here's an implementation that keeps the map entirely private so it cannot be accessed from outside of the WeakMapObj methods.
const weak = require('weak');
function WeakMapObj(iterable) {
// private instance data
const map = new Map();
this.set = function(key, obj) {
if (typeof obj === "object") {
// replace obj with a weak reference
obj = weak(obj, this.delete.bind(this, key));
}
map.set(key, obj);
}
// add methods that have access to "private" map
this.get = function(key) {
let obj = map.get(key);
if (obj) {
obj = weak.get(obj);
}
return obj;
}
this.has = function(key) {
return map.has(key);
}
this.clear = function() {
return map.clear();
}
this.delete = function(key) {
return map.delete(key);
}
// constructor implementation
if (iterable) {
for (let array of iterable) {
this.set(array[0], array[1]);
}
}
}
Sounds like a job for a Map object used as a cache storing the object as the value (along with a count) and the ID as the key. When you want an object, you first look up its ID in the Map. If it's found there, you use the returned object (which will be shared by all). If it's not found there, you fetch it from the database and insert it into the Map (for others to find).
Then, to make it so that the Map doesn't grow forever, the code that fetches something from the Map would also need to release an object from the Map. When the useCnt goes to zero upon a release, you would remove an object from the Map.
This can be made entirely transparent to the caller by creating some sort of cache object that contains the Map and has methods for getting an object or releasing an object and it would be entirely responsible for maintaining the refCnt on each object in the Map.
Note: you will likely have to write the code that fetches it from the DB and inserts it into the Map carefully in order to not create a race condition because the fetching form the database is likely asynchronous and you could get multiple callers all not finding it in the Map and all in the process of getting it from the database. How to avoid that race condition depends upon the exact database you have and how you're using it. One possibility is for the first caller to insert a place holder in the Map so subsequent callers will know to wait for some promise to resolve before the object is inserted in the Map and available to them to use.
Here's a general idea for how such an ObjCache could work. You call cache.get(id) when you want to retrieve an item. This always returns a promise that resolves to the object (or rejects if there's an error getting it from the DB). If the object is in the cache already, the promise it returns will be already resolved. If the object is not in the cache yet, the promise will resolve when it has been fetched from the DB. This works even when multiple parts of your code request an object that is "in the process" of being fetched from the DB. They all get the same promise that is resolved with the same object when the object has been retrieved from the DB. Every call to cache.get(id) increases the refCnt for that object in the cache.
You then call cache.release(id) when a given piece of code is done with an object. That will decrement the internal refCnt and remove the object from the cache if the refCnt hits zero.
class ObjCache() {
constructor() {
this.cache = new Map();
}
get(id) {
let cacheItem = this.cache.get(id);
if (cacheItem) {
++cacheItem.refCnt;
if (cacheItem.obj) {
// already have the object
return Promise.resolve(cacheItem.obj);
}
else {
// object is pending, return the promise
return cacheItem.promise;
}
} else {
// not in the cache yet
let cacheItem = {refCnt: 1, promise: null, obj: null};
let p = myDB.get(id).then(function(obj) {
// replace placeholder promise with actual object
cacheItem.obj = obj;
cacheItem.promise = null;
return obj;
});
// set placeholder as promise for others to find
cacheItem.promise = p;
this.cache.set(id, cacheItem);
return p;
}
}
release(id) {
let cacheItem = this.cache.get(id);
if (cacheItem) {
if (--cacheItem.refCnt === 0) {
this.cache.delete(id);
}
}
}
}
Ok, for anyone who faces similar problems, I found a solution. jfriend00 pushed me towards this solution by mentioning WeakMaps which were not exactly the solution themselves, but pointed my focus on weak references.
There is an npm module simply called weak that will do the trick. It holds a weak reference to an object and safely returns an empty object once the object was garbage collected (thus, there is a way to identify a collected object).
So I created a class called WeakCache using a DataObject:
class DataObject{
constructor( objectID ){
this.objectID = objectID;
this.dataLoaded = new Promise(function(resolve, reject){
loadTheDataFromTheDatabase(function(data, error){ // some pseudo db call
if (error)
{
reject(error);
return;
}
resolve(data);
});
});
}
loadData(){
return this.dataLoaded;
}
}
class WeakCache{
constructor(){
this.cache = {};
}
getDataObjectAsync( objectID, onObjectReceived ){
if (this.cache[objectID] === undefined || this.cache[objectID].loadData === undefined){ // object was not cached yet or dereferenced, recreate it
this.cache[objectID] = weak(new DataObject( objectID )function(){
// Remove the reference from the cache when it got collected anyway
delete this.cache[this.objectID];
}.bind({cache:this, objectID:objectID});
}
this.cache[objectID].loadData().then(onObjectReceived);
}
}
This class is still in progress but at least this is a way how it could work. The only downside to this (but this is true for all database-based data, pun alert!, therefore not such a big deal), is that all data access has to be asynchronous.
What will happen here, is that the cache at some point may hold an empty reference to every possible object id.

How to recursively trap all changes to an object?

I'm using a Proxy to detect when an object is modified (and then I save it to disk). This works great for simple properties of the proxied object, but fails on modification of object properties.
var obj = {
p1 = "Hello",
a1 = []
}
var dirtyHandler = {
set: function(obj, prop, value) {
markDirty(obj);
obj[prop] = value;
return true;
}
};
var proxied = new Proxy(obj, dirtyHandler);
proxied.p1 = "World"; // <-- proxy detects modification
proxied.a1.push({'foo': 3}); // <-- proxy does not detect modification
Does anyone know how to recursively detect any modification in my object (a1.push(...), a1[0].foo = 4, etc.)?
Here's how I ended up solving this for my use case.
First I add a proxy for all the known objects I care about (not shown). In the handler for the proxy on every set call I check if the value is already a proxy and if not, substitute one:
var DirtyHandler = function(root) {
this.root = root;
this.set = (obj, prop, value) => {
if (!dirtyIgnores[prop]) {
debug('Dirty: ' + prop + ' of ' + obj.commitId);
markDirty(this.root);
}
if (value && typeof value === 'object') {
value = new Proxy(value, this);
}
obj[prop] = value;
return true;
};
}
I published a library on GitHub (Observable Slim) that recursively iterates through a target object and applies a Proxy on all objects. It enables you to monitor all changes that occur under a single target object no matter how deeply nested they are. It also has a few extra features:
Reports back to a specified callback whenever changes occur.
Will prevent user from trying to Proxy a Proxy.
Keeps a store of which objects have been proxied and will re-use existing proxies instead of creating new ones (very significant performance implications).
Allows user to traverse up from a child object and retrieve the parent.
Written in ES5 and plays nice with the Proxy Polyfill so it can be deployed in older browsers fairly easily.
Please feel free to take a look and hopefully contribute as well!

How to get proxy's handler from proxy object?

For example, if I have this handler/proxy (from the MDN example)...
var handler = {
get: function(target, name){
return name in target?
target[name] :
37;
}
};
var p = new Proxy({}, handler);
p.a = 1;
p.b = undefined;
console.log(p.a, p.b); // 1, undefined
console.log('c' in p, p.c); // false, 37
is it possible to probe the proxy, p, in some way that allows me to get the handler object back.
Something along the lines of:
p.__handler__ // returns handler object -> Object {get: handler.get(), set: handler.set(), ...}
p.__handler__.get // returns get prop/fn of handler -> function(target, name){ ...}
Obviously, the various traps set up in the handler are still "known" to the proxy, but is there a clear-cut way to return them/ the handler from the proxy itself? If so, how?
I have no specific use-case for this at the moment, but I could see this being useful if you wanted to dynamically change a handler/traps after you already have a proxy.
ECMAScript provides no way to access the internal [[ProxyHandler]] nor [[ProxyTarget]] slots.
Some implementations may provide some non-standard ways, but don't take it for granted.
For example, on Firefox privileged code, you can know if an object is a proxy using
Components.utils.isProxy(object);
I proposed implementing similar methods to expose the [[ProxyHandler]] and [[ProxyTarget]]. They told me to implement them in Debugger.Object instead of Components.utils.
When the patch lands, it will be possible to use something like
Components.utils.import('resource://gre/modules/jsdebugger.jsm');
var Cc = Components.classes;
// Add a debugger to a new global
var global = new Components.utils.Sandbox(
Cc["#mozilla.org/systemprincipal;1"].createInstance(Ci.nsIPrincipal),
{ freshZone: true }
);
addDebuggerToGlobal(global);
var dbg = new global.Debugger().addDebuggee(this);
// Create a debugger object of your object, and run proxy getters
var dbgObj = dbg.makeDebuggeeValue(object);
if(dbgObj.isProxy) { // a boolean
dbgObj.proxyHandler.unsafeDereference(); // the [[ProxyHandler]]
dbgObj.proxyTarget.unsafeDereference(); // the [[ProxyTarget]]
}
Add a "special" self descriptor property to getOwnPropertyDescriptor
const target = {
//Fns ..
//Props ...
};
const handler = {
getOwnPropertyDescriptor(target, prop) {
if(prop == "[[handler]]"){
return { configurable: true, enumerable: true, value: this };
}
return undefined;
},
prop1: 'abcd'
};
const proxy = new Proxy(target, handler);
console.log(Object.getOwnPropertyDescriptor(proxy, '[[handler]]').value.prop1);
I could see this being useful if you wanted to dynamically change a handler/traps after you already have a proxy
If you just want to add handlers over the (proxied) object you already have access to: you could achieve this by creating a new Proxy that handles the specific traps you want to change, eg:
let newProxyWithDifferentGet = new Proxy(originalProxy, {
get: (target, key){ ... }
}
If you wanted to access the original Proxy's target:
If you are the original Proxy's author, you can just do something like this when you construct it:
let openedProxy = new Proxy(Object.assign(target, {
_originalHandler: handler,
_originalTarget: target
}), handler)
If you're not the author, then whether or not that original target should be available to users is the decision of whoever wrote that original Proxy. If you disagree with that author about their encapsulation, that's a social problem, not a technical one, and this is not specific or unique to ES6's Proxies. If you're consuming open-source code, send a PR upstream explaining why you think the original target should be available to users, or just fork their code with your changes and use that, merging their updates to the original repo as you go.

Overriding a function without removing static properties

If I have a function like this:
function a() {
console.log('a');
}
and then assign a static property like this:
a.static = 'foo';
But say I want to override the function with another function like this:
var old = a;
a = function() {
console.log('new');
old.call(this);
};
a.static // undefined
Since I assigned a new function to a, it’s static properties are lost. Is there a neat way to keep the static properties without looping and manually copying them?
Update:
Here’s a real world scenario: In Bootstrap jQuery plugins, the author assigns defaults to the property function like this:
$.fn.modal = function() {
// some code
};
$.fn.modal.defaults = { // some object };
So if I want to "extend" the prototype I would normally do:
var old = $.fn.modal;
$.fn.modal = function() {
// do my thing
old.apply(this, arguments);
}
But that would make
$.fn.modal.defaults === undefined
This will break the functionality, because the defaults are lost. I was wondering if there a sneaky way in javascript to change only the function without losing the static properties.
No, you cannot do this. Replacing the object (function) always takes any properties with it.
There are two solutions here, and both involve transferring the properties from the old object to the new one.
The first (recommended) approach is to copy the properties, which can be done conveniently with $.extend:
$.fn.plugin = $.extend(function() { ... }, $.fn.plugin);
The second option would be to dynamically set the prototype of the new function to be the old function. For example, in some browsers this would work:
var f = function() { ... };
f.__proto__ = $.fn.plugin;
$.fn.plugin = f;
However this is non-standard and might give rise to complications; don't do it.

Is there a way to catch an attempt to access a non existant property or method?

For instance this code:
function stuff() {
this.onlyMethod = function () {
return something;
}
}
// some error is thrown
stuff().nonExistant();
Is there a way to do something like PHP's __call as a fallback from inside the object?
function stuff() {
this.onlyMethod = function () {
return something;
}
// "catcher" function
this.__call__ = function (name, params) {
alert(name + " can't be called.");
}
}
// would then raise the alert "nonExistant can't be called".
stuff().nonExistant();
Maybe I'll explain a bit more what I'm doing.
The object contains another object, which has methods that should be accessible directly through this object. But those methods are different for each object, so I can't just route them, i need to be able to call them dynamically.
I know I could just make the object inside it a property of the main object stuff.obj.existant(), but I'm just wondering if I could avoid it, since the main object is sort of a wrapper that just adds some functionality temporarily (and makes it easier to access the object at the same time).
Well, it seems that with harmony (ES6), there will be a way, and it's more complicated compared to the way other programing languages do it. Basically, it involves using the Proxy built-in object to create a wrapper on the object, and modify the way default behavior its implemented on it:
obj = new Proxy({},
{ get : function(target, prop)
{
if(target[prop] === undefined)
return function() {
console.log('an otherwise undefined function!!');
};
else
return target[prop];
}
});
obj.f() ///'an otherwise undefined function!!'
obj.l = function() {console.log(45);};
obj.l(); ///45
The Proxy will forward all methods not handled by handlers into the normal object. So it will be like if it wasn't there, and from proxy you can modify the target. There are also more handlers, even some to modify the prototype getting, and setters for any property access yes!.
As you would imagine, this isn't supported in all browsers right now, but in Firefox you can play with the Proxy interface quite easy, just go to the MDN docs
It would make me happier if the managed to add some syntactic sugar on this, but anyway, its nice to have this kind of power in an already powerful language. Have a nice day! :)
PD: I didn't copy rosettacode js entry, I updated it.
There is a way to define a generic handler for calls on non-existant methods, but it is non-standard. Checkout the noSuchMethod for Firefox. Will let you route calls to undefined methods dynamically. Seems like v8 is also getting support for it.
To use it, define this method on any object:
var a = {};
a.__noSuchMethod__ = function(name, args) {
console.log("method %s does not exist", name);
};
a.doSomething(); // logs "method doSomething does not exist"
However, if you want a cross-browser method, then simple try-catch blocks if the way to go:
try {
a.doSomething();
}
catch(e) {
// do something
}
If you don't want to write try-catch throughout the code, then you could add a wrapper to the main object through which all function calls are routed.
function main() {
this.call = function(name, args) {
if(this[name] && typeof this[name] == 'function') {
this[name].call(args);
}
else {
// handle non-existant method
}
},
this.a = function() {
alert("a");
}
}
var object = new main();
object.call('a') // alerts "a"
object.call('garbage') // goes into error-handling code
It seems that you know your way around JS.
Unfortunately, I don't know of such feature in the language, and am pretty sure that it does not exist. Your best option, in my opinion is either using a uniform interface and extend it, or extend the prototypes from which your objects inherit (then you can use instanceof before going forward with the method call) or use the somewhat cumbersome '&&' operator in order to avoid the access of nonexistent properties/methods:
obj.methodName && obj.methodName(art1,arg2,...);
You can also extend the Object prototype with Anurag's suggestion ('call').
You can also check if the method exists.
if(a['your_method_that_doesnt_exist']===undefined){
//method doesn't exist
}

Categories

Resources