Provided that the object MAY contain own property called "hasOwnProperty":
> a={abc: 123};
{ abc: 123 }
> a.hasOwnProperty("abc");
true
> a['hasOwnProperty'] = 1;
1
> a.hasOwnProperty("abc");
TypeError: a.hasOwnProperty is not a function
...
This works, kinda ugly interface, if you think about Object.keys(), Object.assign() ETC.. So, is there a better way?
> Object.hasOwnProperty.call(a, "abc");
true
> Object.hasOwnProperty.call(a, "hasOwnProperty");
true
And why shouldn't the solution be the only recommended way? Using methods directly from an object seems like a recipe for a failure, especially if it is containing external data (not in one's control)
The appropriate/recommended way to use hasOwnProperty is as a filter, or a means to determine whether an object... well, has that property. Just they way you are using it in your second command a.hasOwnProperty('abc').
By overwriting the Object hasOwnProperty property with a['hasOwnProperty'] = 1, while it's safe and valid, just removes the ability to use the hasOwnProperty function on that Object.
Am I missing your true question here? It seems like you already knew this from your example.
By
'using methods directly from an object seems like a recipe for a failure
are you referring to something like this:
> dog = {speak: function() {console.log('ruff! ruff!')}};
> dog.speak(); // ruff! ruff!
Because that is extremely useful in many ways as you can imagine.
If you can use ECMAScript 2015 you can try Reflect.getOwnPropertyDescriptor.
It returns a property descriptor of the given property if it exists on the object, undefined otherwise.
To simplify you can create this function:
var hasOwnProp = (obj, prop) => Reflect.getOwnPropertyDescriptor(obj, prop) !== undefined;
var obj = new Object();
obj.prop = 'exists';
console.log('Using hasOwnProperty')
console.log('prop: ' + obj.hasOwnProperty('prop'));
console.log('toString: ' + obj.hasOwnProperty('toString'));
console.log('hasOwnProperty: ' + obj.hasOwnProperty('hasOwnProperty'));
var hasOwnProp = (obj, prop) => Reflect.getOwnPropertyDescriptor(obj, prop) !== undefined;
console.log('Using getOwnPropertyDescriptor')
console.log('prop: ' + hasOwnProp(obj, 'prop'));
console.log('toString: ' + hasOwnProp(obj, 'toString'));
console.log('hasOwnProperty: ' + hasOwnProp(obj, 'hasOwnProperty'));
obj['hasOwnProperty'] = 1;
console.log('hasOwnProperty: ' + hasOwnProp(obj, 'hasOwnProperty'));
Any built-in can be overridden in JS - it's generally considered best practice to avoid overriding any native methods where possible. If the original functionality is preserved it's OK as it will still behave as expected and even could possibly extended further if overridden correctly again.
As that's considered best practice I recommend either remapping the keys to avoid overriding them. If remapping the keys is not an option then you can maybe make it feel a little less messy by either locally referencing/wrapping Object.hasOwnProperty or Object.prototype.hasOwnProperty. In the case of hasOwnProperty you could possibly implement an iterator (as iterating over enumerable non-inherited properties is a very common use of hasOwnProperty) method to reduce the likelihood of its use. There's always still the risk of someone less familiar with your object attempting to directly iterate so I really feel that key mapping is the safer bet even if it does cause a slight difference in between server-side keys and local ones.
A key mapping could be as simple as a suffix using hasOwnProperty_data instead of hasOwnProperty this would mean objects would behave as expected and your IDE's autocomplete likely will still be close enough to know what the property represents.
A mapping function might look like the following:
function remapKeys(myObj){
for(var key in myObj){
if(Object.prototype.hasOwnProperty.call(myObj, key)){
if((key in Object) && Object[key] !== myObj[key]){ // Check key is present on Object and that it's different ie an overridden property
myObj[key + "_data"] = myObj[key];
delete myObj[key]; // Remove the key
}
}
}
return myObj; // Alters the object directly so no need to return but safer
}
// Test
var a = {};
a.hasOwnProperty = function(){ return 'overridden'; };
a.otherProp = 'test';
remapKeys(a);
console.log(a); // a { hasOwnProperty_data : function(){ return 'overridden';}, otherProp: 'test' }
console.log(a.hasOwnProperty('otherProp')); // true
Related
Given the developments of JavaScript since the languages' inception, why is there not a built in method that checks if an object is a plain object?
Or does the method in fact exist?
You can check the type and the instance of an object this way:
var a = new Date();
console.log(typeof a);
console.log(a instanceof Date);
var b = "Hello";
console.log(typeof b);
console.log(b instanceof Date);
Updated according to the comments from the OP:
let arr = [1, 2, true, 4, {
"abc": 123
},
6, 7, {
"def": 456
},
9, [10], {}, "[object Object]"
];
arr.forEach(function(v) {
if (typeof v == "object" && !(v instanceof Array) && v != null)
console.log("Object Found");
else
; // console.log("Na");
});
The above code snippets outputs thrice Object Found.
There doesn't exist any explicit direct way to check if a value is an object, i.e. belongs to Object type, but there are some foolproof ways to do it. I wrote a list in another answer, the most succinct seems
function isObject(value) {
return Object(value) === value;
}
A feature like this has been requested multiple times on esdiscuss. For example,
What is an Object Type(O)?
Juriy Zaytsev "kangax" wonders about a proper way to check if a value is an object.
typeof null
Brendan Eich: "I think we should consider Object.isObject"
Jorge: "Why not .isPrimitive()?"
ES6 doesn't need opt-in
Brendan Eich: "We want sane isObject and isNull predicates"
Axel Rauschmayer: "predicates such as isObject() and isPrimitive()"
In fact, Object.isObject was proposed as strawman, and it appeared in an ES6 early draft.
TC39 bashing: Discussion about Object.isObject in the ES6 draft.
How primitive are Symbols? Bignums? etc: discusses x === Object(x)
Object.isObject strawman was eventually rejected and removed from ES6 draft.
More recently,
ES8 Proposal: Optional Static Typing (Brandon Andrews): Includes Object.isObject
Now there is the is{Type} Methods stage 0 proposal which includes Object.isObject among lots of various other checks.
So there is still hope and eventually we may have something like this.
The above is for testing objects in general. If you don't want that you should define what "plain object" means for you.
For example, you can test the constructor property. But any object can customize it.
You can use Object.prototype.toString to get the legacy ES5 [[Class]]. But any object can customize that via Symbol.toStringTag.
You can check the value returned by [[GetPrototypeOf]]. But even exotic objects might allow their prototype to be changed to whatever arbitrary object or null. And Proxy objects even have full control over that internal method.
So most probably you won't be able to rely on these tests. And adding something to the standard may be hard because different people may want different things.
What I would like is some way to check if an object is an ordinary one. That is, it has the default behaviour for the essential internal methods that must be supported by all objects.
Once you know that an object is ordinary, you can rely on things like [[GetPrototypeOf]] to customize the test to your tastes.
Relying on [object Object] string representation is inaccurate. This behaviour may be changed for any objects with:
let o = { toString: () => '...' };
('' + o) !== '[object Object]'
var a = [];
a.toString = () => '[object Object]';
('' + a) === '[object Object]';
The most solid way to check if a value is a plain object is
let o = {}
Object.getPrototypeOf(o) === Object.prototype
And considering that constructor property wasn't tampered, the most straightforward way to check if a value is a plain object is
let o = {}
o.constructor === Object
This covers all POJOs constructed from Object and doesn't cover Object.create(null, { ... }) or any child classes (including built-ins like RegExp or Array):
Object.create(null).constructor !== Object
[].constructor !== Object
(new class {}).constructor !== Object
One of the possible reasons why there is no dedicated method to check for object plainness is because a restriction to use only {} objects is not practical. This makes very little sense in the context of JS. This prevents the use of any class instances or relatively 'plain' objects (Object.create({}, ...)).
This would require the hack in order for desired non-plain objects to pass the check:
Object.assign({}, (new class {})).constructor === Object
In most cases of object checking 'everything which is not forbidden is allowed' principle pays off (with extra caution regarding infamous null inconsistency).
Applying the above to this case, a safe and concise condition to filter non-array objects is
o && typeof o === 'object' && !Array.isArray(o)
And a condition to filter objects that are not built-ins (functions, Array, RegExp, etc) is
o && (o.constructor === Object || !/\[native code\]/.test(o.constructor))
Just for the sake of further documenting different ways:
One way I can think of:
JSON.stringify(testValue)[0] === '{';
Keep in mind that objects with circular references cannot be stringified. However, if you are certain that all testValues cannot have circular references, you have yourself a way to check against Null, Arrays, and any primitive value to ensure that you have an Object.
I suggest that if you plan on using this throughout your code though, that you define a helper function that actually implements it, in case you find that this does not work as you expect and end up having to change the way you check it.
Every thing JavaScript is an Object , so there is no need to have an isObject api
I am trying to understand the Object.freeze method of ECMAscript.
My understanding was that it essentially stops changes to all the properties of an object. MDN documentation says:
Prevents new properties from being added to it; prevents existing properties from being removed; and prevents existing properties, or their enumerability, configurability, or writability, from being changed.
This does not seem to be the case, but perhaps I have misinterpreted the docs.
Here is my object, with its enumerable property exampleArray
function myObject()
{
this.exampleArray = [];
}
var obj = new myObject();
obj.exampleArray[0] = "foo";
Now if I freeze the object, I would expect the exampleArray property to be frozen too, as in it can no longer be changed in any way.
Object.freeze(obj);
obj.exampleArray[1] = "bar";
console.log(obj.exampleArray.length); // logs 2
"bar" has been added to the array, thus the frozen object has been changed. My immediate solution is to just freeze the desired property:
Object.freeze(obj.exampleArray);
obj.exampleArray[2] = "boo";
Now changing the array throws an error, as desired.
However, I am developing my application and I don't yet know what will be assigned to my object. My use case is that I have some game objects which are initialized (from an XML file) when the game starts. After this, I do not want to be able to change any of their properties accidentally.
Perhaps I am misusing the freeze method? I would like to be able to freeze the whole object, a sort of recursive freeze. The best solution I can think of here is to loop through the properties and freeze each one.
I've already searched for this question and the only answer says it's an implementation bug. I am using the newest version of Chrome. Any help is appreciated.
Object.freeze is a shallow freeze.
If you look at the description in the docs, it says:
Values cannot be changed for data properties. Accessor properties (getters and setters) work the same (and still give the illusion that you are changing the value). Note that values that are objects can still be modified, unless they are also frozen.
If you want to deep-freeze an object, here's a good recursive example
function deepFreeze(o) {
Object.freeze(o);
Object.getOwnPropertyNames(o).forEach(function(prop) {
if (o.hasOwnProperty(prop)
&& o[prop] !== null
&& (typeof o[prop] === "object" || typeof o[prop] === "function")
&& !Object.isFrozen(o[prop])) {
deepFreeze(o[prop]);
}
});
return o;
}
function myObject() {
this.exampleArray = [];
}
var obj = deepFreeze(new myObject());
obj.exampleArray[0] = "foo";
console.log(obj); // exampleArray is unchanged
Set the property descriptors for the object to writable:false, configurable:false using Object.defineProprties; then call Object.preventExtensions on the object. See How to create static array in javascript.
What's the fastest alternative to
JSON.parse(JSON.stringify(x))
There must be a nicer/built-in way to perform a deep clone on objects/arrays, but I haven't found it yet.
Any ideas?
No, there is no build in way to deep clone objects.
And deep cloning is a difficult and edgey thing to deal with.
Lets assume that a method deepClone(a) should return a "deep clone" of b.
Now a "deep clone" is an object with the same [[Prototype]] and having all the own properties cloned over.
For each clone property that is cloned over, if that has own properties that can be cloned over then do so, recursively.
Of course were keeping the meta data attached to properties like [[Writable]] and [[Enumerable]] in-tact. And we will just return the thing if it's not an object.
var deepClone = function (obj) {
try {
var names = Object.getOwnPropertyNames(obj);
} catch (e) {
if (e.message.indexOf("not an object") > -1) {
// is not object
return obj;
}
}
var proto = Object.getPrototypeOf(obj);
var clone = Object.create(proto);
names.forEach(function (name) {
var pd = Object.getOwnPropertyDescriptor(obj, name);
if (pd.value) {
pd.value = deepClone(pd.value);
}
Object.defineProperty(clone, name, pd);
});
return clone;
};
This will fail for a lot of edge cases.
Live Example
As you can see you can't deep clone objects generally without breaking their special properties (like .length in array). To fix that you have to treat Array seperately, and then treat every special object seperately.
What do you expect to happen when you do deepClone(document.getElementById("foobar")) ?
As an aside, shallow clones are easy.
Object.getOwnPropertyDescriptors = function (obj) {
var ret = {};
Object.getOwnPropertyNames(obj).forEach(function (name) {
ret[name] = Object.getOwnPropertyDescriptor(obj, name);
});
return ret;
};
var shallowClone = function (obj) {
return Object.create(
Object.getPrototypeOf(obj),
Object.getOwnPropertyDescriptors(obj)
);
};
I was actually comparing it against angular.copy
You can run the JSperf test here:
https://jsperf.com/angular-copy-vs-json-parse-string
I'm comparing:
myCopy = angular.copy(MyObject);
vs
myCopy = JSON.parse(JSON.stringify(MyObject));
This is the fatest of all test I could run on all my computers
The 2022 solution for this is to use structuredClone
See : https://developer.mozilla.org/en-US/docs/Web/API/structuredClone
structuredClone(x)
Cyclic references are not really an issue. I mean they are but that's just a matter of proper record keeping. Anyway quick answer for this one. Check this:
https://github.com/greatfoundry/json-fu
In my mad scientist lab of crazy javascript hackery I've been putting the basic implementation to use in serializing the entirety of the javascript context including the entire DOM from Chromium, sending it over a websocket to Node and reserializing it successfully. The only cyclic issue that is problematic is the retardo navigator.mimeTypes and navigator.plugins circle jerking one another to infinity, but easily solved.
(function(mimeTypes, plugins){
delete navigator.mimeTypes;
delete navigator.plugins;
var theENTIREwindowANDdom = jsonfu.serialize(window);
WebsocketForStealingEverything.send(theENTIREwindowANDdom);
navigator.mimeTypes = mimeTypes;
navigator.plugins = plugins;
})(navigator.mimeTypes, navigator.plugins);
JSONFu uses the tactic of creating Sigils which represent more complex data types. Like a MoreSigil which say that the item is abbreviated and there's X levels deeper which can be requested. It's important to understand that if you're serializing EVERYTHING then it's obviously more complicated to revive it back to its original state. I've been experimenting with various things to see what's possible, what's reasonable, and ultimately what's ideal. For me the goal is a bit more auspicious than most needs in that I'm trying to get as close to merging two disparate and simultaneous javascript contexts into a reasonable approximation of a single context. Or to determine what the best compromise is in terms of exposing the desired capabilities while not causing performance issues. When you start looking to have revivers for functions then you cross the land from data serialization into remote procedure calling.
A neat hacky function I cooked up along the way classifies all the properties on an object you pass to it into specific categories. The purpose for creating it was to be able to pass a window object in Chrome and have it spit out the properties organized by what's required to serialize and then revive them in a remote context. Also to accomplish this without any sort of preset cheatsheet lists, like a completely dumb checker that makes the determinations by prodding the passed value with a stick. This was only designed and ever checked in Chrome and is very much not production code, but it's a cool specimen.
// categorizeEverything takes any object and will sort its properties into high level categories
// based on it's profile in terms of what it can in JavaScript land. It accomplishes this task with a bafflingly
// small amount of actual code by being extraordinarily uncareful, forcing errors, and generally just
// throwing caution to the wind. But it does a really good job (in the one browser I made it for, Chrome,
// and mostly works in webkit, and could work in Firefox with a modicum of effort)
//
// This will work on any object but its primarily useful for sorting the shitstorm that
// is the webkit global context into something sane.
function categorizeEverything(container){
var types = {
// DOMPrototypes are functions that get angry when you dare call them because IDL is dumb.
// There's a few DOM protos that actually have useful constructors and there currently is no check.
// They all end up under Class which isn't a bad place for them depending on your goals.
// [Audio, Image, Option] are the only actual HTML DOM prototypes that sneak by.
DOMPrototypes: {},
// Plain object isn't callable, Object is its [[proto]]
PlainObjects: {},
// Classes have a constructor
Classes: {},
// Methods don't have a "prototype" property and their [[proto]] is named "Empty"
Methods: {},
// Natives also have "Empty" as their [[proto]]. This list has the big boys:
// the various Error constructors, Object, Array, Function, Date, Number, String, etc.
Natives: {},
// Primitives are instances of String, Number, and Boolean plus bonus friends null, undefined, NaN, Infinity
Primitives: {}
};
var str = ({}).toString;
function __class__(obj){ return str.call(obj).slice(8,-1); }
Object.getOwnPropertyNames(container).forEach(function(prop){
var XX = container[prop],
xClass = __class__(XX);
// dumping the various references to window up front and also undefineds for laziness
if(xClass == "Undefined" || xClass == "global") return;
// Easy way to rustle out primitives right off the bat,
// forcing errors for fun and profit.
try {
Object.keys(XX);
} catch(e) {
if(e.type == "obj_ctor_property_non_object")
return types.Primitives[prop] = XX;
}
// I'm making a LOT flagrant assumptions here but process of elimination is key.
var isCtor = "prototype" in XX;
var proto = Object.getPrototypeOf(XX);
// All Natives also fit the Class category, but they have a special place in our heart.
if(isCtor && proto.name == "Empty" ||
XX.name == "ArrayBuffer" ||
XX.name == "DataView" ||
"BYTES_PER_ELEMENT" in XX) {
return types.Natives[prop] = XX;
}
if(xClass == "Function"){
try {
// Calling every single function in the global context without a care in the world?
// There's no way this can end badly.
// TODO: do this nonsense in an iframe or something
XX();
} catch(e){
// Magical functions which you can never call. That's useful.
if(e.message == "Illegal constructor"){
return types.DOMPrototypes[prop] = XX;
}
}
// By process of elimination only regular functions can still be hanging out
if(!isCtor) {
return types.Methods[prop] = XX;
}
}
// Only left with full fledged objects now. Invokability (constructor) splits this group in half
return (isCtor ? types.Classes : types.PlainObjects)[prop] = XX;
// JSON, Math, document, and other stuff gets classified as plain objects
// but they all seem correct going by what their actual profiles and functionality
});
return types;
};
I have a Javascript object that I'm trying to use as a "hashmap". The keys are always strings, so I don't think I need anything as sophisticated as what's described in this SO question. (I also don't expect the number of keys to go above about 10 so I'm not particularly concerned with lookups being O(n) vs. O(log n) etc.)
The only functionality I want that built-in Javascript objects don't seem to have, is a quick way to figure out the number of key/value pairs in the object, like what Java's Map.size returns. Of course, you could just do something like:
function getObjectSize(myObject) {
var count=0
for (var key in myObject)
count++
return count
}
but that seems kind of hacky and roundabout. Is there a "right way" to get the number of fields in the object?
There is an easier way spec'd in ECMAScript 5.
Object.keys(..) returns an array of all keys defined on the object. Length can be called on that. Try in Chrome:
Object.keys({a: 1, b: 2}).length; // 2
Note that all objects are basically key/value pairs in JavaScript, and they are also very extensible. You could extend the Object.prototype with a size method and get the count there. However, a much better solution is to create a HashMap type interface or use one of the many existing implementations out there, and define size on it. Here's one tiny implementation:
function HashMap() {}
HashMap.prototype.put = function(key, value) {
this[key] = value;
};
HashMap.prototype.get = function(key) {
if(typeof this[key] == 'undefined') {
throw new ReferenceError("key is undefined");
}
return this[key];
};
HashMap.prototype.size = function() {
var count = 0;
for(var prop in this) {
// hasOwnProperty check is important because
// we don't want to count properties on the prototype chain
// such as "get", "put", "size", or others.
if(this.hasOwnProperty(prop) {
count++;
}
}
return count;
};
Use as (example):
var map = new HashMap();
map.put(someKey, someValue);
map.size();
A correction: you need to check myObject.hasOwnProperty(key) in each iteration, because there're can be inherited attributes. For example, if you do this before loop Object.prototype.test = 'test', test will aslo be counted.
And talking about your question: you can just define a helper function, if speed doesn't matter. After all, we define helpers for trim function and other simple things. A lot of javascript is "kind of hacky and roundabout" :)
update
Failure example, as requested.
Object.prototype.test = 'test';
var x = {};
x['a'] = 1;
x['b'] = 2;
The count returned will be 3.
you could also just do myObject.length (in arrays)
nevermind, see this: JavaScript object size
That's all you can do. Clearly, JavaScript objects are not designed for this. And this will only give you the number of Enumerable properties. Try getObjectSize(Math).
I'm trying to figure out what's gone wrong with my json serializing, have the current version of my app with and old one and am finding some surprising differences in the way JSON.stringify() works (Using the JSON library from json.org).
In the old version of my app:
JSON.stringify({"a":[1,2]})
gives me this;
"{\"a\":[1,2]}"
in the new version,
JSON.stringify({"a":[1,2]})
gives me this;
"{\"a\":\"[1, 2]\"}"
any idea what could have changed to make the same library put quotes around the array brackets in the new version?
Since JSON.stringify has been shipping with some browsers lately, I would suggest using it instead of Prototype’s toJSON. You would then check for window.JSON && window.JSON.stringify and only include the json.org library otherwise (via document.createElement('script')…). To resolve the incompatibilities, use:
if(window.Prototype) {
delete Object.prototype.toJSON;
delete Array.prototype.toJSON;
delete Hash.prototype.toJSON;
delete String.prototype.toJSON;
}
The function JSON.stringify() defined in ECMAScript 5 and above (Page 201 - the JSON Object, pseudo-code Page 205), uses the function toJSON() when available on objects.
Because Prototype.js (or another library that you are using) defines an Array.prototype.toJSON() function, arrays are first converted to strings using Array.prototype.toJSON() then string quoted by JSON.stringify(), hence the incorrect extra quotes around the arrays.
The solution is therefore straight-forward and trivial (this is a simplified version of Raphael Schweikert's answer):
delete Array.prototype.toJSON
This produces of course side effects on libraries that rely on a toJSON() function property for arrays. But I find this a minor inconvenience considering the incompatibility with ECMAScript 5.
It must be noted that the JSON Object defined in ECMAScript 5 is efficiently implemented in modern browsers and therefore the best solution is to conform to the standard and modify existing libraries.
A possible solution which will not affect other Prototype dependencies would be:
var _json_stringify = JSON.stringify;
JSON.stringify = function(value) {
var _array_tojson = Array.prototype.toJSON;
delete Array.prototype.toJSON;
var r=_json_stringify(value);
Array.prototype.toJSON = _array_tojson;
return r;
};
This takes care of the Array toJSON incompatibility with JSON.stringify and also retains toJSON functionality as other Prototype libraries may depend on it.
Edit to make a bit more accurate:
The problem key bit of code is in the JSON library from JSON.org (and other implementations of ECMAScript 5's JSON object):
if (value && typeof value === 'object' &&
typeof value.toJSON === 'function') {
value = value.toJSON(key);
}
The problem is that the Prototype library extends Array to include a toJSON method, which the JSON object will call in the code above. When the JSON object hits the array value it calls toJSON on the array which is defined in Prototype, and that method returns a string version of the array. Hence, the quotes around the array brackets.
If you delete toJSON from the Array object the JSON library should work properly. Or, just use the JSON library.
I think a better solution would be to include this just after prototype has been loaded
JSON = JSON || {};
JSON.stringify = function(value) { return value.toJSON(); };
JSON.parse = JSON.parse || function(jsonsring) { return jsonsring.evalJSON(true); };
This makes the prototype function available as the standard JSON.stringify() and JSON.parse(), but keeps the native JSON.parse() if it is available, so this makes things more compatible with older browsers.
I'm not that fluent with Prototype, but I saw this in its docs:
Object.toJSON({"a":[1,2]})
I'm not sure if this would have the same problem the current encoding has, though.
There's also a longer tutorial about using JSON with Prototype.
This is the code I used for the same issue:
function stringify(object){
var Prototype = window.Prototype
if (Prototype && Prototype.Version < '1.7' &&
Array.prototype.toJSON && Object.toJSON){
return Object.toJSON(object)
}
return JSON.stringify(object)
}
You check if Prototype exists, then you check the version. If old version use Object.toJSON (if is defined) in all other cases fallback to JSON.stringify()
Here's how I'm dealing with it.
var methodCallString = Object.toJSON? Object.toJSON(options.jsonMethodCall) : JSON.stringify(options.jsonMethodCall);
My tolerant solution checks whether Array.prototype.toJSON is harmful for JSON stringify and keeps it when possible to let the surrounding code work as expected:
var dummy = { data: [{hello: 'world'}] }, test = {};
if(Array.prototype.toJSON) {
try {
test = JSON.parse(JSON.stringify(dummy));
if(!test || dummy.data !== test.data) {
delete Array.prototype.toJSON;
}
} catch(e) {
// there only hope
}
}
As people have pointed out, this is due to Prototype.js - specifically versions prior to 1.7. I had a similar situation but had to have code that operated whether Prototype.js was there or not; this means I can't just delete the Array.prototype.toJSON as I'm not sure what relies on it. For that situation this is the best solution I came up with:
function safeToJSON(item){
if ([1,2,3] === JSON.parse(JSON.stringify([1,2,3]))){
return JSON.stringify(item); //sane behavior
} else {
return item.toJSON(); // Prototype.js nonsense
}
}
Hopefully it will help someone.
If you don't want to kill everything, and have a code that would be okay on most browsers, you could do it this way :
(function (undefined) { // This is just to limit _json_stringify to this scope and to redefine undefined in case it was
if (true ||typeof (Prototype) !== 'undefined') {
// First, ensure we can access the prototype of an object.
// See http://stackoverflow.com/questions/7662147/how-to-access-object-prototype-in-javascript
if(typeof (Object.getPrototypeOf) === 'undefined') {
if(({}).__proto__ === Object.prototype && ([]).__proto__ === Array.prototype) {
Object.getPrototypeOf = function getPrototypeOf (object) {
return object.__proto__;
};
} else {
Object.getPrototypeOf = function getPrototypeOf (object) {
// May break if the constructor has been changed or removed
return object.constructor ? object.constructor.prototype : undefined;
}
}
}
var _json_stringify = JSON.stringify; // We save the actual JSON.stringify
JSON.stringify = function stringify (obj) {
var obj_prototype = Object.getPrototypeOf(obj),
old_json = obj_prototype.toJSON, // We save the toJSON of the object
res = null;
if (old_json) { // If toJSON exists on the object
obj_prototype.toJSON = undefined;
}
res = _json_stringify.apply(this, arguments);
if (old_json)
obj_prototype.toJSON = old_json;
return res;
};
}
}.call(this));
This seems complex, but this is complex only to handle most use cases.
The main idea is overriding JSON.stringify to remove toJSON from the object passed as an argument, then call the old JSON.stringify, and finally restore it.