I want to pass following payload to the API
params[field1]: value1
params[field2]: value1
....
params[fieldN]: valueN
I have field and value coming from an object.
var params = {};
jQuery.each($scope.linkParams, function(a, b) {
params.params[a] = b; // returns undefined variable error
// I also tried other options but all result in one or another error
// Some doesn't result into an erro but doesn't get merged. See below merge requirement
});
I also wants to merge the above created object to another object with
jQuery.extend(extraParams, params);
How to achieve the rquired object?
Update
$scope.linkParams = {
field1: 'value1',
field2: 'value2',
....
};
You have two questions, so I'll address them one at a time.
(For a TL;DR, I emboldened the solution text. Hopefully the rest is worth the read, though.)
Object Serialization is Pretty Magical, but Not Quite That Magical
If I had a JS object that I instantiated like the following:
var cat = {
'meow': 'loud',
'type': 'Persian',
'sex': 'male'
}
then it is certainly true that you get attribute reference for free. That is, you can say something like cat.meow, and your runtime environment will make sense of that. However, JS will not automatically create properties of an object that you have referenced do not exist, unless you are referencing them to create them.
cat.health = 'meek' will work, but cat.ears[0] = 'pointy' will not.
var cat = {
'meow': 'loud',
'type': 'Persian',
'sex': 'male'
}
cat.health = 'meek'
alert(cat.health)
cat.ears[0] = 'pointy'
alert(cat.ears[0])
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
You'll notice that the first alert happens and contains the expected value, but the second alert never comes. This is because the code fails on the line with cat.ears[0] = 'pointy', and it stops execution at that point.
This may seem to contradict what I just said, but look closely at what's happening. When we attempt to initialize the first element of cat.ears, we must reference the cat.ears property, which does not exist.
JS implementations won't assume that you want to create items up the chain eternally, which is likely by design -- if it didn't throw errors and instead just created any properties or objects that needed to exist in order for your program to by syntactically sound, many pieces of software would silently break when they failed to include required libraries. If you forgot to include JQuery, it'd just create a $, a JQuery variable, and all of the properties of those objects you reference in your code. It'd be a proper mess to debug.
In short, that's what you're -- probably accidentally -- assuming will work here. params.params is analogous to cat.ears in the above example. You may have some reason for doing this, but assuming you don't, your code should function if you simply change params.params[a] to params[a].
JQuery.extend()
Assuming that extraParams is a valid array/object, the code you have written will work once params doesn't break your code anymore, however: do note that this will modify your extraParams object. If you want a new object to contain both params and extraParams, write something more like:
var args = $.extend({}, params, extraParams)
That will modify an empty object and add in the contents of params and extraParams. See the JQuery documentation for more information.
Some manipulations and I was able to achieve the required results.
I am posting the code for further reference:
var d = {};
jQuery.each($scope.linkParams, function(a,b) {
a = "params[" + a + "]";
d[a] = b;
});
jQuery.extend(extraParams, d);
Related
I have a complex variable as an object including 2 times the same object.
if I change a value of the first object part, this will assume to change the value of the second part. is there an explanation? why are the tow keys still connected?!
here is a simple example of my code:
arr1={'a':[],'b':{'b1':'','b2':''}};
arr2={'p1':{...arr1},'p2':{...arr1}};
arr2['p1']['a']=[1,2,3];
console.log(arr2['p2']['a']); // works => []
arr2['p1']['b']['b1']='blabla';
console.log(arr2['p2']['b']['b1']); // doesn't work => 'blabla'
I don't want to write 'B={'b1':{'a':''},'b2':{'a':''}}' because A is a very big object in a separated .js file
A={'a':''};
B={'b1':A,'b2':A};
B['b1']['a']='blabla';
The main reason is that you try to change the object element via its index a instead of changing the object (reference). Step by step,
you ask to find B['bi'], which is A object
and then you ask to set A['a'] to blabla
B['b1'] and B['b2'] point to A
you get "both" changed since it's same instance of A
But if you do B['b1'] = { 'a': 'blabla' }, then you will not get the same result. Because now you are adding a new instance without any connection to A.
- use spread operator will save you from a problem like this
The fundamental idea of the spread operator is to create a new plain object using the own properties of an existing object. So {...obj} creates a new object with the same properties and values as obj
you can do that easily like
A={'a':''};
B={'b1':{...A},'b2':{...A}};
B['b1']['a']='blabla';
console.log(B['b2']); // => '' not 'blabla'
I solved the problem with:
arr2={'p1':JSON.parse(JSON.stringify(arr1)),'p2':JSON.parse(JSON.stringify(arr1))};
I have 2 JS objects, each is rendered as Tree in a webpage.
My issue is how to force a change on one of them while user applies a change to others.
My basic idea is to "bind onChange" on each objects obviously paying attention to not generate infinite loops.
In jQuery it seems almost difficult, I read something about "proxy" but I don't understand if it could help me on this topic.
I lastly thought to vue.js. I read that vue.js is very efficient syncing js and dom objects so a change between them is almost easy, maybe is possible to sync two js objects?
To be clearer, here more details:
I have something like this:
let obj1={key1:1, key2:[1,2,3]}; // defines arbitrary data obj
let obj2={};
$.extend(obj2,obj1); // defines obj2 as clone of obj1
// do "something magic" here
I would like to get the following:
obj1.key1=2; // => should automatically set obj2.key1=2; under the hood
obj2.key2.push(4); // => should automatically set obj1.key2=[1,2,3,4] under the hood
Is there any trick to bind two (identical, cloned) data objects so that any change made on one of them is reflected to the other one, as if the involved object keys "pointed" to the same data? Since objects are assigned "by reference" in javascript, this is doable if we define a third object "obj_value" and we assign it as value to the above objects as follows:
obj1.key=obj_value; // both obj1.key and obj2.key point to the same object
obj2.key=obj_value;
But I'd like something more general, directly binding one obj key to the other, in pseudo-code:
obj1.on('change',function(key,value)
{
obj2.key=value;
})
watch: {
objChangedByUser: function (value) {
this.cloneOfobjChangedByUser = Object.assign({}, value);
}
}
Or:
computed: {
cloneOfObjChangedByUser: function () {
return Object.assign({}, this.objChangedByUser);
}
}
I have an ko.observable with an object that contains 3 arrays like this:
self.filter({ file: [], site: [], statut: [] })`
When I try to empty them it doesn't work. I tried
array = []
to empty them. Is it a problem with the observable?
You don't need all of your observable object's arrays to be observable to be able to update the UI, although I'd certainly (like the other answerer) advice you to do so.
I would like to explain however why it doesn't work.
Say you have the following code:
var originalObject = {
myArray: [1, 2, 3]
};
var myObservable = ko.observable(originalObject);
// Resetting the array behind knockout's back:
originalObject.myArray = [1, 2, 3, 4];
The last line changes a property of the object that was used to set the observable. There's no way for knockout to know you've updated your object. If you want knockout to reconsider the observable's vale, you have to tell it something's changed:
myObservable.valueHasMutated();
Now, normally, you update an observable by passing a new or updated variable to it like so:
myObservable(newValue);
Strangely, setting the observable with the same object again also works:
myObservable(originalObject);
This is why:
Internally, knockout compares the newValue to the value it currently holds. If the values are the same, it doesn't do anything. If they're different, it sets the new value and performs the necessary UI updates.
Now, if you're working with just a boolean or number, you'll notice knockout has no problems figuring out if the new value is actually different:
var simpleObservable = ko.observable(true);
simpleObservable.subscribe(function(newValue) {
console.log("Observable changed to: " + newValue);
});
simpleObservable(true); // Doesn't log
simpleObservable(false); // Does log
<script src="https://cdnjs.cloudflare.com/ajax/libs/knockout/3.2.0/knockout-min.js"></script>
For objects however, it behaves differently:
var myObject = { a: 1 };
var simpleObservable = ko.observable(myObject);
simpleObservable.subscribe(function(newValue) {
console.log("Observable changed to: " + JSON.stringify(newValue, null, 2));
});
simpleObservable(myObject); // Does log, although nothing changed
simpleObservable({b: 2 }); // Does log
<script src="https://cdnjs.cloudflare.com/ajax/libs/knockout/3.2.0/knockout-min.js"></script>
The subscription is triggered, even though we've used the exact same object to reset our observable! If you dig through knockout's source code, you'll see why. It uses this method to check if the new value is different:
var primitiveTypes = { 'undefined':1, 'boolean':1, 'number':1, 'string':1 };
function valuesArePrimitiveAndEqual(a, b) {
var oldValueIsPrimitive = (a === null) || (typeof(a) in primitiveTypes);
return oldValueIsPrimitive ? (a === b) : false;
}
Simply put: if the old value isn't a primitive value, it will just assume things have changed. This means we can update our originalObject, as long as we reset the observable.
originalObject.myArray.length = 0;
myObservable(originalObject);
Or, just as easy:
myObservable(Object.assign(originalObject, { myArray: [] });
A bit of a long answer, but I believe it's nice to know why stuff doesn't work, instead of only circumventing it. Even if simply using observableArrays and letting knockout optimize its work is the better solution!
You mention "emptying" an array. Note that that's different from "assigning a new, empty array to a variable". At any rate, if you want to "empty" an array:
For observableArray check the relevant docs, because they have a removeAll() utility method.
For emptying a plain javascript array, check this duplicate question that has various solutions, one of which is simply array.length = 0.
As a final note, if you're inside the view model, you might need to do self.filter() first to get the object inside the observable. So, for example:
self.filter().file.length = 0; // plain array method
However, since file, site, and statut are plain arrays (and not observableArrays) there will be no automatic updates in your UI. If they were observable arrays, you'd do:
self.filter().file.removeAll(); // assuming `file` has been made observable
I'm calling a JavaScript function that wants an array of things to display. It displays a count, and displays the items one by one. Everything works when I pass it a normal JavaScript array.
But I have too many items to hold in memory at once. What I'd like to do, is pass it an object with the same interface as an array, and have my method(s) be called when the function tries to access the data. And in fact, if I pass the following:
var featureArray = {length: count, 0: func(0)};
then the count is displayed, and the first item is correctly displayed. But I don't want to assign all the entries, or I'll run out of memory. And the function currently crashes when the user tries to display the second item. I want to know when item 1 is accessed, and return func(1) for item 1, and func(2) for item 2, etc. (i.e., delaying the creation of the item until it is requested).
Is this possible in JavaScript?
If I understand correctly, this would help:
var object = {length: count, data: function (whatever) {
// create your item
}};
Then, instead of doing array[1], array[2], et cetera, you'd do object.data(1), object.data(2), and so on.
Since there seems to be a constraint that the data must be accessed using array indexing via normal array indexing arr[index] and that can't be changed, then the answer is that NO, you can't override array indexing in Javascript to change how it works and make some sort of virtual array that only fetches data upon demand. It was proposed for ECMAScript 4 and rejected as a feature.
See these two other posts for other discussion/confirmation:
How would you overload the [] operator in Javascript
In javascript, can I override the brackets to access characters in a string?
The usual way to solve this problem would be to switch to using a method such as .get(n) to request the data and then the implementor of .get() can virtualize however much they want.
P.S. Others indicate that you could use a Proxy object for this in Firefox (not supported in other browsers as far as I know), but I'm not personally familiar with Proxy objects as it's use seems rather limited to code that only targets Firefox right now.
Yes, generating items on the go is possible. You will want to have a look at Lazy.js, a library for producing lazily computed/loaded sequences.
However, you will need to change your function that accepts this sequence, it will need to be consumed differently than a plain array.
If you really need to fake an array interface, you'd use Proxies. Unfortunately, it is only a harmony draft and currently only supported in Firefox' Javascript 1.8.5.
Assuming that the array is only accessed in an iteration, i.e. starting with index 0, you might be able to do some crazy things with getters:
var featureArray = (function(func) {
var arr = {length: 0};
function makeGetter(i) {
arr.length = i+1;
Object.defineProperty(arr, i, {
get: function() {
var val = func(i);
Object.defineProperty(arr, i, {value:val});
makeGetter(i+1);
return val;
},
configurable: true,
enumerable: true
});
}
makeGetter(0);
return arr;
}(func));
However, I'd recommend to avoid that and rather switch the library that is expecting the array. This solution is very errorprone if anything else is done with the "array" but accessing its indices in order.
Thank you to everyone who has commented and answered my original question - it seems that this is not (currently) supported by JavaScript.
I was able to get around this limitation, and still do what I wanted. It uses an aspect of the program that I did not mention in my original question (I was trying to simplify the question), so it is understandable that other's couldn't recommend this. That is, it doesn't technically answer my original question, but I'm sharing it in case others find it useful.
It turns out that one member of the object in each array element is a callback function. That is (using the terminology from my original question), func(n) is returning an object, which contains a function in one member, which is called by the method being passed the data. Since this callback function knows the index it is associated with (at least, when being created by func(n)), it can add the next item in the array (or at least ensure that it is already there) when it is called. A more complicated solution might go a few ahead, and/or behind, and/or could cleanup items not near the current index to free memory. This all assumes that the items will be accessed consecutively (which is the case in my program).
E.g.,
1) Create a variable that will stay in scope (e.g., a global variable).
2) Call the function with an object like I gave as an example in my original question:
var featureArray = {length: count, 0: func(0)};
3) func() can be something like:
function func(r) {
return {
f : function() {featureArray[r + 1] = func(r + 1); DoOtherStuff(r); }
}
}
Assuming that f() is the member with the function that will be called by the external function.
I am working on an app that heavily uses JavaScript. I am attempting to include some object-oriented practices. In this attempt, I have created a basic class like such:
function Item() { this.init(); }
Item.prototype = {
init: function () {
this.data = {
id: 0,
name: "",
description: ""
}
},
save: function() {
alert("Saving...");
$.ajax({
url: getUrl(),
type: "POST",
data: JSON.stringify(this.data),
contentType: "application/json"
});
}
}
I am creating Item instances in my app and then saving them to local storage like such:
Item item = new Item();
window.localStorage.setItem("itemKey", JSON.stringify(item));
On another page, or at another time, I am retriving that item from local storage like such:
var item = window.localStorage.getItem("itemKey");
item = JSON.parse(item);
item.save();
Unfortunately, the "save" function does not seem to get reached. In the console window, there is an error that says:
*save_Click
(anonymous function)
onclick*
I have a hunch that the "(anonymous function)" is the console window's way of saying "calling item.save(), but item is an anonymous type, so I am trying to access an anonymous function". My problem is, I'm not sure how to convert "var item" into an Item class instance again. Can someone please show me?
Short answer:
Functions cannot be serialized into JSON.
Explanation:
JSON is a cross-platform serialization scheme based on a subset of JS literal syntax. This being the case, it can only store certain things. Per http://www.json.org/ :
Objects: An object is an unordered set of name/value pairs. An object begins with { (left brace) and ends with } (right brace). Each name is followed by : (colon) and the name/value pairs are separated by , (comma).
Arrays: An array is an ordered collection of values. An array begins with [ (left bracket) and ends with ] (right bracket). Values are separated by , (comma).
values: A value can be a string in double quotes, or a number, or true or false or null, or an object or an array. These structures can be nested.
Functions cannot be serialized into JSON because another non-JS platform would not be able to unserialize and use it. Consider the example in reverse. Say I had a PHP object at my server which contained properties and methods. If I serialized that object with PHP's json_encode() and methods were included in the output, how would my JavaScript ever be able to parse and understand PHP code in the methods, let alone use those methods?
What you are seeing in your resulting JSON is the toString() value of the function on the platform you're using. The JSON serilizer calls toString() on anything being serialized which isn't proper for JSON.
I believe your solution is to stop storing instances in JSON/local storage. Rather, save pertinent data for an instance which you set back to a new instance when you need.
I know this question is answered already, however I stumbled upon this by accident and wanted to share a solution to this problem, if anyone is interested.
instead of doing this:
var item = window.localStorage.getItem("itemKey");
item = JSON.parse(item);
item.save();
do something like this:
// get serialized JSON
var itemData = window.localStorage.getItem("itemKey");
//instantiate new Item object
var item = new Item();
// extend item with data
$.extend(item, JSON.parse(itemData));
// this should now work
item.save();
this will work so long as the function you are wanting to call (ie, save()) is prototypal and not an instance method (often times the case, and is indeed the case in the OP's original question.
the $.extend method is a utility method of jquery, but it is trivial to roll your own.
You cant do that, how can javascript possibly knows that item have a save function ? json doesnt allow functions as datas. just read the json spec , you cant save functions.
what you need to do is to create a serialize and deserialize method in the hash you want to stock. that will specifiy what to export and how you can "wake up" an object after parsing the corresponding json string.
You can only store plain Objects in DOMstorages (cookies, urlparams..., everything that needs [de]serialisation through JSON.stringify/JSON.parse). So what you did when sending the ajax data
ajaxsend(this.data);
also applies to string serialisation. You can only store the data, not the instance attributes (like prototype, constructor etc.). So use
savestring(JSON.stringify(item.data));
which is possible because item.data is such a plain Object. And when restoring it, you will only get that plain data Object back. In your case it's easy to reconstruct a Item instance from plain data, because your Items hold their values (only) in a public available property:
var item = new Item;
item.data = JSON.parse(getjsonstring());
Disclaimer
Not a full time time J.S. Developer, answer may have some minor bugs:
Long Boring Explanation
As mentioned by #JAAulde, your object cannot be serialized into JSON, because has functions, the technique that you are using doesn't allow it.
Many people forget or ignore that the objects that are used in an application, may not be exactly the same as saved / restored from storage.
Short & quick Answer
Since you already encapsulate the data members of your object into a single field,
you may want to try something like this:
// create J.S. object from prototype
Item item = new Item();
// assign values as you app. logic requires
item.data.name = "John Doe";
item.data.description = "Cool developer, office ladies, love him";
// encoded item into a JSON style string, not stored yet
var encodedItem = JSON.stringify(item.data)
// store string as a JSON string
window.localStorage.setItem("itemKey", encodedItem);
// do several stuff
// recover item from storage as JSON encoded string
var encodedItem = window.localStorage.getItem("itemKey");
// transform into J.S. object
item.data = JSON.parse(encodedItem);
// do other stuff
Cheers.