In a Vue page, I have a call to get data using Ajax when the mounted() event is fired. The code recreates the existing Pager using a new Pager object where it has to pass in all the parameters in the constructor to reconstruct it.
If I don't do this, vm.Pager is just an Object and does not have some needed methods, and fails the prop type check that it gets passed to.
axios.post("/Home/GetList", vm.Pager)
.then(function (result)
{
var p = result.data.Pager;
vm.Pager = new Pager(p.PageSize, p.CurrentRecord, p.TotalCount);
// etc. (Pager has additional fields...)
vm.ItemList = result.data.ListItems;
})
.catch(function (error)
{
alert(error);
});
In the knockoutjs, there was a mapping function and you could what tell it what types to map without having to recreate the object. This was convenient, particularly for more complicated or nested Ajax data.
Is there a better way to do this in Vue (or javascript) where it maps the type from the Ajax without having to recreate it?
You could make your own mapper function.
methods: {
mapTypesToData (responseData, map) {
responseData.forEach((item, key) => {
let mapperVal = map[key]
if (typeof mapperVal === 'string') {
this.$set(this, map[key], item)
} else if (typeof mapperVal === 'function') {
this.$set(this, key, map[key](item))
}
})
}
}
then in your ajax request
axios.post("/Home/GetList", vm.Pager)
.then(function (result)
{
this.mapTypesToData(result.data, {
ItemList: 'ListItems',
Pager: (p) => new Pager(p.PageSize, p.CurrentRecord, p.TotalCount)
})
})
I found I can use either:
Object.assign(vm.Pager, result.data.Pager);
Or
// import _ from 'lodash'
_.merge(vm.Pager, result.data.Pager);
...and both seem to update the viewmodel correctly. Unfortunately, when I use either of these methods, the watch method in the nested ListPager control (which receives the Pager object) does not fire.
To get that to work, I found the solution below, after looking at this documentation VueJs: Reactivity in Depth - Change Detection Caveats.
vm.Pager = Object.assign(new Pager(), vm.Pager, result.data.Pager);
This actually creates a new Pager but seems to populate it correctly and retains the Vue reactivity. Hopefully this will scale out to more elaborate solutions, if needed.
Related
I'm using the OData V4 model in UI5. I've created a binding with some expands in it and now I try to obtain the context of a child entity.
Here is the code how I bind entities to some element. As a result I get an object with on 'SomeEntity' and an array with 'SomeOtherEntity' as a property.
oPage.bindElement({
path: /SomeEntity(id),
parameters: {
$expand: {
SomeOtherEntity: {
$select: ['ID', 'name', 'sequence'],
$orderby: 'sequence'
}
}
}
});
Now I can get the context of the binding with oPage.getBindingContext() and can execute methods like requestObject, setProperty, create and delete from this object.
What I want is to obtain the context of one of the 'SomeOtherEntity' properties to (for example) delete one of them.
I have no idea how to accomplish this. Anybody has an idea?
You can create an own ListBinding to SomeOtherEntity and filter the desired set.
(I'm not quite sure, but it might be necessary to trigger a refresh on the ListBinding to force an initial load)
After the data is loaded (dataReceived-Event), delete all the Contexts.
Each Delete returns a Promise and you can proceed with a Promise.all.
var oDataModel = this.getModel();
var aPromises= [];
var oListBinding = oDataModel.bindList("/SomeOtherEntity", undefined, undefined, new Filter("ID", FilterOperator.EQ, sIdToDelete), {
$$operationMode: OperationMode.Server
});
oListBinding.attachEventOnce("dataReceived", function (oEvent) {
var aContexts = oListBinding.getContexts();
aContexts.forEach(function (oContext) {
aPromises.push(oContext.delete("$auto"));
});
Promise.all(aPromises).then(function () {
/* Cleanup after Deletion
});
});
So, the problem is next: I receive large collection of prices and there are prices for specific, let say, partners. Thus collection contains some kind of groups denoted by partner_id.
I filtering this collection (using collection.filter() in initialize method), to obtain different "format" of data for subsequent views.
var BasePrices = new Collections.ProductPrices( // Creating the same collection type
this.model.get('prices').filter(function (m) { // But a bit narrowed
return ~~m.get('partner_id') === 0; // leaving prices without `partner_id`
})
);
Later I pass this newly done collection to the view that is managing the list of base prices.
The problem itself is that I'm subscribing on events of this newly done collection, but events of models that remained there after .filter() are firing their events to the old collection that lays under this.model.get('prices'), but newly added models firing their events correctly (to the BasePrices collection).
I couldn't understand why its happening so. I can presume that there is smth. related with reference of model to collection (model.collection property), but why it is not updated when I created brand new collection and how to solve the issue?
If you're creating the filtered collection only to use it in a view, it is better (and more correct) to use the original collection and let the view render only the items you want. For example (inside the view class):
render: function() {
this.model.each(function(m) {
if(~m.get('partner_id') === 0)
return;
/* render m here */
});
}
The rationale is that the view represents the original collection.
(If you need several filtered views of the same collection, you can use a single view class for all of them and pass it a filter function:
initialize: function(filter) {
this.filter = filter;
}
render: function() {
this.model.each(function(m) {
if(!this.filter(m))
return;
/* render m here */
});
}
Then create a view like this: new FilteredView(function(m) {return ~~m.get('partner_id') === 0;})
And the problem was indeed in references and cloning (not cloning, actually). The point is that we need to clone everything to the new collection. Clone ... not copy, not pass (by reference as we know) - clone.
var BasePrices = new Collections.ProductPrices() // Creating the same collection type
_(this.model.get('prices').models) // Implicitly `_.chain`ing
.filter(function (m) { return ~~m.get('partner_id') === 0; }) // leaving prices without `partner_id`
.map(function (m) { return m.toJSON(); }) // converting each model to raw object
.tap(function (a) { c.add(a); }) // adding all models at once
.value(); // evaluating chain
! More elegant ways of solving this problem are highly appreciated.
UPD: Just to keep chaining consistent here is one-liner for lodash.
var BasePrices = _(this.model.get('prices').models)
.filter(function (m) { return ~~m.get('partner_id') === 0; })
.map(function (m) { return m.toJSON(); })
// creating collection passing all models in constructor
.thru(function (a) { return new Collections.ProductPrices(a); })
.value();
I have an json object which I am responding from servlet to knockout js. I want to initialize this data in my view model for that I am writing this code.
success: function (data)
{
var jsondata = data['jsonObj'];
self.PopulateStates = ko.computed(function(){
ko.utils.arrayForEach(jsondata, function(item){
self.States.push(new State(item));
});
});
},
error: function (exception)
{
alert( "fail" );
}
});
My json object as string looks like this
{data:[{"id":"5345345","name":"dsfsdf","ssc":"","bic":"dgffdgfdg"},{"id":"123456","name":"SBI","ssc":"654321","bic":"vxvxc"}]}
js fiddle link is demo
What is my mistake ? Or do I need to do it by mapping plugin of knockout js?
I use this knockout extension, declared before use.
ko.observableArray.fn.map = function (data, Constructor) {
var mappedData = ko.utils.forEach(data, function () {
return new Constructor(data);
});
this(mappedData);
return this;
}
Then in my $.ajax request I do this:
success: function (data)
{
var jsondata = data['jsonObj'];
self.PopulateStates = ko.observableArray().map(data, State);
});
You had the results in a computed observable which isn't what you need.
Another thing I have noticed is that your jsondata is set using the data that gets returned from the GET. You are asking that data for the field jsonObj however, looking at your JSON it seems you don't have this field. I think I am correct in saying you have data as the field with the list of items being returned.
If in your view model you have already declared self.PopulateStates which, I'm guessing you have. You can do this:
var State = function (data) {
var self = this;
self.property = ko.observable().set(data, "property");
}
var viewModel = function () {
var self = this;
self.PopulateStates = ko.observable();
function getStates() {
var request = $.ajax();
request.done(function (data, msg) {
if (data) self.PopulateStates.map(data, State);
});
}
}
If you notice in the State model I have self.property using a custom observable function to set it. All this does is if there is data to set the property to, set it. Otherwise give it a default value. I also have a third parameter that I use when I want it to construct an object for me using the data. This is when I have say, a contact, with a modifiedBy property and this modifiedBy is a user object (or just a complex object)
EDIT
The main thing, which isn't an error, but isn't necessary is the jQuery inclusion. Knockout is built to work independant of jQuery so where you do $(document).ready(function () {}) to make sure this loads on DOM ready isn't needed. This means you don't have to include jQuery if the page doesn't need it.
Here is the update fiddle, this will now work!
Please bare bear with me, I'm very new to Javascript. I am pulling my hair out trying to figure out why this won't work. Keep in mind I come from a Java background. I have a function 'getCsvData' and I'm essentially trying to parse a CSV file and dynamically add object properties to the datasource object and then return it. As you can see, outside the function 'getCsvData', I try to log the results after calling my function, but the result object is empty and there are no object propeties added to it.
I have a very strong feeling it has to due with closure/scope chain resolution that I'm still trying to learn and understand.
The questions are: Why aren't the properties added dynamically to the datasource object? I believe they actually are added in the scope of the anonymous function 'function(data)' passed as the second argument to '$.get', but they are immediately gone once the outer function 'getCsvData' returns. Why, and how can I fix this? Thanks!!
<script src="js/jquery-1.10.2.min.js"></script>
<script src="js/knockout-3.0.0.js"></script>
<script src="js/globalize.min.js"></script>
<script src="js/dx.chartjs.js"></script>
<script src="js/jquery.parse.js"></script>
$(function () {
function getCsvData(fileName, groupBy, year) {
var datasource = { }
$.get(fileName, function(data) {
var alldata = $.parse(data, { header: true });
for (var i = 0; i<alldata.results.rows.length;i++) {
var key = alldata.results.rows[i][groupBy]
if (key in datasource) {
datasource[key] = datasource[key] + 1
} else {
datasource[key] = 0
}
}
});
return datasource;
};
var results = getCsvData("data/data.csv", "Priority", 2012);
console.log(results)
for (key in results) {
console.log(key)
}
});
This is because get is called async, so datasource is the return value after initiating the get rather than after receiving the result (i.e. it is empty because the get completion has not been called yet). You should rather indicate completion with a callback or use jQuery.ajax() with the sync option to wait for the response to the get before returning from getCsvData. See here.
I have a Backbone collection with a load of models.
Whenever a specific attribute is set on a model and it is saved, a load of calculations fire off and the UI rerenders.
But, I want to be able to set attributes on several models at once and only do the saving and rerendering once they are all set. Of course I don't want to make several http requests for one operation and definitely dont want to have to rerender the interface ten times.
I was hoping to find a save method on Backbone.Collection that would work out which models hasChanged(), whack them together as json and send off to the back end. The rerendering could then be triggered by an event on the collection. No such luck.
This seems like a pretty common requirement, so am wondering why Backbone doesn't implement. Does this go against a RESTful architecture, to save several things to a single endpoint? If so, so what? There's no way it's practical to make 1000 requests to persist 1000 small items.
So, is the only solution to augment Backbone.Collection with my own save method that iterates over all its models and builds up the json for all the ones that have changed and sends that off to the back end? or does anyone have a neater solution (or am I just missing something!)?
I have ended up augmenting Backbone.Collection with a couple of methods to handle this.
The saveChangeMethod creates a dummy model to be passed to Backbone.sync. All backbone's sync method needs from a model is its url property and toJSON method, so we can easily knock this up.
Internally, a model's toJSON method only returns a copy of it's attributes (to be sent to the server), so we can happily just use a toJSON method that just returns the array of models. Backbone.sync stringifies this, which gives us just the attribute data.
On success, saveChanged fires off events on the collection to be handled once. Have chucked in a bit of code that gets it firing specific events once for each of the attributes that have changed in any of the batch's models.
Backbone.Collection.prototype.saveChanged = function () {
var me = this,
changed = me.getChanged(),
dummy = {
url: this.url,
toJSON: function () {
return changed.models;
}
},
options = {
success: function (model, resp, xhr) {
for (var i = 0; i < changed.models.length; i++) {
changed.models[i].chnageSilently();
}
for (var attr in changed.attributes) {
me.trigger("batchchange:" + attr);
}
me.trigger("batchsync", changed);
}
};
return Backbone.sync("update", dummy, options);
}
We then just need the getChanged() method on a collection. This returns an object with 2 properties, an array of the changed models and an object flagging which attributes have changed:
Backbone.Collection.prototype.getChanged = function () {
var models = [],
changedAttributes = {};
for (var i = 0; i < this.models.length; i++) {
if (this.models[i].hasChanged()) {
_.extend(changedAttributes, this.models[i].changedAttributes());
models.push(this.models[i]);
}
}
return models.length ? {models: models, attributes: changedAttributes} : null;
}
Although this is slight abuse of the intended use of backbones 'changed model' paradigm, the whole point of batching is that we don't want anything to happen (i.e. any events to fire off) when a model is changed.
We therefore have to pass {silent: true} to the model's set() method, so it makes sense to use backbone's hasChanged() to flag models waiting to be saved. Of course this would be problematic if you were changing models silently for other purposes - collection.saveChanged() would save these too, so it is worth considering setting an alternative flag.
In any case, if we are doing this way, when saving, we need to make sure backbone now thinks the models haven't changed (without triggering their change events), so we need to manually manipulate the model as if it hadn't been changed. The saveChanged() method iterates over our changed models and calls this changeSilently() method on the model, which is basically just Backbone's model.change() method without the triggers:
Backbone.Model.prototype.changeSilently = function () {
var options = {},
changing = this._changing;
this._changing = true;
for (var attr in this._silent) this._pending[attr] = true;
this._silent = {};
if (changing) return this;
while (!_.isEmpty(this._pending)) {
this._pending = {};
for (var attr in this.changed) {
if (this._pending[attr] || this._silent[attr]) continue;
delete this.changed[attr];
}
this._previousAttributes = _.clone(this.attributes);
}
this._changing = false;
return this;
}
Usage:
model1.set({key: value}, {silent: true});
model2.set({key: value}, {silent: true});
model3.set({key: value}, {silent: true});
collection.saveChanged();
RE. RESTfulness.. It's not quite right to do a PUT to the collection's endpoint to change 'some' of its records. Technically a PUT should replace the entire collection, though until my application ever actually needs to replace an entire collection, I am happy to take the pragmatic approach.
You can define a new resource to accomplish this kind of behavior, you can call it MyModelBatch.
You need to implement a new resource in you server side that is able to digest an Array of models and execute the proper action: CREATE, UPDATE and DESTROY.
Also you need to implement a Model in your Backbone client side with one attribute which is the Array of Models and a special url that doesn't make use the id.
About the re-render thing I suggest you to try to have one View by each Model so there will be as much renders as Models have changed but they will be detail re-renders without duplication.
This is what i came up with.
Backbone.Collection.extend({
saveAll: function(models, key, val, options) {
var attrs, xhr, wait, that = this;
var transport = {
url: this.url,
models: [],
toJSON: function () {
return { models: this.models };
},
trigger: function(){
return that.trigger.apply(that, arguments);
}
};
if(models == null){
models = this.models;
}
// Handle both `"key", value` and `{key: value}` -style arguments.
if (key == null || typeof key === 'object') {
attrs = key;
options = val;
} else {
(attrs = {})[key] = val;
}
options = _.extend({validate: true}, options);
wait = options.wait;
// After a successful server-side save, the client is (optionally)
// updated with the server-side state.
if (options.parse === void 0) options.parse = true;
var triggers = [];
_.each(models, function(model){
var attributes = model.attributes;
// If we're not waiting and attributes exist, save acts as
// `set(attr).save(null, opts)` with validation. Otherwise, check if
// the model will be valid when the attributes, if any, are set.
if (attrs && !wait) {
if (!model.set(attrs, options)) return false;
} else {
if (!model._validate(attrs, options)) return false;
}
// Set temporary attributes if `{wait: true}`.
if (attrs && wait) {
model.attributes = _.extend({}, attributes, attrs);
}
transport.models.push(model.toJSON());
triggers.push(function(resp){
if(resp.errors){
model.trigger('error', model, resp, options);
} else {
// Ensure attributes are restored during synchronous saves.
model.attributes = attributes;
var serverAttrs = options.parse ? model.parse(resp, options) : resp;
if (wait) serverAttrs = _.extend(attrs || {}, serverAttrs);
if (_.isObject(serverAttrs) && !model.set(serverAttrs, options)) {
return false;
}
model.trigger('sync', model, resp, options);
}
});
// Restore attributes.
if (attrs && wait) model.attributes = attributes;
});
var success = options.success;
options.success = function(resp) {
_.each(triggers, function(trigger, i){
trigger.call(options.context, resp[i]);
});
if (success) success.call(options.context, models, resp, options);
};
return this.sync('create', transport, options);
}
});