I'm loading two sets of data separately but I'd like them to be related. Allow me to explain.
Firstly, I'm not using Ember-data but am instead using a simple $.ajax wrapper as outlined in this post from one of the Discourse team.
I have a concept of channels and programmes.
Channels JSON:
[{
"ID":94,
"Name":"BBC1"
},
{
"ID":105,
"Name":"BBC2"
}]
I have to gather the IDs from this JSON to be able to then request the programmes for those channels. So a response from the programmes endpoint will look a bit like this:
Programmes JSON:
{
"Channels": [
{
"Listings": [
{
"Id": "wcy2g",
"Genres": "Education",
"S": "2013-04-26T10:45",
"E": "2013-04-26T11:15",
"T": "Crime Scene Rescue"
}
]
},
{
"Listings": [
{
"Id": "wcwbs",
"Genres": "Current affairs,News",
"S": "2013-04-26T11:00",
"E": "2013-04-26T12:00",
"PID": "nyg",
"T": "Daily Politics"
}
]
}
]
}
Each Listings array can contain x amount of programmes and the objects in the Channels array relate to the order in which they are requested (by the IDs from the Channels.json) so in this case Channels[0] is BBC1 and Channels[1] is BBC2.
What I'd like is to request these two data sets as a single JSON request each but then somehow relate them. So having a channel controller that has x amount of programme models. I also need to render the channels and their programmes in two different templates
Was thinking I could iterate through the channels.json and use the index of the item to look up the relevant items in programmes.json and create the relationship that way.
Not too sure how to use Ember to achieve this though.
Thanks
I did something very similar to this and got it working in ember. I'll sketch out what I did, using your objects. Note that I'm fairly new to ember so a grain of salt may be necessary.
First, you'll want to have model objects for "Channels", "Channel" and "Programme". This will eventually let you have Controllers and Routers for each of those things, matching up nicely with ember's naming conventions. The ChannelsData will have many ChannelData objects in it, and each ChannelData will have many ProgrammeData objects. How do you get these populated?
In your ChannelsRoute class you can have a model() function which returns the model data for that route. Your model function can call create() on ChannelsData to create an instance, and then call a loadAll function on ChannelsData. ChannelsData implements loadAll() using your preferred flavor of ajax. The dead-simple easiest thing to do is to have that function do both of your ajax calls and build the entire tree of data.
You will then find that you'll run into trouble if your ChannelRoute class tries to call its model(), for instance if you enter a path like #/channels/105 directly into the browser. To work around that, make a simple object store of your own on your App object, something like App.ChannelsStore = {}, and when you create each Channel put a reference to it in your ChannelsStore (by id). Then your ChannelRoute.model function can look up its model from that store. But only if ChannelsRoute.model has completed first!
If the user entered that #/channels/105 route as the very first entry into your app, then your code will go through the ChannelsRoute.model() method, and immediately go through the ChannelRoute.model() method, probably before your ajax has completed. In that case you can have the ChannelRoute.model() method create a temporary Channel (with no programmes, for instance) and put that in the App.ChannelsStore. Your logic for building up the whole tree of data should then be willing to check the ChannelsStore to see if an object with a given id already exists, and if so to update it. You end up with something like:
App.ChannelRoute = Ember.Route.extend({
model: function(params) {
var channel = App.ChannelsStore[params.channel_id];
// create stub version if not found
if (!channel) {
channel = App.ChannelData.create({ID: params.channel_id});
App.ChannelsStore[params.channel_id] = channel;
}
return channel;
}
});
(You may end up building a ProgrammeStore similarly, but that's just more of the same.)
The updating of the temporary object actually demonstrates a very cool aspect of ember, which is that your ui may be presented with the values from the temporary object, but then when your ajax call completes and the Channels and Programmes are all loaded - your ui will update properly. Just make sure you update your objects with the set() method, and that your ui templates are happy to work with partial data.
Related
In backend, my object relationship is that an Item has_many Options. I'd like to be able to access all the attributes on the item and its child options as a hash in the front end:
items = [
{
id: 1,
item_attribute_name: item_attribute_value,
options: [
{id: 1, option_attribute_name: option_attribute_value},
{id: 2, option_attribute_name: option_attribute_value},
]
},
{
id: 2,
item_attribute_name: item_attribute_value,
options: []
}
]
I'm sending data either via a json object in response to an ajax request or using the handy gon gem. I noticed that if I were JUST interested in sending the parent items, the formatting automatically happens such that I can just send back Item.all and in the front end, get an array of items with each item being a hash that represents its attributes exactly as I want.
But if I want to send the children is there a standard way of doing it? I realize I can construct the child attributes myself as below, but wondering if there's a more straight forward direct way.
How I would make this work by constructing the child attributes:
items = Item.all
items.each do |i|
child_attr = {"options" => i.options }
i.attributes.merge(child_attr)
end
A totally acceptable answer, by the way, is that there's no... automagical way of doing this without doing what I'm doing now, which is converting each parent object to attributes in backend, and then stitching together the child attributes.
I'm only asking this question, frankly, because it'd be nice to keep the object relationships in the backend for reuse elsewhere, if possible, rather turning things into a hash.
I think the only way to get the expected result is with some monkey patching. So you can use a serializer, or include, or use ActiveModel::Serializers::JSON which is included by default in your models.
For exemple with ActiveModel::Serializers::JSON, you can do something like:
items = Item.all
items.map! do |i|
i.serializable_hash(include: { options: {} })
end
This is due to rails eager loading, which avoids having to load all children from an association(has_many for example). Where you'd like to serialize is up to your use case.
Here's an example that uses Backbone with React.
He defines a Model: var _todos = new Backbone.Model();
And then adds two functions to it:
var TodoStore = _.extend(_todos, {
areAllComplete: function() {
return _.every(_todos.keys(), function(id){
return _todos.get(id).complete;
});
},
getAll: function() {
return _todos.toJSON();
}
});
What I don't understand is why areAllComplete is being applied to a Model instead of to a Collection.
Shouldn't this be a function in a Collection that will get all of its models and check that complete attribute.
Similarly, I would expect getAll to belong to a Collection - get all of its models.
This example seems to replace Collection with Model.
Maybe I don't totally understand how models are used.
That example is using Backbone.Model in a fairly wierd way in my opinion.
This is where it's adding new todos to the store:
var id = Date.now();
_todos.set(id, {
id: id,
complete: false,
text: text
});
}
What it's basically doing is setting every todo-item as an attribute of the Model, using the id as the attribute name. It ends up with _todos.attributes looking something like below
{
"1436600629317": {
"id": 1436600629317,
"complete": false,
"text": "foo"
},
"1436600629706": {
"id": 1436600629706,
"complete": false,
"text": "bar"
}
}
That's the same output you get from _todos.toJSON(). I've no idea why they decided to implement it like that, if they were to try using Backbone.Sync they'd end up with a server API that's not exactly RESTful. It seems strange to use Backbone without leveraging any of the things Backbone provides. There's a reference to the change event here but I don't see it being used anywhere. You could easily reimplement that store using any regular JS object.
The only thing that example seem to be actually using from Backbone is Backbone.Events in the dispatcher. You're totally right that using a Collection would make way more sense because then you could actually make it talk to a REST based server API. That example seems to only use Backbone for the sake of using Backbone.
Here is what I am trying to understand.
Often times I find myself writing backbone like this:
var CallModel = Backbone.Model.extend({
});
var CallsCollection = Backbone.Collection.extend({
model: CallModel,
url: 'url/to/external/json'
});
It is a very basic example but as you can see, there is nothing really in the model all the data is coming into the Collection via an external url call to a json file that is build from a database.
So whats the purpose of the model? I am sure that I am probably not using backbone.js to its fullest extent which is why I am here asking you guys.
First of all, "there is nothing really in the model all the data is coming into the Collection via an external url call" - this is not true.
Let's assume you've the following:
//Model
var CallModel = Backbone.Model.extend({
defaults: {
cost:0,
duration:0
}
});
(without custom attributes or methods, there is no point in extending the original Backbone.Model)
//Collection
var CallsCollection = Backbone.Collection.extend({
model: CallModel,
url: 'url/to/external/json'
});
And the json data returned from service, probably something like:
//Response
{
callSummary: {
missed: 2,
received: 3,
totalCalls:5
totalDuration: 20
}
calls: [{
id:001,
caller:"Mr.A",
callee:"Mr.B",
cost:1,
duration:5
},{
id:002,
caller:"Mr.X",
callee:"Mrs.Y",
cost:1,
duration:7
},{
id:003,
caller:"Mr.A",
callee:"Mrs.B",
cost:1,
duration:8
}],
//and more additional information from your db
}
Now you populate your collection with data by calling it's fetch method:
CallsCollection.fetch();
Your collection should look something like:
{
models: [{
attributes: {
callSummary: {},
calls: [{},{},{}],
...
},
...
}],
length:1,
url: "url/to/external/json",
...
}
The data will be added to a model's attribute hash. If you don't specify a particular model, as Bart mentioned in his answer, backbone will populate the collection with a Backbone.Model instance: Which is still not much useful - Wew... A collection with single model having entire response data inside it's attributes as it is...
At this point, you're wondering why did I even bother creating a model, and then a collection..?
The problem here is Collections are derived from Arrays, while Models are derived from Objects. In this case, our root data structure is an Object (not an Array), so our collection tried to parse the returned data directly into a single model.
What we really want is for our collection to populate its models from the "calls" property of the service response. To address this, we simply add a parse method onto our collection:
var CallsCollection = Backbone.Collection.extend({
model: CallModel,
url: 'url/to/external/json',
parse: function(response){
/*save the rest of data to corresponding attributes here*/
return response.calls; // this will be used to populate models array
}
});
Now your collection will be something like the following:
{
models: [{
...
attributes: {
...
id:001,
caller:"Mr.A",
callee:"Mr.B",
cost:1,
duration:5
}
},{
...
attributes: {
...
id:002,
caller:"Mr.X",
callee:"Mrs.Y",
cost:1,
duration:7
}
},{
...
attributes: {
...
id:003,
caller:"Mr.A",
callee:"Mrs.B",
cost:1,
duration:8
}
}],
length:3,
url: "url/to/external/json",
...
}
This - is what we want! : Now it is very easy to handle the data: You can make use of the add, remove, find, reset and handful of other collection methods effectively.
You can pass this models array into your templating library of choice, probably with two way bindings: When the respective view for one of the call model changes, the particular model will be updated, events will propagate from your models to the collection, and the particular model will be passed into the handler functions.
You can now call fetch, save, destroy, clear and a lot of other methods with ease on single unit's of data (each model), rather than hurdle with the entire data saved in a single model - which is pretty much useless, you've to iterate through the response data manually and perform CRUD and similar operations by your own, and in most cases: re-render the entire collection view. which is very, very bad and totally unmaintainable.
To conclude: If your data source doesn't return an array of objects, or you don't parse the response and return an array of objects from which n number of models are to be populated - Then defining a collection is pretty much useless.
Hopefully, now you get the idea.
Very helpful source of info:
Backbone, The Primer: Models and Collections
Developing Backbone.js Applications
backbonejs.org
You don't need to specify a model. A Backbone collection will default to using Backbone.Model if you don't specify this option. The following would work equally well if you don't need the models of the collection to be of a particular instance.
var CallsCollection = Backbone.Collection.extend({
url: 'url/to/external/json'
});
Reference
EDIT
In essence, specifying the model option within a collection is just a way to ensure that objects added to this collection will be instances of that particular model class. If the models being added to your collection don't have any custom behaviour outside of what is available to Backbone.Model, you don't need to create and specify a model as Backbone collections will default to using an instance of Backbone.Model as I have already mentioned. If, however, you wanted to ensure that models added to a particular collection were of a particular type and shared customized behaviour (e.g. validations, defaults, etc.), you would create your own model class by extending Backbone.Model and specifying this in the collection. I hope this clears things up for you.
Sounds Weird but this is the way.
Every collection in backbone, must represent a model, so basically a collections is a list of models.
Even if your model has no data, you need to indicate it when you create a Collection.
This is how backbone works for collections.
Using Ember, we have a list of shoes which is fetched from a database. These are listed at '/shoes.
this.resource('shoes', function() {
this.route('new');
this.route('show', {path: ':shoe_id'});
this.route('edit', {path: ':shoe_id/edit'});
});
Only the first 10 shoes in the MongoDB collection are listed in the view, as specified in our webb API. When creating a new shoe (using the nested route 'new'), and transitioning back to '/shoes', the new shoe is added to the current 'shoes' model.
export default Ember.ObjectController.extend({
actions: {
save: function() {
this.get('model').save();
this.transitionToRoute('shoes');
}
}
});
This results in a list of 11 shoes. In other words, it does not use the route and make a new API call. Instead, it is added to the current list of shoes in the model. When refreshing the page, the result is rendered as intended, fetching the 10 first records of the DB collection.
We would like to make the ’transitionToRoute’ execute the route and re-fetch the model instead of just adding it to the current model. We have seen a few examples of how ’this.refresh()’ and ’this.reload()’ can be used inside the controller's 'model' scope body but these examples have not worked for us.
Is it possible to make a ’transitionToRoute’ refresh the model with new database values using the 'shoes' route?
Based on what you wrote, I'm guessing you're trying to use pagination and only want the first 10 shoes to be listed on your /shoes route?
If so, the "Ember Way" is to always keep all your models in sync and never have to do special work just to get the view to update artificially. In this case, Ember has a local store of shoes where it initially has 10 items. Then you add one more, it gets saved both the database and to the Ember local store and so now Ember thinks (correctly) that you have 11 shoes. Just because Mongo returns 10 shoes doesn't mean your entire data set is 10 shoes.
So, the best way to handle this situation is to have your view display an accurate projection of your underlying model data. In other words, don't tell your view to display "all shoes". Tell it to display a "filtered list of all shoes"!
In practice, I've seen two types of filtering on ArrayController. One is just to return the first n values. For that use good old javascript slice (See MDN docs). The second is to use the Ember filter function. See Ember Docs.
Ultimately, your controller would something like this:
Shoes Controller:
export default Ember.ArrayController.extend( PaginatorClientSideMixin, {
shoesFilteredOption1: function() {
return this.get('arrangedContent') // 'arrangedContent' is the sorted list of underlying content; assumes your backing model is the DS.RecordArray of shoes
// this use of slice takes an array and returns the first 10 elements
.slice( 0, 10 );
// we depend on 'arrangedContent' because everytime this changes, we need to recompute this value
}.property('arrangedContent')
shoesFilteredOption2: function() {
return this.get('arrangedContent') // 'arrangedContent' is the sorted list of underlying content; assumes your backing model is the DS.RecordArray of shoes
// here we're filtering the array to only return "active" shoes
.filter( function(item, index, self ) {
if (item.isActive) { return true; }
})
}.property('arrangedContent')
});
Then on your Handlebars template read from shoesFilteredOption1 instead of content or model.
The question: The documentation is scarce, and I'm something of a noob -- can anyone confirm the proper (assuming there is one) way to bind Backbone.Views to instances of Backbone.RelationalModel (from backbone-relational.js) for updating/rendering to the dom? I've tried a handful of different approaches, based on the normal Model/View binding in Backbone, with little success.
The backstory (/more info):
I'm learning the ropes with Backbone.js, and have had to pick up a lot over the past week. If I'm missing something obvious (which is highly likely -- including the "right" way to handle my problem below), please call me out.
I'm dealing with a mongodb-backed REST interface (that I don't have full control over -- or I would be re-architecting behavior on the server-side) that takes heavy advantage of nested dictionaries, so I've been reading up on how to best represent that in Backbone (while not breaking the great save() + server sync stuff that Backbone provides).
I've seen two options: backbone-relational and ligament.js.
I've started with backbone-relational.js, and have RelationalModels (backbone-relational's replacement for Backbone's standard Model) created for the various dictionaries in the tree that gets handed back by REST interface. The relationships between them are defined, and console logging the JSON from each model (in their respective initialize functions) shows that they're all being called/loaded up correctly off the server on a fetch() command at the overall collection level.
So, that's all great.
Problem: I've got views "listening" for updates on each of those models (and bound functions that should render templates on the dom), and they never "fire" at all (let alone render...). The main view fires on fetch(), no problem, loading the "top level" model and rendering it on the dom -- but the views that represent the "foreign key" models within that "top level" model never do (even though the data is DEFINITELY getting loaded into each model, as evidenced by the console logging on each model mentioned above).
Any insights would be greatly, greatly appreciated.
In direct response to Raynos reply below (thanks Raynos!):
If I defined a base url for the UpperLevelCollection with the UpperLevelModels existing at (UpperLevelCollection url)/(UpperLevelModel id) on the server, how would I map those LowerLevelCollections to dictionary keys within the one JSON dump for each UpperLevelModel from the server-side? In other words, could using collections within models properly handle a data dump from the server like this (obviously very simplified, but gets at the issue) AND properly save/update/sync it back?
[{
"some_key": "Some string",
"labels": ["A","List","Of","Strings"],
"content": [{
"id": "12345"
"another_key": "Some string",
"list": ["A","list","of","strings"],
},{
"id": "67890"
"another_key": "Some string",
"list": ["A","list","of","strings"],
}],
}]
Generally for nested dictionaries I take the following approach
var UpperLevelCollection = Backbone.Collection.extend({
model: UpperLevelModel
}),
UpperLevelModel = Backbone.model.extend({
initialize: function() {
this.nested = new LowerLevelCollection;
}
}),
LowerLevelCollection = Backbone.Collection.extend({
model: LowerLevelModel
}),
LowerLevelModel = Backbone.Model.extend({});
Just nest those collections inside models all the way down.
The problem might be that as you load new data into you ParentModel, your child collection AFAIK is not actually fetched, it's wiped and replaced by a new collection (see Backbone.HasMany.OnChange on line 584 in backbone-relational.js). Thus your own bindings on the child collection are gone.
This is, in my opinion, a weakness with backbone-relational. This behavior should be configurable, with an option where a slower find-and-update-approach is used instead of wipe-and-replace.