Angular / Breeze - camelCasing in model binding - javascript

I am just starting to learn Angular, and have been working with John Papa's course on Pluralsight to try and get my head around it. I was having a problem binding some data to a page, and it turned out to be an incorrect casing.
I had:
<small>{{s.timeslot.name}}</small> at <small>{{s.room.name}}</small>
when I should have had:
<small>{{s.timeSlot.name}}</small> at <small>{{s.room.name}}</small>
That's fine, but I just can't understand why.
The data is returned from Web API with this method (note no camelcase on timeslots):
[HttpGet]
public object Lookups()
{
var rooms = _repository.Rooms;
var tracks = _repository.Tracks;
var timeslots = _repository.TimeSlots;
return new { rooms, tracks, timeslots };
}
An angular service called datacontext.js has the following object to map between (at least to my understanding) Web API names and Breeze entity names.
var entityNames = {
attendee: 'Person',
person: 'Person',
speaker: 'Person',
session: 'Session',
room: 'Room',
track: 'Track',
timeslot: 'TimeSlot'
}
There are some further functions that involve caching data, and extending the model on the client, but nowhere is there a reference to timeSlot (with camelCase).
What is happening here, is a convention for HTML casing being enforced by Angular?
EDIT: Thanks to Ward for the answer. For anyone following the same course who has a similar question, breeze.NamingConvention.camelCase.setAsDefault() is called in entityManagerFactory.js.

The problem is a bit of "sloppiness" in the spelling of "TimeSlot" in various places.
Your binding to "s" is a binding to the Session type. The Session type itself dictates the spelling of any of its navigation properties. Everything else is irrelevant.
The metadata for Session is generated on the server based on the C# "Session" class. There you will find that the navigation property is spelled "TimeSlot".
Your client is using the Breeze NamingConvention.camelCase to translate between the preferred PascalCasing of C# and the preferend camelCasing in JavaScript apps. Therefore, on the Breeze client you should expect "TimeSlot" navigation property to become "timeSlot".
Entity names are not translated by the NamingConvention (at this time). The C# type, "TimeSlot" is also spelled "TimeSlot" for the JavaScript type.
Every other spelling everywhere else is irrelevant for this binding.
You may find the "Query result debugging" documentation topic helpful.

Related

Best practice/only possibility. "Json to Javascript / Typescript Object by constructor"

I started developing with angular / typescript and created a service for my .NET Core API and I like to know what's the best way to get a clean and reliable object from my service.
I have an .NET CORE REST API returning a json result represented by the class definition below.
Service:
demoservice.getDemo().subscribe((val) => new Demo(val));
Demo-Class is the following code:
export class Demo {
public id : number;
public name : string;
public subDemo: SubDemo;
constructor(demo: Demo) {
this.id = demo.id;
this.name = demo.name;
this.subDemo = new SubDemo(demo.subDemo);
}
}
export class SubDemo {
public demoList : ListDemo[]
constructor(subDemo: SubDemo) {
this.demoList = new Array<ListDemo>();
subDemo.demoList.forEach(dl => {
this.demoList.push(new ListDemo(dl))
});
}
}
export class ListDemo {
constructor(listdemo : ListDemo) {
this.birthday = listdemo.birthday;
this.smoker = listdemo.smoker;
}
get birthDayFormatted() : Date {
return new Date(this.birthday);
}
public birthday : string;
public smoker : boolean;
}
I this the best way (full implement all constructors) to create a object. Please note I like to use the "getter" - functionality of my ListDemo Class.
Is there no better way? I just found some Object.clone / Object.assign / Object.create.
But none of this solution is comprehensive...
I am really interested in your experience..
Since you're using better & best I will answer with my, probably unwanted, opinion. Disclaimer: I'm no guru, this answer is based on opinion, feel free to disregard.
Don't do it. Your server has a set of objects in its domain, probably some kind of problem solving or data storage domain.
Your client has a set of objects in its domain, typically focused on presenting the data to the user and allowing the user to understand and manipulate it.
Both of these domains may have objects that have the same name or are based on the same real world concept. It can be tempting to feel like they are the same domain with the same objects. They are not. If they were the same you would not be writing a client and a server you would be writing two of the same thing. The two should communicate with pure data objects. In TS this means you should only create an interface, not a class, for the objects you receive from the server.
Instead, start over. Create a new domain based on what you want to appear in the API. If you design your API from the top (UI) down to the bottom (access services) you'll likely find that you don't have a one-to-one mapping between objects. On the occasions you do then you can probably get away with the occasional assign / merge (see below) but I've personally never found reason to do more than what you posted above.
Should you persist you may come across the option to reassign the prototype of the JSON literal from the base object to the class the data represents but that is a contentious topic and potential performance pitfall.. Your best bet is probably to just do a recursive/deep assign like Lodash's merge.
Just use interfaces not classes.
export interface Demo {
id: number;
name: string;
subDemo: SubDemo;
}
export interface SubDemo {
demoList: ListDemo[];
}
export interface ListDemo {
birthday: string;
smoker: boolean;
}
and your api should return the same shape, you should just be able to go
getDemo(): Observable<Demo> {
return this.http.get<Demo>('url');
}
in your service and in your component assign it to a property
demo$ = this.service.getDemo();
and then use the observable with the async pipe in your template.
<ng-container *ngIf="demo$ | async as demo">
{{ demo | json }}
</ng-container>
The less you have to manipulate your data the better. I have a VS pluging that allows you to paste C# classes into TS files and it converts them to TypeScript interfaces on the fly.

Private variables in Ember-data DS.Model

I want to store a private variable on each DS.Model. Its purpose is to store a pending callback (in case I want to cancel it).
I have tried this (and it works):
DS.Model.reopen({
init() {
let _pending; // my private var
this._getPending = () => _pending; // get private var
this._setPending = callback => _pending = callback; // set private var
this._super(...arguments);
}
});
I have placed this in an initializer, and it works as I expect it to.
My questions are: Is this a good practise? is it likely to mess anything up? ...and, is there a better way?
Personally, I'm happy with the way it works.. but I'm not sure if its the "Ember" way. This is going to go into an Ember-cli addon, so I would like it to be the most "best practise" as possible. (the _getPending/_setPending method are only to be used internally within the addon.)
Here are my 2 cents on this. I would say no it is not a good practice, but it should be okay since they are just Ember Objects. The question here is what is Ember data model used for? From doc it says:
"Models are objects that represent the underlying data that your application presents to the user."
By definition this is not what they are designed for, so just because you are able to it does not mean that you should use them like this.
Pending callback so it can be canceled? Ember model API has defined state objects that can be used for this purpose. http://emberjs.com/api/data/classes/DS.Model.html Flags like isDeleted, isValid, isNew...gives all possible state.
I would place them in router actions where they are easy tested with integration tests.
You can check this screencast that explains them:
https://www.emberscreencasts.com/posts/102-ember-data-20-model-states-and-flags
Hope it helps.

Ember.js createRecord possibly not firing

I have an application that saves a user's search criteria in localStorage, where each saved search is represented as an instance of an Ember.js model:
Checklist.SavedSearch = DS.Model.extend({
id: DS.attr('string'),
filters: DS.attr('string')
});
When the "save" button is pressed, the controller creates a model instanced and creates a record for it:
Checklist.savedSearchController = Ember.ArrayController.create({
[..]
save: function(view) {
var saved_seach = Checklist.SavedSearch.createRecord({
id: 'abcd',
filters: '<json>'
});
Checklist.local_store.commit();
}
});
Checklist.local_store is an adapter I created (this is unsurprisingly where the problem probably begins) that has a basic interface that maps createRecord, updateRecord, etc. to a bunch of get/set methods that work with localStorage (loosely based on a github fork of ember-data). The adapter appears to work fine for some basic tests, particularly as findAll has no issues and returns values added manually to localStorage.
Here is the relevant method within Checklist.local_store:
createRecord: function(store, type, model) {
model.set('id', this.storage.generateId);
var item = model.toJSON({associations: true});
this.storage.setById(this.storage_method, type, id, item);
store.didCreateRecord(model, item);
}
The problem is that when createRecord is called by the controller, absolutely nothing occurs. Running it through the debugger, and logging to console, seems to show that the method isn't called at all. I imagine this is a misunderstanding on my part as to how Ember.js is supposed to work. I'd appreciate help on why this is happening.
I come from a ruby and php background, and have perhaps foolishly dived straight in to a JS framework, so any other comments on code style, structure and anything in general are welcome.
Ember Data doesn't change createRecord on the controller so it shouldn't behave any differently. It's possible that there was something related to this in the past, but it's certainly not the case anymore.

Why is Backbone model sending duplicate attributes to server on save?

I'm writing a practice Backbone app, with Rails backend API, and I'm confused about the behavior of save on Backbone models.
Let's say a Team has many Players, and I want to save a team with numerous players in a single POST.
So in Rails I have:
class Team < ActiveRecord::Base
has_many :players
accepts_nested_attributes_for :players
end
class Player < ActiveRecod::Base
belongs_to :team
end
and for backbone client, I have a Player model and a Players collection defined (not shown)
and then the containing Team model (NOTE: no Teams collection)
Demo.Models.Team = Backbone.Model.extend({
urlRoot: '/teams',
defaults: {
'team_size': 12
},
initialize: function() {
this.players = new Demo.Collections.Players());
},
toJSON: function() {
var json = _.clone(this.attributes);
json.players_attributes = this.players.map(function(player) {
return player.toJSON();
});
return json;
}
}
When I examine my stringified JSON in the browser, everything looks good:
{"team_size":12, "players_attributes":[{"name":"Fred"},{"name":"Jim" },{"name":"Mark"}]}
Checking the server logs, the lone top level attribute ('team size') is repeated, once at the top level, and then repeated under a root key.
Started POST "/teams" for 127.0.0.1 at 2012-06-07 13:39:40 -0400
Processing by TeamsController#create as JSON
Parameters: {
"team_size"=>12, "players_attributes":[{"name":"Fred"},{"name":"Jim" },{"name":"Mark"}]},
"team"=>{"team_size"=>12}
}
I have a few questions:
What's the best way to ensure the player_attributes are nested inside the root key? I (So that I can do a nested save inside TeamController, in the standard rails manner: (i.e. Team.create(params[:team]) ) I can accomplish this with some javascript hackery inside toJSON, but I'm guessing there's an easier, cleaner way.
Is this standard, desirable behaviour? To send duplicates of attributes like this? I guess there's no harm, but it doesn't smell right.
Am I not defining the url / urlRoot correctly or some such?
thanks
1- You have to override the toJSON method in order to include the model name as the root of the JSON element sent to the server.
toJSON: function() {
return { team: _.clone( this.attributes ) }
},
Since you are already messing and overriding this method I don't see any reasons not to go this way.
2- This is a very strange behavior you're describing. Try:
class Team < ActiveRecord::Base
self.include_root_in_json = false
end
It will probably eliminate Rails duplicate params parsing. Another advantage you get from this is that Rails won't include the team as a root element of its generated JSON to the client.
3- Your definition of urlRoot is just fine.
I arrived here while looking for same issue. So even it's an old question I think it's worth giving the answer.
I actually found a Rails setting that explain these duplicate attributes: wrap_parameters
http://apidock.com/rails/v3.2.13/ActionController/ParamsWrapper/ClassMethods/wrap_parameters
Just set it to an empty array, and rails won't try to wrap parameters coming from your JSON requests.
Although you can use the toJSON hack mentioned by others, this is actually not such a good idea. For one, it produces an inconsistent result between sync and save with {patch: true} (this inconsistency is because the sync method calls toJSON if you don't patch, but doesn't call toJSON if you have patch set to true)
Instead, a better solution is to use a patched version of Backbone that overload the sync method itself. The backbone-rails gem does this automatically, or you can pull backbone_rails_sync.js into your own app. A more complete answer to this question can be found here: Backbone.js and Rails - How to handle params from Backbone models?

Backbone-relational.js + Backbone.View(s)

The question: The documentation is scarce, and I'm something of a noob -- can anyone confirm the proper (assuming there is one) way to bind Backbone.Views to instances of Backbone.RelationalModel (from backbone-relational.js) for updating/rendering to the dom? I've tried a handful of different approaches, based on the normal Model/View binding in Backbone, with little success.
The backstory (/more info):
I'm learning the ropes with Backbone.js, and have had to pick up a lot over the past week. If I'm missing something obvious (which is highly likely -- including the "right" way to handle my problem below), please call me out.
I'm dealing with a mongodb-backed REST interface (that I don't have full control over -- or I would be re-architecting behavior on the server-side) that takes heavy advantage of nested dictionaries, so I've been reading up on how to best represent that in Backbone (while not breaking the great save() + server sync stuff that Backbone provides).
I've seen two options: backbone-relational and ligament.js.
I've started with backbone-relational.js, and have RelationalModels (backbone-relational's replacement for Backbone's standard Model) created for the various dictionaries in the tree that gets handed back by REST interface. The relationships between them are defined, and console logging the JSON from each model (in their respective initialize functions) shows that they're all being called/loaded up correctly off the server on a fetch() command at the overall collection level.
So, that's all great.
Problem: I've got views "listening" for updates on each of those models (and bound functions that should render templates on the dom), and they never "fire" at all (let alone render...). The main view fires on fetch(), no problem, loading the "top level" model and rendering it on the dom -- but the views that represent the "foreign key" models within that "top level" model never do (even though the data is DEFINITELY getting loaded into each model, as evidenced by the console logging on each model mentioned above).
Any insights would be greatly, greatly appreciated.
In direct response to Raynos reply below (thanks Raynos!):
If I defined a base url for the UpperLevelCollection with the UpperLevelModels existing at (UpperLevelCollection url)/(UpperLevelModel id) on the server, how would I map those LowerLevelCollections to dictionary keys within the one JSON dump for each UpperLevelModel from the server-side? In other words, could using collections within models properly handle a data dump from the server like this (obviously very simplified, but gets at the issue) AND properly save/update/sync it back?
[{
"some_key": "Some string",
"labels": ["A","List","Of","Strings"],
"content": [{
"id": "12345"
"another_key": "Some string",
"list": ["A","list","of","strings"],
},{
"id": "67890"
"another_key": "Some string",
"list": ["A","list","of","strings"],
}],
}]
Generally for nested dictionaries I take the following approach
var UpperLevelCollection = Backbone.Collection.extend({
model: UpperLevelModel
}),
UpperLevelModel = Backbone.model.extend({
initialize: function() {
this.nested = new LowerLevelCollection;
}
}),
LowerLevelCollection = Backbone.Collection.extend({
model: LowerLevelModel
}),
LowerLevelModel = Backbone.Model.extend({});
Just nest those collections inside models all the way down.
The problem might be that as you load new data into you ParentModel, your child collection AFAIK is not actually fetched, it's wiped and replaced by a new collection (see Backbone.HasMany.OnChange on line 584 in backbone-relational.js). Thus your own bindings on the child collection are gone.
This is, in my opinion, a weakness with backbone-relational. This behavior should be configurable, with an option where a slower find-and-update-approach is used instead of wipe-and-replace.

Categories

Resources