Saving a model in local storage - javascript

I'm using Jerome's localStorage adapter with Backbone and it works great for collections.
But, now I have a single model that I need to save. So in my model I set:
localStorage: new Store("msg")
I then do my saves and fetch. My problem is that everytime I do a refresh and initialize my app a new representation of my model is added to localStorage, see below.
What am I doing wrong?
window.localStorage.msg = {
// Created after first run
"1de5770c-1431-3b15-539b-695cedf3a415":{
"title":"First run",
"id":"1de5770c-1431-3b15-539b-695cedf3a415"
},
// Created after second run
"26c1fdb7-5803-a61f-ca12-2701dba9a09e":{
"0":{
"title":"First run",
"id":"1de5770c-1431-3b15-539b-695cedf3a415"
},
"title":"Second run",
"id":"26c1fdb7-5803-a61f-ca12-2701dba9a09e"
}
}

I ran into same issue. Maybe you have something similar to this
var Settings = Backbone.Model.extend({
localStorage: new Store("Settings"),
defaults: { a: 1 }
});
var s = new Settings;
s.fetch();
I changed to
var s = new Settings({ id: 1 });
localStorage adapter check for id like
case "read": resp = model.id ? store.find(model) : store.findAll(); break;
so 0 or "" for id wont work and it will return all models in one

I'm new to backbone.js too, but it looks like the persistence model is analogous to database tables. That is to say, it's designed to create/delete/read records from a table. The localStorage adapter does the same, so what you are doing there is creating a Msg "table"
in localStorage, and creating a new Msg "record" each time, and the adapter gives each new Msg a unique id.
If you just have one object, it's probably easier to just use localStorage directly. The API is really straight forward:
localStorage.setItem("key","value");
Keep in mind that localStorage only deals with key/value pairs as strings, so you'd need to convert to/from string format.
Take a look a this question for more on doing that:
Storing Objects in HTML5 localStorage

Related

Querying a parse table and eagerly fetching Relations for matching

Currently, I have a table named Appointments- on appointments, I have a Relation of Clients.
In searching the parse documentation, I haven't found a ton of help on how to eagerly fetch all of the child collection of Clients when retrieving the Appointments. I have attempted a standard query, which looked like this:
var Appointment = Parse.Object.extend("Appointment");
var query = new Parse.Query(Appointment);
query.equalTo("User",Parse.User.current());
query.include('Rate'); // a pointer object
query.find().then(function(appointments){
let appointmentItems =[];
for(var i=0; i < appointments.length;i++){
var appt = appointments[i];
var clientRelation = appt.relation('Client');
clientRelation.query().find().then(function(clients){
appointmentItems.push(
{
objectId: appt.id,
startDate : appt.get("Start"),
endDate: appt.get("End"),
clients: clients, //should be a Parse object collection
rate : appt.get("Rate"),
type: appt.get("Type"),
notes : appt.get("Notes"),
scheduledDate: appt.get("ScheduledDate"),
confirmed:appt.get("Confirmed"),
parseAppointment:appt
}
);//add to appointmentitems
}); //query.find
}
});
This does not return a correct Clients collection-
I then switched over to attempt to do this in cloud code- as I was assuming the issue was on my side for whatever reason, I thought I'd create a function that did the same thing, only on their server to reduce the amount of network calls.
Here is what that function was defined as:
Parse.Cloud.define("GetAllAppointmentsWithClients",function(request,response){
var Appointment = Parse.Object.extend("Appointment");
var query = new Parse.Query(Appointment);
query.equalTo("User", request.user);
query.include('Rate');
query.find().then(function(appointments){
//for each appointment, get all client items
var apptItems = appointments.map(function(appointment){
var ClientRelation = appointment.get("Clients");
console.log(ClientRelation);
return {
objectId: appointment.id,
startDate : appointment.get("Start"),
endDate: appointment.get("End"),
clients: ClientRelation.query().find(),
rate : appointment.get("Rate"),
type: appointment.get("Type"),
notes : appointment.get("Notes"),
scheduledDate: appointment.get("ScheduledDate"),
confirmed:appointment.get("Confirmed"),
parseAppointment:appointment
};
});
console.log('apptItems Count is ' + apptItems.length);
response.success(apptItems);
})
});
and the resulting "Clients" returned look nothing like the actual object class:
clients: {_rejected: false, _rejectedCallbacks: [], _resolved: false, _resolvedCallbacks: []}
When I browse the data, I see the related objects just fine. The fact that Parse cannot eagerly fetch relational queries within the same call seems a bit odd coming from other data providers, but at this point I'd take the overhead of additional calls if the data was retrieved properly.
Any help would be beneficial, thank you.
Well, in your Cloud code example - ClientRelation.query().find() will return a Parse.Promise. So the output clients: {_rejected: false, _rejectedCallbacks: [], _resolved: false, _resolvedCallbacks: []} makes sense - that's what a promise looks like in console. The ClientRelation.query().find() will be an async call so your response.success(apptItems) is going to be happen before you're done anyway.
Your first example as far as I can see looks good though. What do you see as your clients response if you just output it like the following? Are you sure you're getting an array of Parse.Objects? Are you getting an empty []? (Meaning, do the objects with client relations you're querying actually have clients added?)
clientRelation.query().find().then(function(clients){
console.log(clients); // Check what you're actually getting here.
});
Also, one more helpful thing. Are you going to have more than 100 clients in any given appointment object? Parse.Relation is really meant for very large related collection of other objects. If you know that your appointments aren't going to have more than 100 (rule of thumb) related objects - a much easier way of doing this is to store your client objects in an Array within your Appointment objects.
With a Parse.Relation, you can't get around having to make that second query to get that related collection (client or cloud). But with a datatype Array you could do the following.
var query = new Parse.Query(Appointment);
query.equalTo("User", request.user);
query.include('Rate');
query.include('Clients'); // Assumes Client column is now an Array of Client Parse.Objects
query.find().then(function(appointments){
// You'll find Client Parse.Objects already nested and provided for you in the appointments.
console.log(appointments[0].get('Clients'));
});
I ended up solving this using "Promises in Series"
the final code looked something like this:
var Appointment = Parse.Object.extend("Appointment");
var query = new Parse.Query(Appointment);
query.equalTo("User",Parse.User.current());
query.include('Rate');
var appointmentItems = [];
query.find().then(function(appointments){
var promise = Parse.Promise.as();
_.each(appointments,function(appointment){
promise = promise.then(function(){
var clientRelation = appointment.relation('Clients');
return clientRelation.query().find().then(function(clients){
appointmentItems.push(
{
//...object details
}
);
})
});
});
return promise;
}).then(function(result){
// return/use appointmentItems with the sub-collection of clients that were fetched within the subquery.
});
You can apparently do this in parallel, but that was really not needed for me, as the query I'm using seems to return instantaniously. I got rid of the cloud code- as it didnt seem to provide any performance boost. I will say, the fact that you cannot debug cloud code seems truly limiting and I wasted a bit of time waiting for console.log statements to show themselves on the log of the cloud code panel- overall the Parse.Promise object was the key to getting this to work properly.

Ember Data not creating an id for new instances

I created a prototype of an Ember app using the Local Storage adapter.
I am now trying to convert the app to use the Ember Data REST adapter with a back-end store.
In the local storage version of the app, Ember generates an id for a new record prior to saving it (and also even if the record is never saved).
For example, in my local storage app, I can log the id in both places
var gecko = this.store.createRecord('gecko', {
date: new Date(),
type: "gecko",
});
console.log(gecko.id, "gecko.id before save");
gecko.save();
console.log(gecko.id, "gecko.id");
By contrast, in the version of the app I'm making with the REST adapter for the back-end store, the id is not logged.
When I check the data Ember is sending to the server, the id is not included (probably because an id was never generated).
Here is the json that Ember is sending to my server
gecko: { type: "alloc", date: "2015-05-30T13:28:27.539Z"}
I am assuming that I am supposed to save the id that Ember generates on my server (which would of course allow it to retrieve the record by id provide my server implements that).
Question: why is there no id being generated?
this is the code
App = Ember.Application.create();
App.Router.map(function() {
this.route("gecko", { path: "/" });
});
App.ApplicationAdapter = DS.RESTAdapter.extend({
//haven't actually any created any code for this part yet
});
App.ApplicationStore = DS.Store.extend({
adapter: App.ApplicationAdapter.create()
});
App.Gecko = DS.Model.extend({
type: DS.attr('string'),
date: DS.attr('date')
})
App.GeckoRoute = Ember.Route.extend({
model: function() {
//currently does nothing. originally I tried to do `return this.store.find('gecko') but since there are no records yet on the backend, it's returning null which leads to an error which Array cannot map over
},
});
App.GeckoController = Ember.Controller.extend({
actions: {
createGeckoButtonClicked: function(){
var gecko = this.store.createRecord('gecko', {
date: new Date(),
type: "gecko",
});
console.log(gecko.id, "gecko.id before save"); //null
gecko.save();
console.log(gecko.id, "gecko.id"); //null
}
}
Note—I'm not sure if it's relevant, but I feel like I'm in a chicken/egg situation with the route because I can't return any entries before I have created them.
So therefore, I'm trying to setup the Ember app to be able to POST an entry to the server, then I will implement the route to retrieve it using return this.store.find('gecko').
When you use the RESTAdapter and save a model, ember-data expects for a valid payload that includes a unique id generated by your backend.
var gecko = this.store.createRecord('gecko', {
date: new Date(),
type: "gecko",
});
/*
Here ember-data expects a payload like this:
gecko: {id: 1, date: "", type: "gecko"}
The id is generated by your backend
*/
gecko.save().then(function(gecko){
console.log(gecko.get('id'))
})
Ember Data doesn't create ids, there isn't anything to stop it from generating non-unique ids. It isn't the source of truth when it comes to gecko records, your database is, so id generation belongs to the db. This is where POST vs PUT comes into play. I want to POST a new gecko record to /api/geckos or I want to PUT gecko record 123 into its place at /api/geckos/123.
If there are no entries in the database you should still be returning a valid response:
{
geckos: []
}
And two other quick things, you should be using getters/setters for property fetching and setting.
var gecko = this.store.createRecord('gecko', {
date: new Date(),
type: "gecko",
});
console.log(gecko.get('id'), "gecko.id before save");
var promise = gecko.save();
And save is an asynchronous process that returns a promise which you can wait to avoid race conditions.
promise.then(function(geckoRecord){
// geckoRecord and geck are the same here, but it's good to know
// it resolves the record
console.log(gecko.get('id'), "gecko.id after save");
console.log(geckoRecord.get('id'), "gecko.id after save");
});

Backbone collection fetch imported incorrectly

I have a collection which is fetched from a REST endpoint, where it receives a JSON.
So to be completely clear:
var Products = Backbone.Collection.extend({
model: Product,
url : 'restendpoint',
customFilter: function(f){
var results = this.where(f);
return new TestCollection(results);
}
});
var products = new Products();
products.fetch();
If I log this, then I have the data. However, the length of the object (initial) is 0, but it has 6 models. I think this difference has something to do with what is wrong, without me knowing what is actually wrong.
Now, if I try to filter this:
products.customFilter({title: "MyTitle"});
That returns 0, even though I know there is one of that specific title.
Now the funky part. If I take the ENTIRE JSON and copy it, as in literally copy/paste it into the code like this:
var TestCollection = Backbone.Collection.extend({
customFilter: function(f){
var results = this.where(f);
return new TestCollection(results);
}
});
var testCollectionInstance = new TestCollection(COPY PASTED HUGE JSON DATA);
testCollectionInstance.customFilter({title: "MyTitle"});
Now that returns the 1 model which I was expecting. The difference when I log the two collections can be seen below. Is there some funky behaviour in the .fetch() I am unaware of?
Edit 2: It may also be of value that using the .fetch() I have no problems actually using the models in a view. It's only the filtering part which is funky.
Edit 3: Added the view. It may very well be that I just don't get the flow yet. Basically I had it all working when I only had to fetch() the data and send it to the view, however, the fetch was hardcoded into the render function, so this.fetch({success: send to template}); This may be wrong.
What I want to do is be able to filter the collection and send ANY collection to the render method and then render the template with that collection.
var ProductList = Backbone.View.extend({
el: '#page',
render: function(){
var that = this; /* save the reference to this for use in anonymous functions */
var template = _.template($('#product-list-template').html());
that.$el.html(template({ products: products.models }));
//before the fetch() call was here and then I rendered the template, however, I needed to get it out so I can update my collection and re-render with a new one (so it's not hard-coded to fetch so to speak)
},
events: {
'keyup #search' : 'search'
},
search : function (ev){
var letters = $("#search").val();
}
});
Edit: New image added to clearify the problem
It's a bit tricky, you need to understand how the console works.
Logging objects or arrays is not like logging primitive values like strings or numbers.
When you log an object to the console, you are logging the reference to that object in memory.
In the first log that object has no models but once the models are fetched the object gets updated (not what you have previously logged!) and now that same object has 6 models. It's the same object but the console prints the current value/properties.
To answer your question, IO is asynchronous. You need to wait for that objects to be fetched from the server. That's what events are for. fetch triggers a sync event. Model emits the sync when the fetch is completed.
So:
var Products = Backbone.Collection.extend({
model: Product,
url : 'restendpoint',
customFilter: function(f){
var results = this.where(f);
return new TestCollection(results);
}
});
var products = new Products();
products.fetch();
console.log(products.length); // 0
products.on('sync',function(){
console.log(products.length); // 6 or whatever
products.customFilter({title: 'MyTitle'});
})
It seems like a response to your ajax request hasn't been received yet by the time you run customFilter. You should be able to use the following to ensure that the request has finished.
var that = this;
this.fetch({
success: function () {
newCollection = that.customFilter({ title: 'foo' });
}
});

Right way to fetch and retrieve data in Backbone.js

I’m trying to understand how and where to use data after a fetch using Backbone.js but I’m a little confused.
I’ll explain the situation.
I have an app that, on the startup, get some data from a server. Three different kind of data.
Let’s suppose Airplanes, Bikes, Cars.
To do that, I’ve inserted inside the three collections (Airplanes, Cars, Bikes) the url where to get these data.
I’ve overwrited the parse method, so I can modify the string that I get, order it, and put it in an object and inside localstorage. I need it to be persistent because I need to use those 3 data structure.
So with a fetch i get all those data and put them inside localstorage. Is it correct doing it that way?
Now i need to make other calls to the server, like “get the nearest car”.
In the view i need to see the color, name and model of the car, all that informations are inside the object “Cars” in localstorage.
In my view “showcars.view” I just call a non-backbone js, (not a collection, model or view) where i get all the informations i need. In this js i do:
var carmodel = new Car(); //car is the backbone model of the cars
carmodel.url = '/get/nearest/car'; //that give id of the nearest car
carmodel.fetch ({
success: function () {}
//here i search the Cars object for a car with the same id
//and get name, color, model and put them in sessionstorage
})
So after that call, in the view I can get the data I need from the sessionstorage.
Is that a bad way of doing things? If so, how i should fetch and analyze those informations? I should do all the calls and operations inside the models?
Thanks
This would be the way that you might implement what you want.
var Car = Backbone.Model.extend();
var Cars = Backbone.Collection.extend({
model: Car,
url: '.../cars'
});
var NearestCar = Backbone.Model.extend({
url: '...nearest/car'
});
var cars = new Cars();
var nearestCar = new NeaerestCar();
cars.fetch({
success: function() {
nearestCar.fetch({
success: function(model) {
var oneYouWant = cars.get(model.get('id'));
// do something with your car
// e.g.:
// var carView = new CarView({model: oneYouWant});
// $('body').append(carView.render().el);
});
});
});
});
In general, Backbone keeps everything in memory (that is, the browser memory) so there is no need to save everything to local storage, as long as your Collection object is somehow reachable from the scope you are sitting in (to keep things simple let's say this is the global window scope).
So in your case I will have something like three collections:
window.Cars
window.Airplanes
window.Bikes
Now you want the nearest. Assuming you are in a Backbone View and are responding to an event, in your place I would do something like this (just shows the meaningful code):
var GeneralView = Backbone.View.extend({
events: { "click .getNearestCar": "_getNearestCar" },
_getNearestCar: function () {
$.getJson('/get/nearest/car', function (data) {
// suppose the data.id is the id of the nearest car
var nearestCar = window.Cars.get(data.id)
// do what you plase with nearestCar...
});
}
});

ExtJS 4 Set Reader

I have a JSON that used by other parts in application a few times.
To avoid unneeded calls, I want to fetch it ones, and than only use it
where it needs.
the issue is, that JSON cantains a different sections for different parts,
thats why I need to use root property.
What I need:
- Proxy that will fetch it ones (One for all)
- Reader for each part, 'cause they use different root
- Store for different parts
Proxy:
var myProxy = new Ext.data.proxy.Ajax({
url: "static/data/myData.json"
});
var operation = new Ext.data.Operation({
action: "read"
});
myProxy.read(operation);
Some Part:
// try to create custom reader with appropriate root
var reader = new Ext.data.reader.Json({
root: "table1"
});
// set reader to proxy
myProxy.setReader(reader);
// create store
Ext.create("Ext.data.Store", {
storeId: "MyStore",
model: "MyModel",
autoLoad: true
});
// set proxy to store
Ext.data.StoreManager.lookup("MyStore").setProxy(proxy);
Of course, this doesn't work. How I have to do it?
You'd be better off using a AJAX fetch to get the JSON, caching it in some variable and then using your store's loadData method to fill each store as needed. loadData lets you manually add records without going to the remote data source. That'll give you tighter control without having to deal with the proxies; just the readers and stores.

Categories

Resources