Store and Retrieve WinJS.Binding.List in Application Data - javascript

I'm developing a Windows 8 Store app using HTML/JavaScript and I've run into an issue storing and retrieving a WinJS.Binding.List into Windows.Storage.ApplicationData.current.roamingSettings.
I DID get this to work by hand rolling my own method of converting the binding list into XML string and storing that, then on retrieval parsing it back out into a list. But, this seems crazy inefficient and I'm trying to find a better way. I've tried JSON.stringify() and JSON.parse() which seem to store and retrieve the right data but as soon as I bind the data to the winControl the application crashes with a 0 (no error message at all).
Here's a bit of my code to demonstrate what I'm attempting (list is a binding list):
function onSaveData() {
if (list) {
Windows.Storage.ApplicationData.current.roamingSettings.values["data"] = JSON.stringify(list);
}
}
function onLoadData() {
var data = Windows.Storage.ApplicationData.current.roamingSettings.values["data"];
if (data) {
list = JSON.parse(data);
var listview = element.querySelector("#mylistview").winControl;
listview.itemsSource = new WinJS.Binding.List(list);
}
}
I know I can get this working the long way, so I'm not looking for any solution... I'm really just hoping there's an easy way to store/retrieve these data objects that I'm missing. If I can find an easier way to do this it will eliminate about 40 lines of code and I can stop using an entire library. Also, as I go forward I plan to have more binding lists that will need to be stored as well. Thanks!

You need to bind to the List's dataSource property, not to the List itself:
listview.itemsSource = new WinJS.Binding.List(list).dataSource;
The dataSource property is specifically the IListDataSource that the ListView requires for a data source. The ListView doesn't understand anything about the WinJS.Binding.List directly, only through that particular interface. (I discuss this in Chapter 7, section "The Structure of Data Sources", in my free ebook, Programming Windows Store Apps with HTML, CSS, and JavaScript, 2nd Edition.)
Your saving and reloading the list with JSON is completely fine.

Related

How do i save model data when creating a Backbone driven theme in Wordpress?

Ok so im messing around with Backbone for the first time. I think I've pretty much covered all the basics of frontend logic, but i have never really been any good at backend logic and coding.
I'm working with wordpress and creating a theme using backbone. My understanding is as long as i set up a template page that has the correct containers that my backbone code will render views in, the fact that it's a wordpress theme instead of it's own app shouldn't really change anything on the frontend side.
I'm at the stage where i want to save a model so that i can fetch it in my routes to link to my view to render.
I'm unsure about the whole process of saving data. I know i need to give the model attribute 'urlRoot' a string but i don't know what that string should be, and what happens after that.
Can someone explain the whole process, especially in terms of how to do it with Wordpress. (i did stumble upon the WP REST API plugin that i think helps, although i don't exactly know how.)
EDIT
OK so in the end i presume my problem was something to do with authentication when trying to access the database as the textResponse was just returning the entire HTML for the current page i was on, probably due to the fact it wasn't getting through to the database and being redirected back to the page.
After googling around for a while i came across this. Rather than reinventing the wheel I installed this plugin and followed the setup instructions and low and behold it worked pretty much out of the box. If your trying to build a Backbone theme i suggest using the WP-API Client JS plugin with the WP REST API plugin. Seems to cover everything.
How to expose a WordPress blog's content through an API?
WP REST API seems like a good way to start. There are a lot of options and it exposes everything you need.
Note that it is named WordPress REST API (Version 2) in the wordpress.org plugin directory.
You can test that the plugin works by navigating to:
http://www.example.com/wp-json/wp/v2/
It should output all the information on the blog as a big JSON dump.
You can also test that it works for other endpoints, like post:
http://www.example.com/wp-json/wp/v2/posts
There's a Backbone plugin for the WP REST API that works out of the box.
How to communicate with the API?
This is a simple example using Backbone without any plugin. If you want to know how to use the plugin, see the documentation for it.
Since it offers a lot of arguments that can be passed in the URL, I made a small collection and an example of how it could be used.
var API_ROOT = '/wp-json/wp/v2/',
DEFAULT_API_ARGS = ['context' /* etc. */ ];
var WordPressCollection = Backbone.Collection.extend({
constructor: function(models, options) {
options = options || {};
this.apiArgs = _.union(DEFAULT_API_ARGS, this.apiArgs, options.apiArgs);
this.args = _.extend({}, this.args, this.getApiArgs(options));
WordPressCollection.__super__.constructor.apply(this, arguments);
},
getApiArgs: function(obj) {
return _.pick(obj, this.apiArgs);
},
fetch: function(options) {
options = options || {};
options.data = _.extend({}, this.args, this.getApiArgs(options), options.data);
return WordPressCollection.__super__.fetch.call(this, options);
},
});
And to use it:
var CommentCollection = WordPressCollection.extend({
url: API_ROOT + 'comments',
// all the arguments to look for in the passed options
apiArgs: ['page', 'per_page', 'post' /* etc. */ ],
});
var myPostComments = new CommentCollection(null, {
post: 23 // id
});
console.log(myPostComments.url());
myPostComments.fetch({ page: 2 });
The fetch should make a GET request to:
/wp-json/wp/v2/comments?post=23&page=2
And from that point, the WP REST API plugin takes control. It returns a new JSON encoded array of comment objects in the body of the response.
It should looks something like this:
Backbone automatically parses the JSON received, so you don't need to worry about that and you just have to go on and use it:
myPostComments.each(function(comment) {
console.log(comment.get('author_name'));
});
Then, saving a new comment is a matter of calling:
// check the doc for the comment object details
myPostComments.create({
post: 23,
content: "my new comment",
/* etc. */
});
And this would make a POST request to /wp-json/wp/v2/comments.

Backbone.model.save(): POST(create) / PUT(update) logic doesn't match application logic - how to avoid PUT in certain situations?

I'm creating an Web-Application (Frontend and Backend, so both are under my control) using Backbone and Pyramid, being connected via a RESTful API.
During development I encountered a problem several times by now, where Backbone PUTs (=updates) a new model, while it actually should POST (=create) it.
Backbone decides whether to POST or UPDATE a model depending of the presence of an ID-field (if no ID present in the current model: -> POST/create | if so: PUT/update).
However I encountered several situations by now, where this behaviour doesn't match my application logic.
Let's say our main model (and its objects being persistently saved in a relational database in the backend) is called Foo, having fields like id, field_1, field_2.
Example #1: Creating a template or preview of Foo: Before creating (=POSTing) an object of Foo, I can create and show a preview to the user and/or save it as a template.
While doing so, the backend (in case of the preview: temporarily) adds the object to the database and returns the full model - including an ID in its HTTP response - back to Backbone.
Template- and Preview-objects of Foo are (temporarily) saved into the same table, as final objects (column type indicates its type (0 = final/live, 1 = preview, 2 = template)).
When now - after previewing / saving as template - trying to actually CREATE an object of Foo, the Backbone model already has the ID field set and actually PUTs and updates the template or not-anymore-existing preview, instead of POSTing and therewith creating a new Foo inside the database (as intended).
=> solution #1: calling POST /json/preview does not return the ID field, so Backbone doesn't get confused.
=> solution #2: overriding parse() of Foo in Backbone-model to kick out ID field from response
.=> kinda works
Example #2: Having a Periodic model, which refers to a Foo-template. Intention of a Periodic is to offer the user the possibility of semi-automatically creating a new Foo object based on a Foo-template every X months.
Now there is a call GET /json/periodics, which returns all Periodic-objects with its nested Foo-objects (Foo-templates), including their IDs, e.g. [{'interval': 12, template_id: 42, template: { 'id': 42, field_1: 'foo', field_2: 'bar', .. } , { .. } , .. ].
On the frontend the user now can periodically confirm (or skip) creating a new Foo-object, by issuing: periodics[X].template.save() which however again PUTs and therewith updates the Foo-model, instead of POSTing and creating a new one (as intended).
Here again (as in example 1), I could strip out the ID field of Foo - either in the backend or frontend.
However there are situations, where I need the id-field of templates, e.g. when actually editing them, so here I'd need two calls (GET /json/templates_WITHOUT_FOO-IDs and GET /json/templates_WITH_FOO-IDs). which also sounds far from right.
Question is: What's the right (and consistent) way of avoiding Backbone falsely assuming a model should be PUT instead of POSTed in certain situations / views?
Backbone's save and fetch methods just make calls to the Backbone.sync
method, which in turn is just a wrapper for an ajax call. you can pass
in ajax parameters using the save function without having to actually
extend it. basically ends up being something like this:
model.save({attributes you want to save}, {type:'POST', url: 'apiurl/model/:id/played'});
You would have to do this every time though so it is probably better practice to extend Backbone.sync for your model.
The Backbone website has a bit of information about what I'm talking about as far as the Backbone sync and save taking ajax options. There are also a few examples I've seen on extending sync but I can't seem to track them down at the moment.

How to save multiple objects to an array in a chrome extension?

I'm building my first chrome extension and I want it to track the TV series I watch and I'm currently trying to get it to save metadata on the series that I am following.
I have a content script that returns the title, the newest episode (and the URL of this episode) as well as the URL of the cover image of the series. I am currently trying to save it with some code on my background script (I have made sure to include "storage" under the permissions section of the manifest file).
So far my script looks like this (This was developed with help from Trying to save and fetch a Javascript object using chrome.storage API?):
var bkg = chrome.extension.getBackgroundPage();
response.aID = new Series(response.aTitle,response.aNewEp,response.aNewEpURL,response.aImage);
chrome.storage.sync.set(response.aID, function(){
chrome.storage.sync.get(function(val){
bkg.console.log("The saved title is: ", val.anTitle);
bkg.console.log("The saved newEp is: ", val.anNewEp);
bkg.console.log("The saved newEpURL is: ", val.anNewEpURL);
bkg.console.log("The saved imageURL is: ", val.anImage);
});
});
Problem is, the script only seems to store one response.aID at a time, so I can never store data for more than 1 TV series. Every time I try, the script seems to overwrite my previous entry. So I would like to ask whether there's any way to store more than 1 TV series at a time?
I have looked at storing an array and then pushing each new object into that array (Store an array with chrome.storage.local), but I don't quite understand the syntax involved so I'm not sure if this would work for me.
Unfortunately you didn't include the piece of code where you save your data, but i think you dont store your data with indices for the different TV series so the stored one gets overwritten everytime you store another one.
Anyway I would prefer storing your data in a JSON element (basically every javascript element can by converted to one but continue reading) because js provides several functions for this format which make it quite easy to use.
When opening your extension, load the data and call
var data = JSON.parse (yourloadedstring);
so the string (which should look like {"TVShows": [{"title": "How i met your mother", "url": ...}, {...}]} (look here for an explenation how JSON works) gets "translated" to an element from which you can read simply by calling
data.TVShows[0].title
or
data.TVShows[1].imageURL
You can edit this data JSON element when you add a new show for example by saying
data.TVShows[2].title = "The Big Bang Theory";
data.TVShows[2].URL= ...;
data.TVShows[2].imageURL= ...;
and save this element to chromes storage by calling
var dataToSave = JSON.stringify(data);
You have a string in your storage then, containing all information you need and you can simply parse it later like explained above :)
I hope everything is clearly to understand, if not pls ask me!
Cheers

Parsing a large JSON array in Javascript

I'm supposed to parse a very large JSON array in Javascipt. It looks like:
mydata = [
{'a':5, 'b':7, ... },
{'a':2, 'b':3, ... },
.
.
.
]
Now the thing is, if I pass this entire object to my parsing function parseJSON(), then of course it works, but it blocks the tab's process for 30-40 seconds (in case of an array with 160000 objects).
During this entire process of requesting this JSON from a server and parsing it, I'm displaying a 'loading' gif to the user. Of course, after I call the parse function, the gif freezes too, leading to bad user experience. I guess there's no way to get around this time, is there a way to somehow (at least) keep the loading gif from freezing?
Something like calling parseJSON() on chunks of my JSON every few milliseconds? I'm unable to implement that though being a noob in javascript.
Thanks a lot, I'd really appreciate if you could help me out here.
You might want to check this link. It's about multithreading.
Basically :
var url = 'http://bigcontentprovider.com/hugejsonfile';
var f = '(function() {
send = function(e) {
postMessage(e);
self.close();
};
importScripts("' + url + '?format=json&callback=send");
})();';
var _blob = new Blob([f], { type: 'text/javascript' });
_worker = new Worker(window.URL.createObjectURL(_blob));
_worker.onmessage = function(e) {
//Do what you want with your JSON
}
_worker.postMessage();
Haven't tried it myself to be honest...
EDIT about portability: Sebastien D. posted a comment with a link to mdn. I just added a ref to the compatibility section id.
I have never encountered a complete page lock down of 30-40 seconds, I'm almost impressed! Restructuring your data to be much smaller or splitting it into many files on the server side is the real answer. Do you actually need every little byte of the data?
Alternatively if you can't change the file #Cyrill_DD's answer of a worker thread will be able to able parse data for you and send it to your primary JS. This is not a perfect fix as you would guess though. Passing data between the 2 threads requires the information to be serialised and reinterpreted, so you could find a significant slow down when the data is passed between the threads and be back to square one again if you try to pass all the data across at once. Building a query system into your worker thread for requesting chunks of the data when you need them and using the message callback will prevent slow down from parsing on the main thread and allow you complete access to the data without loading it all into your main context.
I should add that worker threads are relatively new, main browser support is good but mobile is terrible... just a heads up!

Using AngularJS to process custom localStorage data

I wrote a bookmarklet that retrieves information from a page and stores it in JSON format in local storage (converting it to a string first, of course).
I would like a web app I am writing to be able to process this data, on the fly, preferably as it gets saved to the localStorage.
Right now i can change the item in LS via the console and refresh the page and the new data appears but I would like it to be live and seamless.
Any advice on how to go about this? I found several localStorage modules for angularJS and I tried them but they don't seem to allow me to retrieve from LS if the data is already there in LS.
In response to answer:
$scope.$watch(
function(){
return $window.localStorage.getItem('TestData');
},
function(newValueInStorage){
$scope.testingLS = newValueInStorage;
}
)
I tried this and I still get the data displayed by just doing a {{ testingLS }} in the view template but when I go and change the TestData key in local storage via the console it doesn't update instantly. (for now, I am just testing it without the bookmarklet with just a simple string inside TestData
There is few ways to do it
One of will be to populate correct model on scope when saving to localStorage
The other that I can think of at this moment is to setup watcher
$watch(
function(){
return localstorage object
},
function(newValueInStorage){
$scope.modelFromLS = JSON.parse(newValueInsStorage)
}
)
---edit---
as per James comment you need something that will handle the fact that data has changed in different tab and $digest process need to run for watch to be recalculated
http://plnkr.co/edit/zlS3wL65meBeA8KkV5KH?p=preview
window.addEventListener('focus', function(){
console.log('focus')
$scope.$digest()
})

Categories

Resources