WinJS: Loading data - javascript

I'm trying to develop my first Windows 8 Store app (HTML/JS). I am using the Grid App Template which suites my Needs I think the best.
This is my model:
I have three entities: 1. GalleryCategory 2. Gallery 3. GalleryItem.
A Gallery is linked to exactly one Category. A GalleryItem is linked to exactly one Gallery...so nothing fancy here...
I'm using the out of the box data.js file to load all categories and all galleries on the Startup of the app. But when I open the galleryDetail.html (which is supposed to Show all the Images of the particular Gallery) I want to load all Images of the Gallery then. (to avoid to much loading on the beginning).
And now I'm finally coming to the Point that I do not understand:
How can I manage this?? I mean
WinJS.UI.Pages.define("/pages/galleryDetail/galleryDetail.html", {
// This function is called whenever a user navigates to this page. It
// populates the page elements with the app's data.
ready: function (element, options) {
var item = options && options.item ? Data.resolveItemReference(options.item) : Data.items.getAt(0);
element.querySelector(".titlearea .pagetitle").textContent = item.group.title;
element.querySelector("article .item-title").textContent = item.title;
element.querySelector("article .item-subtitle").textContent = item.subtitle;
element.querySelector("article .item-image").src = item.backgroundImage;
element.querySelector("article .item-image").alt = item.subtitle;
element.querySelector("article .item-content").innerHTML = item.content;
element.querySelector(".content").focus();
var galleryId = item.key;
WinJS.xhr({ url: "http://someUrlToAnAspNetWebsite/Handlers/GalleryItemsHandler.ashx?galleryId=" + galleryId }).done(
// Complete function
function (response) {
var items = JSON.parse(response.responseText);
items.forEach(function (item) {
galleryItemsList.push(item);
});
dataList = new WinJS.Binding.List(galleryItemsList);
var galleryItemsListView = document.getElementById('galleryItemsListView').winControl;
galleryItemsList.itemDataSource = dataList.dataSource;
},
// Error function
function (response) {
// handle error here...
},
// Progress function
function (response) {
// progress implementation goes here...
}
);
},
my Problem is obivous...the ready function continues / Ends before the data is retrieved...as the async call takes a while.
But I thought using the promise (.done()) will do this for me (synchronising the threads)?? Or do I need to use the join() function. If so, where and how?? Sorry for my issues with this...
Thanks for any help...

The ready function itself is an async function, so you only have to return a promise to tell its caller that its not done until some promise is resolved. So you can fix your issue with 7 key strokes. Just add return before the WinJS.xhr call.

Related

Framework7 routes, load page from existing JSON string

I have the JSON data load at the start, and also pull to refresh. During these times a small delay is expected. When going between pages, it should be snappy, so I am looking to use this already requested JSON for my page content. One of the JSON objects is the entire (small enough) html page.
I cannot find a way to use this, and instead am following the examples making a second JSON get request before loading each page (article). I would rather just load the JSON data once at the start and use it until refreshed with pull-to-refresh.
* Currently Working, but using a second JSON get *
{
path: '/article/:article_id/',
// This works after much turmoil.
// sadly have to do a second json call. could have got with initial.
async: function (routeTo, routeFrom, resolve, reject) {
//Testing
// import('window.TodayJsonDB['+ routeTo.params.article_id +'][\'html\']');
// window.TodayJsonDB[routeTo.params.article_id]['html'];
// [data['article']['article_html']
// console.log(routeTo);
// Get external data and return template7 template
this.app.request.json('/__php/json1.php', { one: 1, article_id: routeTo.params.article_id }, function (data) {
// console.log(data['article'][0]['article_html']);
resolve(
// DOM locked until resolve returned.
// object with resolved route content. Must contain one of:
// url, content, template, templateUrl, component or componentUrl
{
content: data['article'][0]['article_html'],
},
);
});
}
// A day of testin but couldnt figure out how to use existing json feed.
//asyncComponent: () => import('window.TodayJsonDB['+ params.article_id +'][\'html\']'),
//el: window.TodayJsonDB[params.article_id]['html'],
//el: import('window.TodayJsonDB['+ params.article_id +'][\'html\']'),
//template: import('window.TodayJsonDB['+ params.article_id +'][\'html\']'),
//template: import('window.TodayJsonDB[' + params.article_id +'][html]'),
//asyncComponent: () => import('window.TodayJsonDB[' + params.article_id +'][html]'),
//asyncComponent: () => import('window.TodayJsonDB[' + $route.params.article_id +'][html]'),
//asyncComponent: () => import('window.TodayJsonDB[' + {{article_id}} +'][html]'),
//asyncComponent: () => import('window.TodayJsonDB[11][\'html\']'),
//content: window.TodayJsonDB[':article_id']['html'],
},
I already have this json get already; loaded when the app opens and updated with pull-down: window.TodayJsonDB
which contains:
window.TodayJsonDB[data['article'][i]['article_id']] = new Array();
window.TodayJsonDB[data['article'][i]['article_id']]['article_id'] = data['article'][i]['article_id'];
window.TodayJsonDB[data['article'][i]['article_id']]['title'] = [data['article'][i]['article_title']];
window.TodayJsonDB[data['article'][i]['article_id']]['content'] = [data['article'][i]['article_content']];
window.TodayJsonDB[data['article'][i]['article_id']]['html'] = [data['article'][i]['article_html']];
So my question is; how can I use the content of window.TodayJsonDB[article_id]['html'] to appear as the page content instead of having to do another JSON call when the user clicks a link.
My attempts in the code, commented out. Any other suggestions on how to approach the entire thing differently very much welcome.
Thanks as ever.
n.b. I tagged Vue as I believe closely related with Framework7. I am not using Vue.
Found a solution, my working code snippet is below. I used the same async section and converted the array output to a string using toString(). Only appears to work in the async section.
Can now load up the JSON for everything at the start, one JSON call.
Maybe will help someone else with Framework7. Good Luck!
{
path: '/article/:article_id/',
async: function (routeTo, routeFrom, resolve, reject) {
// Do we already have the JSON for the page?
if (typeof window.TodayJsonDB[routeTo.params.article_id]['html'] != "undefined") {
resolve({
content: (window.TodayJsonDB[routeTo.params.article_id]['html'].toString()),
});
}
else{
// Try and get it
this.app.request.json('/__php/json1.php', { one: 1, article_id: routeTo.params.article_id }, function (data) {
resolve(
{
content: data['article'][0]['article_html'],
},
);
});
}
}

Meteor: Lazyload, load after rendering. Best practise

i have a Meteor Application which is very "slow" as there are a lot of API-Calls.
What i try to do is to break apart the loading/calls.
What i just did is:
i have loading template via iron-router
i waitOn for the first API-Call has finished
then i start the next API-calls in the Template.myTemplate.rendered - function
This was already a big benefit for the speed of my Application, but i want to break it up even more as the second call is in fact more like 5-25 API-calls.
So what i try to do now is inside the rendered function is a self-calling function which calls itself as long as there are no more to do and saves the response inside a session. (Until now it just rewrites, but even to this point i can´t get)
Template.detail.rendered = function(){
//comma separated list of numbers for the API-Call
var cats = $(this.find(".extra")).attr('data-extra').split(',');
var shop = $(this.find(".extra")).attr('data-shop');
var counter = 0;
var callExtras = function(_counter){
var obj = {
categories : [cats[_counter]],
shop : shop
};
if(_counter <= cats.length){
Meteor.subscribe('extra', obj,function(result){
//TODO dickes todo... nochmal nachdenken und recherchieren
//console.log(_counter);
Session.set('extra',Extra.find('extra').fetch()[0].results);
counter++;
callExtras(counter);
});
}
};
callExtras(counter);
Session.set('loading_msg', '' );
};
Now i have again problems with my reactive parts of the app desscribed here - Meteor: iron-router => waitOn without subscribe As i can´t find a proper way to update my client-side per user base collection. Also in the docs it is described the publish method also creates a new collection. (The new document´s ID) here - http://docs.meteor.com/#/full/publish_added
here is the publish from server
Meteor.publish('extra', function(obj){
var that = this;
Meteor.call('extra', obj, function(error, result){
if (result){
//console.log(result);
that.added("extra", "extra", {results: result});
//that.changed('extra','extra',{results: result});
that.ready();
} else {
//that.ready();
}
});
});
So my question is: Is there from scratch a better way to structuring my code means solving the problem somehow different? If not how can i achive it the cleanest way? Because for my understanding this is just strange way to do it.
EDIT:
For example.
Can i do a per-user-collection (maybe only client-side like now) and push data from the server and just subscribe to this collection? But then how can i check when the async API-Call has finshed to start the next round. So the view gets data piece by piece. I am just confused right now.
My fault was simple as i thaught: You don´t need to use subscribe.
I just added "error,result" in the callback of Meteor.call
Only "result" leads to the result is always undefined.
var cats = $(this.find(".extra")).attr('data-extra').split(',');
var shop = $(this.find(".extra")).attr('data-shop');
var counter = 0;
var callExtras = function(_counter){
var obj = {
categories : [cats[_counter]],
shop : shop
};
if(_counter <= cats.length){
Meteor.call('extra', obj,function(error,result){
var actual_session = Session.get('extra');
if(actual_session === false){
actual_session = [];
}
actual_session = actual_session.concat(result);
Session.set('extra',actual_session);
counter++;
callExtras(counter);
});
}
};
callExtras(counter);
Then in the template helper
"extra" : function(){
return Session.get('extra');
},

Assemble paginated ajax data in a Bacon FRP stream

I'm learning FRP using Bacon.js, and would like to assemble data from a paginated API in a stream.
The module that uses the data has a consumption API like this:
// UI module, displays unicorns as they arrive
beautifulUnicorns.property.onValue(function(allUnicorns){
console.log("Got "+ allUnicorns.length +" Unicorns");
// ... some real display work
});
The module that assembles the data requests sequential pages from an API and pushes onto the stream every time it gets a new data set:
// beautifulUnicorns module
var curPage = 1
var stream = new Bacon.Bus()
var property = stream.toProperty()
var property.onValue(function(){}) # You have to add an empty subscriber, otherwise future onValues will not receive the initial value. https://github.com/baconjs/bacon.js/wiki/FAQ#why-isnt-my-property-updated
var allUnicorns = [] // !!! stateful list of all unicorns ever received. Is this idiomatic for FRP?
var getNextPage = function(){
/* get data for subsequent pages.
Skipping for clarity */
}
var gotNextPage = function (resp) {
Array.prototype.push.apply(allUnicorns, resp) // just adds the responses to the existing array reference
stream.push(allUnicorns)
curPage++
if (curPage <= pageLimit) { getNextPage() }
}
How do I subscribe to the stream in a way that provides me a full list of all unicorns ever received? Is this flatMap or similar? I don't think I need a new stream out of it, but I don't know. I'm sorry, I'm new to the FRP way of thinking. To be clear, assembling the array works, it just feels like I'm not doing the idiomatic thing.
I'm not using jQuery or another ajax library for this, so that's why I'm not using Bacon.fromPromise
You also may wonder why my consuming module wants the whole set instead of just the incremental update. If it were just appending rows that could be ok, but in my case it's an infinite scroll and it should draw data if both: 1. data is available and 2. area is on screen.
This can be done with the .scan() method. And also you will need a stream that emits items of one page, you can create it with .repeat().
Here is a draft code (sorry not tested):
var itemsPerPage = Bacon.repeat(function(index) {
var pageNumber = index + 1;
if (pageNumber < PAGE_LIMIT) {
return Bacon.fromCallback(function(callback) {
// your method that talks to the server
getDataForAPage(pageNumber, callback);
});
} else {
return false;
}
});
var allItems = itemsPerPage.scan([], function(allItems, itemsFromAPage) {
return allItems.concat(itemsFromAPage);
});
// Here you go
allItems.onValue(function(allUnicorns){
console.log("Got "+ allUnicorns.length +" Unicorns");
// ... some real display work
});
As you noticed, you also won't need .onValue(function(){}) hack, and curPage external state.
Here is a solution using flatMap and fold. When dealing with network you have to remember that the data can come back in a different order than you sent the requests - that's why the combination of fold and map.
var pages = Bacon.fromArray([1,2,3,4,5])
var requests = pages.flatMap(function(page) {
return doAjax(page)
.map(function(value) {
return {
page: page,
value: value
}
})
}).log("Data received")
var allData = requests.fold([], function(arr, data) {
return arr.concat([data])
}).map(function(arr) {
// I would normally write this as a oneliner
var sorted = _.sortBy(arr, "page")
var onlyValues = _.pluck(sorted, "value")
var inOneArray = _.flatten(onlyValues)
return inOneArray
})
allData.log("All data")
function doAjax(page) {
// This would actually be Bacon.fromPromise($.ajax...)
// Math random to simulate the fact that requests can return out
// of order
return Bacon.later(Math.random() * 3000, [
"Page"+page+"Item1",
"Page"+page+"Item2"])
}
http://jsbin.com/damevu/4/edit

PhantomJS page fetching with nested loop to get new pages

I want to fetch a list online from a certain URL that is in JSON format and then use the DATA_ID from each item in that list to call a new URL. I'm just new with PhantomJS and I can't figure out why nest loops inside the page.open() acts all weird. Also the way to use phantom.exit() seems to be really weird doing what I want to achieve.
Here's my code:
console.log('Loading recipes');
console.log('===============================================================');
var page = require('webpage').create();
var url = 'http://www.hiddenurl.com/recipes/all';
page.open(url, function (status) {
//Page is loaded!
var js = page.evaluate(function () {
return document.getElementsByTagName('pre')[0];
});
var recipes = JSON.parse(js.innerHTML).results;
//console.log(recipes[0].name.replace('[s]', ''));
for (i = 0; i < recipes.length; i++) {
console.log(recipes[i].name.replace('[s]', ''));
var craft_page = require('webpage').create();
var craft_url = 'http://www.hiddenurl.com/recipe/' + recipes[i].data_id;
craft_page.open(craft_url, function (craft_status) {
//Page is loaded!
var craft_js = craft_page.evaluate(function () {
return document.getElementsByTagName('body')[0];
});
var craftp = craft_js.innerHTML;
console.log('test');
});
if (i == 5) {
console.log('===============================================================');
phantom.exit();
//break;
}
}
});
The thing that happens here is that this line:
console.log(recipes[i].name.replace('[s]', ''));
..prints the following:
===============================================================
Item from DATA_ID 1
Item from DATA_ID 2
Item from DATA_ID 3
Item from DATA_ID 4
Item from DATA_ID 5
..then it just prints the next:
===============================================================
..followed by:
'test'
'test'
'test'
'test'
'test'
Why is this not happening serial? The data from the innerly called page() request gets heaped up and dumped at the end, even after phantom.exit() should actually already be called.
Also when I free-loop a normal data-set I get this error:
QEventDispatcherUNIXPrivate(): Unable to create thread pipe: Too many open files
2013-01-31T15:35:18 [FATAL] QEventDispatcherUNIXPrivate(): Can not continue without a thread pipe
Abort trap: 6
Is there any way I can set GLOBAL_PARAMETERS or direct the process in some way so I can just handle 100's of page requests?
Thanks in advance!
I've made a workaround with Python by calling PhantomJS separately through the shell, like this:
import os
import json
cmd = "./phantomjs fetch.js"
fin,fout = os.popen4(cmd)
result = fout.read()
recipes = json.loads(result)
print recipes['count']
Not the actual solution for the PhantomJS issue, but it's a working solution and has less problems with memory and code-structure.

Problems making GET request from jQuery

I'm trying to make an HTTP GET request using the jQuery get() function, but I'm having some trouble.
Here's what my code looks like:
// get the links on the page
var pageLinks = $.find('#pageLinks');
// loop through each of the links
$(pageLinks).find('a').each(function(){
if($(this).attr('title') !== "Next Page"){
// make a GET request to the URL of this link
$.get($(this).attr("href"), function(data) {
console.log("here");
var temp = parse_page(data);
// concatenate the return string with another
bdy = bdy+String(temp);
console.log("done");
});
}
});
There are multiple pages that I need to get data from. Since the get() function is asynchronous, I get the pages in a random order. Secondly, the concatenation does not work. Even though I get each of the pages, they're not put into bdy.
Can anyone suggest how I might deal with this?
Thanks a lot!!
Construct bdy after all pages are retrieved, i.e. store get results in a dictionary or array; wait for all gets to finish; then assemble them in the correct order.
I tried this one and it works:
// get the links on the page
var pageLinks = $('a');
var bdy
// loop through each of the links
$(pageLinks).each(function(){
console.log(this);
// make a GET request to the URL of this link
$.get($(this).attr("href"), function(data) {
// concatenate the return string with another
bdy = bdy + data.toString();
console.log(bdy);
});
});
As an example of what #muratgu has said:
var results = [];
var count = 0;
function allDone() {
var bdy = results.join("");
// do stuff with bdy
}
// get the links on the page
var pageLinks = $.find('#pageLinks');
// filter the links so we're left with the links we want
var wantedLinks = $(pageLinks).find('a').filter(function (idx) {
return $(this).attr('title') !== "Next Page";
});
// remember how many links we're working on
count = wantedLinks.length;
// loop through each of the links
wantedLinks.each(function (idx) {
// make a GET request to the URL of this link
$.get($(this).attr("href"), function (data) {
console.log("here");
var temp = parse_page(data);
results[idx] = temp;
// Decrement the count.
count--;
if (count === 0) {
// All done.
allDone();
}
});
});
You could go further and abstract this into a data type that can perform N async downloads, and then notify you when all are complete.
I just found that there are modules that allow one to manage the control flow in JS. The ones I found are:
Async
Step
For help using the above modules, see my follow up question here.

Categories

Resources