How do I parse multiple pages? - javascript

I have been attempting to parse a sites table data into a json file, which I can do if I do each page one by one, but seeing as there are 415 pages that would take a while.
I have seen and read a lot of StackOverflow questions on this subject but I don't seem able to modify my script so that it;
Scrapes each page and extracts the 50 items with item IDS per page
Do so in a rate limited way so I don't negatively affect the server
The script waits until all requests are done so I can write each item + item id to a JSON file.
I believe you should be able to do this using request-promise and promise.all but I cannot figure it out.
The actual scraping of the data is fine I just cannot make the code, scrape a page, then go to the next URL with a delay or pause inbetween requests.
Code below is the closest I have got, but I get the same results multiple times and I cannot slow the request rate down.
Example of the page URLS:
http://test.com/itemlist/1
http://test.com/itemlist/2
http://test.com/itemlist/3 etc (upto 415)
for (var i = 1; i <= noPages; i++) {
urls.push({url: itemURL + i});
console.log(itemURL + i);
}
Promise.map(urls, function(obj) {
return rp(obj).then(function(body) {
var $ = cheerio.load(body);
//Some calculations again...
rows = $('table tbody tr');
$(rows).each(function(index, row) {
var children = $(row).children();
var itemName = children.eq(1).text().trim();
var itemID = children.eq(2).text().trim();
var itemObj = {
"id" : itemID,
"name" : itemName
};
itemArray.push(itemObj);
});
return itemArray;
});
},{concurrency : 1}).then(function(results) {
console.log(results);
for (var i = 0; i < results.length; i++) {
// access the result's body via results[i]
//console.log(results[i]);
}
}, function(err) {
// handle all your errors here
console.log(err);
});
Apologies for perhaps misunderstand node.js and its modules, I don't really use the language but I needed to scrape some data and I really don't like python.

since you need requests to be run only one by one Promise.all() would not help.
Recursive promise (I'm not sure if it's correct naming) would.
function fetchAllPages(list) {
if (!list || !list.length) return Promise. resolve(); // trivial exit
var urlToFetch = list.pop();
return fetchPage(urlToFetch).
then(<wrapper that returns Promise will be resolved after delay >).
then(function() {
return fetchAllPages(list); // recursion!
});
}
This code still lacks error handling.
Also I believe it can become much more clear with async/await:
for(let url of urls) {
await fetchAndProcess(url);
await <wrapper around setTimeout>;
}
but you need to find /write your own implementation of fetch() and setTimeout() that are async

After input from #skyboyer suggesting using recursive promises I was lead to a GitHub Gist called Sequential execution of Promises using reduce()
Firstly I created my array of URLS
for (var i = 1; i <= noPages; i++) {
//example urls[0] = "http://test.com/1"
//example urls[1] = "http://test.com/2"
urls.push(itemURL + i);
console.log(itemURL + i);
}
Then
var sequencePromise = urls.reduce(function(promise, url) {
return promise.then(function(results) {
//fetchIDsFromURL async function (it returns a promise in this case)
//when the promise resolves I have my page data
return fetchIDsFromURL(url)
.then(promiseWithDelay(9000))
.then(itemArr => {
results.push(itemArr);
//calling return inside the .then method will make sure the data you want is passed onto the next
return results;
});
});
}, Promise.resolve([]));
// async
function fetchIDsFromURL(url)
{
return new Promise(function(resolve, reject){
request(url, function(err,res, body){
//console.log(body);
var $ = cheerio.load(body);
rows = $('table tbody tr');
$(rows).each(function(index, row) {
var children = $(row).children();
var itemName = children.eq(1).text().trim();
var itemID = children.eq(2).text().trim();
var itemObj = {
"id" : itemID,
"name" : itemName
};
//push the 50 per page scraped items into an array and resolve with
//the array to send the data back from the promise
itemArray.push(itemObj);
});
resolve(itemArray);
});
});
}
//returns a promise that resolves after the timeout
function promiseWithDelay(ms)
{
let timeout = new Promise(function(resolve, reject){
setTimeout(function()
{
clearTimeout(timeout);
resolve();
}, ms);
});
return timeout;
}
Then finally call .then on the sequence of promises, the only issue I had with this was returning multiple arrays inside results with the same data in each, so since all data is the same in each array I just take the first one which has all my parsed items with IDs in it, then I wrote it to a JSON file.
sequencePromise.then(function(results){
var lastResult = results.length;
console.log(results[0]);
writeToFile(results[0]);
});

Related

Firebase not receiving data before view loaded - empty array returned before filled

In the following code I save each item's key and an email address in one table, and to retrieve the object to fetch from the original table using said key. I can see that the items are being put into the rawList array when I console.log, but the function is returning this.cartList before it has anything in it, so the view doesn't receive any of the data. How can I make it so that this.cartList waits for rawList to be full before it is returned?
ionViewWillEnter() {
var user = firebase.auth().currentUser;
this.cartData.getCart().on('value', snapshot => {
let rawList = [];
snapshot.forEach(snap => {
if (user.email == snap.val().email) {
var desiredItem = this.goodsData.findGoodById(snap.val().key);
desiredItem.once("value")
.then(function(snapshot2) {
rawList.push(snapshot2);
});
return false
}
});
console.log(rawList);
this.cartList = rawList;
});
}
I have tried putting the this.cartList = rawList in a number of different locations (before return false, even inside the .then statement, but that did not solve the problem.
The following function call is asynchronous and you're falling out of scope before rawList has a chance to update because this database call takes a reasonably long time:
desiredItem.once("value").then(function(snapshot2) {
rawList.push(snapshot2);
});
You're also pushing the snapshot directly to this list, when you should be pushing snapshot2.val() to get the raw value.
Here's how I would fix your code:
ionViewWillEnter() {
var user = firebase.auth().currentUser;
this.cartData.getCart().on('value', snapshot => {
// clear the existing `this.cartList`
this.cartList = [];
snapshot.forEach(snap => {
if (user.email == snap.val().email) {
var desiredItem = this.goodsData.findGoodById(snap.val().key);
desiredItem.once("value")
.then(function(snapshot2) {
// push directly to the cartList
this.cartList.push(snapshot2.val());
});
}
return false;
});
});
}
The problem is the Promise (async .once() call to firebase) inside the forEach loop (sync). The forEach Loop is not gonna wait for the then() statement so then on the next iteration the data of the previous iteration is just lost...
let snapshots = [1, 2, 3];
let rawList = [];
snapshots.forEach((snap) => {
console.log(rawList.length)
fbCall = new Promise((resolve, reject) => {
setTimeout(function() {
resolve("Success!");
}, 2500)
});
fbCall.then((result) => {
rawList.push(result);
});
})
You need forEach to push the whole Promise to the rawList and Then wait for them to resolve and do sth with the results.
var snapshots = [1, 2, 3];
var rawList = [];
var counter = 0;
snapshots.forEach((snap) => {
console.log(rawList.length)
var fbCall = new Promise((resolve, reject) => {
setTimeout(function() {
resolve("Success!" + counter++);
}, 1500)
});
rawList.push(fbCall);
})
Promise.all(rawList).then((res) => {
console.log(res[0]);
console.log(res[1]);
console.log(res[2]);
});
The thing is, it is still a bit awkward to assign this.cartList = Promise.all(rawList) as it makes it a Promise. So you might want to rethink your design and make something like a getCartList Service? (dont know what ur app is like :p)
Since you're using angular you should also be using angularfire2, which makes use of Observables which will solve this issue for you. You will still be using the normal SDK for many things but for fetching and binding data it is not recommended to use Firebase alone without angularfire2 as it makes these things less manageable.
The nice things about this approach is that you can leverage any methods on Observable such as filter, first, map etc.
After installing it simply do:
public items$: FirebaseListObservable<any[]>;
this.items$ = this.af.database.list('path/to/data');
And in the view:
{{items$ | async}}
In order to wait for the data to appear.
Use AngularFire2 and RxJS this will save you a lot of time, and you will do it in the proper and maintainable way by using the RxJS operators, you can learn about those operators here learnrxjs

nodejs - Help "promisifying" a file read with nested promises

So I've recently delved into trying to understand promises and the purpose behind them due to javascripts asynchronous behavior. While I "think" I understand, I still struggle with how to promisify something to return the future value, then execute a new block of code to do something else. Two main node modules I'm using:
pg-promise
exceljs
What I'd like to do is read a file, then once fully read, iterate of each worksheet executing DB commands. Then once all worksheets are processed, go back and delete the original file I read. Here is the code I have. I have it working to the point everything writes into the database just fine, even when there are multiple worksheets. What I don't have working is setting it up to identify when all the worksheets have been fully processed, then to go remove the file
workbook.csv.readFile(fileName)
.then(function () {
// this array I was going to use to somehow populate a true/false array.
// Then when done with each sheet, push a true into the array.
// When all elements were true could signify all the processing is done...
// but have no idea how to utilize this!
// So left it in to take up space because wtf...
var arrWorksheetComplete = [];
workbook.eachSheet(function (worksheet) {
console.log(worksheet.name);
db.tx(function (t) {
var insertStatements = [];
for (var i = 2; i <= worksheet._rows.length; i++) {
// here we create a new array from the worksheet, as we need a 0 index based array.
// the worksheet values actually begins at element 1. We will splice to dump the undefined element at index 0.
// This will allow the batch promises to work correctly... otherwise everything will be offset by 1
var arrValues = Array.from(worksheet.getRow(i).values);
arrValues.splice(0, 1);
// these queries are upsert. Inserts will occur first, however if they error on the constraint, an update will occur instead.
insertStatements.push(t.one('insert into rq_data' +
'(col1, col2, col3) ' +
'values($1, $2, $3) ' +
'ON CONFLICT ON CONSTRAINT key_constraint DO UPDATE SET ' +
'(prodname) = ' +
'($3) RETURNING autokey',
arrValues));
}
return t.batch(insertStatements);
})
.then(function (data) {
console.log('Success:', 'Inserted/Updated ' + data.length + ' records');
})
.catch(function (error) {
console.log('ERROR:', error.message || error);
});
});
});
I would like to be able to say
.then(function(){
// everything processed!
removeFile(fileName)
// this probably also wouldn't work as by now fileName is out of context?
});
But I'm super confused when having a promise inside a promise.. I have the db.tx call which is essentially a promise nested inside the .eachSheet function.
Please help a dumb programmer understand! Been beating head against wall for hours on this one. :)
If i understand correctly, you're trying to chain promises.
I suggest you to read this great article on Promises anti-pattern (see 'The Collection Kerfuffle' section)
If you need to execute promises in series, this article suggests to use reduce.
I'll rewrite your snippet to:
workbook.csv.readFile(fileName).then(function () {
processWorksheets().then(function() {
// all worksheets processed!
});
});
function processWorksheets() {
var worksheets = [];
// first, build an array of worksheet
workbook.eachSheet(function (worksheet) {
worksheets.push(worksheet);
});
// then chain promises using Array.reduce
return worksheets.reduce(function(promise, item) {
// promise is the the value previously returned in the last invocation of the callback.
// item is a worksheet
// when the previous promise will be resolved, call saveWorksheet on the next worksheet
return promise.then(function(result) {
return saveWorksheet(item, result);
});
}, Promise.resolve()); // start chain with a 'fake' promise
}
// this method returns a promise
function saveWorksheet(worksheet, result) {
return db.tx(function (t) {
var insertStatements = [];
for (var i = 2; i <= worksheet._rows.length; i++) {
// here we create a new array from the worksheet, as we need a 0 index based array.
// the worksheet values actually begins at element 1. We will splice to dump the undefined element at index 0.
// This will allow the batch promises to work correctly... otherwise everything will be offset by 1
var arrValues = Array.from(worksheet.getRow(i).values);
arrValues.splice(0, 1);
// these queries are upsert. Inserts will occur first, however if they error on the constraint, an update will occur instead.
insertStatements.push(t.one('insert into rq_data' +
'(col1, col2, col3) ' +
'values($1, $2, $3) ' +
'ON CONFLICT ON CONSTRAINT key_constraint DO UPDATE SET ' +
'(prodname) = ' +
'($3) RETURNING autokey',
arrValues));
}
return t.batch(insertStatements);
})
// this two below can be removed...
.then(function (data) {
return new Promise((resolve, reject) => {
console.log('Success:', 'Inserted/Updated ' + data.length + ' records');
resolve();
});
})
.catch(function (error) {
return new Promise((resolve, reject) => {
console.log('ERROR:', error.message || error);
reject();
});
});
}
Don't forget to include the promise module:
var Promise = require('promise');
I haven't tested my code, could contains some typo errors.

javascript/jquery: Iterative called function; wait till the previous call is finished

I've some problem with a library calling a function on each item. I've to check the state for this item via an ajax request and don't want to call one request per item, but get a range of item states.
Because these items are dates I can get some range pretty easy - that's the good part :)
So to to give some code ...
var itemStates = {};
var libraryObj = {
itemCallback: function(item) {
return checkState(item);
}
}
function checkState(item) {
if(!itemStates.hasOwnProperty(item)) {
$.get('...', function(result) {
$.extend(true, itemStates, result);
});
}
return itemStates[item];
}
The library is now calling library.itemCallback() on each item, but I want to wait for the request made in checkState() before calling checkState() again (because the chance is extremly high the next items' state was allready requested within the previous request.
I read about the defer and wait(), then() and so on, but couldn't really get an idea how to implement this.
Many thanks to everybody who could help me with this :)
You can achieve this by using jQuery.Deferred or Javascript Promise. In the following code, itemCallback() will wait for previous calls to finish before calling checkState().
var queue = [];
var itemStates = {};
var libraryObj = {
itemCallback: function(item) {
var def = $.Deferred();
$.when.apply(null, queue)
.then(function() {
return checkState(item);
})
.then(function(result) {
def.resolve(result);
});
queue.push(def.promise());
return def.promise();
}
}
function checkState(item) {
var def = $.Deferred();
if (!itemStates.hasOwnProperty(item)) {
$.get('...', function(result) {
$.extend(true, itemStates, result);
def.resolve(itemStates[item]);
});
} else
def.resolve(itemStates[item]);
return def.promise();
}
//these will execute in order, waiting for the previous call
libraryObj.itemCallback(1).done(function(r) { console.log(r); });
libraryObj.itemCallback(2).done(function(r) { console.log(r); });
libraryObj.itemCallback(3).done(function(r) { console.log(r); });
libraryObj.itemCallback(4).done(function(r) { console.log(r); });
libraryObj.itemCallback(5).done(function(r) { console.log(r); });
Same example built with Javascript Promises
var queue = [];
var itemStates = {};
var libraryObj = {
itemCallback: function(item) {
var promise = new Promise(resolve => {
Promise.all(queue)
.then(() => checkState(item))
.then((result) => resolve(result));
});
queue.push(promise);
return promise;
}
}
function checkState(item) {
return new Promise(resolve => {
if (item in itemStates)
resolve(itemStates[item]);
else {
$.get('...', function(result) {
$.extend(true, itemStates, result);
resolve(itemStates[item]);
});
}
});
}
//these will execute in order, waiting for the previous call
libraryObj.itemCallback(1).then(function(r) { console.log(r); });
libraryObj.itemCallback(2).then(function(r) { console.log(r); });
libraryObj.itemCallback(3).then(function(r) { console.log(r); });
libraryObj.itemCallback(4).then(function(r) { console.log(r); });
libraryObj.itemCallback(5).then(function(r) { console.log(r); });
The library is now calling library.itemCallback() on each item, but I want to wait for the request made in checkState() before calling checkState() again (because the chance is extremely high the next items' state was already requested within the previous request.
One thing I can think of doing is making some caching function, depending on the last time the function was called return the previous value or make a new request
var cached = function(self, cachingTime, fn){
var paramMap = {};
return function( ) {
var arr = Array.prototype.slice.call(arguments);
var parameters = JSON.stringify(arr);
var returning;
if(!paramMap[parameters]){
returning = fn.apply(self,arr);
paramMap[parameters]={timeCalled: new Date(), value:returning};
} else {
var diffMs = Math.abs(paramMap[parameters].timeCalled - new Date());
var diffMins = ( diffMs / 1000 ) / 60;
if(diffMins > cachingTime){
returning = fn.apply(self,arr);
paramMap[parameters] = {timeCalled: new Date(), value:returning};
} else {
returning = paramMap[parameters].value;
}
}
return returning;
}
}
Then you'd wrap the ajax call into the function you've made
var fn = cached(null, 1 , function(item){
return $.get('...', function(result) {
$.extend(true, itemStates, result);
});
});
Executing the new function would get you the last promise called for those parameters within the last request made at the last minute with those parameters or make a new request
simplest and dirty way of taking control over the library is to override their methods
But I don't really know core problem here so other hints are below
If you have the control over the checkState then just collect your data and change your controller on the server side to work with arrays that's it
and if you don't know when the next checkState will be called to count your collection and make the request use setTimeout to check collection after some time or setIterval to check it continuously
if you don't want to get same item multiple times then store your checked items in some variable like alreadyChecked and before making request search for this item in alreadyChecked
to be notified when some library is using your item use getter,
and then collect your items.
When you will have enough items collected then you can make the request,
but when you will not have enought items then use setTimeout and wait for some time. If nothing changes, then it means that library finishes the iteration for now and you can make the request with items that left of.
let collection=[];// collection for request
let _items={};// real items for you when you don't want to perfrom actions while getting values
let itemStates={};// items for library
let timeoutId;
//instead of itemStates[someState]=someValue; use
function setItem(someState,someValue){
Object.defineProperty(itemStates, someState, { get: function () {
if(typeof timeoutId=="number")clearTimeout(timeoutId);
//here you can add someState to the collection for request
collection.push(_items[someState]);
if(collection.length>=10){
makeRequest();
}else{
timeoutId=setTimeout(()=>{...checkCollectionAndMakeRequest...},someTime);
}
return someValue;
} });
}

Return an object from a function that has nested ajax calls

I would like to write a javascript function that returns informations from youtube videos; to be more specific I would like to get the ID and the length of videos got by a search, in a json object. So I took a look at the youtube API and I came out with this solution:
function getYoutubeDurationMap( query ){
var youtubeSearchReq = "https://gdata.youtube.com/feeds/api/videos?q="+ query +
"&max-results=20&duration=long&category=film&alt=json&v=2";
var youtubeMap = [];
$.getJSON(youtubeSearchReq, function(youtubeResult){
var youtubeVideoDetailReq = "https://gdata.youtube.com/feeds/api/videos/";
for(var i =0;i<youtubeResult.feed.entry.length;i++){
var youtubeVideoId = youtubeResult.feed.entry[i].id.$t.substring(27);
$.getJSON(youtubeVideoDetailReq + youtubeVideoId + "?alt=json&v=2",function(videoDetails){
youtubeMap.push({id: videoDetails.entry.id.$t.substring(27),runtime: videoDetails.entry.media$group.media$content[0].duration});
});
}
});
return youtubeMap;
}
The logic is ok, but as many of you have already understood because of ajax when I call this function I get an empty array. Is there anyway to get the complete object? Should I use a Deferred object? Thanks for your answers.
Yes, you should use deferred objects.
The simplest approach here is to create an array into which you can store the jqXHR result of your inner $.getJSON() calls.
var def = [];
for (var i = 0; ...) {
def[i] = $.getJSON(...).done(function(videoDetails) {
... // extract and store in youtubeMap
});
}
and then at the end of the whole function, use $.when to create a new promise that will be resolved only when all of the inner calls have finished:
return $.when.apply($, def).then(function() {
return youtubeMap;
});
and then use .done to handle the result from your function:
getYoutubeDurationMap(query).done(function(map) {
// map contains your results
});
See http://jsfiddle.net/alnitak/8XQ4H/ for a demonstration using this YouTube API of how deferred objects allow you to completely separate the AJAX calls from the subsequent data processing for your "duration search".
The code is a little long, but reproduced here too. However whilst the code is longer than you might expect note that the generic functions herein are now reusable for any calls you might want to make to the YouTube API.
// generic search - some of the fields could be parameterised
function youtubeSearch(query) {
var url = 'https://gdata.youtube.com/feeds/api/videos';
return $.getJSON(url, {
q: query,
'max-results': 20,
duration: 'long', category: 'film', // parameters?
alt: 'json', v: 2
});
}
// get details for one YouTube vid
function youtubeDetails(id) {
var url = 'https://gdata.youtube.com/feeds/api/videos/' + id;
return $.getJSON(url, {
alt: 'json', v: 2
});
}
// get the details for *all* the vids returned by a search
function youtubeResultDetails(result) {
var details = [];
var def = result.feed.entry.map(function(entry, i) {
var id = entry.id.$t.substring(27);
return youtubeDetails(id).done(function(data) {
details[i] = data;
});
});
return $.when.apply($, def).then(function() {
return details;
});
}
// use deferred composition to do a search and then get all details
function youtubeSearchDetails(query) {
return youtubeSearch(query).then(youtubeResultDetails);
}
// this code (and _only_ this code) specific to your requirement to
// return an array of {id, duration}
function youtubeDetailsToDurationMap(details) {
return details.map(function(detail) {
return {
id: detail.entry.id.$t.substring(27),
duration: detail.entry.media$group.media$content[0].duration
}
});
}
// and calling it all together
youtubeSearchDetails("after earth").then(youtubeDetailsToDurationMap).done(function(map) {
// use map[i].id and .duration
});
As you have discovered, you can't return youtubeMap directly as it's not yet populated at the point of return. But you can return a Promise of a fully populated youtubeMap, which can be acted on with eg .done(), .fail() or .then().
function getYoutubeDurationMap(query) {
var youtubeSearchReq = "https://gdata.youtube.com/feeds/api/videos?q=" + query + "&max-results=20&duration=long&category=film&alt=json&v=2";
var youtubeVideoDetailReq = "https://gdata.youtube.com/feeds/api/videos/";
var youtubeMap = [];
var dfrd = $.Deferred();
var p = $.getJSON(youtubeSearchReq).done(function(youtubeResult) {
$.each(youtubeResult.feed.entry, function(i, entry) {
var youtubeVideoId = entry.id.$t.substring(27);
//Build a .then() chain to perform sequential queries
p = p.then(function() {
return $.getJSON(youtubeVideoDetailReq + youtubeVideoId + "?alt=json&v=2").done(function(videoDetails) {
youtubeMap.push({
id: videoDetails.entry.id.$t.substring(27),
runtime: videoDetails.entry.media$group.media$content[0].duration
});
});
});
});
//Add a terminal .then() to resolve dfrd when all video queries are complete.
p.then(function() {
dfrd.resolve(query, youtubeMap);
});
});
return dfrd.promise();
}
And the call to getYoutubeDurationMap() would be of the following form :
getYoutubeDurationMap("....").done(function(query, map) {
alert("Query: " + query + "\nYouTube videos found: " + map.length);
});
Notes:
In practice, you would probably loop through map and display the .id and .runtime data.
Sequential queries is preferable to parallel queries as sequential is kinder to both client and server, and more likely to succeed.
Another valid approach would be to return an array of separate promises (one per video) and to respond to completion with $.when.apply(..), however the required data would be more awkward to extract.

node-mysql timing

i have a recursive query like this (note: this is just an example):
var user = function(data)
{
this.minions = [];
this.loadMinions = function()
{
_user = this;
database.query('select * from users where owner='+data.id,function(err,result,fields)
{
for(var m in result)
{
_user.minions[result[m].id] = new user(result[m]);
_user.minions[result[m].id].loadMinions();
}
}
console.log("loaded all minions");
}
}
currentUser = new user(ID);
for (var m in currentUser.minions)
{
console.log("minion found!");
}
this don't work because the timmings are all wrong, the code don't wait for the query.
i've tried to do this:
var MyQuery = function(QueryString){
var Data;
var Done = false;
database.query(QueryString, function(err, result, fields) {
Data = result;
Done = true;
});
while(Done != true){};
return Data;
}
var user = function(data)
{
this.minions = [];
this.loadMinions = function()
{
_user = this;
result= MyQuery('select * from users where owner='+data.id);
for(var m in result)
{
_user.minions[result[m].id] = new user(result[m]);
_user.minions[result[m].id].loadMinions();
}
console.log("loaded all minions");
}
}
currentUser = new user(ID);
for (var m in currentUser.minions)
{
console.log("minion found!");
}
but he just freezes on the while, am i missing something?
The first hurdle to solving your problem is understanding that I/O in Node.js is asynchronous. Once you know how this applies to your problem the recursive part will be much easier (especially if you use a flow control library like Async or Step).
Here is an example that does some of what you're trying to do (minus the recursion). Personally, I would avoid recursively loading a possibly unknown number/depth of records like that; Instead load them on demand, like in this example:
var User = function(data) {
this.data = data
this.minions;
};
User.prototype.getMinions = function(primaryCallback) {
var that = this; // scope handle
if(this.minions) { // bypass the db query if results cached
return primaryCallback(null, this.minions);
}
// Callback invoked by database.query when it has the records
var aCallback = function(error, results, fields) {
if(error) {
return primaryCallback(error);
}
// This is where you would put your recursive minion initialization
// The problem you are going to have is callback counting, using a library
// like async or step would make this party much much easier
that.minions = results; // bypass the db query after this
primaryCallback(null, results);
}
database.query('SELECT * FROM users WHERE owner = ' + data.id, aCallback);
};
var user = new User(someData);
user.getMinions(function(error, minions) {
if(error) {
throw error;
}
// Inside the function invoked by primaryCallback(...)
minions.forEach(function(minion) {
console.log('found this minion:', minion);
});
});
The biggest thing to note in this example are the callbacks. The database.query(...) is asynchronous and you don't want to tie up the event loop waiting for it to finish. This is solved by providing a callback, aCallback, to the query, which is executed when the results are ready. Once that callback fires and after you perform whatever processing you want to do on the records you can fire the primaryCallback with the final results.
Each Node.js process is single-threaded, so the line
while(Done != true){};
takes over the thread, and the callback that would have set Done to true never gets run because the thead is blocked on an infinite loop.
You need to refactor your program so that code that depends on the results of the query is included within the callback itself. For example, make MyQuery take a callback argument:
MyQuery = function(QueryString, callback){
Then call the callback at the end of your database.query callback -- or even supply it as the database.query callback.
The freezing is unfortunately correct behaviour, as Node is single-threaded.
You need a scheduler package to fix this. Personally, I have been using Fibers-promise for this kind of issue. You might want to look at this or another promise library or at async

Categories

Resources