This code doesn't seem to fire in order? - javascript

My problem is that the code does not seem to be running in order, as seen below.
This code is for my discord.js bot that I am creating.
var Discord = require("discord.js");
var bot = new Discord.Client();
var yt = require("C:/Users/username/Documents/Coding/Discord/youtubetest.js");
var youtubetest = new yt();
var fs = require('fs');
var youtubedl = require('youtube-dl');
var prefix = "!";
var vidid;
var commands = {
play: {
name: "!play ",
fnc: "Gets a Youtube video matching given tags.",
process: function(msg, query) {
youtubetest.respond(query, msg);
var vidid = youtubetest.vidid;
console.log(typeof(vidid) + " + " + vidid);
console.log("3");
}
}
};
bot.on('ready', () => {
console.log('I am ready!');
});
bot.on("message", msg => {
if(!msg.content.startsWith(prefix) || msg.author.bot || (msg.author.id === bot.user.id)) return;
var cmdraw = msg.content.split(" ")[0].substring(1).toLowerCase();
var query = msg.content.split("!")[1];
var cmd = commands[cmdraw];
if (cmd) {
var res = cmd.process(msg, query, bot);
if (res) {
msg.channel.sendMessage(res);
}
} else {
let msgs = [];
msgs.push(msg.content + " is not a valid command.");
msgs.push(" ");
msgs.push("Available commands:");
msgs.push(" ");
msg.channel.sendMessage(msgs);
msg.channel.sendMessage(commands.help.process(msg));
}
});
bot.on('error', e => { console.error(e); });
bot.login("mytoken");
The youtubetest.js file:
var youtube_node = require('youtube-node');
var ConfigFile = require("C:/Users/username/Documents/Coding/Discord/json_config.json");
var mybot = require("C:/Users/username/Documents/Coding/Discord/mybot.js");
function myyt () {
this.youtube = new youtube_node();
this.youtube.setKey(ConfigFile.youtube_api_key);
this.vidid = "";
}
myyt.prototype.respond = function(query, msg) {
this.youtube.search(query, 1, function(error, result) {
if (error) {
msg.channel.sendMessage("There was an error finding requested video.");
} else {
vidid = 'http://www.youtube.com/watch?v=' + result.items[0].id.videoId;
myyt.vidid = vidid;
console.log("1");
}
});
console.log("2");
};
module.exports = myyt;
As the code shows, i have an object for the commands that the bot will be able to process, and I have a function to run said commands when a message is received.
Throughout the code you can see that I have put three console.logs with 1, 2 and 3 showing in which order I expect the parts of the code to run. When the code is run and a query is found the output is this:
I am ready!
string +
2
3
1
This shows that the code is running in the wrong order that I expect it to.
All help is very highly appreciated :)
*Update! Thank you all very much to understand why it isn't working. I found a solution where in the main file at vidid = youtubetest.respond(query, msg) when it does that the variable is not assigned until the function is done so it goes onto the rest of my code without the variable. To fix I simply put an if statement checking if the variable if undefined and waiting until it is defined.*

Like is mentioned before, a lot of stuff in javascript runs in async, hence the callback handlers. The reason it runs in async, is to avoid the rest of your code being "blocked" by remote calls. To avoid ending up in callback hell, most of us Javascript developers are moving more and more over to Promises. So your code could then look more like this:
myyt.prototype.respond = function(query, msg) {
return new Promise(function(resolve, reject) {
this.youtube.search(query, 1, function(error, result) {
if (error) {
reject("There was an error finding requested video."); // passed down to the ".catch" statement below
} else {
vidid = 'http://www.youtube.com/watch?v=' + result.items[0].id.videoId;
myyt.vidid = vidid;
console.log("1");
resolve(2); // Resolve marks the promises as successfully completed, and passes along to the ".then" method
}
});
}).then(function(two) {
// video is now the same as myyt.vidid as above.
console.log(two);
}).catch(function(err) {
// err contains the error object from above
msg.channel.sendMessage(err);
})
};
This would naturally require a change in anything that uses this process, but creating your own prototypes seems.. odd.
This promise returns the vidid, so you'd then set vidid = youtubetest.response(query, msg);, and whenever that function gets called, you do:
vidid.then(function(id) {
// id is now the vidid.
});
Javascript runs async by design, and trying to hack your way around that leads you to dark places fast. As far as I can tell, you're also targetting nodeJS, which means that once you start running something synchronously, you'll kill off performance for other users, as everyone has to wait for that sync call to finish.
Some suggested reading:
http://callbackhell.com/
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise
https://stackoverflow.com/a/11233849/3646975
I'd also suggest looking up ES6 syntax, as it shortens your code and makes life a hellofalot easier (native promises were only introduced in ES6, which NodeJS 4 and above supports (more or less))

In javascript, please remember that any callback function you pass to some other function is called asynchronously. I.e. the calls to callback function may not happen "in order". "In order" in this case means the order they appear on the source file.
The callback function is simply called on certain event:
When there is data to be processed
on error
in your case for example when the youtube search results are ready,
'ready' event is received or 'message' is received.
etc.

Related

Best way to turn JSON change into event

I'm creating a YouTube upload notification bot for a Discord Server I am in using the YouTube RSS Feed and am having problems with it. I have issues with the bot sending the same video twice even though I've tried everything to fix it. The bot cycles through different users in a for loop and checks the user's latest video's ID with one stored in a JSON file. If they do not match, it sends a message and updates the JSON. Here is my current code:
function update(videoId, n) {
var u = JSON.parse(fs.readFileSync("./jsons/uploads.json"))
u[n].id = videoId
fs.writeFile("./jsons/uploads.json", JSON.stringify(u, null, 2), (err) => {
if (err) throw err;
// client.channels.cache.get("776895633033396284").send()
console.log('Hey, Listen! ' + n + ' just released a new video! Go watch it: https://youtu.be/' + videoId + "\n\n")
});
}
async function uploadHandler() {
try {
var u = require('./jsons/uploads.json');
var users = require('./jsons/users.json');
for (i = 0; i < Object.keys(users).length; i++) {
// sleep(1000)
setTimeout(function(i) {
var username = Object.keys(users)[i]
let xml = f("https://www.youtube.com/feeds/videos.xml?channel_id=" + users[username]).text()
parseString(xml, function(err, result) {
if (err) {} else {
let videoId = result.feed.entry[0]["yt:videoId"][0]
let isMatch = u[username].id == videoId ? true : false
if (isMatch) {} else {
if (!isMatch) {
u[username] = videoId
update(videoId, username)
}
}
}
});
}, i * 1000, i)
}
} catch (e) {
console.log(e)
}
}
My code is rather simple but I've had the same issue with other codes that use this method; therefore what would be the best way to accomplish this? Any advice is appreciated
There are a few issues with your code that I would call out right off the bat:
Empty blocks. You use this especially with your if statements, e.g. if (condition) {} else { // Do the thing }. Instead, you should negate the condition, e.g. if (!condition) { // Do the thing }.
You declare the function uploadHandler as async, but you never declare that you're doing anything asynchronously. I'm suspecting that f is your asynchronous Promise that you're trying to handle.
You've linked the duration of the timeout to your incrementing variable, so in the first run of your for block, the timeout will wait zero seconds (i is 0, times 1000), then one second, then two seconds, then three...
Here's a swag at a refactor with some notes that I hope are helpful in there:
// Only require these values once
const u = require('./jsons/uploads.json');
const users = require('./jsons/users.json');
// This just makes the code a little more readable, I think
const URL_BASE = 'https://www.youtube.com/feeds/videos.xml?channel_id=';
function uploadHandler() {
Object.keys(users).forEach(username => {
// We will run this code once for each username that we find in users
// I am assuming `f` is a Promise. When it resolves, we'll have xml available to us in the .then method
f(`${URL_BASE}${username}`).then(xml => {
parseString(xml, (err, result) => {
if (!err) {
const [videoId] = result.feed.entry[0]['yt:videoId']; // We can use destructuring to get element 0 from this nested value
if (videoId !== u[username].id) {
// Update the in-memory value for this user's most recent video
u[username].id = videoId;
// Console.log the update
console.log(`Hey listen! ${username} just released a new video! Go watch it: https://youtu.be/${videoId}\n\n`);
// Attempt to update the json file; this won't affect the u object in memory, but will keep your app up to date
// when you restart it in the future.
fs.writeFile('./jsons/uploads.json', JSON.stringify(u, null, 2), err => {
if (err) {
console.err(`There was a problem updating uploads.json with the new videoId ${videoId} for user ${username}`);
}
});
}
}
});
})
// This .catch method will run if the call made by `f` fails for any reason
.catch(err => console.error(err));
});
}
// I am assuming that what you want is to check for updates once every second.
setInterval(uploadHandler, 1000);

What are ways to run a script only after another script has finished?

Lets say this is my code (just a sample I wrote up to show the idea)
var extract = require("./postextract.js");
var rescore = require("./standardaddress.js");
RunFunc();
function RunFunc() {
extract.Start();
console.log("Extraction complete");
rescore.Start();
console.log("Scoring complete");
}
And I want to not let the rescore.Start() run until the entire extract.Start() has finished. Both scripts contain a spiderweb of functions inside of them, so having a callback put directly into the Start() function is not appearing viable as the final function won't return it, and I am having a lot of trouble understanding how to use Promises. What are ways I can make this work?
These are the scripts that extract.Start() begins and ends with. OpenWriter() is gotten to through multiple other functions and streams, with the actual fileWrite.write() being in another script that's attached to this (although not needed to detect the end of run. Currently, fileWrite.on('finish') is where I want the script to be determined as done
module.exports = {
Start: function CodeFileRead() {
//this.country = countryIn;
//Read stream of thate address components
fs.createReadStream("Reference\\" + postValid.country + " ADDRESS REF DATA.csv")
//Change separator based on file
.pipe(csv({escape: null, headers: false, separator: delim}))
//Indicate start of reading
.on('resume', (data) => console.log("Reading complete postal code file..."))
//Processes lines of data into storage array for comparison
.on('data', (data) => {
postValid.addProper[data[1]] = JSON.stringify(Object.values(data)).replace(/"/g, '').split(',').join('*');
})
//End of reading file
.on('end', () => {
postValid.complete = true;
console.log("Done reading");
//Launch main script, delayed to here in order to not read ahead of this stream
ThisFunc();
});
},
extractDone
}
function OpenWriter() {
//File stream for writing the processed chunks into a new file
fileWrite = fs.createWriteStream("Processed\\" + fileName.split('.')[0] + "_processed." + fileName.split('.')[1]);
fileWrite.on('open', () => console.log("File write is open"));
fileWrite.on('finish', () => {
console.log("File write is closed");
});
}
EDIT: I do not want to simply add the next script onto the end of the previous one and forego the master file, as I don't know how long it will be and its supposed to be designed to be capable of taking additional scripts past our development period. I cannot just use a package as it stands because approval time in the company takes up to two weeks and I need this more immediately
DOUBLE EDIT: This is all my code, every script and function is all written by me, so I can make the scripts being called do what's needed
You can just wrap your function in Promise and return that.
module.exports = {
Start: function CodeFileRead() {
return new Promise((resolve, reject) => {
fs.createReadStream(
'Reference\\' + postValid.country + ' ADDRESS REF DATA.csv'
)
// .......some code...
.on('end', () => {
postValid.complete = true;
console.log('Done reading');
resolve('success');
});
});
}
};
And Run the RunFunc like this:
async function RunFunc() {
await extract.Start();
console.log("Extraction complete");
await rescore.Start();
console.log("Scoring complete");
}
//or IIFE
RunFunc().then(()=>{
console.log("All Complete");
})
Note: Also you can/should handle error by reject("some error") when some error occurs.
EDIT After knowing about TheFunc():
Making a new Event emitter will probably the easiest solution:
eventEmitter.js
const EventEmitter = require('events').EventEmitter
module.exports = new EventEmitter()
const eventEmitter = require('./eventEmitter');
module.exports = {
Start: function CodeFileRead() {
return new Promise((resolve, reject) => {
//after all of your code
eventEmitter.once('WORK_DONE', ()=>{
resolve("Done");
})
});
}
};
function OpenWriter() {
...
fileWrite.on('finish', () => {
console.log("File write is closed");
eventEmitter.emit("WORK_DONE");
});
}
And Run the RunFunc like as before.
There's no generic way to determine when everything a function call does has finished.
It might accept a callback. It might return a promise. It might not provide any kind of method to determine when it is done. It might have side effects that you could monitor by polling.
You need to read the documentation and/or source code for that particular function.
Use async/await (promises), example:
var extract = require("./postextract.js");
var rescore = require("./standardaddress.js");
RunFunc();
async function extract_start() {
try {
extract.Start()
}
catch(e){
console.log(e)
}
}
async function rescore_start() {
try {
rescore.Start()
}
catch(e){
console.log(e)
}
}
async function RunFunc() {
await extract_start();
console.log("Extraction complete");
await rescore_start();
console.log("Scoring complete");
}

In Node.js design patterns unleashing zalgo why is the asynchronous path consistent?

In the great book i'm reading now NodeJs design patterns I see the following example:
var fs = require('fs');
var cache = {};
function inconsistentRead(filename, callback) {
if (cache[filename]) {
//invoked synchronously
callback(cache[filename]);
} else {
//asynchronous function
fs.readFile(filename, 'utf8', function(err, data) {
cache[filename] = data;
callback(data);
});
}
}
then:
function createFileReader(filename) {
var listeners = [];
inconsistentRead(filename, function(value) {
listeners.forEach(function(listener) {
listener(value);
});
});
return {
onDataReady: function(listener) {
listeners.push(listener);
}
};
}
and usage of it:
var reader1 = createFileReader('data.txt');
reader1.onDataReady(function(data) {
console.log('First call data: ' + data);
The author says that if the item is in cache the behaviour is synchronous and asynchronous if its not in cache. I'm ok with that. he then continues to say that we should be either sync or async. I'm ok with that.
What I don't understand is that if I take the asynchronous path then when this line var reader1 = createFileReader('data.txt'); is executed can't the asynchronous file read finish already and thus the listener won't be registered in the following line which tries to register it?
JavaScript will never interrupt a function to run a different function.
The "file has been read" handler will be queued until the JavaScript event loop is free.
The async read operation won't call its callback or start emitting events until after the current tick of the event loop, so the sync code that registers the event listener will run first.
Yes,I feel the same when read this part of the book.
"inconsistentRead looks good"
But in the next paragraphs I will explain the potential bug this kind of sync/async functions "could" produce when used (so it could not pass too).
As a summary, was happen in the sample of use is:
In an event cycle 1:
reader1 is created, cause "data.txt" isn't cached yet, it will respond async in other event cycle N.
some callbacks are subscribed for reader1 readiness. And will be called on cycle N.
In event cycle N:
"data.txt" is read and this is notified and cached, so reader1 subscribed callbacks are called.
In event cycle X (but X >= 1, but X could be before or after N): (maybe a timeout, or other async path schedule this)
reader2 is created for the same file "data.txt"
What happens if:
X === 1 : The bug could express in a no mentioned way, cause the data.txt result will attempt to cache twice, the first read, the more fast, will win. But reader2 will register its callbacks before the async response is ready, so they will be called.
X > 1 AND X < N: Happens the same as X === 1
X > N : the bug will express as explained in the book:
You create reader2 (the response for it is already cached), the onDataReady is called cause the data is cached (but you don't subscribe any subscriber yet), and after that yo subscribe the callbacks with onDataReady, but this will not be called again.
X === N: Well, this is an edge case, and if the reader2 portion run first will pass the same as X === 1, but, if run after "data.txt" readiness portion of inconsistentRead then will happen the same as when X > N
this example was more helpful for me to understand this concept
const fs = require('fs');
const cache = {};
function inconsistentRead(filename, callback) {
if (cache[filename]) {
console.log("load from cache")
callback(cache[filename]);
} else {
fs.readFile(filename, 'utf8', function (err, data) {
cache[filename] = data;
callback(data);
});
}
}
function createFileReader(filename) {
const listeners = [];
inconsistentRead(filename, function (value) {
console.log("inconsistentRead CB")
listeners.forEach(function (listener) {
listener(value);
});
});
return {
onDataReady: function (listener) {
console.log("onDataReady")
listeners.push(listener);
}
};
}
const reader1 = createFileReader('./data.txt');
reader1.onDataReady(function (data) {
console.log('First call data: ' + data);
})
setTimeout(function () {
const reader2 = createFileReader('./data.txt');
reader2.onDataReady(function (data) {
console.log('Second call data: ' + data);
})
}, 100)
output:
╰─ node zalgo.js
onDataReady
inconsistentRead CB
First call data: :-)
load from cache
inconsistentRead CB
onDataReady
when the call is async the onDataReady handler is set before file is read and in the async the the itration finishes before onDataReady is setting the listener
I think the problem can also be illustrated with a simpler example:
let gvar = 0;
let add = (x, y, callback) => { callback(x + y + gvar) }
add(3,3, console.log); gvar = 3
In this case, callback is invoked immediately inside add, so the change of gvar afterwards has no effect: console.log(3+3+0)
On the other hand, if we add asynchronously
let add2 = (x, y, callback) => { setImmediate(()=>{callback(x + y + gvar)})}
add2(3, 3, console.log); gvar = 300
Because the order of execution, gvar=300 runs before the async call setImmediate, so the result becomes console.log( 3 + 3 + 300)
In Haskell, you have pure function vs monad, which are similar to "async" functions that get executed "later". In Javascript these are not explicitly declared. So these "delayed" executed code can be difficult to debug.

Async.js - ETIMEDOUT and Callback was already called

I keep getting an ETIMEDOUT or ECONNRESET error followed by a Callback was already called error when I run index.js.
At first I thought it was because I was not including return prior to calling the onEachLimitItem callback. So I included it per the async multiple callbacks documentation. Still not solving it. I've also tried removing the error event and removing the callback to onEachLimit in the error event, but neither has worked. I've looked at the other SO questions around the issue of Callback already called, but because they aren't concerned with streams, I didn't find a solution.
My understanding is that if the stream encounters an error like ECONNRESET, it will return the callback in the error event and move on to the next stream, but this doesn't seem to be the case. It almost seems if the error resolves itself i.e. it re-connects and tries sending the errored steam up to Azure again and it works, then it triggers the 'finish' event, and we get the Callback already called.
Am I handling the callbacks within the stream events correctly?
var Q = require('q');
var async = require('async');
var webshot = require('webshot');
var Readable = require('stream').Readable;
var azure = require('azure-storage');
var blob = azure.createBlobService('123', '112244');
var container = 'awesome';
var countries = [
'en-us', 'es-us', 'en-au', 'de-at', 'pt-br', 'en-ca', 'fr-ca', 'cs-cz', 'ar-ly', 'es-ve',
'da-dk', 'fi-fi', 'de-de', 'hu-hu', 'ko-kr', 'es-xl', 'en-my', 'nl-nl', 'en-nz', 'nb-no',
'nn-no', 'pl-pl', 'ro-ro', 'ru-ru', 'ca-es', 'es-es', 'eu-es', 'gl-es', 'en-gb', 'es-ar',
'nl-be', 'bg-bg', 'es-cl', 'zh-cn', 'es-co', 'es-cr', 'es-ec', 'et-ee', 'fr-fr', 'el-gr',
'zh-hk', 'en-in', 'id-id', 'en-ie', 'he-il', 'it-it', 'ja-jp', 'es-mx', 'es-pe', 'en-ph'
];
var uploadStreamToStorage = function (fileName, stream, onEachLimitItem) {
var readable = new Readable().wrap(stream);
var writeable = blob.createWriteStreamToBlockBlob(container, fileName);
readable.pipe(writeable);
writeable.on('error', function (error) {
return onEachLimitItem.call(error);
});
writeable.on('finish', function () {
onEachLimitItem.call(null);
});
};
var takeIndividualScreenshot = function (ID, country, onEachLimitItem) {
var fileName = ID + '-' + country + '.jpg';
var url = 'https://example.com/' + country + '/' + ID;
webshot(url, function (error, stream) {
if (error) { throw 'Screenshot not taken'; }
uploadStreamToStorage(fileName, stream, onEachLimitItem);
});
};
var getAllCountriesOfId = function (ID) {
var deferred = Q.defer();
var limit = 5;
function onEachCountry(country, onEachLimitItem) {
takeIndividualScreenshot(ID, country, onEachLimitItem);
}
async.eachLimit(countries, limit, onEachCountry, function (error) {
if (error) { deferred.reject(error); }
deferred.resolve();
});
return deferred.promise;
};
var createContainer = function () {
var df = Q.defer();
var self = this;
blob.createContainerIfNotExists(this.container, this.containerOptions, function (error) {
if (error) { df.reject(error); }
df.resolve(self.container);
});
return df.promise;
};
createContainer()
.then(function () {
return getAllCountriesOfId('211007');
})
.then(function () {
return getAllCountriesOfId('123456');
})
.fail(function (error) {
log.info(error);
});
You are letting your callback get called twice, as you already know. The question is; do you want to stop on all errors as you are iterating the stream or do you want to accumulate all errors from the stream?
There are multiple ways to catch and handle the errors which you are already doing, but because you aren't throwing the error object leading to additional calls from your data stream to fatally error.
The actual problem in your code is due to the scope of your return. When you are handling the error and trying to return the callback and halt script execution the scope of hour return is local to the streams error handler, not the global script hence the script continuing and catching moving on to the next valid stream.
writeable.on('error', function (error) {
// This 'return' is in the local scope of 'writable.on('error')'
return onEachLimitItem.call(error);
});
It could perhaps set an array, then handle the error outside of that functions local scope. i.e.
// Set the array's scope as global to the writable.on() error
var errResults = [];
writeable.on('error', function (error) {
// Push the local scoped 'error' into the global scoped 'errResults' array
errResults.push(error);
});
writeable.on('finish', function () {
// Are there any errors?
return (errResults.length > 0) ?
onEachLimitItem.call(errors) : onEachLimitItem.call(null);
});
The above is just one way you could tackle the problem.
I am not sure if you have read the error handling help provided from Joyent (original node.js language backers) but it should give you a good idea of your options when handling the error(s).
https://www.joyent.com/developers/node/design/errors

node-mysql timing

i have a recursive query like this (note: this is just an example):
var user = function(data)
{
this.minions = [];
this.loadMinions = function()
{
_user = this;
database.query('select * from users where owner='+data.id,function(err,result,fields)
{
for(var m in result)
{
_user.minions[result[m].id] = new user(result[m]);
_user.minions[result[m].id].loadMinions();
}
}
console.log("loaded all minions");
}
}
currentUser = new user(ID);
for (var m in currentUser.minions)
{
console.log("minion found!");
}
this don't work because the timmings are all wrong, the code don't wait for the query.
i've tried to do this:
var MyQuery = function(QueryString){
var Data;
var Done = false;
database.query(QueryString, function(err, result, fields) {
Data = result;
Done = true;
});
while(Done != true){};
return Data;
}
var user = function(data)
{
this.minions = [];
this.loadMinions = function()
{
_user = this;
result= MyQuery('select * from users where owner='+data.id);
for(var m in result)
{
_user.minions[result[m].id] = new user(result[m]);
_user.minions[result[m].id].loadMinions();
}
console.log("loaded all minions");
}
}
currentUser = new user(ID);
for (var m in currentUser.minions)
{
console.log("minion found!");
}
but he just freezes on the while, am i missing something?
The first hurdle to solving your problem is understanding that I/O in Node.js is asynchronous. Once you know how this applies to your problem the recursive part will be much easier (especially if you use a flow control library like Async or Step).
Here is an example that does some of what you're trying to do (minus the recursion). Personally, I would avoid recursively loading a possibly unknown number/depth of records like that; Instead load them on demand, like in this example:
var User = function(data) {
this.data = data
this.minions;
};
User.prototype.getMinions = function(primaryCallback) {
var that = this; // scope handle
if(this.minions) { // bypass the db query if results cached
return primaryCallback(null, this.minions);
}
// Callback invoked by database.query when it has the records
var aCallback = function(error, results, fields) {
if(error) {
return primaryCallback(error);
}
// This is where you would put your recursive minion initialization
// The problem you are going to have is callback counting, using a library
// like async or step would make this party much much easier
that.minions = results; // bypass the db query after this
primaryCallback(null, results);
}
database.query('SELECT * FROM users WHERE owner = ' + data.id, aCallback);
};
var user = new User(someData);
user.getMinions(function(error, minions) {
if(error) {
throw error;
}
// Inside the function invoked by primaryCallback(...)
minions.forEach(function(minion) {
console.log('found this minion:', minion);
});
});
The biggest thing to note in this example are the callbacks. The database.query(...) is asynchronous and you don't want to tie up the event loop waiting for it to finish. This is solved by providing a callback, aCallback, to the query, which is executed when the results are ready. Once that callback fires and after you perform whatever processing you want to do on the records you can fire the primaryCallback with the final results.
Each Node.js process is single-threaded, so the line
while(Done != true){};
takes over the thread, and the callback that would have set Done to true never gets run because the thead is blocked on an infinite loop.
You need to refactor your program so that code that depends on the results of the query is included within the callback itself. For example, make MyQuery take a callback argument:
MyQuery = function(QueryString, callback){
Then call the callback at the end of your database.query callback -- or even supply it as the database.query callback.
The freezing is unfortunately correct behaviour, as Node is single-threaded.
You need a scheduler package to fix this. Personally, I have been using Fibers-promise for this kind of issue. You might want to look at this or another promise library or at async

Categories

Resources