In some modules I saw this strange way of initializing variable used in a callback.
This particular example is from mssql module:
var sql = require('mssql');
var connection = new sql.Connection(config, function (err) {
var request = new sql.Request(connection);
request.query('select 1 as number', function (err, recordset) {
// do something
});
});
What appears strange to me is that connection is used inside callback as if it is already initialized, and in fact it is.
However I would thought that callback should be run before function sql.Connection() does return. In fact there is no way to run anything after it returns.
So how does this thing work?
The callback is asynchronous, meaning it doesn't run immediately. Because of this, it gets placed in a queue and run whenever the interpreter isn't doing anything. For example, try this:
var connection = new sql.Connection(config, function(err) {
console.log('I run second');
});
console.log('I run first');
Related
I'm new to all of this web development programming. I'm currently learning for 1 or 2 months now. and trying to create a local database system using node and mongodb. I'm having trouble understanding the sequence of mongoose collection.find(). I want to run the collection.find() to get some data then pass it to another function but it seems the collection.find() always executes last no matter where i put the code.
My code is something like this:
function getInitial() {
ticket.find(function (err, tickets) {
totalTickets = tickets.length;
console.log("Done verifying tickets");
});
gentInitial();
function lastFunction() {
console.log("this should be the last to execute");
}
lastFunction();
The output of this code is:
"this should be the last to execute"
"Done verifying tickets"
Desired output is:
"Done verifying tickets"
"this should be the last to execute"
Find runs async, so the callback in getInitial() gets executed after everything else in the OP code. Fixing the code to account for the fact that the find runs asynchronously...
async function getInitial() {
let tickets = await ticket.find();
totalTickets = tickets.length;
console.log("Done verifying tickets");
}
function lastFunction() {
console.log("this should be the last to execute");
}
async function test() {
await gentInitial();
lastFunction();
}
test();
like you have discovered, callbacks are not synchronous. (ticket.find takes time to return data)
if you want to execute something after find has returned data then execute it from inside the callback.
function getInitial() {
ticket.find(function (err, tickets) {
totalTickets = tickets.length;
console.log("Done verifying tickets");
// execute the rest of the code here
lastFunction();
});
}
function lastFunction() {
console.log("this should be the last to execute");
}
getInitial();
Please note that this way of coding from callbacks to callbacks leads the the callback hell : http://callbackhell.com/
the only way to have clean code and avoid thinking in callbacks too much is my using promises and using the await syntax.
example :
// now it works sequentially without callbacks
const data = await Character.find();
console.log(data);
This is actually straight forward. You're using callback which run syncronously. While lastfunction() directly calls console.log().
Mongoose call goes through a few step before it can actually return the result of the query you run and because nodeJS is a single threaded language, it'll move on to the next code (lastfunction in your case) while keeping the syncronous call in memory. Then gives you the result once the data have been fetched.
You can use asyn...await to make Mongoose give you the result before moving on to the next code.
Your code should be like this:
const getInitial = async() => {
try{
const tickets = await ticket.find({});
const totalTickets = tickets.length;
console.log("Done verifying tickets");
} catch(e){
console.log(e)
}
}
getInitial();
function lastFunction() {
console.log("this should be the last to execute");
}
lastFunction();
It is an asynchronous function because it will take time to query database and report it back to you.
Also try leaning about event loop in node.js.
There is plenty sources online.
Img for reference:-
I have the following code:
function start() {
var function = this.buildUpModels;
console.log('async wordt gestart');
$q.all([Timeline.updateEvents(),Timeline.updateTimeslots(),Sponsors.updateSponsors(),Biography.updateBiography()]).then( function (rv){
console.log('async is done');
function ();
});
window.localStorage.setItem('check', 'done');
window.localStorage.setItem('planning', '[1,2]');
}
function buildUpModels() {
console.log('start buildUpModels')
Biography.buildObject();
Timeline.buildObject();
}
The following is one of the async update functions (all are coded similarly):
function updateBiography() {
return $http.get("pathToData")
.then(function (resp) {
console.log( "update bio")
window.localStorage.setItem('biography', resp.data);
window.localStorage.setItem('biographyTimeStamp', Date.now());
}, function (err) {
console.log('ERR', err);
});
}
As I understood it, both the console.log("async is done") and function() statements should be called after all promises within q.all() have been resolved. The functions are called before the async functions are resolved however. What exactly am I doing wrong?
I think I have found the answer! This problem arised purely because I was testing with ionic lab.
-Device nr.1 finds an empty localstorage and fills it. Then it starts building the objects. (And apparently it waits for the results as it should)
-Device nr.2 however finds a full localstorage (because of the other phone filling it) and then exeutes this.buildUpModels directly (as it is not called via the callback of an asynchronous function if the localstorage is not empty).
Tip: Watch out with testing code containing localstorage when using a testing tool like ionic lab
I'm using Meteor._wrapAsync to force only one call to the function writeMeLater to be
executing at any one time. If 10 calls to writeMeLater are made within 1 second, the other 9 calls should be queued up in order.
To check that writeMeLater is running synchronously, the timestamp field in the Logs Collection should be spaced 1 second apart.
Problem: With the following code, only the first call to writeMeLater is executed, the other 9 does not appear to run. Why is this so?
Server Code:
writeMeLater = function(data) {
console.log('writeMeLater: ', data)
// simulate taking 1 second to complete
Meteor.setTimeout(function() {
Logs.insert({data: data, timestamp: new Date().getTime()})
}, 1 * 1000)
}
writeMeLaterSync = Meteor._wrapAsync(writeMeLater)
// simulate calling the function many times real quick
for(var i=0; i<10; i++) {
console.log('Loop: ', i)
writeMeLaterSync(i)
}
Output:
=> Meteor server running on: http://localhost:4000/
I20140119-11:04:17.300(8)? Loop: 0
I20140119-11:04:17.394(8)? writeMeLater: 0
Using an alternate version of writeMeLater, I get the same problem:
writeMeLater = function(data) {
console.log('writeMeLater: ', data)
setTimeout(Meteor.bindEnvironment( function() {
Logs.insert({data: data, timestamp: new Date().getTime()})
}), 1 * 1000)
}
TL;DR - your writeMeLater function needs to take a callback parameter.
NodeJS classic asynchronous functions usually have this signature :
function async(params..., callback) {
try {
var result = compute(params);
callback(null,result);
}
catch {
callback("something went wrong", null);
}
}
They take any number of parameters, the last one being a callback to be run when the computation is ready, called with 2 parameters: error which is null if everything is OK, and the result of course.
Meteor._wrapAsync expects to be given a function with this signature to return a newly pseudo-synchronous function.
Meteor "synchronous" functions allows you to write code in a synchronous style, but they are not truly synchronous like NodeJS fs.readFileSync for example, which BLOCKS the event loop until it's done (usually this is bad unless you're writing a command-line app, which is not the case with Meteor).
Note: using NodeJS fs *Sync functions in Meteor is bad because you might be tricked into thinking they are "Meteor synchronous" but they aren't, they will block your entire node process until they're done ! You should be using fs async funcs wrapped with Meteor._wrapAsync.
A simplified clone of Meteor._wrapAsync would look like this:
var wrapAsync=function(asyncFunc) {
// return a function who appears to run synchronously thanks to fibers/future
return function() {
var future = new Future();
// take the arguments...
var args = arguments;
// ...and append our callback at the end
Array.prototype.push.call(args, function(error, result) {
if (error) {
throw error;
}
// our callback calls future.return which unblocks future.wait
future.return(result);
});
// call the async func with computed args
asyncFunc.apply(null, args);
// wait until future.return is called
return future.wait();
};
};
There is a Future.wrap which does exactly this, Meteor._wrapAsync is a bit more complicated because it handles Meteor environment variables by using Meteor.bindEnvironment.
Fibers and Futures are a bit out of scope so I won't dive into them, be sure to check eventedmind.com videos on the subject.
Introducing Fibers - https://www.eventedmind.com/feed/BmG9WmSsdzChk8Pye
Using Futures - https://www.eventedmind.com/feed/kXR6nWTKNctKariSY
Meteor._wrapAsync - https://www.eventedmind.com/feed/Ww3rQrHJo8FLgK7FF
Now that you understand how things need to be done to encapsulate async functions in Meteor, let's fix your code.
If your async function doesn't take a callback as last argument, it won't call it (obviously), and the callback we pass to it in the wrapped function won't trigger either, which means future.return won't be called and this is why your program is blocked in the first place !
You simply have to rewrite writeMeLater to take a callback as final argument :
var writeMeLater = function(data, callback){
console.log('writeMeLater: ', data);
// simulate taking 1 second to complete
Meteor.setTimeout(function() {
Logs.insert({
data:data,
timestamp:new Date().getTime()
});
callback(null, "done processing " + data);
}, 1 * 1000);
};
And you're good to go !
I currently have a database connection module containing the following:
var mongodb = require("mongodb");
var client = mongodb.MongoClient;
client.connect('mongodb://host:port/dbname', { auto_reconnect: true },
function(err, db) {
if (err) {
console.log(err);
} else {
// export db as member of exports
module.exports.db = db;
}
}
);
I can then successfully access it doing the following:
users.js
var dbConnection = require("./db.js");
var users = dbConnection.db.collection("users");
users.find({name: 'Aaron'}).toArray(function(err, result) {
// do something
});
However, if I instead export module.exports = db, i.e., try to assign the exports object to the db object instead of making it a member of exports, and try to access it in users.js via var db = require("./db.js"); the object is undefined, why?
If it is because there is a delay in setting up the connection (shouldn't require() wait until the module finishes running its code before assigning the value of module.exports?), then why do neither of these examples work?
one.js
setTimeout(function() {
module.exports.x = {test: 'value'};
}, 500);
two.js
var x = require("./one");
console.log(x.test);
OR
one.js
setTimeout(function() {
module.exports.x = {test: 'value'};
}, 500);
two.js
setTimeout(function() {
var x = require("./one");
console.log(x.test);
}, 1000);
Running $ node two.js prints undefined in both cases instead of value.
There are 3 key points to understand here and then I will explain them in detail.
module.exports is an object and objects are passed by copy-of-reference in JavaScript.
require is a synchronous function.
client.connect is an asynchronous function.
As you suggested, it is a timing thing. node.js cannot know that module.exports is going to change later. That's not it's problem. How would it know that?
When require runs, it finds a file that meets its requirements based on the path you entered, reads it and executes it, and caches module.exports so that other modules can require the same module and not have to re-initialize it (which would mess up variable scoping, etc.)
client.connect is an asynchronous function call, so after you execute it, the module finishes execution and the require call stores a copy of the module.exports reference and returns it to users.js. Then you set module.exports = db, but it's too late. You are replacing the module.exports reference with a reference to db, but the module export in the node require cache is pointing to the old object.
It's better to define module.exports as a function which will get a connection and then pass it to a callback function like so:
var mongodb = require("mongodb");
var client = mongodb.MongoClient;
module.exports = function (callback) {
client.connect('mongodb://host:port/dbname', { auto_reconnect: true },
function(err, db) {
if (err) {
console.log(err);
callback(err);
} else {
// export db as member of exports
callback(err, db);
}
}
)
};
Warning: though it's outside the scope of this answer, be very careful with the above code to make sure you close/return the connections appropriately, otherwise you will leak connections.
Yes, dbConnection.db is undefined because the connection is made asynchronously which means by definition the node code just continues to execute without waiting for the DB connection to be established.
shouldn't require() wait until the module finishes running its code before assigning the value of module.exports?
Nope, it just doesn't work that way. require is for code that is always there. Database connections aren't code and aren't always there. Best not to confuse these two types of resources and how to reference them from you program.
shouldn't `require() wait until the module finishes running its code
before assigning the value of module.exports?
module.exports.db is setting in callback, this operation is async, so in user.js you can't get db.collection.
It will be better to add collections in connect callback.
You can use this answer to change you code and use shared connection in other modules.
And what is the question? This is how require works - it gets the module synchronously and pass you the exports.
You suggestion to 'wait until code is run' could be answered two ways:
It waits until the code is run. The setTimeout has successfully finished. Learn to separate asynchronous callbacks aimed for future from the actual thread.
If you mean "until all of the asynchronous callbacks are run", that's nonsense - what if some of them is not run at all, because it wait for, I don't know, mouse click, but user does not have mouse attached? (and how do you even define 'all code has run?' That every statement was run at least once? What about if (true) { thisruns(); } else { thiswontrunever(); }?)
I am working with a WebSocket and trying to be able to send socket data at anytime from throughout my application. When I attempt to access the send command from within another function, I am receiving:
Uncaught InvalidStateError: An attempt was made to use an object that is not, or is no longer, usable.
This only is occuring when I call a function, this is how I am setting up my websocket:
Main.socket = (function() {
var socket = new WebSocket("ws://server:port");
socket.onopen = function() {
console.log("Socket has been opened!");
}
function send() {
socket.send('test');
}
return {
socket: socket,
send: send
}
})();
I am able to call the function globally, and also when I console.log Main.socket from within a function it is able to see the socket. But when I call the send function I get that error.
Here is an alternative solution to waiting for the web socket connection to come online, replace your call to :
function send() {
web_socket.send('test');
}
with this :
function send(msg) {
wait_for_socket_connection(socket, function() {
socket.send(msg);
});
};
function wait_for_socket_connection(socket, callback){
setTimeout(
function(){
if (socket.readyState === 1) {
if(callback !== undefined){
callback();
}
return;
} else {
console.log("... waiting for web socket connection to come online");
wait_for_socket_connection(socket,callback);
}
}, 5);
};
The problem is that the socket has not been opened yet. WebSocket.send cannot be used until the asynchronous onopen event occurs.
While using setTimeout (for a long enough duration) "should work", the correct way to deal with asynchronous JavaScript programming is to treat program flow as a sequence of dependent events.
In any case, here is a small example showing how to use a jQuery Deferred Object which (as of jQuery 1.8 isn't broken and honors the Promises/A contract):
Main.socket = (function($) {
var socket = new WebSocket("ws://server:port");
// Promise will be called with one argument, the "send" function for this
// socket.
var readyPromise = $.Deferred();
socket.onopen = function() {
console.log("Socket has been opened!");
readyPromise.resolve(socket.send)
}
return readyPromise;
})(jQuery);
Then later, in the code that uses this little module:
Main.socket.then(function (send) {
// This will only be called after `Promise.resolve` is called in the module
// which will be called in the `WebSocket.onopen` callback.
send("Hello world!");
})
// This code may or may not execute before the `then` function above
// depending upon the state the Promise/Deferred Object.
// However, we can get consistent program flow by using `then`-chaining
// of promises.
Of course you don't have to use Promises - callbacks will work just fine, although I prefer the unified contract/framework of Promises - and you can use whatever names or structure is most fitting.
Also, note that it might not be good to have a single WebSocket for the entire page lifecycle as this won't correctly handle disconnect and recovery scenarios.