I have function in unbound.js with the the following code
export default async function connect({ mongoose: mongoose }, URI) {
console.log('in connect');
mongoose.connect(URI);
mongoose.Promise = global.Promise;
});
}
I then have another index.js to deal with dependency injection which looks like this
module.exports = async url => {
return await require("./unbound").default.bind(
null,
{
mongoose: require("mongoose")
},
url
);
};
The only thing I am doing different to plain vanilla dependency injection is to pass the URL as an argument.
When I call the export from index.js I get no response. This is confirmed by console.lognot outputting
Any guidance on how I could resolve this ?
Since chat restricted, I'm gonna post the answer here instead.
In this snippet, you export a function that when invoked, return another function
module.exports = async url => {
return await require("./unbound").default.bind(
null,
{
mongoose: require("mongoose")
},
url
);
};
So if you want to actually run it, you have to invoke it twice like require('./')()() for example
As others have suggested, bind returns a bound function that you can call, it does not actually call the function - that is what .apply or .call does. #ptdien's solution is somewhat correct, but it won't work because you've forgotten to return the promise that mongoose.connect returns so your connect function returns undefined, so there is nothing the the caller to await. I.e. you need to do this:
export default function connect({ mongoose: mongoose }, url) {
mongoose.Promise = global.Promise;
return mongoose.connect(url);
}
(Also note that I've removed the async keyword as that is not necessary when we are not using await - the code returns a promise already.
Also, bind will automatically forward arguments after the bound ones (i.e the url in this case), so you can simplify your code to this:
module.exports = require("./unbound").default.bind(
null,
{
mongoose: require("mongoose")
}
);
By the way, the reason you have to append .default is because you are mixing node.js requires and ES6 imports. Pick one and stick to it.
Related
While using the Google Cloud's Firestore Emulator, I'm calling the following method:
global.db.runTransaction(async () => 100)
This dummy call works when executed using node, but fails when executed using jest inside a test function. When running it from Jest, the runTransaction method throws:
Error: You must return a Promise in your transaction()-callback.
referencing node_modules/#google-cloud/firestore/build/src/transaction.js:362:27
That file has the following snippet of code, which is causing the failure:
async runTransaction(updateFunction, options) {
// ...
const promise = updateFunction(this);
if (!(promise instanceof Promise)) {
throw new Error('You must return a Promise in your transaction()-callback.');
}
// ...
}
In other words, when running in Jest, the library considers async () => 100 function not to return a promise.
I changed the library code to add some debug messages:
async runTransaction(updateFunction, options) {
// ...
const promise = updateFunction(this);
if (!(promise instanceof Promise)) {
// *** Added these debug messages ***
console.log(updateFunction);
console.log(promise);
console.log(promise instanceof Promise);
throw new Error('You must return a Promise in your transaction()-callback.');
}
// ...
}
And I am getting the following:
[AsyncFunction (anonymous)]
Promise { 100 }
false
To complicate matters, this test passes as expected within jest:
test('promise', () => {
const fn = async () => 100;
const res = fn();
expect(res instanceof Promise).toEqual(true);
}
I'm a bit at a loss here as to why executing this library code with Jest fails compared to Node, while executing a similar statement with Jest in the code snippet just above succeeds.
Any suggestions on how to approach this head-scratcher?
EDIT
Initializing a local db variable instead of global.db solves the issue.
Originally, I had the global.db set in jest.config.js like so:
module.exports = async () => {
const Firestore = require("#google-cloud/firestore");
global.db = new Firestore({ projectId: process.env.GOOGLE_CLOUD_PROJECT });
return {}
}
The application code works by relying on global.db that is set through an initialization file, and I wanted to mimic this for the tests.
I guess the question is, then, how should one go about initializing a global db variable that can be used in all tests?
ANSWER
Ended up creating a setup.js file and loaded it with setupFilesAfterEnv per the docs. Content:
const Firestore = require("#google-cloud/firestore/build/src");
global.db = new Firestore({ projectId: process.env.GOOGLE_CLOUD_PROJECT });
Now things are working properly. Other solutions, such as using the globals entry in the object exported in jest.config.js (not shown in code snippets above) didn't work as per the documentation the values must be JSON-serializable, which is not the case for Firestore.
I'm really confused what's going on here:
when I use the following code I get my expected product
const newProd = await models.Product.findOne({_id: new Buffer(util.HexUUIDToBase64(newProductData._id), 'base64').toString('hex')})
.populate({
path: 'SellerID',
populate: { path: 'UserID' }})
console.log(newProd)
However when I use this code then I get nothing for my product:
const newProd = await models.Product.findOne({_id: new Buffer(util.HexUUIDToBase64(newProductData._id), 'base64').toString('hex')})
.populate({
path: 'SellerID',
populate: { path: 'UserID' }}).exec(function (err, product) {
console.log(err)
console.log(product)
})
console.log(newProd)
I would expect my product to be passed to the callback but it's not. What exactly is going on here and how can I fix it?
find() and exec() serve two different purposes.
The find() method is a database READ operation enabling function. It is present in both the native mongodb driver for node, as well as the Mongoose library which internally uses the mongodb drivers and is especially useful for imposing a fixed schema.
Now, in mongodb driver, if you use find(query), the query is automatically executed where as in mongoose it wouldn't. We need helper functions/ callbacks to get the operation to execute. exec is one such helper function. It goes something like:
myPlaylist.findOne({'singer':'Adam Levine'}).exec()
Mongoose queries are not promises. They have a .then() function for convenience.
If you need a fully-fledged promise, use the .exec() function.
So, you could do a myPlaylist.findOne({'singer':'Adam Levine'}).then() but that would return a Mongoose/BlueBird (yet another library) promise and a typical JavaScript one.
Note: exec() takes an optional callback function. It's only when you do not use a callback that you get back a Promise.
I'm building an api with NextJS and MongoDB. I do a basic setup on the top of the API file:
const { db } = await connectToDatabase();
const scheduled = db.collection('scheduled');
and I continue the code with my handler function:
export default async function handler(req, res) {
otherFunctionCalls()
...
}
const otherFunctionCalls = async () => {
...
}
I know await will work only within an async function, but I would like to use the scheduled constant in other functions what the handler calls, that's why I need to call it on the top.
If I put the constant to every single function, then it's code duplication.
What's the best practice to access to scheduled constant? Should I add the otherFunctionCalls declaration into the handler function?
The complete error what I got:
Module parse failed: The top-level-await experiment is not enabled (set experiments.topLevelAwait: true to enabled it)
File was processed with these loaders:
* ./node_modules/next/dist/build/babel/loader/index.js
You may need an additional loader to handle the result of these loaders.
Error: The top-level-await experiment is not enabled (set experiments.topLevelAwait: true to enabled it)
In the comments you've said you want to find another solution rather than enabling the top-level await experiment in the tool you're using.
To do that, you'll have to adjust the module code to handle the fact that you don't have the scheduled collection yet, you just have a promise of it. If the only exported function is handler and all of the other functions are going to be called from handler, they it makes sense to handle that (no pun!) in handler, along these lines:
// A promise for the `scheduled` collection
const pScheduled = connectToDatabase().then(db => db.collection("scheduled"));
export default async function handler(req, res) {
const scheduled = await pScheduled;
await otherFunctionCalls(scheduled);
// ...
}
const otherFunctionCalls = async (scheduled) => {
// ...use `scheduled` here...
};
There are lots of ways you might tweak that, but fundamentally you'll want to get the promise (just once is fine) and await it to get its fulfillment value anywhere you need its fulfillment value. (For the avoidance of double: await doesn't re-run anything; if the promise is already fulfilled, it just gives you back the fulfillment value the promise already has.)
The /api/user/ request to make $auth.loggedIn available happens after client already loads/started loading. This makes some of my modules render and then disappear after $auth.loggedIn has turned true/false, making everything look messy on page load.
It would be ideal if the call to /api/user/ happened in the same way an asyncData request works so that $auth.loggedIn is available before anything renders. Is there any setup/config that makes this possible ? I'm using the universal mode.
You can use the middleware defined in Nuxt here
In your case you could do your request on /api/user before like that, in a middleware called userLoggedIn for example.
export default function (context) {
// Here we return a Promise in the middleware to make it asynchronous
// Cf doc https://nuxtjs.org/guide/routing#middleware
// So we return a promise that can only be resolved when the function resolve is called
return new Promise(resolve => this.$axios.$get('/api/user'/).then((user) => {
if (!user) {
console.log('NOT LOGGED');
// Do something if not logged, like redirect to /login
return context.redirect('/login');
}
resolve(user);
}));
}
I made a proposition with $axios but of course you can change it. You just need to keep the middleware and the promise to have an asynchronous middleware.
Then on your page waiting for the query, you can add:
export default {
middleware: [
'userLogged',
],
data() {...}
};
I currently have a database connection module containing the following:
var mongodb = require("mongodb");
var client = mongodb.MongoClient;
client.connect('mongodb://host:port/dbname', { auto_reconnect: true },
function(err, db) {
if (err) {
console.log(err);
} else {
// export db as member of exports
module.exports.db = db;
}
}
);
I can then successfully access it doing the following:
users.js
var dbConnection = require("./db.js");
var users = dbConnection.db.collection("users");
users.find({name: 'Aaron'}).toArray(function(err, result) {
// do something
});
However, if I instead export module.exports = db, i.e., try to assign the exports object to the db object instead of making it a member of exports, and try to access it in users.js via var db = require("./db.js"); the object is undefined, why?
If it is because there is a delay in setting up the connection (shouldn't require() wait until the module finishes running its code before assigning the value of module.exports?), then why do neither of these examples work?
one.js
setTimeout(function() {
module.exports.x = {test: 'value'};
}, 500);
two.js
var x = require("./one");
console.log(x.test);
OR
one.js
setTimeout(function() {
module.exports.x = {test: 'value'};
}, 500);
two.js
setTimeout(function() {
var x = require("./one");
console.log(x.test);
}, 1000);
Running $ node two.js prints undefined in both cases instead of value.
There are 3 key points to understand here and then I will explain them in detail.
module.exports is an object and objects are passed by copy-of-reference in JavaScript.
require is a synchronous function.
client.connect is an asynchronous function.
As you suggested, it is a timing thing. node.js cannot know that module.exports is going to change later. That's not it's problem. How would it know that?
When require runs, it finds a file that meets its requirements based on the path you entered, reads it and executes it, and caches module.exports so that other modules can require the same module and not have to re-initialize it (which would mess up variable scoping, etc.)
client.connect is an asynchronous function call, so after you execute it, the module finishes execution and the require call stores a copy of the module.exports reference and returns it to users.js. Then you set module.exports = db, but it's too late. You are replacing the module.exports reference with a reference to db, but the module export in the node require cache is pointing to the old object.
It's better to define module.exports as a function which will get a connection and then pass it to a callback function like so:
var mongodb = require("mongodb");
var client = mongodb.MongoClient;
module.exports = function (callback) {
client.connect('mongodb://host:port/dbname', { auto_reconnect: true },
function(err, db) {
if (err) {
console.log(err);
callback(err);
} else {
// export db as member of exports
callback(err, db);
}
}
)
};
Warning: though it's outside the scope of this answer, be very careful with the above code to make sure you close/return the connections appropriately, otherwise you will leak connections.
Yes, dbConnection.db is undefined because the connection is made asynchronously which means by definition the node code just continues to execute without waiting for the DB connection to be established.
shouldn't require() wait until the module finishes running its code before assigning the value of module.exports?
Nope, it just doesn't work that way. require is for code that is always there. Database connections aren't code and aren't always there. Best not to confuse these two types of resources and how to reference them from you program.
shouldn't `require() wait until the module finishes running its code
before assigning the value of module.exports?
module.exports.db is setting in callback, this operation is async, so in user.js you can't get db.collection.
It will be better to add collections in connect callback.
You can use this answer to change you code and use shared connection in other modules.
And what is the question? This is how require works - it gets the module synchronously and pass you the exports.
You suggestion to 'wait until code is run' could be answered two ways:
It waits until the code is run. The setTimeout has successfully finished. Learn to separate asynchronous callbacks aimed for future from the actual thread.
If you mean "until all of the asynchronous callbacks are run", that's nonsense - what if some of them is not run at all, because it wait for, I don't know, mouse click, but user does not have mouse attached? (and how do you even define 'all code has run?' That every statement was run at least once? What about if (true) { thisruns(); } else { thiswontrunever(); }?)