Module.exports in JavaScript - javascript

What is different between module.exports = testMethod ; and module.exports = { testMethod } ; Because when I am using module.exports = testMethod ; it is throwing error as below.
Error: Route.get() requires a callback function but got a [object Undefined]
But I am okay with module.exports = { testMethod } ;
Whole codes are
const testMethod = asyncErrorWrapper(async (req, res, next) => {
const information = req.body;
const question = await Question.create({
title: information.title,
content: information.content,
user: req.user.id,
});
res.status(200).json({
success: true,
data: question,
});
});
module.exports = { testMethod };

From VSCode, change between the ES5 or ES6 version to Js can take you on a bad way.
So, be carreful, i have the same problem recently, and after refactoring by use on ES6 module.exports = router to the end of some Js file (Node project using express) it was done.
Strange for me, on Cloud9 on Aws i have no problems.

Both are worked to export your module to outside function. But when you are using any callback function with
module.exports = somectrl
then it will fail but
module.exports = { somectrl }
because when you create an object, it actually instanciate it but when you pass a ref function/ const function name then it will behave as a existing function which does not work right.
you can do something like this to work,
module.exports = somectrl()
or
module.exports = new somectrl()

Related

Unable to export db properties from nodejs module

I am trying to export database properties stored in properties file from Javascript module. By the time I read database properties file, Javascript file is already exported and data properties appear undefined wherever I use in other modules.
const Pool = require('pg').Pool;
const fs = require('fs')
const path = require('path');
class DbConfig {
constructor(dbData) {
this.pool = new Pool({
user: dbData['user'],
host: dbData['host'],
database: dbData['database'],
password: dbData['password'],
max: 20,
port: 5432
});
}
}
function getdbconf() {
const dbData = {};
fs.readFile("../../db_properties.txt"), 'utf8', (err, data) => {
if (err) {
console.error(err)
return
}
// dbData = {"user":"postgres", "password": "1234"...};
return dbData;
});
}
let db = new DbConfig(getdbconf());
let dbPool = db.pool;
console.log("dbpool : -> : ",dbPool); // username and password appear undefined
module.exports = { dbPool };
Is there a way to read data before exporting data from Javascript module?
Usually database config or any other sensitive info is read from a .env file using dotenv .
Or
you could also provide env from command line itself like
DB_HOST=127.0.0.1 node index.js
inside your index.js
console.log(process.env.DB_HOST)
Please create a new file (connection-pool.js) and paste this code:
const { Pool } = require('pg');
const poolConnection = new Pool({
user: 'postgresUserName',
host: 'yourHost',
database: 'someNameDataBase',
password: 'postgresUserPassword',
port: 5432,
});
console.log('connectionOptions', poolConnection.options);
module.exports = poolConnection;
For use it, create a new file (demo-connection.js) and paste this code:
const pool = require('./connection-pool');
pool.query('SELECT NOW();', (err, res) => {
if (err) {
// throw err;
console.log('connection error');
return;
}
if (res) {
console.log(res.rows);
pool.end();
}
});
This is an alternative option 🙂
Exporting the result of async calls
To export values which have been obtained asynchronously, export a Promise.
const fs = require('fs/promises'); // `/promise` means no callbacks, Promise returned
const dbDataPromise = fs.readFile('fileToRead')); //`readFile` returns Promise now
module.exports = dbDataPromise;
Importing
When you need to use the value,
const dbDataPromise = require('./dbdata');
async init() {
const dbData = await dbDataPromise;
}
//or without async, using Promise callbacks
init() {
dbDataPromise
.then(dbData => the rest of your code that depends on dbData here);
}
Current code broken
Please note that your current code, as pasted above, is broken:
function getdbconf() {
const dbData = {};
fs.readFile("../../db_properties.txt"), 'utf8', (err, data) => {
//[...] snipped for brevity
return dbData;
});
}
fs.readFile "returns" dbData, but there is nothing to return to, since you are in a callback which you did not call yourself. Function getdbconf returns nothing.
The line that says let db = new DbConfig(getdbconf()); will NOT work. It needs to be inside the callback.
The only way to avoid putting all of your code inside the callback (and "flatten" it) is to use await, or to use readFileSync
Avoiding the issue
Using environment variables
Suhas Nama's suggestion is a good one, and is common practice. Try putting the values you need in environment variables.
Using synchronous readFile
While using synchronous calls does block the event loop, it's ok to do during initialization, before your app is up and running.
This avoids the problem of having everything in a callback or having to export Promises, and is often the best solution.

Refactoring probot event functions into seperate file causes error: TypeError: handler is not a function

I have the vanilla probot event function from the docs that comments on new issues:
const probotApp = app => {
app.on("issues.opened", async context => {
const params = context.issue({ body: "Hello World!" });
return context.github.issues.createComment(params);
});
}
This works fine.
I refactor the code into a separate file:
index.js
const { createComment } = require("./src/event/probot.event");
const probotApp = app => {
app.on("issues.opened", createComment);
}
probot.event.js
module.exports.createComment = async context => {
const params = context.issue({ body: "Hello World!" });
return context.github.issues.createComment(params);
};
But I receive this error:
ERROR (event): handler is not a function
TypeError: handler is not a function
at C:\Users\User\probot\node_modules\#octokit\webhooks\dist-node\index.js:103:14
at processTicksAndRejections (internal/process/task_queues.js:97:5)
at async Promise.all (index 0)
When I create a test as recommended in the docs with a fixture and mock the event webhook call with nock this works fine. But when I create a real issue on GitHub this error is thrown.
How can I refactor the code into a separate file without causing the error?
This was my mistake.
This is the whole probot.event.js file:
module.exports.createComment = async context => {
const params = context.issue({ body: "Hello World!" });
return context.github.issues.createComment(params);
};
module.exports = app => {
// some other event definitions
}
By defining module.exports = app I overwrote the previous module.export. The createComment function was therefore never exported.
Removing module.exports = app = { ... } fixed it!

Nodejs pass log instance between modules

I’ve logger which I initiate using a constractor in the index.js file. Now I need
To pass the logger instance to other files, and I do it like this
index.js
const books = require(“./books”);
books(app, logger);
logger = initLogger({
level: levels.error,
label: “app”,
version: "0.0.1",
});
app.listen(port, () => logger.info(`listening on port ${port}`));
And inside the books.js file I use it like following, get the logger from the index.js file and use it
inside the books.js file, also pass it to another file with the function isbn.get(books, logger);,
Is it recommended to do it like this? Is there a cleaner way in nodes ?
books.js
const isbn = require(“./isbn”);
module.exports = async function (app, logger) {
…
try {
Let books = await getBooks();
logger.info(“get “books process has started”);
} catch (err) {
logger.error("Failed to fetch books", err);
return;
}
…
// this function is from the file “isbn” and I should pass the logger to it also
try {
let url = await isbn.get(books, logger);
} catch (e) {
res.send(e.message);
}
}
Try creating a module specifically for your logger configuration, then you can import that into your modules instead of using a side-effect of your business module to create a logger.
This will help if you ever need/want to change your logger configuration - instead of following a chain of business methods, you can just update the log configuration.
Example
logger.js
'use strict';
// Any setup you need can be done here.
// e.g. load log libraries, templates etc.
const log = function(level, message) {
return console.log(level + ": " + message);
};
module.exports = log;
business-logic.js
'use strict';
var log = require('./logger');
var stuff = require('./stuff');
const do_stuff = function (thing) {
// do stuff here
log("INFO", "Did stuff");
}
This is a pretty clean way of doing it, however it could be awkward when trying to share more variables or adding more requires. So, you could put all the variables in an object and destructure only the variables you need in books.js:
index.js:
const state = {app, logger, some, other, variables};
require("./books")(state);
require("./another_file")(state);
books.js:
module.exports = async function ({app, logger}) {
};

What am I doing wrong in my module export from main.js?

Learning how to develop modules I'm trying to learn how to export one from main.js. On my renderer I can send it correctly to main with:
renderer.js:
let _object = {
foo: foo1,
bar: bar1,
}
ipcRenderer.send('channel', _object)
in main.js I can get this correctly:
ipcMain.on('channel', (e, res) => {
console.log(JSON.stringify(res))
console.log(typeof res)
})
however when I export the result from main.js and try to bring it into another file I get an undefined:
main.js:
const foobar = require('./foobar')
ipcMain.on('channel', (e, res) => {
console.log(JSON.stringify(res))
console.log(typeof res)
module.exports.res = res
foobar.testing()
})
foobar.js:
const res = require('./main')
module.exports = {
testing: function(res) {
console.log(`Attempting console.log test: ${res}`)
console.log(res)
console.log(JSON.stringify(res))
}
}
terminal result:
Attempting console.log test: undefined
undefined
undefined
I've also tried to redefine the object in main.js:
ipcMain.on('channel', (e, res) => {
module.exports = {
foo: foo,
bar: bar,
}
console.log(`Testing object ${res.foo}`)
foobar.testing()
})
My research of reference:
Node js object exports
Node module.exports returns undefined
What a I doing wrong in my export of the result in main.js so I can use it in a different file?
Edit:
My end goal is to learn how to be able to call res.foo in foobar.js.
First of all, you have an argument res in your testing function, which shadows the import. Second, the imported res object is all exports from main, which includes the res you need - so you should print res.res instead of the whole object.
Foobar.js:
const res = require('./main')
module.exports = {
testing: function() {
console.log(`Attempting console.log test: ${res.res}`)
console.log(res.res)
console.log(JSON.stringify(res.res))
}
}
The last version (where you reassign module.exports) would not work, because foobar will still have the original exports, which was an empty object.

Summon Dependency injenction and calling an method/function inside a module

Thank you in advance for your help.
I am unsure how to do the following. I have a module to send emails, and the Config is injected into the module using summon.js dependency injection, but I need to use the sendMail method and pass it the parameter mailOptions. Here is the code example:
'use strict';
const nodemailer = require('nodemailer');
const ejs = require('ejs');
const fs = require('fs');
module.exports = function(Configs) {
// create reusable transporter object using the default SMTP transport
let transporter = nodemailer.createTransport({
host: Configs.email.host,
port: Configs.email.port,
auth: {
user: Configs.email.user,
pass: Configs.email.pass
}
});
this.sendMail = function(mailOptions) {
mailOptions.to = Configs.mockEmail || mailOptions.to
mailOptions.from = Configs.email.user
return new Promise((resolve, reject) => {
if (mailOptions.template) {
ejs.renderFile('/../templates/' + mailOptions.template +
'.ejs', mailOptions.data, null, (err, html) => {
if (err) {
return reject(err)
}
resolve(html)
})
return
}
resolve()
}).then(html => {
mailOptions.html = html || mailOptions.html
return new Promise((resolve, reject) => {
// send mail with defined transport object
transporter.sendMail(mailOptions, (error, info) => {
if (error) {
return reject(error)
}
resolve(info)
})
})
})
}
return this
}
Then, I want to make use of this module:
const EmailUtil = require('email')
async function foo() {
// Do something async with await.
const mailOptions = {...}
EmailUtils.sendMail(mailOption);
}
However, it gives me the error:
TypeError: EmailUtils.sendMail is not a function
Note: I can remove the module.export = function(Configs) but them that will not be good since i would need to hard code the path of my config file and I have multiple configuration files for each environment. Then, I want to be able to keep Summon.js dependency injection while calling sendMail from another module. Thanks
Any ideas??
Thank you!
Since you're exporting a function you need to actually call it after requiring the module:
const EmailUtil = require('email')
async function foo() {
// Do something async with await.
const mailOptions = {...}
EmailUtil().sendMail(mailOption);
}
The use of this suggests that a function is supposed to be used as a constructor. There will be desired this object only if a function is called with new or bound to some context.
There is a convention in JavaScript to use pascal-cased names for constructors, so they could be identified unambiguously in the code.
For given EmailUtil, it should be:
const EmailUtil = require('email');
const emailUtil = new EmailUtil(config);
...
emailUtil.sendMail(mailOption);
I answer this by including EmailUtils in the depend.json file which takes care of defining the dependencies for summonjs. In this way, I was able to pass the Configs to EmailUtils and call sendMail in such way.
EmailUtils.sendMail(mailOptions);
There was no need to use the keyword new, which is a great answer. I was not aware a module could be instantiated in this way.

Categories

Resources