extending a module when module is using "module.exports" - javascript

I've read a few pages on extending a module.They revolve around using a functional form of a module and I get how to do it (from https://toddmotto.com/mastering-the-module-pattern/)
var Module = (function () {
return {
publicMethod: function () {
// code
}
};
})();
but what I have is two modules like this
util.js
module.exports = {
thing1: function() {// do thing1 stuff }
}
extend.js a package I can't change (from npm)
module.exports = {
thing2: function() {// do thing2 one stuff}
}
now pretending I am going to use my util.js module
const _ = require('util.js);
let catin = _.thing1; // that's easy
let thehat = _.thing2;. // this was util.js extended.
I could in util.js just do this.
const ex = require('extend.js')
module.exports = {
thing1: function() {// do thing1 stuff }
thing2: ex.thing2
}
and that's ok since extend.js only has one function/method to extend, but I would like to extend this into my util library https://github.com/dodekeract/bitwise/blob/master/index.js but it has 22! items to extend.
There must be a better slicker way yes?
I'm open to refactoring my util.js file (but not hand coding each extension like I showed) so it extends automatically but obviously can't refactor that package I'm not maintaining, short of a fork...ugh. Also not interested in adding a sub deal like
ex: ex
_.ex.thing2
Ideas?

So given Molda's hint I'll share what I put together to make this question more useful for others. I put together a simple way of building a (utility) module from a folder of (utility) modules plus other one off packages (e.g. bitwise)
Make a utils.js module in say lib/ with this (you'll need require-all or some such package)
let utils = require('require-all')(__dirname + '/util');
let bw = require('bitwise');
let self = module.exports = (function(){
let util={};
for (var key in utils) {
util = utils.object.merge_keys(util,utils[key])
}
util = utils.object.merge_keys(util,bw)
return util;
}());
now make a subdirectory lib/util/ and fill it with your utility modules. Have one of those modules contain this key/function
merge_keys: function (obj1,obj2){
var obj3 = {};
for (var attrname in obj1) { obj3[attrname] = obj1[attrname]; }
for (var attrname in obj2) { obj3[attrname] = obj2[attrname]; }
return obj3;
}
and be sure that module name matches the key used in this line util = utils.object.merge_keys(util,utils[key]). In my case I have a module object.js in lib/util/ containing merge_keys
Then just require the utils.js module and all will be as one including access to the merge_keys function for other merging :-).
let _ = require('./lib/utils');
// see if it worked
console.log(_);
beware: there is no checking for duplicate key names between modules
notes:
let self= allows one to refer to any other key within the merged object itself like self.myfunctionkeyname( )

Related

Pass data to custom npm package module

I'm new to building custom npm packages and I'm getting lost configuring it with data coming from the application it is using it.
EDIT: This is just an example but the app will have more methods and those fake a and b will be used from many of those methods.
Basically on App I'm requiring my package in this way:
var a = 'a';
var b = 'b';
var module = require('module')(a, b);
module.test();
My module in his index file has:
var a;
var b;
function test() {
return {
a: a,
b: b
};
}
module.exports = function(_a, _b) {
a = _a;
b = _b;
return {
test: test
}
};
As you can guess it is not working as I was expecting... How can I pass my custom data to my npm package and be able using that data along my methods?
shouldnt you use it something like this
var a = 'a';
var b = 'b';
var module = require('module');
module.init(a,b);
// do some other code....
module.test();
and in your module like this:
var _a = null;
var _b = null;
var test = function() {
return {
a: _a,
b: _b
}
}
var init = function(a, b) {
_a = a;
_b = b;
}
module.exports = {
init,
test
};
Your definition of a module is absolutely fine but it depends on how you plan on using it!
If you would like to use it as though it were published on the npm-registry then you need to use the process described here: Installing a local module using npm?
I know that this is an example but for other readers - you shouldn't use the name module as this is already in use by Node.
If you are simply using a modular approach to producing a larger app and you want to use the exports of your file, then you require the file by pointing to it. For example, if this file was called module.js and is in the same directory as the script which is requiring it then use:
var myModule = require('./module.js')(a, b);
If you have it in another directory then use the normal relative path navigation syntax like ../module.js if it is up one directory or ./lib/module.js if it is in a sub-directory called lib

How to export and import two different function objects in JavaScript?

I use Jasmine-Node to test Javascript code.
How can one export two different function objects like Confusions1 and Confusions2 so that both are available in the Jasmine-Node test file?
My attempt looks like this:
// confusions.js
var Confusions1 = function() {};
Confusions1.prototype.foo = function(num) {
// keep track of how many times `foo` is called
this.count++;
}
module.exports = Confusions1;
// second function object
var Confusions2 = function() {};
Confusions2.prototype.foo = function() {
var a = 2;
this.bar();
};
Confusions2.prototype.bar = function() {
return this.a;
}
module.exports = Confusions2;
And my Jasmine Test file:
// confusions.spec.js
var Confusion = require('./confusions.js');
describe("chapter 1, common misconception ", function() {
describe("to assume `this` refers to the function itself: ", function() {
var confusion = new Confusion();
// some test code omitted
});
describe("to assume `this` refers to the function's scope", function() {
var confusion = new Confusion();
// test code omitted
});
});
I want it so that Confusions1 and Confusions2 from confusions.js are both usable in the two nested describe blocks within confusions.spec.js
I assume that I have to somehow initialize different objects after requiring the file var Confusion = require('./confusions.js');
Something like var confusion1 = new Confusions1(); var confusion2 = new Confusions2(); But how exactly (without splitting both Objects in two separate files)?
So you want to have a module that behaves like a container of exported values, in another words you want to export two functions:
// foo.js
var Confusion1 = function() {}
var Confusion2 = function() {}
...
exports.Confusion1 = Confusion1
exports.Confusion2 = Confusion2
Wherever you need this module, you could require this stuff like:
// bar.test.js
var confusions = require('path-to-the-foo-file')
console.log(confusions.Confusion1)
console.log(confusions.Confusion2)
Also as it seems you should check how module system that you are using works in general, check this answer: module.exports vs exports in Node.js

How can a module in node.js maintain state?

I have two different js files that use the same module.
file1.js:
var mod1 = require('commonmodule.js');
mod1.init('one');
file2.js:
var mod2 = require('commonmodule.js');
mod2.init('two');
(both these files file1.js, file2.js are loaded inside my server.js file, they themselves are modules)
now in commonmodule.js:
var savedName;
exports.init = function(name)
{
savedName = name;
}
exports.getName = function()
{
return savedName;
}
I noticed that this savedName is always overridden dependent on who set it last.So it doesn't seem to work. How would I get a module to maintain state?
Note: I also tried to set savedName as exports.savedName in the commonmodule.js but it doesn't work either
You can just create a new instance every time the module is required:
commonmodule.js
function CommonModule() {
var savedName;
return {
init: function(name) {
savedName = name;
},
getName: function() {
return savedName;
}
};
}
module.exports = CommonModule;
file1.js
var mod1 = new require('./commonmodule')();
mod1.init('one');
console.log(mod1.getName()); // one
file2.js
var mod2 = new require('./commonmodule')()
mod2.init('two');
console.log(mod2.getName()); // two
modules in and of themselves are simple object instances. A single instance will be shared by all other modules (with the caveat that it is loaded via the same path). If you want state, use a class and export a constructor function.
example:
//Person.js
function Person(name) {
this.name = name;
}
module.exports = Person;
To use it:
var Person = require("./Person");
var bob = new Person("Bob");
Modules are not like classes/class functions; by default, mod1 and mod2 will refer to the same module due to caching. To keep track of per-instance state, you'll need a constructor function or something similar inside your module, e.g.
var mod = require('commonmodule.js');
var state = new mod.init('one');
Where init defines the stateful object. You could also have it return an object literal, in which case you wouldn't have to use new (e.g. var state = require('commonmodule.js').init('one');)
(This is assuming you want the module to have other, shared state in addition to the per-instance state; if that is not the case, Peter Lyons' method would be simpler.)
You could perhaps remove from cache your module. Like that:
file1.js:
var mod1 = require('commonmodule.js');
mod1.init('one');
file2.js:
delete require.cache[require.resolve(modulename)];
var mod2 = require('commonmodule.js');
mod2.init('two');
But I don't find it very convenient and clean.
But you could also clone it or make a small proxy.
Also you could create classes:
var savedName;
exports.obj = {}
exports.obj.prototype.init = function(name)
{
savedName = name;
}
exports.obj.prototype.getName = function()
{
return savedName;
}
Then :
var mod2 = new (require('commonmodule.js'))();
mod2.init('two');

Global object with functions to be overriden by each project JS?

I'm wondering how I should design my javascript files.
I will have a global.js file which will be used for all projects. Then each project will have it's own project.js file, containing specific functions/overrides/settings just for that project.
So I'll want to write all my "global" functions in the global.js file:
Global = function() {
var config = {'alpha': 1};
function getConfig() {
return this.config;
}
function printConfig() {
console.log(this.getConfig());
}
};
Global.prototype.echoConfig = function() {
console.log(this.getConfig());
};
and I guess my project.js file, would look like:
var project = new Global();
Global.prototype.projFunc = function() { return 2; };
However, I haven't figured out how to get the config from global.js ?
I'm using jQuery, and have noted there's the $.extend function that looks nice, however I'd like to first set-up the structure for my global.js and project.js - in general I'd probably want to move most functions from project.js into global.js, but there might be one or two projects that only need 1 specific function for that application.
you need to have getConfig in a public scope, and since you have config declared in a "private" way, you cannot use this.config to get the config, just use config.
Global = function() {
var config = {'alpha': 1};
this.getConfig = function() {
return config;
}
function printConfig() {
console.log(this.getConfig());
}
};
Scope Tutorial

Node.js double call to require()

//lib.js
var opt = 0
exports.set = function(arg) {
opt = arg
}
exports.prn = function() {
console.log(opt)
}
///prog.js
var lib = require('./lib')
var lib2 = require('./lib')
lib.set(222)
lib2.set(333)
lib.prn()
lib2.prn()
prog.js will output:
333
333
but I need it to output:
222
333
In ohter words, opt must be unique to variable lib and to variable lib2. How to achieve that?
That's because normally nodejs caches its modules which are got via require. You may use the following helper:
// RequireUncached.js
module.exports = function(module) {
delete require.cache[require.resolve(module)]
return require(module);
}
and the usage of the helper:
var requireUncached = require('RequireUncached.js');
requireUncached("./lib");
Have in mind that this approach is considered as bad practice and should not be used. I'll suggest to wrap your logic into a function, require the module and call the function. So, every time you get a new instance.
require will not load scripts multiple times, but always yield the same instance.
If you need different environments, make your module a constructor function that allows to be instantiated multiple times. Store opt on each object for that instead of in the (global) module scope.
// lib.js
module.exports = function constr() {
var opt = 0
this.set = function(arg) {
opt = arg
};
this.print = function() {
console.log(opt)
};
};
// prog.js
var lib = require('./lib'),
inst1 = new lib(),
inst2 = new lib();
/* or short:
var inst1 = new require('./lib')(),
inst2 = new require('./lib')(); */
inst1.set(222)
inst2.set(333)
inst1.print()
inst2.print()
The way the NodeJS module system works, the output is correct and your expectations contradict the design principle here.
Each module is loaded once and only once, and subsequent calls to require simply return the reference to the pre-existing module.
Maybe what you need to do is create a class you can create one or more instances of instead of using module-level globals.
Adding to Bergi's answer, You may also try it like
// prog.js
var lib = require('./lib')(),
lib2 = require('./lib')();
lib.set(222)
lib2.set(333)
lib.print()
lib2.print()
// lib.js
module.exports = function constr() {
var opt = 0
return { set : function(arg) {
opt = arg
},
print : function() {
console.log(opt)
}
}
};
Add this line as first line of your lib.js
delete require.cache[__filename]
now your module becomes in a separate namespace each time you require it.

Categories

Resources