Allow modules access to functions/variables defined in server that requires them? - javascript

Here's a basic example of what I'm trying to do:
ModuleA.js
module.exports = {
doX () {
console.log(data['a']);
}
}
ModuleB.js
module.exports = {
doX () {
console.log(data['b']);
}
}
server.js
let data = { a:'foo', b:'bar' };
let doX = {};
doX['a'] = require('./ModuleA.js').doX;
doX['b'] = require('./ModuleB.js').doX;
doX['a'](); // Should print 'foo'
doX['b'](); // Should print 'bar'
In the actual implementation there would be many more variables to pass in than just data, so passing that to the functions isn't a viable solution.
This almost works, but the functions in the modules need access to functions and variables at the top level of the server file. I know I could global.variable all of my variables and functions but I'd rather not, as I've only seen people recommend against that. Of course I could pass every single variable and function in each function call, but that would look ridiculous and brings up way too many potential problems. I was hoping I could pass a reference to the server's namespace, by passing this or something, but that didn't work. I could register every function and variable on some object and pass that around, but that's inconvenient and I'm trying to refactor for convenience and organization. I think I could read in the module files and eval them, as seen here, but I would much rather use the standard module.exports system if possible.

I'll summarize my comments into an answer.
Your data variable is local to server.js and is not accessible to your other two modules. I'd suggest you pass it to them when you load those modules as a means of sharing it with them. That design pattern is typically called a "module constructor" if you want to read more about it.
Passing data from one module to another is how you achieve shared data with separate modules without using globals. That's how you do it. Since you've now rejected the usual design pattern, there's not much else we can do without understanding a lot more about the real problem so we can go further outside your box and suggest a better design than the path you're down.
Abstracting hardware to have a common set of methods sounds like a perfect fit for subclasses where each piece of hardware has its own subclass, all with the same interface. Shared data could be in the base class.
You can pass a lot of variables at once if you make them properties of an object and pass just the object. Then, both places can reference the same properties on the same object and you can pass an infinite number of properties by passing one object. There is no way to pass a modules namespace. You have to create your own object with properties on it and pass that. You can create such an object and then set that object into the base class and then all your derived classes can have access to that object.

In short:
module.exports = {
doX () {
console.log(data['a']);
^^^^ this variable is not available here. You should pass it as argument to make it available.
}
}

Related

The way to use custom method everywhere without requiring the module everywhere

I am using Node.js. I defined a custom method to the String obj like this:
if (!String.prototype.myMethod) {
String.prototype.myMethod= function () {
//do something
return this;
};
}
I found that myMethod maybe used in many different files, so that I have to require the file where this piece of code in. Is there any way that does the many 'requires' ?
Don't do that.
Node is intentionally designed in a module pattern, where each module gets it's own scope to run in and without polluting the global variables. This very intentional and very important.
https://nodejs.org/api/modules.html#modules_the_module_wrapper

Is it possible to spy on or mock the value of a "behind-the-scenes" variable?

Let's say I have code that modifies a variable which is not exposed to the user like this:
var model;
module.exports = {
doSomething: function() {
...
//at some point in the code, modify model
if(/* something happened */) {
model = '123';
},
doSomethingElse: function() {
//use model in some way
}
}
};
If I later want to write a unit test to make sure that model was updated, but I do not have a getter for it, how can I test this? Is this possible to do with Karma/Jasmine/Sinon.js?
It is impossible to check the model value directly because it's hidden in a closure.
You can still write tests for it though: Your "model" will make doSomethingElse behave differently. You can verify that it matches your expectations after calling doSomething. This way, you are also free refactor the internals of your module without changing the test cases.
In general, testing private methods or properties is an antipattern. By making aspects of your implementation private, you're explicitly creating the freedom to change how those implementation details work in the future without changing your public API.
Therefore, in an ideal world, you should not (and in this case, you cannot) test the model value.
That said, we don't always live in an ideal world. So, there are some workarounds that you might consider if you really, really must test private properties and methods.
First, you could look for an environment variable, and export additional properties (by attaching them to the exports).
var model;
module.exports = {
...
}
if(process.env.ENV === 'TEST') {
module.exports.model = model;
}
Second, you can use conventions instead of making things completely private. For example, a common convention is to prefix private entities with _ to signify that they are not part of the public API. This can have varied levels of effectiveness depending on the audience that will consume your API.
Third, you could create accessors for your private variables, and check for the presence of a testing environment in some way (perhaps a global variable, an environment variable, or a variable injected into your module upon instantiation) that only allows access when the test environment is detected.
Fourth, you could allow an "inspector" object to be injected into your module (this object would only be present during testing). I have used this pattern with some success.
module.exports = function(spies) {
...
spies = spies || {};
var model = spies.model;
...
}
...
// instantiate the module in the test
var spies = {};
var mock = new Module(spies);
// do a thing
expect(spies.model).to.eql("foo");
But really, you should reconsider your testing strategy and design.

js - How to decorate/proxy/spy on all functions? For creating a runtime profiler

So I have this decorate function that takes an object and a method-name and wraps it with external logic.
function decorate(object, methodName) {
var originalMethod = object[methodName];
object[methodName] = function () {
// pre-logic
var retVal = originalMethod.apply(this, arguments);
// post-logic
return retVal;
};
}
Now I want to wrap ALL of the functions in my application, i.e.
All the recursive public functions of the object
All private scope functions
All anonymous functions
Anything else I might have forgotten.
My purpose in doing this is to implement a "JS Profiler" that will run alongside my application during automated testing, and output performance data to logs.
I need this for testing purposes, so the solution must have minimal changes to the actual code of my application.
Possible solutions I've considered:
Public methods can be easily traversed and replaced using a recursive object traversal function.
Some hack using eval() to be able to access private methods.
Ideally, to handle all cases, I could use a proxy HTTP server (Node.js for example) that will transform each javascript file before sending it to the browser. This way my codebase will remain clean, but my tests will have the necessary changes.
The first 2 are only partial solutions, and the last one seems like an overkill and also a potential "bug factory"...
Does anyone have any other ideas on how to achieve what I need?

Encapsulation in JavaScript with protoypes

Probably many of you tried to achieve encapsulation in JavaScript. The two methods known to me are:
a bit more common I guess:
var myClass(){
var prv //and all private stuff here
//and we don't use protoype, everything is created inside scope
return {publicFunc:sth};
}
and second one:
var myClass2(){
var prv={private stuff here}
Object.defineProperty(this,'prv',{value:prv})
return {publicFunc:this.someFunc.bind(this)};
}
myClass2.prototype={
get prv(){throw 'class must be created using new keyword'},
someFunc:function(){
console.log(this.prv);
}
}
Object.freeze(myClass)
Object.freeze(myClass.prototype)
So, as second option is WAY more convenient to me (specifically in my case as it visually separates construction from workflow) the question is - are there any serious disadvantages / leaks in this case? I know it allows external code to access arguments of someFunc by
myClass.protoype.someFunc.arguments
but only in case of sloppily executed callbacks (synchronously inside caller chain). Calling them with setTimeout(cb,0) breaks chain and disallows to get arguments as well as just returning value synchronously. At least as far as i know.
Did I miss anything? It's kind of important as code will be used by external, untrusted user provided code.
I like to wrap my prototypes in a module which returns the object, this way you can use the module's scope for any private variables, protecting consumers of your object from accidentally messing with your private properties.
var MyObject = (function (dependency) {
// private (static) variables
var priv1, priv2;
// constructor
var module = function () {
// ...
};
// public interfaces
module.prototype.publicInterface1 = function () {
};
module.prototype.publicInterface2 = function () {
};
// return the object definition
return module;
})(dependency);
Then in some other file you can use it like normal:
obj = new MyObject();
Any more 'protecting' of your object is a little overkill for JavaScript imo. If someone wants to extend your object then they probably know what they're doing and you should let them!
As redbmk points out if you need private instance variables you could use a map with some unique identifier of the object as the key.
So, as second option is WAY more convenient to me (specifically in my case as it visually separates construction from workflow) the question is - are there any serious disadvantages / leaks in this case?
Hm, it doesn't really use the prototype. There's no reason to "encapsulate" anything here, as the prototype methods will only be able to use public properties - just like your untrusted code can access them. A simple
function myClass2(){
var prv = // private stuff here
Object.defineProperty(this, 'prv', {value:prv})
// optionally bind the publicFunc if you need to
}
myClass2.prototype.publicFunc = function(){
console.log(this.prv);
};
should suffice. Or you use the factory pattern, without any prototypes:
function myClass2(){
var prv = // private stuff here
return {
prv: prv,
publicFunc: function(){
console.log(this.prv); // or even just `prv`?
}
};
}
I know it allows external code to access arguments of someFunc by
myClass.protoype.someFunc.arguments
Simply use strict mode, this "feature" is disallowed there.
It's kind of important as code will be used by external, untrusted user provided code.
They will always get your secrets if the code is running in the same environment. Always. You might want to try WebWorkers instead, but notice that they're still CORS-privileged.
To enforcing encapsulation in a language that doesn't properly support private, protected and public class members I say "Meh."
I like the cleanliness of the Foo.prototype = { ... }; syntax. Making methods public also allows you to unit test all the methods in your "class". On top of that, I just simply don't trust JavaScript from a security standpoint. Always have security measures on the server protecting your system.
Go for "ease of programming and testing" and "cleanliness of code." Make it easy to write and maintain, so whichever you feel is easier to write and maintain is the answer.

'variable' : function(req, res){} means?

I am currently starting with node.js, so I am for the first time using Js beyond dom manipulation.
I came across a code piece like below. I cant understand it. What is happening? is it a key value object? Is an anonymous function being passed to 'new'?
module.exports = {
'new': function(req, res) {
res.view();
},
/**
* Overrides for the settings in `config/controllers.js`
* (specific to UserController)
*/
_config: {}
};
As others have said, this is ultimately just creating an object called module.exports then assigning two properties to it. One is another object called _config and the other is a function called new that expects two arguments.
That's the plain JavaScript explanation.
In node.js, you're also seeing a few conventions in play, which I'll describe below.
One convention is module.exports.
This is the object that will be made available when some other code loads this file using require(). It would work something like this:
var m = require('yourmodule.js');
m.new(req, res);
Another convention is the pair of arguments: req, res.
These are usually parameters that represent a request (like an http.IncomingMessage) and a response (like a http.ServerResponse).
Putting it all together, this module is probably defining a Controller that will receive http requests, and render them as responses. It currently does this for new, and there are probably routes configured elsewhere that call this method when a user requests something like 'http://server.come/user/new'.
Looks like basic JavaScript.
An object named module has a property named exports that is an object.
This object has a property named new whose value is an anonymous function.
In theory you could invoke the method like this:
module.exports.new(someRequest, someResponse);

Categories

Resources