I'm building a larger web app, and have now begun to realise the use of modularising my routes in its own files. But when doing this I notice I have to reapeat alot of requirements... Before i started to move out the routes to its own file I had around 20 required modules in my main app, handling everything from DB to emails...
Many of these modules are used in most of the routes... which mean I have to repeat maybe 15-20 requirements in each route module file.
Question: This seems like a alot of repeated code, but maybe this is the righ way to do it?
At least official NPM modules seems to work in this way.
You may write a module (lets say, common.js), which would require all of your requirements and return a single object:
module.exports = {
http: require('http'),
request: require('request'),
async: require('async'),
someModule: require('./someModule')
};
Then all you have to do is require a single common module:
var common = require('./common');
common.request.get(...);
common.async.parallel(...);
The only inconvenience is that you now have to write common. when you want to access these modules.
You can also use global variables. Actually using globals is a bad practice, and it's strongly recommended that you don't use them:
Why are global variables considered bad practice? (node.js)
Why are globals bad?
Related
I have a question about sharing dependencies between node modules. I had almost 100 different stackoverflow-questions opened, read about di in node, singleton and nothing really seemed quite right.
As I'm new to node, I created an express app (api) to play around with. With more features the file grew fairly quickly in size, so I decided to split routes up with the express use method
const express = require('express');
const app = express();
...
app.use('/', require('./routes/index');
...
At first that looked like a nice approach. But at the moment my logging stopped working it became, what seems to be a "unsolvable" problem.
Before I split up the routes, i instantiated my logger (winston) inside my app file and used it inside my routes:
const winston = require('winston');
...
const logger = winston.createLogger({...});
...
app.get('/some-route', (req, res) => {
logger.info('Accessed some-route');
...
});
Since my routes are now separated, this approach does not work anymore.
After googling (a lot), I came across some solutions:
Depend on nodes module caching:
I created a module that instantiates my logger end exports it. Now every time I require the logger file it should theoretically using the same instance, since node caches the module. This works quite fine, but seems absolutely wrong. Should you really depend on a cache? Since the caching is case-sensitive too, I can't expect everyone to correctly use it.
Globals
In the past I started learning PHP as my first programming language. Even back then the concept of using global variables weren't advised.
When first using javascript in the browser, encapsulating "modules" with anonymous functions were also preferred to not pollute the global namespace and create capsuled features.
I don't think using globals is the way to go.
Using handlers
Using special handlers to pass dependencies up. But this is tedious and produces a large overhead.
index-handler.js
module.export.getIndex = function(logger) {
return function(req, res, next) {
logger.info('Do some logging');
...
}
}
app.js
...
-- instanciate the logger as above
const logger = ...
...
const indexHandler = require('handlers/index');
app.use('/', indexHandler(logger));
Dependency Injection Containers Libraries
After all I read, every module in node should only depend on itself. Many posts said that using singletons or di isn't necessary, in fact it would be wrong the way node works.
There were a few more approaches, but those are the most common when searching.
I may want to give another example to clarify my problem:
My app uses a configuration file (settings.json). I normally required it like this:
let config;
try {
config = require("./settings.json");
} catch (err) {
console.error(
`Application could not start. Ensure you created a settings file`
);
process.exit(1);
}
Since the config variable was available, every route could use it. After splitting up, this does not work anymore. Even if I could require the config file in every module, I don't think it should be done this way. Or am I wrong?
This question really bugs me. Finding no clear solution is frustrating. I hope someone can give me a push in the right directionor at least provide me with some more information that I can use.
Thanks in advance
When disabling globals the docs suggests using the following alternatives:
_ = require('lodash')
myService = sails.services.myservice
myModel = sails.models.mymodel
sails = req._sails
Would there be any issue requiring "sails", "services", and "models" much like any other module?
Having tried It, it does appear to work, however I feel I might be missing something.
Using require for services is always valid; the globalizing is merely for convenience.
On the other hand, doing require('api/models/User.js') will almost certainly not give you what you want, since those files are used by Sails to build model classes. So the only way to reliably use models in Sails without globals turned on is via sails.models.
Finally, while require('sails') will usually give you a reference to the running Sails app, it's not recommended that you use it that way. If you were running multiple Sails apps in the same process (which you might do in automated tests) then it would not reliably return the correct app. You're much better off using req._sails in controllers, and this.sails in models and services.
I'd like to use something like requireJS to modularise my javascript app, however, there's one particular requirement I'd like to meet that I haven't figured out:
I'm happy enough to have require and define in the global space, however, I don't want modules that I import with require to be accessible globally (i.e. they should be accessible to my app, but not other apps running on the same page).
It seems to me that if I call define('moduleA', function() { ... }); I can then access that module - globally - via the require function. It may not be occupying a variable in the global space itself, or be attached to window, but it still feels bad, because other apps really shouldn't be able to see my internal modules (not to mention potential naming conflicts, etc, can I use contexts to circumvent this?).
This seems to be a step back from just namespacing my modules and including them all inside of one big privatising function at build time.
I could have my own private version of require but my modules (being in different files) wouldn't be able to access define.
Am I missing something or do I just have to live with this? (or alternatively, run an optimizer to bake everything into one file - feels like I could just namespace my modules and not bother with requireJS at all if I do that though).
In your r.js build config add wrap: true and you will no longer pollute the glabal scope with require or define functions.
Edit:
You can also specify a namespace for your require and define with the namespace setting in the r.js build config. You can read more about namespacing in RequireJS's FAQ. From the example.build.js comments:
// Allows namespacing requirejs, require and define calls to a new name.
// This allows stronger assurances of getting a module space that will
// not interfere with others using a define/require AMD-based module
// system. The example below will rename define() calls to foo.define().
// See http://requirejs.org/docs/faq-advanced.html#rename for a more
// complete example.
namespace: 'foo',
I am currently building a web app with Node and I am curious as to how Node loads its required files or modules.
I am using express for view and server config, however I am finding that all the Node examples (I know express offers an MVC example) don't really conform to the general MVC pattern. I am also aware that Node is not necessarily suited for MVC, but bear with me, because I like MVC.
If you consider the following route declaration, it would be effective to use this as a controller, since here you can control the request and response logic:
module.exports = function (app) {
app.get('/', function (req, res) {
res.render('index', { layout: false });
});
To try and follow an MVC architecture I have effectively divided the routes up into its relevant paths in effect creating controllers. However whenever I have a different route file it must contain its own set of required modules. For example:
var mongo = require('mongoskin');
I would then declare the required route files in the app.js or server.js file that holds the server config settings.
I am wondering whether splitting up the routes like this would slow down the application since I am unaware of how Node loads its modules. If it is loading as per needed then surely this implementation must slow it down?
Required modules are only loaded once and then cached, so feel free the break up your app into as many modules as needed to cleanly organize your app. If you have 20 files that call require('mongoskin'), that module is still only loaded once.
Quoting from the node.js documentation:
Modules are cached after the first time they are loaded. This means
(among other things) that every call to require('foo') will get
exactly the same object returned, if it would resolve to the same
file.
Multiple calls to require('foo') may not cause the module code to be
executed multiple times. This is an important feature. With it,
"partially done" objects can be returned, thus allowing transitive
dependencies to be loaded even when they would cause cycles.
I'm working on a project with Node.js and the server-side code is becoming large enough that I would like to split it off into multiple files. It appears this has been done client-side for ages, development is done by inserting a script tag for each file and only for distribution is something like "Make" used to put everything together. I realize there's no point in concatting all the server-side code so I'm not asking how to do that. The closest thing I can find to use is require(), however it doesn't behave quite like script does in the browser in that require'd files do not share a common namespace.
Looking at some older Node.js projects, like Shooter, it appears this was once not the case, that or I'm missing something really simple in my code. My require'd files cannot access the global calling namespace at compile time nor run time. Is there any simple way around this or are we forced to make all our require'd JS files completely autonomous from the calling scope?
You do not want a common namespace because globals are evil. In node we define modules
// someThings.js
(function() {
var someThings = ...;
...
module.exports.getSomeThings = function() {
return someThings();
}
}());
// main.js
var things = require("someThings");
...
doSomething(things.getSomeThings());
You define a module and then expose a public API for your module by writing to exports.
The best way to handle this is dependency injection. Your module exposes an init function and you pass an object hash of dependencies into your module.
If you really insist on accessing global scope then you can access that through global. Every file can write and read to the global object. Again you do not want to use globals.
re #Raynos answer, if the module file is next to the file that includes it, it should be
var things = require("./someThings");
If the module is published on, and installed through, npm, or explicitly put into the ./node_modules/ folder, then the
var things = require("someThings");
is correct.