Node.js module loading - javascript

I am currently building a web app with Node and I am curious as to how Node loads its required files or modules.
I am using express for view and server config, however I am finding that all the Node examples (I know express offers an MVC example) don't really conform to the general MVC pattern. I am also aware that Node is not necessarily suited for MVC, but bear with me, because I like MVC.
If you consider the following route declaration, it would be effective to use this as a controller, since here you can control the request and response logic:
module.exports = function (app) {
app.get('/', function (req, res) {
res.render('index', { layout: false });
});
To try and follow an MVC architecture I have effectively divided the routes up into its relevant paths in effect creating controllers. However whenever I have a different route file it must contain its own set of required modules. For example:
var mongo = require('mongoskin');
I would then declare the required route files in the app.js or server.js file that holds the server config settings.
I am wondering whether splitting up the routes like this would slow down the application since I am unaware of how Node loads its modules. If it is loading as per needed then surely this implementation must slow it down?

Required modules are only loaded once and then cached, so feel free the break up your app into as many modules as needed to cleanly organize your app. If you have 20 files that call require('mongoskin'), that module is still only loaded once.

Quoting from the node.js documentation:
Modules are cached after the first time they are loaded. This means
(among other things) that every call to require('foo') will get
exactly the same object returned, if it would resolve to the same
file.
Multiple calls to require('foo') may not cause the module code to be
executed multiple times. This is an important feature. With it,
"partially done" objects can be returned, thus allowing transitive
dependencies to be loaded even when they would cause cycles.

Related

How to correctly share dependencies in node

I have a question about sharing dependencies between node modules. I had almost 100 different stackoverflow-questions opened, read about di in node, singleton and nothing really seemed quite right.
As I'm new to node, I created an express app (api) to play around with. With more features the file grew fairly quickly in size, so I decided to split routes up with the express use method
const express = require('express');
const app = express();
...
app.use('/', require('./routes/index');
...
At first that looked like a nice approach. But at the moment my logging stopped working it became, what seems to be a "unsolvable" problem.
Before I split up the routes, i instantiated my logger (winston) inside my app file and used it inside my routes:
const winston = require('winston');
...
const logger = winston.createLogger({...});
...
app.get('/some-route', (req, res) => {
logger.info('Accessed some-route');
...
});
Since my routes are now separated, this approach does not work anymore.
After googling (a lot), I came across some solutions:
Depend on nodes module caching:
I created a module that instantiates my logger end exports it. Now every time I require the logger file it should theoretically using the same instance, since node caches the module. This works quite fine, but seems absolutely wrong. Should you really depend on a cache? Since the caching is case-sensitive too, I can't expect everyone to correctly use it.
Globals
In the past I started learning PHP as my first programming language. Even back then the concept of using global variables weren't advised.
When first using javascript in the browser, encapsulating "modules" with anonymous functions were also preferred to not pollute the global namespace and create capsuled features.
I don't think using globals is the way to go.
Using handlers
Using special handlers to pass dependencies up. But this is tedious and produces a large overhead.
index-handler.js
module.export.getIndex = function(logger) {
return function(req, res, next) {
logger.info('Do some logging');
...
}
}
app.js
...
-- instanciate the logger as above
const logger = ...
...
const indexHandler = require('handlers/index');
app.use('/', indexHandler(logger));
Dependency Injection Containers Libraries
After all I read, every module in node should only depend on itself. Many posts said that using singletons or di isn't necessary, in fact it would be wrong the way node works.
There were a few more approaches, but those are the most common when searching.
I may want to give another example to clarify my problem:
My app uses a configuration file (settings.json). I normally required it like this:
let config;
try {
config = require("./settings.json");
} catch (err) {
console.error(
`Application could not start. Ensure you created a settings file`
);
process.exit(1);
}
Since the config variable was available, every route could use it. After splitting up, this does not work anymore. Even if I could require the config file in every module, I don't think it should be done this way. Or am I wrong?
This question really bugs me. Finding no clear solution is frustrating. I hope someone can give me a push in the right directionor at least provide me with some more information that I can use.
Thanks in advance

dynamically load modules in Meteor node.js

I'm trying to load multiple modules on the fly via chokidar (watchdog) using Meteor 1.6 beta, however after doing extensive research on the matter I just can't seem to get it to work.
From what I gather require by design will not take in anything other than static strings, i.e.
require("test/string/here")
Since if I try:
var path = "test/string/here"
require(path)
I just get Error: Cannot find module, even though the strings are identical.
Now the thing is I'm uncertain how to go on about this, am I really forced to either use import or static strings when using meteor or is there some workaround this?
watchdog(cmddir, (dir) => {
match = "." + regex_cmd.exec(dir);
match = dir;
loader.emit("loadcommand", match)
});
loader.on('loadcommand', (file) => {
require(file);
});
There is something intrinsically weird in what you describe.
chokidar is used to watch actual files and folders.
But Meteor compiles and bundles your code, resulting in an app folder after build that is totally different from your project structure.
Although Meteor now supports dynamic imports, the mechanism is internal to Meteor and does not rely on your actual project files, but on Meteor built ones.
If you want to dynamically require files like in Node, including with dynamically generated module path, you should avoid import and require statements, which are automatically replaced by Meteor built-in import mechanism. Instead you would have to make up your own loading function, taking care of the fact that your app built folder is different from your project folder.
That may work for example if your server is watching files and/or folders in a static location, different from where your app will be running.
In the end, I feel this is a sort of XY problem: you have not described your objective in the first place, and the above issue is trying to solve a weird solution that does not seem to fit how Meteor works, hence which may not be the most appropriate solution for your implicit objective.
#Sashko does a great job of explaining Meteor's dynamic imports here. There are also docs
A dynamic import is a function that returns a promise instead of just importing statically at build time. Example:
import('./component').then((MyComponent) => {
render(MyComponent);
});
The promise runs once the module has been loaded. If you try to load the module repeatedly then it only gets loaded once and is immediately available on subsequent requests.
afaict you can use a variable for the string to import.

Should i really repeat all this requirement in every route module file?

I'm building a larger web app, and have now begun to realise the use of modularising my routes in its own files. But when doing this I notice I have to reapeat alot of requirements... Before i started to move out the routes to its own file I had around 20 required modules in my main app, handling everything from DB to emails...
Many of these modules are used in most of the routes... which mean I have to repeat maybe 15-20 requirements in each route module file.
Question: This seems like a alot of repeated code, but maybe this is the righ way to do it?
At least official NPM modules seems to work in this way.
You may write a module (lets say, common.js), which would require all of your requirements and return a single object:
module.exports = {
http: require('http'),
request: require('request'),
async: require('async'),
someModule: require('./someModule')
};
Then all you have to do is require a single common module:
var common = require('./common');
common.request.get(...);
common.async.parallel(...);
The only inconvenience is that you now have to write common. when you want to access these modules.
You can also use global variables. Actually using globals is a bad practice, and it's strongly recommended that you don't use them:
Why are global variables considered bad practice? (node.js)
Why are globals bad?

Exposing application scripts to certain scripts only

uhh it's hard to come with a right title for this problem excuse me.
In a backbone.js application i am building. Models, Views, Templates are all in separate javascript, html files. I want to export the Models, Views and Templates to the application bootstapper file (app.js) without polluting the global variable i.e doing window.App.Model = myModel; that. By export i mean make the code inside the files available to app.js for initialization and running
How do i go about doing this?
Are there any patterns that will solve the problem? Could you provide me a example
Description
In cases where models,views and templates are split to many disparate files the application bootstrapper file app.js should have some means to access these M,V,C components. Hence common approach is to do below inside the model.js file
window.App.Model.PersonModel = Backbone.Model.extend({});
App.js
var instance = new window.App.Model.PersonModel();
var personView = new window.App.Views.PersonView({model:instance});
Finally you see that everything derives from the Global object App which i think is not safe, improper and weak way to build application dependencies
Suggestions
Just to the above question, could someone suggest a template loading library(javascript templates regardless of engine used) that can be used to load the templates
Take a look on RequireJS, which support asynchronous module definitions/loading. You would have to rewrite your modules to and app.js to satisfy AMD api, but it would take only few strings of code.

RequireJS loads resources I don't want to

I'm using RequireJS in my app, but don't quite well understand all aspects of it's work.
I have main.js file, where dependencies are described. I have Backbone.Router component up and running, which triggers different application classes (class which is responsible to create main view). Some code you can see here.
What I can see with requireJS: even if some view has not been yet 'required' (mean explicitly call to require('./subviews/view)), it's still loaded and inside it loads all templates (I use requireJS text plugin). If I'm adding new application but it's subviews are not ready, but I never used the application - nonexisting subviews are still loaded and I'm getting 404 errors.
Not sure I explained everything clearly, but hope you got point.
Looks like you are using the CommonJS sugared form of define(). Note that this is just a convenience for wrapping Node/CommonJS code, but AMD modules do not operate like Node modules.
An AMD loader will scan for require('') calls in the factory function of the module, and make sure to load all the modules. Otherwise, synchronous access to the that module, by doing a require('./apps/DashboardApp');, would fail in the browser, since file IO is by default async network IO.
If you want to delay loading of some scripts, then you need to use the callback-form of require:
require(['./apps/DashboardApp'], function (DashboardApp) {
});
but this call is an async call, so you would have to adjust your module's public API accordingly.
So basically, if you want to do on-demand loading of a dependency, the callback form of require is needed, given the async nature of file IO in the browser.
Because RequireJS loads all required dependencies. By taking quick look at your code I see that you load routing module and routing has:
var ViewManager = require('ViewManager');
That means it will load ViewManager, dependecies that are specified by ViewManager and other dependencies that those modules need. Essentially when you include require(...), it's the same as specifying dependency. This will be transformed by RequireJS into
define(['ViewManager'], ...)

Categories

Resources