How to correctly share dependencies in node - javascript

I have a question about sharing dependencies between node modules. I had almost 100 different stackoverflow-questions opened, read about di in node, singleton and nothing really seemed quite right.
As I'm new to node, I created an express app (api) to play around with. With more features the file grew fairly quickly in size, so I decided to split routes up with the express use method
const express = require('express');
const app = express();
...
app.use('/', require('./routes/index');
...
At first that looked like a nice approach. But at the moment my logging stopped working it became, what seems to be a "unsolvable" problem.
Before I split up the routes, i instantiated my logger (winston) inside my app file and used it inside my routes:
const winston = require('winston');
...
const logger = winston.createLogger({...});
...
app.get('/some-route', (req, res) => {
logger.info('Accessed some-route');
...
});
Since my routes are now separated, this approach does not work anymore.
After googling (a lot), I came across some solutions:
Depend on nodes module caching:
I created a module that instantiates my logger end exports it. Now every time I require the logger file it should theoretically using the same instance, since node caches the module. This works quite fine, but seems absolutely wrong. Should you really depend on a cache? Since the caching is case-sensitive too, I can't expect everyone to correctly use it.
Globals
In the past I started learning PHP as my first programming language. Even back then the concept of using global variables weren't advised.
When first using javascript in the browser, encapsulating "modules" with anonymous functions were also preferred to not pollute the global namespace and create capsuled features.
I don't think using globals is the way to go.
Using handlers
Using special handlers to pass dependencies up. But this is tedious and produces a large overhead.
index-handler.js
module.export.getIndex = function(logger) {
return function(req, res, next) {
logger.info('Do some logging');
...
}
}
app.js
...
-- instanciate the logger as above
const logger = ...
...
const indexHandler = require('handlers/index');
app.use('/', indexHandler(logger));
Dependency Injection Containers Libraries
After all I read, every module in node should only depend on itself. Many posts said that using singletons or di isn't necessary, in fact it would be wrong the way node works.
There were a few more approaches, but those are the most common when searching.
I may want to give another example to clarify my problem:
My app uses a configuration file (settings.json). I normally required it like this:
let config;
try {
config = require("./settings.json");
} catch (err) {
console.error(
`Application could not start. Ensure you created a settings file`
);
process.exit(1);
}
Since the config variable was available, every route could use it. After splitting up, this does not work anymore. Even if I could require the config file in every module, I don't think it should be done this way. Or am I wrong?
This question really bugs me. Finding no clear solution is frustrating. I hope someone can give me a push in the right directionor at least provide me with some more information that I can use.
Thanks in advance

Related

Dependancy injection using Tsyringe for multiple implementation of interface in Typescript

Context
I am currently working on a Typescript Lambda project where we are planning to refactor our code to make use of dependency injection using the Tsyringe library. We have a typical MVC structure for projects except instead of the Repo/Database layer we have a proxy layer which calls a third-party service over the rest API to fetch the required data.
The catch is that the proxy layer will have a single interface defined and it will have multiple implementations among which one needs to be injected depending upon the business decision. For example AuthProxy is an interface which contains a login method, and it has two different implementation classes KeycloakAuthProxyImpl and AuthZeroAuthProxyImpl. These two implementations will be in 2 separate folders say AuthZero and KeyCloak and while building we pass an argument like --folderName so only one implementation will be available in runtime for dependency injection.
The problem
The problem we are facing with Tsyringe (I have evaluated some other libraries too) is that interface-based dependency injection needs explicit token-based registration with ioc-container in the main.ts page(In my case, the handler function file). So as per theory, I should be registering it as follows.
.
But in our case, this is not possible. Because say we are building it as --keycloak as an argument, then AuthZearoAuthProxyimpl will be ignored while compiling and hence the code will break in line 14 at runtime.
We tried to move that dependency registration logic to the corresponding implementation class so that each implementation class will be self-contained and isolated so that there won't be any runtime issues. But then these are not even being registered for dependency injection and we get an error saying Attempted to resolve unregistered dependency token: "AuthProxy". This is expected as per the file loading of javascript.
KeycloakImpl class.
.
We even tried using #registry decorator which can be found commented in the images, but it also didn't make any difference.
Even though I haven't tried any other dependency injection libraries of Typescript, from my little research, most of them follow more or less the same pattern for interface-based dependency injection and I am anticipating the same issue in those also. Is there any other workaround through which I can resolve this issue, or is it even possible with typescript?
PS: I don't have much expertise in js and typescript, the above keywords are based on my experience with spring and java. Please ignore if I have misused any js specific terminologies while explaining the issue.
Code and project structure
I had similar problems with tsyringe and I found a much better approach.
Here is how I would solve this with a different DI lib iti and also remove many lines of code:
import { createContainer } from "iti"
import express from "express"
// import other deps
const container = createContainer()
.add({
authProxy: () =>
Math.random() > 0.5
? new KeycloakAuthProxyImpl()
: new AuthZearoAuthProxyImpl(),
})
.add((ctx) => ({
authService: () => new AuthServiceV3(ctx.authProxy),
}))
.add((ctx) => ({
server: () => {
const app = express()
app.all("/login", async (req, res) => handler(req, ctx.authService))
return app
},
}))
// start your server / lambda
const server = container.get("server")
server.listen(3000)
I've also refactor other parts of the app and got rid of singletons and made code IMO a bit simpler to read
I’ve created an interactive playground with a mock your app:
https://stackblitz.com/edit/json-server-ntssfj?file=index.ts
Here are some links to the lib:
https://github.com/molszanski/iti
https://itijs.org/

How to completely remove a nodejs module with all its functionalities even including setInterval?

I am developing an IDE that enables developers to develop express.js applications. So, developers can create new express applications. Since resource consumption is vital, I do not want to run each express app in a separate process; actually they are all running in the main process as node.js modules. Now the problem is I do not have any idea what codes developers will write, for example whether they use functions like setInterval or not. Let me explain a little bit more:
Suppose this is the main program:
'use strict'
const mainApp = require('express')();
require('./userModule.js')();
mainApp.delete('/', function(req, res){
/*
What should I do here to delete the user code?
*/
res.status(202).send('deleted');
});
mainApp.listen(8000, () => console.log('Example app listening on port 8000!'))
And this is the code written by user in userModule.js function:
'use strict'
module.exports = function(){
// code written by user will come below
setInterval(function(){
console.log('hello!!!');
}, 3000);
// code written by user will come above
}
I already know the following:
1- Delete the module cache as explained here. That does not help.
delete require.cache[require.resolve('./userModule.js')];
2- ClearInterval when the ID is unknown as explained here. I do not want to clear all intervals! Moreover, it only works for setInterval function, what about others?!!!
I am reaching a point that there is no option if the module is running in the main process. Am i right?
And don't forget that they may well set global variables by assigning to global (global.foo = 42 or similar). And once a global is created, you have no idea who created it and whether it should be removed.
Yes, you're correct that you can't fully clean up a Node module.
Absolutely, positively, do not run them all in the same process.
Even if you're spawning them as separate processes, unless you absolutely trust each and every author, you can't just run the code they give you. You'll need to do a lot more to protect yourself (chroot jails, that kind of thing). The full list of things you'll need to do to protect yourself is far too broad for an SO question/answer.
But again the basic answer is: Yes, you're correct, you cannot fully clean up a Node module once loaded. Don't load their code into your process.

What is the reason of adding all modules to express index file app.js?

I often see that all modules used by an application (Express.js) are added in the very beginning of the index file "app.js". Like this:
var passport = require('passport');
var LocalStrategy = require('passport-local').Strategy;
var mongo = require('mongodb');
var mongoose = require('mongoose');
And that's it, nothing deals with them in "app.js". They are used somewhere else, for example this modules may be required route file "/routes/login.js", where their addition is duplicating.
What's the matter of adding all modules in the "app.js" insted of adding them only where it really needed? Is it a part of convention or there are some real needs?
Most likely people begin writing their project in a single file. They include everything there, then, when they split code into multiple files, they forget to remove the requires.
The only semi-real benefit I can think of is preloading the modules at the beginning, because technically when you require a module twice it should resolve to the same object and all of its' initialization should execute only once. But that's stretching it, really. I think people just forget.

Should i really repeat all this requirement in every route module file?

I'm building a larger web app, and have now begun to realise the use of modularising my routes in its own files. But when doing this I notice I have to reapeat alot of requirements... Before i started to move out the routes to its own file I had around 20 required modules in my main app, handling everything from DB to emails...
Many of these modules are used in most of the routes... which mean I have to repeat maybe 15-20 requirements in each route module file.
Question: This seems like a alot of repeated code, but maybe this is the righ way to do it?
At least official NPM modules seems to work in this way.
You may write a module (lets say, common.js), which would require all of your requirements and return a single object:
module.exports = {
http: require('http'),
request: require('request'),
async: require('async'),
someModule: require('./someModule')
};
Then all you have to do is require a single common module:
var common = require('./common');
common.request.get(...);
common.async.parallel(...);
The only inconvenience is that you now have to write common. when you want to access these modules.
You can also use global variables. Actually using globals is a bad practice, and it's strongly recommended that you don't use them:
Why are global variables considered bad practice? (node.js)
Why are globals bad?

Node.js module loading

I am currently building a web app with Node and I am curious as to how Node loads its required files or modules.
I am using express for view and server config, however I am finding that all the Node examples (I know express offers an MVC example) don't really conform to the general MVC pattern. I am also aware that Node is not necessarily suited for MVC, but bear with me, because I like MVC.
If you consider the following route declaration, it would be effective to use this as a controller, since here you can control the request and response logic:
module.exports = function (app) {
app.get('/', function (req, res) {
res.render('index', { layout: false });
});
To try and follow an MVC architecture I have effectively divided the routes up into its relevant paths in effect creating controllers. However whenever I have a different route file it must contain its own set of required modules. For example:
var mongo = require('mongoskin');
I would then declare the required route files in the app.js or server.js file that holds the server config settings.
I am wondering whether splitting up the routes like this would slow down the application since I am unaware of how Node loads its modules. If it is loading as per needed then surely this implementation must slow it down?
Required modules are only loaded once and then cached, so feel free the break up your app into as many modules as needed to cleanly organize your app. If you have 20 files that call require('mongoskin'), that module is still only loaded once.
Quoting from the node.js documentation:
Modules are cached after the first time they are loaded. This means
(among other things) that every call to require('foo') will get
exactly the same object returned, if it would resolve to the same
file.
Multiple calls to require('foo') may not cause the module code to be
executed multiple times. This is an important feature. With it,
"partially done" objects can be returned, thus allowing transitive
dependencies to be loaded even when they would cause cycles.

Categories

Resources