In the case where I would like to share some information from my index.js file to multiple endpoints in different routes, would simply sticking a custom key to the express instance (app) be fine? For example, lets say I'm using socket.io and would like multiple endpoints emitting from the instance I created, would just saying app.socketIo = io be fine? Is there a better way of doing this if this is not the case?
app.set() and app.get() are designed for this purpose, though some named parameters are reserved for special use.
There is also app.locals which a shared object that is entirely your own namespace, no conflicts at all.
While you could apply your own properties to the app object directly, that runs a risk of conflict with future development of Express and would generally not be recommended for that reason. I'd suggest using app.locals which is conflict-free and entirely for your own use.
Values in app.locals values are also available to templates using the Express template interface so that's where you can set application-level variables that are available to all templates (such as an application name).
Related
Is there any advantages (or disadvantages) on using #NestJS/Config instead of using dotenv to retrieve envvar? In both cases I could create a class that is responsible for all envvars, but should I?
I know #NestJS/Config uses dotenv behind the curtains, but is there any reason why one should choose one over the other?
The two big advantages are the ability to use Joi or class-validator or whatever else you want as a schema validator to ensure you have your env values correct to begin with, before trying to access them at runtime and getting an error. Earlier feedback loop means fewer failures later on. The other big advantage is the use of DI meaning it's easier (usually) to mock the env variable value in your test cases, rather than having to assign to process.env itself. There's also slight speed improvements, as Nest caches the value so if you read it again you don't need to read from process.env, but other than that there's not too much to mention. If you don't want to use it, don't feel like you have to. There is also the disadvantage of not being able to use the ConfigService inside a decorator
My understanding is using #nestjs/config is easy for you to manage your config/envvars as a module in your project. So it can be easily swapped in different place:
e.g. if you need a different set of config for test, you don't have to actually modify your process.env.xxx or use a different .env file.
However if you do that, it requires all/most your other services to utilize this pattern as well. It wouldn't be so helpful if you have all your other service to be a pure function export.
I haven't had a look at the source in-depth or anything yet, but was just curious if anyone has required or made use of this in the past or not.
It appears as though Redux is creating a singleton instance for the store and persisting this for the lifetime of the caller, but this is not really tenable for what we're thinking of implementing... what is the simplest approach, if one exists, for extending their API and creating stores with scoped lifetimes?
This is in the context of Redux being used in a purely server-side application (or at least as far as it can tell etc.)
Edit:
Similarly, while you can reference your store instance by importing it
directly, this is not a recommended pattern in Redux. If you create a
store instance and export it from a module, it will become a
singleton. This means it will be harder to isolate a Redux app as
a component of a larger app, if this is ever necessary, or to
enable server rendering, because on the server you want to create
separate store instances for every request.
I found this excerpt in their documentation here.
How does this module exporting consideration work in practice? If I only access the store within the scope I declared it for a given lifetime via externally exposed methods, should I expect the instance to be deallocated when the scope closes in some specified app lifetime?
I saw some people use app.locals to store properties that is available across all views.
Also other people used global.AnyNameVariable to store anything even like requiring config.js files and etc
ex:
app.locals.objOne = {
name:'John'
}
global.objTwo = {
name:'Doe'
}
What is the difference between them? and what is the purpose? what is the right way to use both?
As the documentation states,
The app.locals object has properties that are local variables within the application.
This is application-level container provided by the framework that serves to store application settings, etc. There can be more than one Express application, while global is global.
app.locals is available as req.app.locals within middlewares, this way it can be decoupled from specific app variable.
i'm studying how nodejs module system works.
I've found so far this literature:
https://nodejs.org/api/modules.html
http://fredkschott.com/post/2014/06/require-and-the-module-system/
http://www.bennadel.com/blog/2169-where-does-node-js-and-require-look-for-modules.htm
It helps me to understand a few points however is still have these questions:
If i have a expensive resource (let's say database pooling connection) inside a module, how can i make sure that by requiring the module again i am not re-evaluating the resource?
How can i dynamically unload a disposable resource once i already called the 'require' on it?
It's important because i have some scenarios that demands me to make sure that i have a single instance of database pool. because of this i'm exporting modules which are able to receive parameters instead of just require the expensive resource.
Any guidance is very appreciated.
Alon Salont has written an EXCELLENT guide to understanding exports in NodeJS (which is what you're accessing when you call require()) here:
http://bites.goodeggs.com/posts/export-this/#singleton
If you study the list of options for what a module can export, you'll see that the answer to your question depends on how the module is written. When you call require, NodeJS will look for the module loaded in its cache and return it if it already had it loaded elsewhere.
That means if you choose to export a Singleton pattern, are monkey-patching, or creating global objects (I recommend only the first one in your case), only ONE shared object will be created / will exist. Singleton patterns are good for things like database connections that you want to be shared by many modules. Although some argue that "injecting" these dependencies via a parent/caller is better, this is a philosophical view not shared by all, and singletons are widely used by software developers for shared-service tasks like this.
If you export a function, typically a constructor, require() will STILL return just one shared reference to that. However, in this case, the reference is to a function, not something the function returns. require() doesn't actually CALL the function for you, it just gives you a reference to it. To do any real work here, you must now call the function, and that means each module that requires this thing will have its own instance of whatever class the module provides. This pattern is the more traditional one where a class instance is desired / is the goal. Most NPM modules fit this category, although that doesn't mean a singleton is a bad idea in your case.
Take the following for example:
// Load Required Modules
var Express = require('express'), // Express Framework
Session = require('express-session'); // Express Sessions
App = Express(), // Express Application
HTTP = require('http').Server(App), // HTTP Server
Parser = require('body-parser'), // Request Body Parser
Moment = require('moment'); // Date Manipulation And Formatting
var Application = {};
Application.ExpressFramework = Express;
Application.Express = App;
Application.Parser = Parser;
Application.Moment = Moment;
// Load Internal Modules
Application.Services = require('./modules/services')(Application);
Application.Models = require('./models')(Application);
etc
etc
Application.Services.Start();
As you can see, everything is being loaded in to one variable which is then passed around all the modules so they can access everything.
Is this a bad practice and if so, why? (Not looking for personal opinions - as per StackOverflow rules, just wondering if this would cause negative performance).
In general, if you want to pass a group of variables around to a bunch of different modules, then it is a natural way to group them into an object so you can pass just that single object rather than a long list of individual variables. It is also much more extensible.
The node.js module pattern also supports the exports mechanism as one way of doing just this, but it is more of a "pull" model where another model does a require on you and fetches your exports.
The code you show is more of a "push" model where you load another module and then pass your variables to it which is another reasonable way to do things and works out better logically sometimes.
But ... some of the things you are sharing and the way you are sharing them does not seem ideal to me.
In the particular example you show, it is a little odd to be sharing some of the things you are sharing. For example, there's no need to share the Moment or Parser variable. If another module needs one of those modules, it should just do it's own require('moment'); and get the module that way.
That makes your other module more stand-alone (it requires the things that it needs). The require subsystem caches modules so it is not physically reloading the module when you do a 2nd require() on the same module.
There are occasionally reasons to share the App or HTTP variables from your code because those are instances of something that you don't want multiple instances of, not modules. I would probably not put these in a single large object, but rather I'd pass them only to the modules that actually need them. When you put lots of things in an object and pass the whole object to several other modules (each of which only use some of the things in the object), it makes it very difficult to tell what really depends upon what. If, on the other hand, you pass another module only what it needs, then the code is very clear about what is really being used by the other module.