From my understanding, webpack (and other similar bundlers) ensures that if the same module is required across multiple parts of the app:
That code will only be loaded once
Each time the same module is required, a new instance is created, rather than all sharing the same scope
Firstly, are my above assumptions correct? If so, is it bad optimisation to have multiple instances of the same module being created?
In my example I am creating an app that will be using ThreeJS. This is a rather large library. Many of the modules in my app will want to require this library.
Is it bad practice to keep on requiring a library like this? Or should I be passing a single instance from module to module instead of requiring multiple times?
I'd be interested to know if there are any common patterns for dealing with this, if it is indeed an issue.
This assumption:
Each time the same module is required, a new instance is created,
rather than all sharing the same scope
is wrong for two reasons:
Whether an instance is created or not, will depend on what is returned by the required module (it could return nothing).
Even if the module initialises an instance, only one will be created, as the code in the module is only run once.
With that said, requiring the same module multiple times should have no impact in performance.
Related
I'm wondering what the scope of a module is in both browsers and Node. I'm specifically trying to understand whether a module-level variable is created once per app, or many times. Is a new instance of the module created on each import, or is it the exact same module shared across all imports? Many thanks!
There are several JS module flavours - ESM, CommonJS, AMD. Their common trait is that they are evaluated once on first import, at least under normal circumstances. Doing the opposite would make them inefficient for sharing data.
Exporting class instance is a common way to share one instance across the application without making a class a singleton.
The ways that can make a module be evaluated multiple times (intentionally or not) include having many copies of a module, different filename case on import in case-insensitive system, the modification of Node module cache.
I'm looking to set up a file structure such that I have a 'modules' file which exports a bunch of... modules.
I know that in general, any required module is cached and so not read/executed after its first import, but I'm wondering if this stands true for a chain of imports/exports, where their root import has been exported as a defined constant.
eg.
// modules.js
const fs = require('fs');
module.exports = {
fs
};
------------------
// file1.js
const { fs } = require('modules.js');
------------------
// file2.js
const { fs } = require('modules.js');
Do file1 and file2 receive the same cached copy of fs? This is important because some of my modules require initialization and I don't want this code being executed multiple times.
P.S. The whole point of this is so that I can have a single location to require specific modules from. If there's a better way to do this, please school me!
Thanks in advance.
Do file1 and file2 receive the same cached copy of fs? This is important because some of my modules require initialization and I don't want this code being executed multiple times.
Yes, modules are cached. So, unless you manually reach into the cache to remove a module from the cache, an already initialized module will just be fetched from the cache on the 2nd, 3rd, 4th, etc... times it is loaded. It will only be initialized once. This is a fundamental (and very useful) part of the node.js module design philosophy.
The whole point of this is so that I can have a single location to require specific modules from. If there's a better way to do this, please school me!
I would strongly discourage you from doing something like you show in your question. If you need the fs module, then just include the fs module directly. There is no savings for doing it the way you propose and it only obscures the real dependencies and ties modules together in ways that makes them harder to reuse individually. The whole point of modularity is that you can build small to medium sized chunks of separate, reusable and testable code that aren't interwoven with the rest of your code and clearly spell out their own dependencies. Linking things to some intermediate modules module just obscures all that and creates ties between things when that is not necessary. It also can complicate individual module testing.
Look at what typically happens. You start creating the modules module. FileA needs four things in it so you put those four things in it. FileB needs 3 common ones, but two other ones. So, now you have 6 things in the modules module. But, you've now made FileA depend on 2 modules in the modules module that it doesn't really depend on and FileB now depends on 1 module in the modules module that it doesn't really depend on. Extend this a few more ways and you quickly have a big mess of false dependencies. To actually reuse a given module in another project, you've now got to pull in a lot more than is really necessary. In any sort of larger project this can get messy real quickly.
Even if you only ever use the module module in cases where another module needs everything in it, what have you actually gained over just specifying the modules you actually need in the module that needs them? You've attempted to save a small amount of typing, but complicated the cleanliness of your dependencies and simplicity of reuse. IMO, not the correct tradeoff.
For some reason which seems to happen to many when they first start programming on node.js (but the full reasoning isn't entirely clear to me), many people find they want to avoid manually typing the module dependencies at the start of each module. Perhaps it's the notion that we shouldn't ever be repeating lines of code so the 2nd and 3rd time we go to type similar module dependencies into the start of a new module, we have an urge to encapsulate that in some common code. That's generally a good notion to follow, but not in this specific case because of the compromise to modularity and independence and testability. I think it's just something to get used to when programming with node.js modules and is the better way to code your modules. Don't create intermediate aggregation modules that add no real value and just obscure the actual dependencies.
i'm studying how nodejs module system works.
I've found so far this literature:
https://nodejs.org/api/modules.html
http://fredkschott.com/post/2014/06/require-and-the-module-system/
http://www.bennadel.com/blog/2169-where-does-node-js-and-require-look-for-modules.htm
It helps me to understand a few points however is still have these questions:
If i have a expensive resource (let's say database pooling connection) inside a module, how can i make sure that by requiring the module again i am not re-evaluating the resource?
How can i dynamically unload a disposable resource once i already called the 'require' on it?
It's important because i have some scenarios that demands me to make sure that i have a single instance of database pool. because of this i'm exporting modules which are able to receive parameters instead of just require the expensive resource.
Any guidance is very appreciated.
Alon Salont has written an EXCELLENT guide to understanding exports in NodeJS (which is what you're accessing when you call require()) here:
http://bites.goodeggs.com/posts/export-this/#singleton
If you study the list of options for what a module can export, you'll see that the answer to your question depends on how the module is written. When you call require, NodeJS will look for the module loaded in its cache and return it if it already had it loaded elsewhere.
That means if you choose to export a Singleton pattern, are monkey-patching, or creating global objects (I recommend only the first one in your case), only ONE shared object will be created / will exist. Singleton patterns are good for things like database connections that you want to be shared by many modules. Although some argue that "injecting" these dependencies via a parent/caller is better, this is a philosophical view not shared by all, and singletons are widely used by software developers for shared-service tasks like this.
If you export a function, typically a constructor, require() will STILL return just one shared reference to that. However, in this case, the reference is to a function, not something the function returns. require() doesn't actually CALL the function for you, it just gives you a reference to it. To do any real work here, you must now call the function, and that means each module that requires this thing will have its own instance of whatever class the module provides. This pattern is the more traditional one where a class instance is desired / is the goal. Most NPM modules fit this category, although that doesn't mean a singleton is a bad idea in your case.
I have an Angular application which is going to become quite large. For the sake of clarity, I opted for splitting it into different modules, one for each application area. I'm adding components strictly related to an area to the corresponding module, and shared components to an additional common module. All these modules are in turn loaded by a unique main module, that depends on all of these and basically represents the application as a whole.
Now, it turns out that many of the areas need internationalization, so each area's module depends in turn on a translation module to have its string localized. But when I put it all together by means of the main "aggregator" module the translate module ends up with being loaded multiple times, with only one instance actually used.
For instance, let's say my application has the following areas:
Main dashboard
Settings
Documentation
Support
Corresponding modules are declared as follows:
angular.module('app.mainDashboard', [translate]);
angular.module('app.settings', [translate]);
angular.module('app.settings', [translate]);
angular.module('app.support', [translate]);
and the main module looks like this:
angular.module('app.main', [app.mainDashboard, app.settings, app.settings, app.support]);
This approach definitely works, but I bet there is a better way to achieve the same result without repeatedly loading the same dependency. Or perhaps my initial module splitting approach is awkward by itself and in turn it leads to this other inconvenient.
I have 3000+ lines of javascript that I need to get into a sensible/maintainable structure. I have chosen to use requireJS as it has been recommend to me by a few people. I have a bunch of variables that are used throughout the application and need to be available everywhere. I also have a bunch of functions that need to be available everywhere. Apart from these two dependencies most of the code can be divided off into their own modules.
I am having trouble understanding how to manage my main variables so that if one module of code makes changes to the variables the rest of the JS modules will see that change. I think I need to see a few examples that demonstrate how requireJS is intended to work on a larger scale that the examples in the documentation.
If anyone is an experienced requireJS user I would love the hear your tips!
The whole point of RequireJS is to avoid the need for these global variables and global functions.
Can't you wrap those globals into a module, then depend on it in your other modules?
For example, a RequireJS modularized Dojo may be something like:
dojo/cache module
dojo/string module (requires dojo/cache)
dojo/date module (requires dojo/string)
dojo/cookie module (requires dojo/string)
:
:
dojo module (requires everything above, make them all into sub-objects, say, e.g. dojo.cache, dojo.string, dojo.date etc.)
user module #1 (requires dojo)
user module #2 (maybe only requiring dojo/string)
RequireJS gives you better options for encapsulating modules, but it doesn't change Javascript itself at all. As a transition strategy, you can still define your globals inside the function block. Depending on the module that contains these definitions will ensure that it has run before the dependent modules.
Next step would be to assign those methods to an object other than window, and then use that object through the variable received from RequireJS module dependency.
Hopefully by the time you've done this, you might have some insight into a cleaner solution. I refactored (and still am) a single-file project into several files, including optional plug-ins, although I did most of it before adding RequireJS to the mix.
See the RequireJS documentation:
Defining a module
Definition Functions with Dependencies
If the module has dependencies, the first argument should be an array of dependency names, and the second argument should be a definition function. ... The dependencies will be passed to the definition function as function arguments
define(["./cart", "./inventory"], function(cart, inventory) {
// ...
return { ... };
}
);
So I think you can define() your main module like all other modules and make the submodules depend on that main module. Then the module object is passed as an argument to the definition function of a submodule. You don't have to use global variables.
If you want to share information between modules, attach the information to the module object, and have the other modules rely on that module, and check its property.
If you have existing code, you can assign to window.x to provide a global x while you are cleaning it up.