I'd like to use something like requireJS to modularise my javascript app, however, there's one particular requirement I'd like to meet that I haven't figured out:
I'm happy enough to have require and define in the global space, however, I don't want modules that I import with require to be accessible globally (i.e. they should be accessible to my app, but not other apps running on the same page).
It seems to me that if I call define('moduleA', function() { ... }); I can then access that module - globally - via the require function. It may not be occupying a variable in the global space itself, or be attached to window, but it still feels bad, because other apps really shouldn't be able to see my internal modules (not to mention potential naming conflicts, etc, can I use contexts to circumvent this?).
This seems to be a step back from just namespacing my modules and including them all inside of one big privatising function at build time.
I could have my own private version of require but my modules (being in different files) wouldn't be able to access define.
Am I missing something or do I just have to live with this? (or alternatively, run an optimizer to bake everything into one file - feels like I could just namespace my modules and not bother with requireJS at all if I do that though).
In your r.js build config add wrap: true and you will no longer pollute the glabal scope with require or define functions.
Edit:
You can also specify a namespace for your require and define with the namespace setting in the r.js build config. You can read more about namespacing in RequireJS's FAQ. From the example.build.js comments:
// Allows namespacing requirejs, require and define calls to a new name.
// This allows stronger assurances of getting a module space that will
// not interfere with others using a define/require AMD-based module
// system. The example below will rename define() calls to foo.define().
// See http://requirejs.org/docs/faq-advanced.html#rename for a more
// complete example.
namespace: 'foo',
Related
I'm working with a legacy project and I try to convert it to a modern webpack project, with minimum changes to the original legacy files.
The problem is that many of these legacy files rely on each other's globals, e.g:
legacy1.js
console.log("Legacy1");
function globalMess() {
console.log("I am doing global mess!")
}
legacy2.js
console.log("Legacy2");
globalMess();%
The ideas solution would be one that:
Allows all the legacy files use each other globals without me searching for every global.
The globals would be only global to the legacy files, and not leak out to the real global scope.
Allow me to use Webpack's cache busting hash on the resulting legacy file. So that if I do I fix a bug in one the legacy files, the resulting file would have a new hash (e.g legacy1.abc.js).
Automatically add the resulting file to index.html usings something like HtmlWebpackPlugin.
What I've considered so far
These are solutions I've considered:
Solution 1
Simply inject them manually into index.html, in the same order they appear in the original project. This solution would have no cache busting and would leak globals.
Solution 2
Using ExposeLoader/ProvidePlugin. Those require me know specify each and every global that each library exposes. This doesn't make sense to me as I have 20 legacy files and each exposes some random function and I don't really know which exposes what.
Solution 3
Using webpack-raw-bundler. It concats the files, pretty much like I wanted. However, it doesn't support cache busting via hash or automatically adding itself to index.html.
Ideal solution
So what I imagine as an ideal solution would be a webpack plugin that globs all the legacy files, bundles them all as into a single file in its own function and produces a single file with a hash. That file will also get injected into the index.html using the logic of HtmlWebpackPlugin.
Its usage would look something like that:
new LegacyBundlePlugin({
uglify: false,
sourceMap: false,
name: "legacy",
fileName: "[name].[hash:8].js",
filesToConcat: ["./legacy1.js", "./legacy2.js"],
}),
Which would produce the following:
(function () {
console.log("Legacy1");
function globalMess() {
console.log("I am doing global mess!")
}
console.log("Legacy2");
globalMess();
})();
Currently the best solution I found is using webpack-concat-plugin.
It bundles everything to a single file, allows me to use cache busting and adds itself via HtmlWebpackPlugin to the resulting html.
The only thing it doesn't do is not-leaking to the global scope as it simply concats the provided files and doesn't put them into any scope.
I am using webpack to transpile/bundle/etc my js files. Im using the import/export syntax ( as opposed to the CommonJS way of require and export.module ). This means i need to import each class and all subclasses of it however many times if i need to use them in the context of a specific script.
The question:
Even though classes arent natively supported in js, why do we need to import them all the time? Wouldnt it be easier if ( and im only speaking for classes ) they were available to all scopes?
EDIT: To avoid polluting the global scope one could do something like global.myLibs and be done with that issue. I personally prefix my classes with something unique but this method would serve even those that dont i suppose.
For example:
window.myClasses could serve as a container for all my classes. I come from an iOS background where all the classes in a main "bundle", in java i think that would be a "package" are available to everyone. Re-importing the class itself doesnt seem like it serves any purpose.
See here:
Why do i need to import modules everytime in webpack bundle?
and here: Bundling js files with webpack class undefined
Adding things in the global scope can lead you to naming conflicts. What if you created a class named Node, add it to global scope by doing window.Node = Node ? You lose the reference to the browser's global Node object.
You can argue that you will never use names that are already used for global objects. But now, what if in a year or so, a new object is added to the spec with the same name as one of yours, and you want to use it alongside your own object ? You have to make a choice, or rename your own object. This is not future proof.
Importing the same module in every module that uses it is a best practice. Because by doing it, you never pollute the global scope. Don't be afraid that it imports the code of this module n times in your final bundle. Webpack imports it only one time, then uses a reference to the module everytime you are importing it.
See these resources :
JavaScript Modules: A Beginner’s Guide by Preethi Kasireddy on Medium
Eloquent Javascript Modules chapter
Im quite new to using requireJs and im having an issue with setting global configuration for a module.
I am using accountingJs and want to modify the setting globally in this case i want to change the symbol from $ to £.
Without RequireJS you would simply do something like this as accounting would be in the global namespace
accounting.settings = $.extend(accounting.settings, {
currency: { symbol: '\u00A3 '}
});
accountingJs is AMD compliant and works perfectly with require but i cant seem to figure out a way of passing the config into it globally rather than .
I have seen the config setting in require docs here and i can set the config here but accountingjs doesn't pick this up (it isnt coded to!).
My question is how can i set configuration like this for a AMD compliant module globally within the page?
I can see a few options
Edit accountingjs to look at module.config() and load any config it sees - i have tried this and it does work as expected but i dont really want a custom.
use shim config and use the init call back - i havent got this to work (maybe because it is already AMD compliant)
create a new module to wrap accountingjs in another define and apply the config here and use this module in each page - not tries this but i guess it would work...
what i really ant to do is have a way of globally applying config to an already existing module from the require config is it possible??
If the AMD module is not designed to use module.config, then you can't force it to use it. The solution you mention last is the most robust: create a wrapper module that configures the actual module as you want. This wrapper can use module.config to grab values. This solution is likely to work with RequireJS now and for quite a long time since you're using API features that are well documented and central to RequireJS' functionality.
As for a shim, I don't recall the docs for RequireJS ever providing a solution that consists as using a shim for a module that is already designed to work with AMD loaders. So if using a shim worked, it would be by happenstance rather than by design.
TL;DR at the bottom of the question, for those who don't want to read all my junk.
My current method of JavaScript modularization is to simply create a global anchor, like "csc" (my company acronym) and then start tacking modules onto that.
So I'll end up with a global structure like:
csc.
Utils
Map
.Geolocation
Storage.
Cookie
DB
And each of these files are stored in a directory structure:
csc.js
csc.Utils.js
csc.Map.js
csc.Storage.js
This eliminates the need to load my entire codebase all the time.
I'm trying to transition toward using RequireJS, but the methodology employed by that library seems to be a bit different.
In order to maintain my namespaced structure, I could define modules around all of my modules, but still tack them into the global "csc" reference. However this seems to go against the core principles of Require.
Without keeping them global, however, I lose my nice namespacing, such as "csc.Map.Geolocation" because I now have a bunch of separate variables, like so:
require(['csc', 'csc.Utils', 'csc.Map'], function (csc, utils, map) {
});
I'll strip my question down to its essence, in case my horrible description above didn't suffice:
Is there a way, perhaps within the module definitions, to combine these three variables back into the structure defined at the top of this question? Or am I going about this all wrong, and should I instead be adhering to the Require way of doing things?
I'd love to follow the Require methodology, but I also love the ability to have all of my modules chainable and namespaced.
Don't be discouraged by the documentation's example path hierarchy, notice that require does not strictly enforce any particular convention. You are free to design and follow your own convention.
Utils, Map, and Storage all become directories. The base actions that they perform should be named module.js in their respective directories, like so:
core.js
Utils/
module.js
Map/
module.js
geolocation.module.js
Storage/
module.js
cookie.module.js
db.module.js
The module.js files include and return their children. Here is an example of Storage/module.js:
require(["Storage/cookie", "Storage/db"], function (cookie, db) {
var Storage = {};
// do whatever you need with Storage
Storage.cookie = cookie
Storage.db = db
return Storage
});
Notice also the core.js file in root. This file would work just the same,
require(["Utils/module", "Storage/module", "Map/module"], function (utils, storage, map) {
var Core = {};
// do whatever you need with Storage
Core.Utils = utils
Core.Storage = storage
Core.Map = map
return Core
});
Now, require core.js wherever you will need access to these modules (every file basically). Yes, this will load all of the involved files all of the time in development, but when you compile your app, all of your variables will be inside the scope of an anonymous function, and not directly accessible via the window object.
Again, tweak this as you see fit, it's your own convention.
I'm working on a project with Node.js and the server-side code is becoming large enough that I would like to split it off into multiple files. It appears this has been done client-side for ages, development is done by inserting a script tag for each file and only for distribution is something like "Make" used to put everything together. I realize there's no point in concatting all the server-side code so I'm not asking how to do that. The closest thing I can find to use is require(), however it doesn't behave quite like script does in the browser in that require'd files do not share a common namespace.
Looking at some older Node.js projects, like Shooter, it appears this was once not the case, that or I'm missing something really simple in my code. My require'd files cannot access the global calling namespace at compile time nor run time. Is there any simple way around this or are we forced to make all our require'd JS files completely autonomous from the calling scope?
You do not want a common namespace because globals are evil. In node we define modules
// someThings.js
(function() {
var someThings = ...;
...
module.exports.getSomeThings = function() {
return someThings();
}
}());
// main.js
var things = require("someThings");
...
doSomething(things.getSomeThings());
You define a module and then expose a public API for your module by writing to exports.
The best way to handle this is dependency injection. Your module exposes an init function and you pass an object hash of dependencies into your module.
If you really insist on accessing global scope then you can access that through global. Every file can write and read to the global object. Again you do not want to use globals.
re #Raynos answer, if the module file is next to the file that includes it, it should be
var things = require("./someThings");
If the module is published on, and installed through, npm, or explicitly put into the ./node_modules/ folder, then the
var things = require("someThings");
is correct.