RequireJs extend module initialize begin and end - javascript

I have created a JavaScript library which can be used for logging purposes.
I also want to support the logging of requirejs.
Which functions/events of requirejs can I prototype/wrap so I can log when a module is initialized and when it is done initializing and returns the initialized object.
For instance if I call require(["obj1","obj2", "obj3"], function(obj1, obj2, obj3){}
I would like to know when requirejs begins on initializing each of the object, and I would like to know when each object is completely initialized.
I looked into the documentation/code, but could not find any usefull functions I can access from the requirejs object or the require object.
Note: I do not want to change the existing code of requirejs I wish to append functionality from the outside by either prototyping or wrapping.
What I have tried (problem is that this only accesses the begin and end of the entire batch of modules):
var oldrequire = require;
require = function (deps, callback, errback, optional) {
console.log("start");
var callbackWrapper = callback;
callbackWrapper = function () {
console.log("end");
var args = new Array();
for(var i = 0; i < arguments.length; i++) {
args.push(arguments[i]);
}
callback.apply(this, args);
};
oldrequire.call(this, deps, callbackWrapper, errback, optional);
};

This is a "better than nothing answer", not a definitive answer, but it might help you look in another direction. Not sure if that's good or bad, certainly it's brainstorming.
I've looked into this recently for a single particular module I had to wrap. I ended up writing a second module ("module-wrapper") for which I added a path entry with the name of the original module ("module"). I then added a second entry ("module-actual") that references the actual module which I require() as a dependency in the wrapper.
I can then add code before and after initialization, and finally return the actual module. This is transparent to user modules as well as the actual module, and very clean and straightforward from a design standpoint.
However, it is obviously not practical to create a wrapper per module manually in your case, but you might be able to generate them dynamically with some trickery. Or somehow figure out what name was used to import the (unique) wrapper module from within it so that it can in turn dynamically import the associated actual module (with an async require, which wouldn't be transparent to user code).
Of course, it would be best if requirejs provided official hooks. I've never seen such hooks in the docs, but you might want to go through them again if you're not more certain than me.

Related

How to migrate legacy JS app to modules

I have a large (~15k LoC) JS app (namely a NetSuite app) written in old-style all-global way. App consists of 26 files and dependencies between them are totally unclear.
The goal is to gracefully refactor the app to smaller modules. By gracefully I mean not breaking\locking the app for long time, but doing refactoring in smaller chunks, while after completing each chunk app remains usable.
An idea I have here is to concat all the JS files we have now into single-file bundle. After that some code could be extracted into modules. And the legacy code could start importing it. The modules & imports should be transpiled with webpack\whatever, while legacy code remains all-globals style. Finally all this is packed into single JS file and deployed.
My questions are
is there a better approach maybe? This sounds like a typical problem
are there any tools available to support my approach?
I gave webpack a try and I haven't managed to get what I want out of it. The export-loader and resolve-loader are no options because of amount of methods\vars that needs to be imported\exported.
Examples
Now code looks like
function someGlobalFunction() {
...
}
var myVar = 'something';
// and other 15k lines in 26 files like this
What I would ideally like to achieve is
function define(...) { /* function to define a module */ }
function require(moduleName) { /* function to import a module */ }
// block with my refactored out module definitions
define('module1', function () {
// extracted modularised code goes here
});
define('module2', function () {
// extracted modularised code goes here
});
// further down goes legacy code, which can import new modules
var myModule = require('myNewModule');
function myGlobalLegacyFunction() {
// use myModule
}
I'm following an approach similar to that outlined here: https://zirho.github.io/2016/08/13/webpack-to-legacy/
In brief:
Assuming that you can configure webpack to turn something like
export function myFunction(){...}
into a file bundle.js that a browser understands. In webpack's entry point, you can import everything from your module, and assign it to the window object:
// using namespace import to get all exported things from the file
import * as Utils from './utils'
// injecting every function exported from utils.js into global scope(window)
Object.assign(window, Utils).
Then, in your html, make sure to include the webpack output before the existing code:
<script type="text/javascript" src="bundle.js"></script>
<script type="text/javascript" src="legacy.js"></script>
Your IDE should be able to help identify clients of a method as you bring them into a module. As you move a function from legacy.js to myNiceModule.js, check to see if it still has clients that are aware of it globally - if it doesn't, then it doesn't need to be globally available.
No good answer here so far, and it would be great if the person asking the question would come back. I will pose a challenging answer saying that it cannot be done.
All module techniques end up breaking the sequential nature of execution of scripts in the document header.
All dynamically added scripts are loaded in parallel and they do not wait for one another. Since you can be certain that almost all such horrible legacy javascript code is dependent on the sequential execution, where the second script can depend on the first previous one, as soon as you load those dynamically, it can break.
If you use some module approach (either ES7 2018 modules or require.js or you own) you need to execute the code that depends on the loading having occurred in a call-back or Promise/then function block. This destroys the implicit global context, so all these spaghetti coils of global functions and var's we find in legacy javascript code files will not be defined in the global scope any more.
I have determined that only two tricks could allow a smooth transition:
Either some way to pause continuation of a script block until the import Promise is resolved.
const promise = require("dep1.js", "dep2.js", "dep3.js");
await promise;
// legacy stuff follows
or some way to revert the scope of a block inside a function explicitly into the global scope.
with(window) {
function foo() { return 123; }
var bar = 543;
}
But neither wish was granted by the javascript fairy.
In fact, I read that even the await keyword essentially just packs the rest of the statements into function to call when promise is resolved:
async function() {
... aaa makes promise ...
await promise;
... bbb ...
}
is just, I suppose, no different from
async function() {
... aaa makes promise ...
promise.then(r => {
... bbb ...
});
}
So this means, the only way to fix this is by putting legacy javascript statically in the head/script elements, and slowly moving things into modules, but continue to load them statically.
I am tinkering with my own module style:
(function(scope = {}) {
var v1 = ...;
function fn1() { ... }
var v2 = ...;
function fn2() { ... }
return ['v1', 'fn1', 'v2', 'fn2']
.reduce((r, n) => {
r[n] = eval(n);
return r;
}, scope);
})(window)
by calling this "module" function with the window object, the exported items would be put on there just as legacy code would do.
I gleaned a lot of this by using knockout.js and working with the source readable file that has everything together but in such module function calls, ultimately all features are on the "ko" object.
I hate using frameworks and "compilation" so generating the sequence of HTML tags to load them in the correct order by the topologically sorted dependency tree, while I could write myself such a thing quickly, I won't do this, because I do not want to have any "compilation" step, not even my own.
UPDATE: https://stackoverflow.com/a/33670019/7666635 gives the idea that we can just Object.assign(window, module) which is somewhat similar to my trick passing the window object into the "module" function.

How do I keep functions out of the global namespace when using multiple files?

Last time I built a large-scale application with JS I used require.js - which is great, but can be quite an overhead, especially if you don't want to load files asychronously, so this time I'm going without it.
This time I'm writing in pure JS and concatenating and minifying everything with Grunt (I'm a Grunt n00b here). Because I'm keeping all the functions in separate files, I can't wrap everything in a closure like I could if I was using a single file. What's the best solution to keeping all functions out of the global namespace?
I'm not sure I need a full dependency management solution, but I'd consider one if it's lightweight and simple.
If you want to do it without any dependency management tool you can for example use the Revealing Module Pattern and namespaces, simplified example:
Top/Application file
window.SomeApplication = (function () {
// Add functions you want to expose to this
this.require= function (path) { // Creates namespace if not already existing, otherwise returns reference to lowest level object in path
var current = window,
i;
path = path.split('.');
for (i = 0; i < path.length; ++i) {
if (!current[path[i]]) {
current[path[i]] = {};
}
current = current[path[i]];
}
return current;
};
return this;
})();
Some other file
SomeApplication.require('SomeApplication.SomeSubNamespace').SomeModule = (function () {
// Module code
return this;
})();
Then use your grunt concat and specify the top file first. This way you will only expose one item on the window object and your module will be accessible through window.SomeApplication.SomeSubNamespace.SomeModule.
There are a number of common and easy to use JavaScript tools for managing application-wide dependencies by either implementing the CommonJS (the specification used by require.js) or the ES2015 module specification, including (as Benjamin suggested) Webpack, Browserify, and Rollup.

Can I export/require a module in Node.js to be used without a var holding its objects?

I am creating my own error library to have a custom catalog of specific and well documented errors to return on my API. I am doing something like this:
module.exports = CError;
function CError () {
}
// CUSTOM ERROR TYPES
CError.EmptyParamError = createErrorType(...);
CError.InvalidFormatError = createErrorType(...);
A sample of how I use my custom error types right now:
CError = require('cerror');
if(!passwd)
callback(new CError.EmptyParamError(passwd, ...));
I will use this errors through my entire project and I wish to have a cleaner code like this: (without the CError reference)
if(!passwd)
callback(new EmptyParamError(passwd, ...);
Is there a way to export the module or to require it that allows me to do this?
I googled without finding any answer, I also checked all this interface design patterns for Node.js modules but no one applies.
You can set it as a global, though as always when using globals, beware of the side-effects.
EmptyParamError = createErrorType(...);
That's it. Just leave off the var keyword, and don't set it as a property.
If it's only one or two types, you can skip the CError variable like this:
var EmptyParamError = require('cerror').EmptyParamError;
if(!passwd)
callback(new EmptyParamError(passwd, ...));
If you have multiple types in a single file, there will be multiple require('cerror') statements, but I believe there's no significant performance hit there because (if I understand correctly) Node will cache it the first time.

Requiring Singleton Instances in Node

I spied in some code the following:
var Log = require('log');
// This creates a singleton instance
module.exports = new Log('info');
My first reaction was "no way that is a singleton", but I remember an article that outlined how Node.js does require() statements said something about caching require statements and using them in subsequent calls.
So basically my question, is exporting a new Object() actually creating a singleton behind the scenes?
I know there are other ways of creating singletons in JavaScript, but this seems like a pretty handy way of doing it inside Node (if it actually works).
Yes, that's the closest thing to a Singleton in Node.js. require() caches each module upon the first invocation, so every subsequent require() returns always the same instance.
However you must be aware that this is not always enforced. The cache could be modified/removed, also the module is cached using its fullpath, in short it means that if you define a singleton in a package and your package is installed as nested dependency, you may have several instances of your "singleton" within the context of the same application.
Quick example:
Application
node_modules
PackageA
node_modules
YourPackageWithSingleton
PackageB
node_modules
YourPackageWithSingleton
Usually you don't have to worry about this, but if you really need to share one object across the entire application, the only solution is using globals (discouraged) or Dependency Injection.
Its not really a singleton due to you could create as much instances of Log as you want. But you can use it like this. Otherwise you just can create an plain object by var mySingleton = {} and attach attributes and methods to it.
Then you can assign mySingleton to module.exports and require() it in other modules.
foo.js:
var mySingleton = {
someMethod = function () {}
};
module.exports = mySingleton;
bar.js:
var singletonClass = require('./foo.js');
singletonClass.someMethod();
Why not test it out for yourself?
$ cat test.js
module.exports = Math.random();
$ node
> var a = require("./test");
undefined
> var a = require("./test");
undefined
> a === b
true
Yes, it seems that node.js does indeed cache exported values. Hence if you require the same module again it will return the cached value instead of creating a new value.
It should be noted however that the programmer could just require("log") and create a new instance manually. So in that sense it is not a singleton.

Using reflection, dynamically load require module

So I have a small requirejs application that needs to create instances of a dynamic list of classes in runtime. Basically, reflection. I've done quite a bit of reading, but I've been finding a lot of references to Typescript, which I'm not using.
The principal idea is that before requirejs is ready, an array is loaded with a list of classes that will be required. This array is given to requirejs after its main entry point and I hope to create an instance for each entry.
I have done some reading of Ben Nadal's blog here http://www.bennadel.com/blog/2320-extending-classes-in-a-modular-javascript-application-architecture-using-requirejs.htm and I like his pattern and think this would work well for some of the modules I plan to dynamically create.
I had a thought that I could do something like this:-
_.each(loader, function(dep) {
require([dep.name]);
});
With loader being the global loaded with the array list. This doesn't create an instance of the dependency though, which is what I want in this case, like so:-
new Carousel('Delboy');
new Carousel('Rodney');
new Carousel('Grandad');
This, in this example, would create an instance of 3 new carousels, each with a name as passed in via the constructor. I think I am missing something in my understanding, help is appreciated.
Each resolved AMD dependency is an AMD module, which means that it's either a singleton object or a function. In the post by Ben Nadel which you referred to, a distinction is made between "definitions" and "instances". A definition is a singleton, and from a definition (function) you can create multiple instances. In Ben's terminology, RequireJS will only give you the definitions, and it is up to you to create the instances.
So, the following should work for what you're trying to do:
define('Carousel', [], function (name) {
this.name = name;
});
var loader = {};
var carousels = ['Delboy','Rodney','Grandad'];
carousels.forEach(function (carouselName) {
require(['Carousel'], function (Carousel) {
loader[carouselName] = new Carousel(carouselName);
});
});

Categories

Resources