Strange behavior with RequireJS using CommonJS sintax - javascript

I'm a strange behavior with RequireJS using the CommonJS syntax. I'll try to explain as better as possible the context I'm working on.
I have a JS file, called Controller.js, that registers for input events (a click) and uses a series of if statement to perform the correct action. A typical if statement block can be the following.
if(something) {
// RequireJS syntax here
} else if(other) { // ...
To implement the RequireJS syntax I tried two different patterns. The first one is the following. This is the standard way to load modules.
if(something) {
require(['CompositeView'], function(CompositeView) {
// using CompositeView here...
});
} else if(other) { // ...
The second, instead, uses the CommonJS syntax like
if(something) {
var CompositeView = require('CompositeView');
// using CompositeView here...
} else if(other) { // ...
Both pattern works as expected but I've noticed a strange behavior through Firebug (the same happens with Chrome tool). In particular, using the second one, the CompositeView file is already downloaded even if I haven't follow the branch that manages the specific action in response to something condition. On the contrary, with the first solution the file is downloaded when requested.
Am I missing something? Is it due to variable hoisting?

This is a limitation of the support for CommonJS-style require. The documentation explains that something like this:
define(function (require) {
var dependency1 = require('dependency1'),
dependency2 = require('dependency2');
return function () {};
});
is translated by RequireJS to:
define(['require', 'dependency1', 'dependency2'], function (require) {
var dependency1 = require('dependency1'),
dependency2 = require('dependency2');
return function () {};
});
Note how the arguments to the 2 require calls become part of the array passed to define.
What you say you observed is consistent with RequireJS reaching inside the if and pulling the required module up to the define so that it is always loaded even if the branch is not taken. The only way to prevents RequireJS from always loading your module is what you've already discovered: you have to use require with a callback.

Related

How to migrate legacy JS app to modules

I have a large (~15k LoC) JS app (namely a NetSuite app) written in old-style all-global way. App consists of 26 files and dependencies between them are totally unclear.
The goal is to gracefully refactor the app to smaller modules. By gracefully I mean not breaking\locking the app for long time, but doing refactoring in smaller chunks, while after completing each chunk app remains usable.
An idea I have here is to concat all the JS files we have now into single-file bundle. After that some code could be extracted into modules. And the legacy code could start importing it. The modules & imports should be transpiled with webpack\whatever, while legacy code remains all-globals style. Finally all this is packed into single JS file and deployed.
My questions are
is there a better approach maybe? This sounds like a typical problem
are there any tools available to support my approach?
I gave webpack a try and I haven't managed to get what I want out of it. The export-loader and resolve-loader are no options because of amount of methods\vars that needs to be imported\exported.
Examples
Now code looks like
function someGlobalFunction() {
...
}
var myVar = 'something';
// and other 15k lines in 26 files like this
What I would ideally like to achieve is
function define(...) { /* function to define a module */ }
function require(moduleName) { /* function to import a module */ }
// block with my refactored out module definitions
define('module1', function () {
// extracted modularised code goes here
});
define('module2', function () {
// extracted modularised code goes here
});
// further down goes legacy code, which can import new modules
var myModule = require('myNewModule');
function myGlobalLegacyFunction() {
// use myModule
}
I'm following an approach similar to that outlined here: https://zirho.github.io/2016/08/13/webpack-to-legacy/
In brief:
Assuming that you can configure webpack to turn something like
export function myFunction(){...}
into a file bundle.js that a browser understands. In webpack's entry point, you can import everything from your module, and assign it to the window object:
// using namespace import to get all exported things from the file
import * as Utils from './utils'
// injecting every function exported from utils.js into global scope(window)
Object.assign(window, Utils).
Then, in your html, make sure to include the webpack output before the existing code:
<script type="text/javascript" src="bundle.js"></script>
<script type="text/javascript" src="legacy.js"></script>
Your IDE should be able to help identify clients of a method as you bring them into a module. As you move a function from legacy.js to myNiceModule.js, check to see if it still has clients that are aware of it globally - if it doesn't, then it doesn't need to be globally available.
No good answer here so far, and it would be great if the person asking the question would come back. I will pose a challenging answer saying that it cannot be done.
All module techniques end up breaking the sequential nature of execution of scripts in the document header.
All dynamically added scripts are loaded in parallel and they do not wait for one another. Since you can be certain that almost all such horrible legacy javascript code is dependent on the sequential execution, where the second script can depend on the first previous one, as soon as you load those dynamically, it can break.
If you use some module approach (either ES7 2018 modules or require.js or you own) you need to execute the code that depends on the loading having occurred in a call-back or Promise/then function block. This destroys the implicit global context, so all these spaghetti coils of global functions and var's we find in legacy javascript code files will not be defined in the global scope any more.
I have determined that only two tricks could allow a smooth transition:
Either some way to pause continuation of a script block until the import Promise is resolved.
const promise = require("dep1.js", "dep2.js", "dep3.js");
await promise;
// legacy stuff follows
or some way to revert the scope of a block inside a function explicitly into the global scope.
with(window) {
function foo() { return 123; }
var bar = 543;
}
But neither wish was granted by the javascript fairy.
In fact, I read that even the await keyword essentially just packs the rest of the statements into function to call when promise is resolved:
async function() {
... aaa makes promise ...
await promise;
... bbb ...
}
is just, I suppose, no different from
async function() {
... aaa makes promise ...
promise.then(r => {
... bbb ...
});
}
So this means, the only way to fix this is by putting legacy javascript statically in the head/script elements, and slowly moving things into modules, but continue to load them statically.
I am tinkering with my own module style:
(function(scope = {}) {
var v1 = ...;
function fn1() { ... }
var v2 = ...;
function fn2() { ... }
return ['v1', 'fn1', 'v2', 'fn2']
.reduce((r, n) => {
r[n] = eval(n);
return r;
}, scope);
})(window)
by calling this "module" function with the window object, the exported items would be put on there just as legacy code would do.
I gleaned a lot of this by using knockout.js and working with the source readable file that has everything together but in such module function calls, ultimately all features are on the "ko" object.
I hate using frameworks and "compilation" so generating the sequence of HTML tags to load them in the correct order by the topologically sorted dependency tree, while I could write myself such a thing quickly, I won't do this, because I do not want to have any "compilation" step, not even my own.
UPDATE: https://stackoverflow.com/a/33670019/7666635 gives the idea that we can just Object.assign(window, module) which is somewhat similar to my trick passing the window object into the "module" function.

Replacing requirejs with systemjs -- variables not visible in local scope

I'm trying to convert our requirejs calls to use SystemJS, but I'm not exactly sure what I'm doing wrong.
Our original calls look like this:
return function(callback) {
requirejs(["/app/shared.js"], function(result){
callbackFunction = callback;
callback(dashboard);
main();
})
}
And what I'm trying instead is:
return function(callback) {
console.log(callback.toString())
SystemJS.import('app/shared.js').then(function(result){
callbackFunction = callback;
callback(dashboard);
main();
});
}
I've had to remove some leading / to get things to load properly, which is fine, but I've now ran into an issue where variables that were defined at the top of shared.js aren't visible in my local main.js file. In my browser console I get:
Potentially unhandled rejection [1] ReferenceError: dashboard is not defined
shared.js defines dashboard:
var dashboard = { rows: [], }
// Other definitions...
define(["/app/custom-config.js", /* etc */]);
I guess I have two questions:
is this the correct way to replace requirejs calls?
if so, why aren't my variables from shared.js accessible?
For a fuller picture, main() just sets up the dashboard object, and then calls callbackFunction(dashboard) on it.
Your problem can be reduced to the following case where you have two AMD modules, with one that leaks into the global space, and the 2nd one that tries to use what the first one leaked. Like the two following modules.
src/a.js requires the module that leaks and depends on what that module leaks:
define(["./b"], function () {
console.log("a loaded");
callback();
});
src/b.js leaks into the global space:
// This leaks `callback` into the global space.
var callback = function () {
console.log("callback called");
}
define(["./b"], function () {
console.log("b loaded");
});
With RequireJS, the code above will work. Oh, it is badly designed because b.js should not leak into the global space, but it will work. You'll see callback called on the console.
With SystemJS, the code above won't work. Why? RequireJS loads modules by adding a script element to the header and lets script execute the module's code so callback does end up in the global space in exactly the same way it would if you had written your own script element with an src attribute that points to your script. (You'd get an "Mismatched anonymous define" error, but that's a separate issue that need not detain us here.) SystemJS, by default, uses eval rather than create script elements, and this changes how the code is evaluated. Usually, it does not matter, but sometimes it does. In the case at hand here callback does not end up in the global space, and module a fails.
Ultimately, your AMD modules should be written so that they don't use the global space to pass information from one another.
However, there is another solution which may be useful as a stepping-stone towards a final solution. You can use scriptLoad: true to tell SystemJS to use script elements like RequirejS does. (See the documentation on meta for details and caveats.) Here is a configuration that does that:
System.config({
baseURL: "src",
meta: {
"*": {
scriptLoad: true, // This is what fixes the issue.
}
},
packages: {
// Yes, this empty package does something. It makes `.js` the
// default extension for modules.
"": {}
},
});
// We have to put `define` in the global space to
// so that our modules can find it.
window.define = System.amdDefine;
If I run the example code I've given here without scriptLoad: true, then module a cannot call the callback. With scriptLoad: true, it can call the callback and I get on the console:
b loaded
a loaded
callback called

Dojo module loading in general

Coming from Java/C# I struggle to understand what that actually means for me as a developer (I'm thinking too much object-oriented).
Suppose there are two html files that use the same Dojo module via require() in a script tag like so:
<script type="text/javascript">
require(["some/module"],
function(someModule) {
...
}
);
</script>
I understand that require() loads all the given modules before calling the callback method. But, is the module loaded for each and every require() where it is defined? So in the sample above is some/module loaded once and then shared between the two HTMLs or is it loaded twice (i.e. loaded for every require where it is listed in the requirements list)?
If the module is loaded only once, can I share information between the two callbacks then? In case it is loaded twice, how can I share information between those two callbacks then?
The official documentation says "The global function define allows you to register a module with the loader. ". Does that mean that defines are something like static classes/methods?
If you load the module twice in the same window, it will only load the module once and return the same object when you request it a second time.
So, if you're having two seperate pages, then it will have two windows which will mean that it will load the module two times. If you want to share information, you will have to store it somewhere (the web is stateless), you could use a back-end service + database, or you could use the HTML5 localStorage API or the IndexedDB (for example).
If you don't want that, you can always use single page applications. This means that you will load multiple pages in one window using JavaScript (asynchronous pages).
About your last question... with define() you define modules. A module can be a simple object (which would be similar to static classes since you don't have to instantiate), but a module can also be a prototype, which means you will be able to create instances, for example:
define([], function() {
return {
"foo": function() {
console.log("bar");
}
};
});
This will return the same single object every time you need it. You can see it as a static class or a singleton. If you require it twice, then it will return the same object.
However, you could also write something like this:
define([], function() {
return function() {
this.foo = function() {
console.log("bar");
};
};
});
Which means that you're returning a prototype. Using it requires you to instantiate it, for example:
require(["my/Module"], function(Module) {
new Module().foo();
});
Prototyping is a basic feature of JavaScript, but in Dojo there's a module that does that for you, called dojo/_base/declare. You will often see things like this:
define(["dojo/_base/declare"], function(declare) {
return declare([], {
foo: function() {
console.log("bar");
}
});
});
In this case, you will have to load it similarly to a prototype (using the new keyword).
You can find a demo of all this on Plunker.
You might ask, how can you tell the difference between a singleton/static class module, and a prototypal module... well, there's a common naming convention to it. When your module starts with a capital letter (for example dijit/layout/ContentPane, dojo/dnd/Moveable, ...) then it usually means the module requires instances. When the module starts with a lowercase letter, it's a static class/singleton (for example dojo/_base/declare, dijit/registry)
1) dojo require, loads the module once and then if you called it again require() will simply return if the package is already loaded. so the request will be called once and it will also call any dependencies once.
but all that if you are in the same HTML page if you leave the page and call the same module in a different page then it will be called from the server. you can also use cache in your config settings so things will be cached in the browser and the file will or not by setting the cacheBust to true if you want a fresh copy or false if you want things to be cached.
2) if you are in the same html page and domain, the module didn't change the module will be the same and you can share values and any change you make you can get it from anywhere unless you call a new instance. but that is not possible between different html pages.
3) not it is not like a static classes or methods from what I understand static methods A static class can be used as a convenient container for sets of methods that just operate on input parameters and do not have to get or set any internal instance fields..
dojo work differently it is a reference for an object if you did it in that way :
define(function(){
var privateValue = 0;
return {
increment: function(){
privateValue++;
},
decrement: function(){
privateValue--;
},
getValue: function(){
return privateValue;
}
};
});
This means every bit of code loads that module will reference the same object in memory so the value will be the same through out the use of that module.
of course that is my understanding please feel free to tell me where I am wrong.

convert a javascript library to AMD

I'm trying to use a library -- Google's libphonenumber -- in my require application that is not AMD. What is the best way to consume this? I know I can create a module like this:
define(['module'], function (module) {
// insert and return library code here.
});
But that doesn't seem great. It seems like I would have to refactor some of their code to get that working (e.g., turn it all into an object and return that object). I see a lot of libraries using a different pattern where they use an immediately invoked function that defines the module on the window object and returns it.
(function() {
var phoneformat = {};
window.phoneformat = phoneformat;
if (typeof window.define === "function" && window.define.amd) {
window.define("phoneformat", [], function() {
return window.phoneformat;
});
}
})();
** UPDATE **
Is there any reason not to just do this?
define(['lib/phoneformatter'], function(phoneformatter) {
});
I get access to all of my methods but now it seems they are global because I did not wrap the library in a define...
Use RequireJS's shim. It'll look something like this
requirejs.config({
shim: {
'libphonenumber': {
exports: 'libphonenumber' // Might not apply for this library
}
}
});
This will load libphonenumber and put its variables in the global scope
This ended up working for me:
define(['module'], function (module) {
// insert and return library code here.
});
I am not entirely sure why 'module' was necessary. But it doesn't work without it. Also, I just returned an object and attached functions to it like so:
return {
countryForE164Number: countryForE164Number,
nextFunction: nextFunction,
// more functions as needed.
}
There is not much in the way of documentation for using 'module' but from what I can ascertain: Module is a special dependency that is processed by requireJS core. It gives you information about the module ID and location of the current module. So it is entirely possible that I messed up the paths in config.

RequireJS dependency override for configurable dependency injection

I'm working with something that seems a perfect fit for DI but it's being added to an existing framework that did not have that in mind when it was written. The config that defines dependencies is coming from a back-end model. Really it's not a full config at this point it basically contains a key that can be used to determine a particular view should be available or not.
I'm using require so the dependency looks something like this
// Dependency
define(['./otherdependencies'], function(Others) {
return {
dep: "I'm a dependency"
};
});
And right now the injector looks something like this
// view/injector
define([
'./abackendmodel',
'./dependency'
], function(Model, Dependency) {
return {
show: function() {
if (model.showDepency) {
var dep = new Dependency();
this.$el.append(dep);
}
}
};
});
This is a far stretch from the actual code but the important part is how require works. Notice in the injector code the dependency is required and used in the show method but only if the model says it should be shown. The problem is that there maybe additional things required by the dependency that aren't available when it shouldn't be shown. So what I'd really like to do is not have to specify that dependency unless the model.showDependency is true. I've come up with a couple of ideas but nothing that I like.
Idea one
Have another async require call based on that model attribute. So the injector would look like this.
// Idea 2 view/injector
define([
'./abackendmodel'
], function(Model) {
var Dep1 = null;
if (model.showDepedency) {
require([
'./dependency'
], function(Dependency) {
Dep1 = Dependency;
});
}
return {
show: function() {
if (Dep1) {
var dep = new Dep1();
this.$el.append(dep);
}
}
};
});
Obviously this has issues. If show is called before the async require call is finished then Dep1 will still be null. So we don't really show the dependency which is a goal and obviously there's JS errors that will be thrown in this case. Also we're still using an if check on show which I don't like but the use case is that a dependency may or may not be present and we just don't want to require it if it's not needed so I might not be able to get around that. Also keep in mind that the model.showDependency is not actually a boolean value. It can be have multiple values which would call for different dependencies to be required. I'm just stripping it down here for simplicity of understanding the basic issue.
Idea two
This is less solidified i.e. I don't think this will even work but I've considered playing with the require config.path stuff. My idea was basically having two configs so that './dependency' pointed to different places. Problem with that is despite what the model.showDependency value is the config is the same require config so can't change that at run-time. Maybe there's some magic that could be done here like having separate view directory path defined and using a factory type object to return the one that we care about but since that would ultimately result in the same async behaviour in Idea one I don't think that buys me anything (it's basically the same).
Idea three
Have the dependency return null base on the model.showDependency attribute.
This might be the best solution right now. I'm still stuck with some ifs but I don't think that's going away. Also this prevent in initialization code from being called.
Any better ideas?
Why not try using a promise, for loading the dependency?
You have two choices, depending on how your code needs to work.
Option 1)
Return a promise for the result of the module 'view/injector', the result of this promise would be the current object result show above.
Option 2)
Use a promise to load the dependency, and then execute the logic once the promise has been resolved.
Below is an example of Option 2, using the jQuery style deferred. I typically prefer
when.js or Q. This example might fall apart if order of appending is important.
// Option 2
define([
'./abackendmodel'
], function(Model) {
var dep1Promise = null;
if (model.showDepedency) {
var dep1Deferred = $.Deferred();
dep1Promise = dep1Deferred.promise();
require([
'./dependency'
], function(Dependency) {
dep1Deferred.resolve(Dependency);
}, dep1Deferred.reject); // Optionally reject the promise if there is an error finding the dependency.
}
return {
show: function() {
if (dep1Promise) {
dep1Promise.then(function(Dep1) {
var dep = new Dep1();
this.$el.append(dep);
});
}
}
};
});

Categories

Resources