Using a RequireJS module more than once does not run - javascript

Running require(['pages/home']) will work once but if I use require(['pages/home']) again then it won't run.
The module "pages/home" is a file named "home.js" in a directory named "pages".
main.js
require(['pages/home']);
pages/home.js
define('pages/home', function() {
console.log('running pages/home module');
});

RequireJS modules are singletons. It loads a module once and only once. If a module has been loaded already, what you get if you load it again is a reference to the same module as originally loaded. The factory function you pass to define won't be run a second time.
So what you are seeing is exactly what is expected.

Static code in modules isn't supposed to be evaluated more than once, just like a script loaded through a normal <script> tag won't be run more than once during the page load.
Imagine if a module contained code like:
define('my-module', function () {
var foo = foo || 0;
var bar = ++foo;
});
You should expect bar and foo to both === 1, but if the module was run repeatedly and a global foo existed, that may not be the case. Admittedly, this is a very contrived example, but evaluating a module repeatedly could cause serious problems.

Make it return a function/object that can be executed after you require it.
define('pages/home', function() {
return function(){
console.log('running pages/home module');
};
});
require(['pages/home'], function(resultFunc){
window.YourFunc = resultFunc;
});
Now you can execute your function whenever you want

Related

How to migrate legacy JS app to modules

I have a large (~15k LoC) JS app (namely a NetSuite app) written in old-style all-global way. App consists of 26 files and dependencies between them are totally unclear.
The goal is to gracefully refactor the app to smaller modules. By gracefully I mean not breaking\locking the app for long time, but doing refactoring in smaller chunks, while after completing each chunk app remains usable.
An idea I have here is to concat all the JS files we have now into single-file bundle. After that some code could be extracted into modules. And the legacy code could start importing it. The modules & imports should be transpiled with webpack\whatever, while legacy code remains all-globals style. Finally all this is packed into single JS file and deployed.
My questions are
is there a better approach maybe? This sounds like a typical problem
are there any tools available to support my approach?
I gave webpack a try and I haven't managed to get what I want out of it. The export-loader and resolve-loader are no options because of amount of methods\vars that needs to be imported\exported.
Examples
Now code looks like
function someGlobalFunction() {
...
}
var myVar = 'something';
// and other 15k lines in 26 files like this
What I would ideally like to achieve is
function define(...) { /* function to define a module */ }
function require(moduleName) { /* function to import a module */ }
// block with my refactored out module definitions
define('module1', function () {
// extracted modularised code goes here
});
define('module2', function () {
// extracted modularised code goes here
});
// further down goes legacy code, which can import new modules
var myModule = require('myNewModule');
function myGlobalLegacyFunction() {
// use myModule
}
I'm following an approach similar to that outlined here: https://zirho.github.io/2016/08/13/webpack-to-legacy/
In brief:
Assuming that you can configure webpack to turn something like
export function myFunction(){...}
into a file bundle.js that a browser understands. In webpack's entry point, you can import everything from your module, and assign it to the window object:
// using namespace import to get all exported things from the file
import * as Utils from './utils'
// injecting every function exported from utils.js into global scope(window)
Object.assign(window, Utils).
Then, in your html, make sure to include the webpack output before the existing code:
<script type="text/javascript" src="bundle.js"></script>
<script type="text/javascript" src="legacy.js"></script>
Your IDE should be able to help identify clients of a method as you bring them into a module. As you move a function from legacy.js to myNiceModule.js, check to see if it still has clients that are aware of it globally - if it doesn't, then it doesn't need to be globally available.
No good answer here so far, and it would be great if the person asking the question would come back. I will pose a challenging answer saying that it cannot be done.
All module techniques end up breaking the sequential nature of execution of scripts in the document header.
All dynamically added scripts are loaded in parallel and they do not wait for one another. Since you can be certain that almost all such horrible legacy javascript code is dependent on the sequential execution, where the second script can depend on the first previous one, as soon as you load those dynamically, it can break.
If you use some module approach (either ES7 2018 modules or require.js or you own) you need to execute the code that depends on the loading having occurred in a call-back or Promise/then function block. This destroys the implicit global context, so all these spaghetti coils of global functions and var's we find in legacy javascript code files will not be defined in the global scope any more.
I have determined that only two tricks could allow a smooth transition:
Either some way to pause continuation of a script block until the import Promise is resolved.
const promise = require("dep1.js", "dep2.js", "dep3.js");
await promise;
// legacy stuff follows
or some way to revert the scope of a block inside a function explicitly into the global scope.
with(window) {
function foo() { return 123; }
var bar = 543;
}
But neither wish was granted by the javascript fairy.
In fact, I read that even the await keyword essentially just packs the rest of the statements into function to call when promise is resolved:
async function() {
... aaa makes promise ...
await promise;
... bbb ...
}
is just, I suppose, no different from
async function() {
... aaa makes promise ...
promise.then(r => {
... bbb ...
});
}
So this means, the only way to fix this is by putting legacy javascript statically in the head/script elements, and slowly moving things into modules, but continue to load them statically.
I am tinkering with my own module style:
(function(scope = {}) {
var v1 = ...;
function fn1() { ... }
var v2 = ...;
function fn2() { ... }
return ['v1', 'fn1', 'v2', 'fn2']
.reduce((r, n) => {
r[n] = eval(n);
return r;
}, scope);
})(window)
by calling this "module" function with the window object, the exported items would be put on there just as legacy code would do.
I gleaned a lot of this by using knockout.js and working with the source readable file that has everything together but in such module function calls, ultimately all features are on the "ko" object.
I hate using frameworks and "compilation" so generating the sequence of HTML tags to load them in the correct order by the topologically sorted dependency tree, while I could write myself such a thing quickly, I won't do this, because I do not want to have any "compilation" step, not even my own.
UPDATE: https://stackoverflow.com/a/33670019/7666635 gives the idea that we can just Object.assign(window, module) which is somewhat similar to my trick passing the window object into the "module" function.

Replacing requirejs with systemjs -- variables not visible in local scope

I'm trying to convert our requirejs calls to use SystemJS, but I'm not exactly sure what I'm doing wrong.
Our original calls look like this:
return function(callback) {
requirejs(["/app/shared.js"], function(result){
callbackFunction = callback;
callback(dashboard);
main();
})
}
And what I'm trying instead is:
return function(callback) {
console.log(callback.toString())
SystemJS.import('app/shared.js').then(function(result){
callbackFunction = callback;
callback(dashboard);
main();
});
}
I've had to remove some leading / to get things to load properly, which is fine, but I've now ran into an issue where variables that were defined at the top of shared.js aren't visible in my local main.js file. In my browser console I get:
Potentially unhandled rejection [1] ReferenceError: dashboard is not defined
shared.js defines dashboard:
var dashboard = { rows: [], }
// Other definitions...
define(["/app/custom-config.js", /* etc */]);
I guess I have two questions:
is this the correct way to replace requirejs calls?
if so, why aren't my variables from shared.js accessible?
For a fuller picture, main() just sets up the dashboard object, and then calls callbackFunction(dashboard) on it.
Your problem can be reduced to the following case where you have two AMD modules, with one that leaks into the global space, and the 2nd one that tries to use what the first one leaked. Like the two following modules.
src/a.js requires the module that leaks and depends on what that module leaks:
define(["./b"], function () {
console.log("a loaded");
callback();
});
src/b.js leaks into the global space:
// This leaks `callback` into the global space.
var callback = function () {
console.log("callback called");
}
define(["./b"], function () {
console.log("b loaded");
});
With RequireJS, the code above will work. Oh, it is badly designed because b.js should not leak into the global space, but it will work. You'll see callback called on the console.
With SystemJS, the code above won't work. Why? RequireJS loads modules by adding a script element to the header and lets script execute the module's code so callback does end up in the global space in exactly the same way it would if you had written your own script element with an src attribute that points to your script. (You'd get an "Mismatched anonymous define" error, but that's a separate issue that need not detain us here.) SystemJS, by default, uses eval rather than create script elements, and this changes how the code is evaluated. Usually, it does not matter, but sometimes it does. In the case at hand here callback does not end up in the global space, and module a fails.
Ultimately, your AMD modules should be written so that they don't use the global space to pass information from one another.
However, there is another solution which may be useful as a stepping-stone towards a final solution. You can use scriptLoad: true to tell SystemJS to use script elements like RequirejS does. (See the documentation on meta for details and caveats.) Here is a configuration that does that:
System.config({
baseURL: "src",
meta: {
"*": {
scriptLoad: true, // This is what fixes the issue.
}
},
packages: {
// Yes, this empty package does something. It makes `.js` the
// default extension for modules.
"": {}
},
});
// We have to put `define` in the global space to
// so that our modules can find it.
window.define = System.amdDefine;
If I run the example code I've given here without scriptLoad: true, then module a cannot call the callback. With scriptLoad: true, it can call the callback and I get on the console:
b loaded
a loaded
callback called

Does Node run all the code inside required modules?

Are node modules run when they are required?
For example: You have a file foo.js that contains some code and some exports.
When I import the file by running the following code
var foo = require(./foo.js);
is all the code inside the file foo.js run and only exported after that?
Much like in a browser's <script>, as soon as you require a module the code is parsed and executed.
However, depending on how the module's code is structured, there may be no function calls.
For example:
// my-module-1.js
// This one only defines a function.
// Nothing happens until you call it.
function doSomething () {
// body
}
module.exports = doSomething;
// my-module-2.js
// This one will actually call the anonymous
// function as soon as you `require` it.
(function () {
// body
})();
Some examples..
'use strict';
var a = 2 * 4; //this is executed when require called
console.log('required'); //so is this..
function doSomething() {}; //this is just parsed
module.exports = doSomething; //this is placed on the exports, but still not executed..
Only in the sense that any other JS code is run when loaded.
e.g. a function definition in the main body of the module will be run and create a function, but that function won't be called until some other code actually calls it.
Before exporting the content that are visible outside of your module, if there is same code that can be execute it it execute but the content that are export like a class will be execute in the code that import it.
For example, if I have this code
console.log("foo.js")
module.exports = {
Person: function(){}
}
the console.log will be execute when you require it.

Amd, requirejs - want to ensure something always executes last

I'm trying to avoid global scope for the following problem, but cannot see a way to avoid it.
I have a singleton Javascript object called "Application". This is an AMD module.
I then have many "Modules" (not to be confused with AMD modules) which are just javascript objects that I'd like to register with the "Application" instance.
For example:
require(['Application'], function(app) {
var module = {
name: "theme-switcher",
start: function() { console.log("started") } }
app.registerModule(module)
}
The architecture I am going for, is i'd like for each "Module" on the page to register itself with the "Application" instance.
Here is the tricky part: Only once ALL modules have been registered with the application, do I then want the "Application" instance, to loop through those registered modules and call their "Start()" methods.
The way I thought to do this was to just add another require block at the bottom of the page like this:
<script type="text/javascript">
require(['Application'], function (app) {
// start all the registered modules running.
app.start(() => {
// this could be a call back function once all modules started..
});
});
Niavely thinking, that just because this require call was last, that it would allways be executed last. But actually, sometimes this gets fired BEFORE the require calls above - so the Application attempts to Start() all registered modules BEFORE the modules themselves have all registered themselves with the Application.
No matter how I think about this problem, I am often led back to the fact that I need to keep some state in global scope, Something like this:
Module 1:
var app = Application.Instance;
var moduleStart = function(){
require(['jquery'], function(jquery) {
// do module goodness here.
}};
app.registerModule({name: "theme-switcher", start: moduleStart })
// later on in page - some other widget
// Module 2
var app = Application.Instance;
var moduleStart = function(){
require(['jquery'], function(jquery) {
// do second module goodness here.
}};
app.registerModule({name: "foo", start: moduleStart })
And then at the bottom of the page,
var app = Application.Instance;
app.Start(); // loops through registered modules calling start() method.
Surely there must be a way to do this avoiding global scope?
The reason for me wanting to do this, is that I want the "Application" to manage the lifecycle for registered modules on the page - including starting / pausing / stopping them etc. I'd also like the Application to publish an event once ALL modules gave been started - as this is when I would typically stop displaying my "loading" animation, and actually display the DOM - as modules will often manipulate the DOM in their start() methods and I don't want the page to be visible before everything is started.
This will do it. If you make every object you are wanting an AMD module, which I think you should be doing if you have RequireJS in place anyway, then you'll just need an array of strings defining those AMD module names to pass as an argument to the app's init.
Application.js : -
define(function (require, exports, module) {
"use strict";
var modules = {};
var moduleNames = [];
var numberOfModules = 0;
var loadedModules = 0;
exports.init = function (dependencies) {
numberOfModules = dependencies.length;
for (var i = 0; i < numberOfModules; i++){
var name = dependencies[i];
moduleNames.push(name);
require([name], function (moduleRef) {
loadedModules++;
modules[name] = moduleRef;
if (numberOfModules === loadedModules) {
exports.start();
}
});
}
};
exports.start = function () {
// all modules available
// use modules.myModuleName to access the module.
modules.myModuleName.functionName();
// or if they all have start() function and it needs calling
for (var i = 0; i < moduleNames.length; i++) {
modules[moduleNames[i]].start();
}
};
});
USAGE Depending on how you are loading your app, assuming you have an Application reference somewhere, just call:
// names of modules RequireJS uses to require, can be changed for each page.
var dependencies = ['moduleOne', 'moduleTwo', 'myModuleName'];
app.init(dependencies);
CodePen of this code, slightly altered to work on one page... http://codepen.io/owenayres/pen/MyMJYa

Strange behavior with RequireJS using CommonJS sintax

I'm a strange behavior with RequireJS using the CommonJS syntax. I'll try to explain as better as possible the context I'm working on.
I have a JS file, called Controller.js, that registers for input events (a click) and uses a series of if statement to perform the correct action. A typical if statement block can be the following.
if(something) {
// RequireJS syntax here
} else if(other) { // ...
To implement the RequireJS syntax I tried two different patterns. The first one is the following. This is the standard way to load modules.
if(something) {
require(['CompositeView'], function(CompositeView) {
// using CompositeView here...
});
} else if(other) { // ...
The second, instead, uses the CommonJS syntax like
if(something) {
var CompositeView = require('CompositeView');
// using CompositeView here...
} else if(other) { // ...
Both pattern works as expected but I've noticed a strange behavior through Firebug (the same happens with Chrome tool). In particular, using the second one, the CompositeView file is already downloaded even if I haven't follow the branch that manages the specific action in response to something condition. On the contrary, with the first solution the file is downloaded when requested.
Am I missing something? Is it due to variable hoisting?
This is a limitation of the support for CommonJS-style require. The documentation explains that something like this:
define(function (require) {
var dependency1 = require('dependency1'),
dependency2 = require('dependency2');
return function () {};
});
is translated by RequireJS to:
define(['require', 'dependency1', 'dependency2'], function (require) {
var dependency1 = require('dependency1'),
dependency2 = require('dependency2');
return function () {};
});
Note how the arguments to the 2 require calls become part of the array passed to define.
What you say you observed is consistent with RequireJS reaching inside the if and pulling the required module up to the define so that it is always loaded even if the branch is not taken. The only way to prevents RequireJS from always loading your module is what you've already discovered: you have to use require with a callback.

Categories

Resources