Importing a self invoking function - javascript

I have created a particular effect and wrapped it into a self-invoking function in myEffect.js file,
(function () {
// yada yada...
}());
Is it possible to then use the es6 way of importing to import this into my main file so that it will run as it is? Reason I'm doing this is my main js file has other misc stuff and this effect is pretty long in itself and I'm hoping to be able to split things up.

The effect will run when the module is evaluated, which happens when it is imported at least once in some other module.
You don't need an IIFE at all, ES6 modules already provide their own scope.
You don't need to export anything, as all your module is supposed to do is execute the side effect. It does not have a result value. (Which could be considered a design flaw, but let's not discuss that).
All you need to do is
// myEffect.js
… // yada yada
// main.js
import 'myEffect.js';

Yes you can import the file as such and execute the function wherever you want
for example
module.exports=(function () {
// yada yada...
}());
var effect = require('./effect.js)
now using effect you can access those in the file.

Related

Using 'export' keyword solely for importing into Unit Tests

I'm using Meteor and am writing unit tests for a Collection. I've got Helper methods for the collection in addition to just regular JS functions.
I.e.
Collection.helpers({
helperFn: function () {
return 'foo';
}
});
//And in the same file
function bar() {
return "bar";
}
Then in my tests file I have something like
import { Collection } from '../collections'
//Use Factory or Stub to create test Document
//This then works just fine and I can assert, etc..
testDoc.helperFn
My question is with wanting to test just the regular 'bar' JS function. This isn't a big deal using ES6 classes because then I can just export the whole class and call any function with an instance of it. But with Meteor I'm finding the only way I can access the function is by using the 'export' keyword.
So in my Collection file
export function bar ({ return bar; });
And now in my test file I'd do something like
import { bar } from '../collection'
I'd rather not add an export statement for every time I test a new function. Is there any way around this or is it not a big deal?
I do think that the export/import is the way to go, but to answer the first part of your question: yes, you can fall back to the original scoping of meteor and put these functions in the global scope of meteor as follows:
do not put your files in the imports/ folder, but into another folder in your project, e.g., server/.
defined the functions as:
bar = function() { /* function body */ }
These variables are interpreted by meteor as being global to the project, and hence do not need to be imported before use.
That said, there was a reason meteor introduced the imports/ folder and corresponding export/import paradigm in version 1.3. It avoids polluting the global scope and makes it much easier to see where things are defined.

Webpack extracts function, from module, that depends on outer scope

I notice this bug sometimes when there is very little code inside a function of a module:
// module A
import data from "data.json";
export function getSomeData() {
return data;
}
// module B
impoort { getSomeData } from "moduleA";
alert(getSomeData());
Then the error is something along the lines of
TypeError: data_json__WEBPACK_IMPORTED_MODULE_1__ is undefined
I notice at the top of module B there are some binding exports (whatever that is)
/* harmony export (binding) */ __webpack_require__.d(__webpack_exports__, "getSomeData", function() { return getSomeData; });
And this is making me think that whenever possible, in order to optimize things, webpack is bundling it so that only this function gets executed but it misses that it's dependent on data from the module itself. How can I prevent this from happening (other than writing better code, duh)?
I'm posting this for any lost souls in the future.
I found the reason it was that I was including module B in module A as well and this appears to be a very smart way webpack handles recursive includes (where node would return an empty object). Sadly for people without the proper knowledge like myself this could be pretty frustrating and time consuming to debug.
Hope it helps someone.

How to migrate legacy JS app to modules

I have a large (~15k LoC) JS app (namely a NetSuite app) written in old-style all-global way. App consists of 26 files and dependencies between them are totally unclear.
The goal is to gracefully refactor the app to smaller modules. By gracefully I mean not breaking\locking the app for long time, but doing refactoring in smaller chunks, while after completing each chunk app remains usable.
An idea I have here is to concat all the JS files we have now into single-file bundle. After that some code could be extracted into modules. And the legacy code could start importing it. The modules & imports should be transpiled with webpack\whatever, while legacy code remains all-globals style. Finally all this is packed into single JS file and deployed.
My questions are
is there a better approach maybe? This sounds like a typical problem
are there any tools available to support my approach?
I gave webpack a try and I haven't managed to get what I want out of it. The export-loader and resolve-loader are no options because of amount of methods\vars that needs to be imported\exported.
Examples
Now code looks like
function someGlobalFunction() {
...
}
var myVar = 'something';
// and other 15k lines in 26 files like this
What I would ideally like to achieve is
function define(...) { /* function to define a module */ }
function require(moduleName) { /* function to import a module */ }
// block with my refactored out module definitions
define('module1', function () {
// extracted modularised code goes here
});
define('module2', function () {
// extracted modularised code goes here
});
// further down goes legacy code, which can import new modules
var myModule = require('myNewModule');
function myGlobalLegacyFunction() {
// use myModule
}
I'm following an approach similar to that outlined here: https://zirho.github.io/2016/08/13/webpack-to-legacy/
In brief:
Assuming that you can configure webpack to turn something like
export function myFunction(){...}
into a file bundle.js that a browser understands. In webpack's entry point, you can import everything from your module, and assign it to the window object:
// using namespace import to get all exported things from the file
import * as Utils from './utils'
// injecting every function exported from utils.js into global scope(window)
Object.assign(window, Utils).
Then, in your html, make sure to include the webpack output before the existing code:
<script type="text/javascript" src="bundle.js"></script>
<script type="text/javascript" src="legacy.js"></script>
Your IDE should be able to help identify clients of a method as you bring them into a module. As you move a function from legacy.js to myNiceModule.js, check to see if it still has clients that are aware of it globally - if it doesn't, then it doesn't need to be globally available.
No good answer here so far, and it would be great if the person asking the question would come back. I will pose a challenging answer saying that it cannot be done.
All module techniques end up breaking the sequential nature of execution of scripts in the document header.
All dynamically added scripts are loaded in parallel and they do not wait for one another. Since you can be certain that almost all such horrible legacy javascript code is dependent on the sequential execution, where the second script can depend on the first previous one, as soon as you load those dynamically, it can break.
If you use some module approach (either ES7 2018 modules or require.js or you own) you need to execute the code that depends on the loading having occurred in a call-back or Promise/then function block. This destroys the implicit global context, so all these spaghetti coils of global functions and var's we find in legacy javascript code files will not be defined in the global scope any more.
I have determined that only two tricks could allow a smooth transition:
Either some way to pause continuation of a script block until the import Promise is resolved.
const promise = require("dep1.js", "dep2.js", "dep3.js");
await promise;
// legacy stuff follows
or some way to revert the scope of a block inside a function explicitly into the global scope.
with(window) {
function foo() { return 123; }
var bar = 543;
}
But neither wish was granted by the javascript fairy.
In fact, I read that even the await keyword essentially just packs the rest of the statements into function to call when promise is resolved:
async function() {
... aaa makes promise ...
await promise;
... bbb ...
}
is just, I suppose, no different from
async function() {
... aaa makes promise ...
promise.then(r => {
... bbb ...
});
}
So this means, the only way to fix this is by putting legacy javascript statically in the head/script elements, and slowly moving things into modules, but continue to load them statically.
I am tinkering with my own module style:
(function(scope = {}) {
var v1 = ...;
function fn1() { ... }
var v2 = ...;
function fn2() { ... }
return ['v1', 'fn1', 'v2', 'fn2']
.reduce((r, n) => {
r[n] = eval(n);
return r;
}, scope);
})(window)
by calling this "module" function with the window object, the exported items would be put on there just as legacy code would do.
I gleaned a lot of this by using knockout.js and working with the source readable file that has everything together but in such module function calls, ultimately all features are on the "ko" object.
I hate using frameworks and "compilation" so generating the sequence of HTML tags to load them in the correct order by the topologically sorted dependency tree, while I could write myself such a thing quickly, I won't do this, because I do not want to have any "compilation" step, not even my own.
UPDATE: https://stackoverflow.com/a/33670019/7666635 gives the idea that we can just Object.assign(window, module) which is somewhat similar to my trick passing the window object into the "module" function.

Replacing requirejs with systemjs -- variables not visible in local scope

I'm trying to convert our requirejs calls to use SystemJS, but I'm not exactly sure what I'm doing wrong.
Our original calls look like this:
return function(callback) {
requirejs(["/app/shared.js"], function(result){
callbackFunction = callback;
callback(dashboard);
main();
})
}
And what I'm trying instead is:
return function(callback) {
console.log(callback.toString())
SystemJS.import('app/shared.js').then(function(result){
callbackFunction = callback;
callback(dashboard);
main();
});
}
I've had to remove some leading / to get things to load properly, which is fine, but I've now ran into an issue where variables that were defined at the top of shared.js aren't visible in my local main.js file. In my browser console I get:
Potentially unhandled rejection [1] ReferenceError: dashboard is not defined
shared.js defines dashboard:
var dashboard = { rows: [], }
// Other definitions...
define(["/app/custom-config.js", /* etc */]);
I guess I have two questions:
is this the correct way to replace requirejs calls?
if so, why aren't my variables from shared.js accessible?
For a fuller picture, main() just sets up the dashboard object, and then calls callbackFunction(dashboard) on it.
Your problem can be reduced to the following case where you have two AMD modules, with one that leaks into the global space, and the 2nd one that tries to use what the first one leaked. Like the two following modules.
src/a.js requires the module that leaks and depends on what that module leaks:
define(["./b"], function () {
console.log("a loaded");
callback();
});
src/b.js leaks into the global space:
// This leaks `callback` into the global space.
var callback = function () {
console.log("callback called");
}
define(["./b"], function () {
console.log("b loaded");
});
With RequireJS, the code above will work. Oh, it is badly designed because b.js should not leak into the global space, but it will work. You'll see callback called on the console.
With SystemJS, the code above won't work. Why? RequireJS loads modules by adding a script element to the header and lets script execute the module's code so callback does end up in the global space in exactly the same way it would if you had written your own script element with an src attribute that points to your script. (You'd get an "Mismatched anonymous define" error, but that's a separate issue that need not detain us here.) SystemJS, by default, uses eval rather than create script elements, and this changes how the code is evaluated. Usually, it does not matter, but sometimes it does. In the case at hand here callback does not end up in the global space, and module a fails.
Ultimately, your AMD modules should be written so that they don't use the global space to pass information from one another.
However, there is another solution which may be useful as a stepping-stone towards a final solution. You can use scriptLoad: true to tell SystemJS to use script elements like RequirejS does. (See the documentation on meta for details and caveats.) Here is a configuration that does that:
System.config({
baseURL: "src",
meta: {
"*": {
scriptLoad: true, // This is what fixes the issue.
}
},
packages: {
// Yes, this empty package does something. It makes `.js` the
// default extension for modules.
"": {}
},
});
// We have to put `define` in the global space to
// so that our modules can find it.
window.define = System.amdDefine;
If I run the example code I've given here without scriptLoad: true, then module a cannot call the callback. With scriptLoad: true, it can call the callback and I get on the console:
b loaded
a loaded
callback called

javascript linking

I'm going to build a rather complicated application in html5 with some heavy javascripting.
In this I need to make some objects which can be pass around. But since there is no like #import foobar.js.
Then I assume that if my html page loads the scripts, then all the scripts can access eachother?
I read (here) that ajax somehow is able to load a .js file from within another .js file. But i dont think this is what I need?
Can go more into details if needed, thanks in advance.
In our projects, we do the following: Suppose you're going to call your project Foo, and have a module called Bar in it,
Then what we do is declare a file called Foo.js that just defines an equivalent to a Foo namespace:
Foo = (function(){
return {
};
})();
Then we create a file called Foo.Bar.js that contains the code for the Bar module:
Foo.Bar = (function(){
// var declarations here that should be invisible outside Foo.Bar
var p, q;
return {
fun1 : function(a, b){
// Code for fun1 here
},
fun2 : function(c) {
// Code for fun2 here
}
} // return
})();
Note that how it is a function that executes immediately, and returns an object that gets assigned to Foo.Bar. Any local variables, like p and q are available to fun1 and fun2 because they're in a closure, but they are invisible outside of Foo.Bar
The functions in Foo.Bar can be constructors for objects and so on.
Now in your HTML you simple include both files like so:
<script type="text/javascript" src="Foo.js"></script>
<script type="text/javascript" src="Foo.Bar.js"></script>
The result will be that you can call Foo.Bar's functions in the JavaScript of your main HTML file without any problems.
You should check out the module pattern:
http://www.adequatelygood.com/2010/3/JavaScript-Module-Pattern-In-Depth/
This describes alternatives for creating modular code in javascript, how you can protect your code and share APIs and data among them.
You should also consider using and AMD. Require.js is quite popular, but I tend to prefer head.js for this. Keep in mind that these put some requirements on how you structur your code in files, and personally, I don't think it's worth it, compared to a concatenated and minified file included in the bottom of the page.
Then I assume that if my html page loads the scripts, then all the scripts can access eachother?
Yes.

Categories

Resources