Trouble with HarmonyImportSpecifierDependency - javascript

I am writing plugin, which inlines some code. And I have troubles with internal webpack plugin
Intro
My aim is to transpile this code
// a.js
export const getFoo = x => x.foo;
// b.js
import {getFoo} from './a.js';
export const func = x => getFoo(x);
to this one
// b.js
export const func = x => x.foo;
Using this comment I implemented such logic
Hook on compiler.hooks.beforeCompile and find all imports of transpiled modules with parser.hooks.importSpecifier and
Then using the same compiler hook, I find all CallExpressions and remember all functions which will be transpiled and their position
In compiler.hooks.compilation in compilation.hooks.buildModule I add template dependency which will replace CallExpressions with MemberExpression
The problem
Webpack has internal plugin HarmonyImportSpecifierDependency. Apparently, it does the same logic replacing
import {getFoo} from './a.js';
export const func = x => x.foo;
with
"use strict";
/* harmony export (binding) */ __webpack_require__.d(__webpack_exports__, "a", function() { return func; });
/* harmony import */ var _getters_uno__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(0);
const func = x => {
// NEXT LINE IS THE RESULT OF `HarmonyImportSpecifierDependency` WORK
return Object(_getters_uno__WEBPACK_IMPORTED_MODULE_0__[/* getFoo */ "e"])(x);
};
My problem is that both plugins (mine and HarmonyImportSpecifierDependency) remembers the original source and original places of CallExpressions. The result of both plugins is invalid code:
return Object(_getters_uno__WEBPACK_IMPORTED_MODULE_0__[/* getFoo */ "e"])x.foo);
The question
Is there a way how to give HarmonyImportSpecifierDependency the result of my plugin? I've tried to reparse module in compilation phase, but whith no success

I solved my problem with patching replacements object in source in succeedBuild hook
source.replacements = source.replacements
.filter(r => !r.content.includes(`/* ${callExpression.callee.name} */`));
Still wonder if there is a better solution

Related

Loading dependencies with or without react

I have an old IIFE that is injected into legacy pages via <script src.
However, I want to use all these old libraries in a react app. I just need to use the global function exposed.
I figure loading dependencies that will work both via script or via react's import or nodejs require
Here is an example of an example IIFE
example.js :
var $ = $;
var geocomplete = $.fn.geocomplete;
var OtherExternalLib = OtherExternalLib;
var Example = (function() {
return {
init: function () {
// stuff
}
}
)();
Where the legacy code is calling Example.init(), and likewise the react code will call the same function.
Where $ (jQuery), $.fn.geocomplete, and OtherExternalLib are all dependencies that must be loaded, either they should be loaded on-demand or just throw a big loud error message.
I suspect if the solution loads dynamically example.js would look something like
var $ = load("\libs\jquery.js");
var geocomplete = load("\libs\$.fn.geocomplete.js");
var OtherExternalLib = load("\libs\OtherExternalLib.js");
var Example = (function() {
return {
init: function () {
// stuff
}
}
)();
And the legacy application can still use <script src=example.js and React can use
import {Example} from example
Understandably this is somewhat a round-about way to of using legacy code in new applications, so I am open to other ideas on how best to expose an IIFE (with or without dependencies) and using it in React
I am using react+typescript in my project with some limitations which is why I had to dynamically import my package (my project runs in a shell project with AMD module, not having my own startup, and change the way project files get bundled).
Since I could only use the dependent modules on the fly during the run time, I had to assume them were valid while building and bundling . Most of them were IIFE.
So I used the lazy dynamic import .
something like this
import("somePolyfill");
This would be translated by TSC
new Promise(function (resolve_3, reject_3) { require(["arrayPolyfill"], resolve_3, reject_3); });
This would call the IIFE and execute the polyfills or initializing any window or global variable, so the rest of the code is aware of that.
If it returns a module or throughs error can be handled like normal promise then and catch.
So I created a wrapper
export class DepWrap {
public static Module: any = {};
public constructor() {
this.getPI();
this.getSomeModule();
}
public async getPI() {
DepWrap.Module["PI"] = 3.14;
}
public async getSomeModule() {
await import('somepath/somemodule').then(($package) => {
DepWrap.Module["somemodule"] = $package;
}).catch(() => {
window.console.log("Some Module error");
});
}
}
this got compiled to
define(["require", "exports", "tslib"], function (require, exports, tslib_1) {
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
var DepWrap = /** #class */ (function () {
function DepWrap() {
this.getPI();
this.getSomeModule();
}
DepWrap.prototype.getPI = function () {
return tslib_1.__awaiter(this, void 0, void 0, function () {
return tslib_1.__generator(this, function (_a) {
DepWrap.Module["PI"] = 3.14;
return [2 /*return*/];
});
});
};
DepWrap.prototype.getSomeModule = function () {
return tslib_1.__awaiter(this, void 0, void 0, function () {
return tslib_1.__generator(this, function (_a) {
switch (_a.label) {
case 0: return [4 /*yield*/, new Promise(function (resolve_1, reject_1) { require(['somepath/somemodule'], resolve_1, reject_1); }).then(function ($package) {
DepWrap.Module["somemodule"] = $package;
}).catch(function () {
window.console.log("Some Module error");
})];
case 1:
_a.sent();
return [2 /*return*/];
}
});
});
};
DepWrap.Module = {};
return DepWrap;
}());
exports.DepWrap = DepWrap;
});
with this I could use all the dependency modules from my wrapper and every time i need to import a new one I would create another function to add that to my wrap module.
import { DepWrap } from "wrapper/path";
const obj = new DepWrap(); // initialize once in the beginning of project so it would import all the dependencies one by one .
Afterwards in all file, I can import my module from the wrapper
import { DepWrap } from "wrapper/path";
const { PI, somemodule} = DepWrap.Module;
I am not sure if the code will work for your scenario as well, but I guess tweaking it a bit might come in handy for your useCase .
Plus : if you are writing unit test case it will help jest to just ignore the modules and can create a mock for this so that you can test your actual code .

How to correctly export a javascript module in Webpack project

I am currently experiencing issues exporting a module in webpack. I have been able to export simple modules which contain functions like the following:
let getEle = function(item) {
return document.getElementById(item);
};
module.exports = {
getEle: getEle
};
And in my main.js I will import it like so:
import { getEle } from './helper.js';
This works without any issues. However, I was trying to export a custom datePicker that I found (namely FooPicker: https://github.com/yogasaikrishna/foopicker):
var FooPicker = (function () {
// code
function FooPicker() {
// code
}
return FooPicker;
})();
// my attempt at exporting the entire FooPicker module
module.exports = {
FooPicker: FooPicker
}
And I try to import it like so in my main.js:
import FooPicker from './fooPicker.js'
My attempt at using the module (this works as expected if I simply call the function in a demo HTML file):
let foopicker2 = new FooPicker({
id: 'datepicker2'
});
However this does not work and I seeing the following error:
Uncaught TypeError: FooPicker is not a constructor
I have limited experience working with Webpack and I have done a fair bit of searching but I am still not able to find something relevant to my issue. What am I doing incorrectly here and what can I do to correct it?
Your case is with export
var FooPicker = (function () {
// code
function FooPicker() {
// code
}
return FooPicker;
})();
var fooPicker = new FooPicker()
console.log(fooPicker)
Try:
module.exports = FooPicker
const FooPicker = require('./fooPicker.js')
var fooPicker = new FooPicker()
This will work

Dynamic init to variable in a npm package

I have an imported library from npm and some parts of it need to be initialized before use, a simplified version of the code in the library:
export let locale = () => { throw new Error("Must init locale"); }
export initLocale(userLocaleFunction) {
locale = userLocaleFunction;
}
export checkLocale() {
console.log(locale());
}
But when calling the library in the following way:
lib = require("lib");
lib.initLocale(() => { return "en" });
lib.checkLocale(); // works as expected: "en"
lib.locale(); // Throws "Must init locale";
lib.locale acts as if it's not been initialized. I can't have initLocale() return the locale, I need it to be on the variable lib.locale
Is it possible to initialize a variable in this way?
It seems that when initializing a variable inside a library it's only in the libraries scope.
In my first solution I simply returned the value:
export initLocale(userLocaleFunction) {
locale = userLocaleFunction;
return locale;
}
But then realized that this creates a new problem: What if locale gets modified inside the library, or worse, outside of it?
In the spirit of avoiding 2 sources of truth I ended up going with this:
locale = undefined;
export initLocale(userLocaleFunction) {
locale = userLocaleFunction;
}
export getLocale() {
if (locale === undefined) {
throw new Error("Uninitialized locale");
}
return locale;
}
This code performs the is initialized check I needed at first and gives the value with one source of truth.

mocha test (and babel) using global variables

I am writing library using es6, transpiling it with babel via webpack and npm.
The problem is, that my lib is dependent on some code, that I can not change but need to use. I don't know how to load var stuff (from the following example) in my tests so that it is visible for the module.
See the example:
external-stuff.js - this one can not be changed and is loaded before my lib is loaded on prod env.
var stuff = {
get some() { return "some"; }
get stuff() { return "stuff"; }
}
some-module.js - this is one of the modules in the library
export class foo {
static get whatever() { return stuff.some; }
static get whichever() { return stuff.stuff; }
}
test
import {foo} from "../src/foo.js";
let assert = require('assert');
describe('foo', function() {
describe('#whatever()', function() {
it("should do some", function () {
assert.equals(foo.whatever(), "some");
});
});
});
If I run this I get "ReferenceError: stuff is not defined"
I already tried to define "stuff" in before() hook, but no success.
In the end I found a solution that's 'good enough'. I am not sure it would be sufficient for some advanced library though.
I have created file called globals.js
var g = typeof(window) === 'undefined' ? global : window;
// Dependencies - add as many global stuff as needed
g.stuff= typeof(stuff) === 'undefined' ? {} : stuff;
And I import this 'es6module' at the beginning of test
import * as globals from "../lib/global/globals"
import {foo} from "../src/foo.js";
And then I use node-import npm module with which I load the global to the tests in beforeEach hook.
beforeEach(function () {
global.stuff = imports.module("lib/global/stuff.js").stuff;
});
This is perfect for unit testing because I can also mock stuff. And its even more awesome because this way I have a place where I 'define' global dependencies. It would be nice to improve on it and make it per es6modul dependencies... and build on it something fancy... ya know.. dependency injection.
complete test
require('node-import'); // +
import * as globals from "../lib/global/globals"; // +
import {foo} from "../src/foo.js";
let assert = require('assert');
describe('foo', function() {
beforeEach(function () { // +
global.stuff = imports.module("lib/global/stuff.js").stuff; // +
}); // +
describe('#whatever()', function() {
it("should do some", function () {
assert.equals(foo.whatever(), "some");
});
});
});

Google Closure Compiler advanced: remove code blocks at compile time

If I take this code and compile it (advanced optimizations)
/**#constructor*/
function MyObject() {
this.test = 4
this.toString = function () {return 'test object'}
}
window['MyObject'] = MyObject
I get this code
window.MyObject=function(){this.test=4;this.toString=function(){return"test object"}};
Is there any way I can remove the toString function using the Closure Compiler?
toString is implicitly callable, so unless the Closure compiler can prove that the result of MyObject is never coerced to a string it has to preserve it.
You can always mark it as explicit debug code:
this.test = 4;
if (goog.DEBUG) {
this.toString = function () { return "test object"; };
}
then in your non-debug build, compile with
goog.DEBUG = false;
See http://closure-library.googlecode.com/svn/docs/closure_goog_base.js.source.html which does
/**
* #define {boolean} DEBUG is provided as a convenience so that debugging code
* that should not be included in a production js_binary can be easily stripped
* by specifying --define goog.DEBUG=false to the JSCompiler. For example, most
* toString() methods should be declared inside an "if (goog.DEBUG)" conditional
* because they are generally used for debugging purposes and it is difficult
* for the JSCompiler to statically determine whether they are used.
*/
goog.DEBUG = true;
The answer is surprising simple. I was researching this and didn't find the correct answer, so I am adding it here. The solution is to use a JSDoc annotation (see https://github.com/google/closure-compiler/wiki/Annotating-JavaScript-for-the-Closure-Compiler#const-const-type):
/* #const */
const debug = false;
Now anywhere in your code (also inside of nested functions) you can do the following:
if (debug) console.log("hello world");
Or in your case invalidate a complete block
if (debug) {
/* your code to remove */
}
If you set debug to false the Closure compiler can remove it because it knows you declared debug a constant a hence it won't change and the code will never be executed if gated with your debug variable.
Because the #define doesn't work in modules, I wrote a patch that can be run before compilation. It goes:
import { c } from '#artdeco/erte'
import { readFileSync, writeFileSync } from 'fs'
import { join } from 'path'
const [,,version] = process.argv
const PATH = join(__dirname, 'index.js')
let f = readFileSync(PATH, 'utf-8')
const isFree = version == '--free'
if (isFree) {
f = f.replace("\nimport isFree from './paid'", "\n// import isFree from './paid'")
f = f.replace("\n// import isFree from './free'", "\nimport isFree from './free'")
console.log('Set version to %s', c('free', 'red'))
} else {
f = f.replace("\n// import isFree from './paid'", "\nimport isFree from './paid'")
f = f.replace("\nimport isFree from './free'", "\n// import isFree from './free'")
console.log('Set version to %s', c('paid', 'magenta'))
}
writeFileSync(PATH, f)
Usage:
node ./src/version/patch --free
node ./src/version/patch --paid
And the actual ./src/version/index.js that's being patched:
// import isFree from './free'
import isFree from './paid'
With './free':
export default true
With './paid':
export default true
And based on that, you can export a variable from index.js:
export const free = isFree
So this was to allow compiling paid and free packages, but you could extend this code to adjust for debug/production version.
Still, this should be done with -D (#define) but apparently it's very difficult for a trillion-dollar company that Google is.

Categories

Resources