I have a file with a few time calculations.
In order to "DRY", at its top there's a
const HOUR_MS = 60 * 60 * 1000
In the transpiled output each HOUR_MS occurrence is inlined with 36e5.
So far, so good ^_^
BUT! If I extract this const to its own file, because I want to reuse it in more places, this no longer works.
Instead, the transpiled output now has references to that const (e.g. r.HOUR_MS) which means it's not as minified as it can be.
Is this a deliberate behavior (which maybe I can suppress with some flag) or is it an optimization oversight? (was about to report it, but their GitHub bug template suggested I'd ask here first)
This phenomenon is problematic, but due to the multiplicity of layers (2) working together to generate output, this feat is not feasible.
This is because import and export behave differently than CommonJS.
No matter what, import and export are by reference (even with primitive variables), whereas CommonJS does not manipulate the exported values (therefore, primitive values won't be referenced).
Proof that import and export are by reference:
1.ts
export let number = 10;
export function increase() {
number++;
}
2.ts
import {number, increase} from './1';
console.log(number);
increase();
console.log(number);
the log will print 10, 11
Layer 1: TypeScript Transpiler
const number = 10;
console.log(number);
Will transpile to:
const number = 10;
console.log(number);
But
export const number = 10;
console.log(number);
Will transpile to: (This is where the problem begins)
Object.defineProperty(exports, "__esModule", { value: true });
exports.number = 10;
console.log(exports.number);
Layer 2: uglify-es
const number = 10;
console.log(number);
Will compress to
console.log(10);
But
Object.defineProperty(exports, "__esModule", { value: true });
exports.number = 10;
console.log(exports.number);
Will compress to: (This is where the problem ends)
Object.defineProperty(exports, "__esModule", { value: !0 });
exports.number = 10;
console.log(exports.number);
And this is why this optimization will never work without any common standard between the layers.
Related
I am aware of import and export, but this doesn't work here because each time you call imported function it starts a new instance of the original file from which it was exported.
I have a JavaScript Code A running in node.js continously without restarting. As it runs, it creates its own data and stores it in arrays.
Please how do I call a function within code A from another file which will run in the same instance Code A is currently running in and that it will respect the stored data in arrays Code A has?
If I use import/export, it calls a function from within Code A, but it doesn't have any stored data in arrays that I gathered whilst running Code A.
Thank you very much.
EDIT: Adding sample code
MAIN FILE
let calculationArray = []
function add(x, y) {
if ( calculationArray.includes("abcde") ) {
console.log("Calculation is on timeout", calculationArray)
}
else {
calculationArray.push("abcde")
console.log(x + y)
return x + y;
}
}
module.exports = { add }
setInterval(add, 3000, 5, 5)
SECOND FILE WHICH CALLS FUNCTION FROM MAIN FILE and doesn't respect the fact that calculationArray in the original file already has "abcde". It simply starts it own fresh instance with empty array
const f = require('./calculation.js');
f.add(10, 5);
I'm not sure how you exactly load and call your scripts... so I tried to make something I could reason about of — out of your code.
Using ES6 modules (which use export and import statements), say you have:
calculation.js — which adds every second 5+5 to a file-internal array
other.js — which imports calculation.js and returns the result of add(100, 100) 200
a main index.js which calls:
calculations.js the first time after 3500ms
other.js after 6500ms
Here are the files and than some explanation on what happens using ES6 dynamic import():
calculation.js
// calculation.js
const calculationArray = [];
const add = (x, y) => {
const result = x + y;
calculationArray.push(result);
console.log(calculationArray.reduce((a, v) => a + v), 0);
return result;
};
// add 5+5 every 1sec
setInterval(add, 1000, 5, 5);
export { add }
other.js
// other.js
import { add } from "./calculation.js";
// export the result of 100 + 100
export default add(100, 100);
and finally:
index.js
// index.js
const delayedImport_calculation = async () => {
const { add } = await import("./calculation.js");
add(10, 5);
};
const delayedImport_other = async () => {
const { default: otherResult } = await import("./other.js");
console.log(otherResult);
};
setTimeout(delayedImport_calculation, 3500);
setTimeout(delayedImport_other, 6500);
If you call the main script like: $ node ./index.js from the terminal, you should expect the following console logs:
(Nothing for 3500ms)
15 after 3500ms since that's when calculation.js was first imported into index.js and called the function add(10, 5);
25 (after 1sec)
35 (also after 1sec since the interval inside calculation.js)
235 since other.js was dynamically imported into index.js and added 100 to the array
other: 200 200 since other.js exports just the result of add(100 + 100)
245, 255, 265, etc... on intervals of 1sec — all proving the point that the array values are updated as expected.
as you can see above, the .reduce() on the calculationArray returns the expected added values in console.log
Another simpler example (which might be closer to what you have?) with only two files:
calculation.js that:
every 1sec adds 5+5 to the array
exports a default add() function and a getAddResult() method
index.js which imports both the default and the helper methods
logs every second the sun of the array values using getAddResult()
calls add(1000, 1000) after 5sec
calculation.js
const calculationArray = [];
const getAddResult = () => calculationArray.reduce((a, v) => a + v, 0);
const add = (x, y) => {
const result = x + y;
calculationArray.push(result);
return result;
};
// add 5+5 every 100ms
setInterval(add, 100, 5, 5);
export default add
export { getAddResult }
index.js
import add, { getAddResult } from "./calculation.js";
setInterval(() => {
console.log(getAddResult())
}, 1000);
setTimeout(() => {
add(1000, 1000);
}, 5000);
the log this time would be approximately like:
90, 180, 280, 370 (in intervals of 1sec)
2470, (after ~5000ms) proving the point that the values in array are updated
2560, 2660, etc... (also in intervals of 1sec)
PS:
for the above to work you either need to use "type": "module" in package.json, or just name all your files extensions as .mjs
I am writing plugin, which inlines some code. And I have troubles with internal webpack plugin
Intro
My aim is to transpile this code
// a.js
export const getFoo = x => x.foo;
// b.js
import {getFoo} from './a.js';
export const func = x => getFoo(x);
to this one
// b.js
export const func = x => x.foo;
Using this comment I implemented such logic
Hook on compiler.hooks.beforeCompile and find all imports of transpiled modules with parser.hooks.importSpecifier and
Then using the same compiler hook, I find all CallExpressions and remember all functions which will be transpiled and their position
In compiler.hooks.compilation in compilation.hooks.buildModule I add template dependency which will replace CallExpressions with MemberExpression
The problem
Webpack has internal plugin HarmonyImportSpecifierDependency. Apparently, it does the same logic replacing
import {getFoo} from './a.js';
export const func = x => x.foo;
with
"use strict";
/* harmony export (binding) */ __webpack_require__.d(__webpack_exports__, "a", function() { return func; });
/* harmony import */ var _getters_uno__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(0);
const func = x => {
// NEXT LINE IS THE RESULT OF `HarmonyImportSpecifierDependency` WORK
return Object(_getters_uno__WEBPACK_IMPORTED_MODULE_0__[/* getFoo */ "e"])(x);
};
My problem is that both plugins (mine and HarmonyImportSpecifierDependency) remembers the original source and original places of CallExpressions. The result of both plugins is invalid code:
return Object(_getters_uno__WEBPACK_IMPORTED_MODULE_0__[/* getFoo */ "e"])x.foo);
The question
Is there a way how to give HarmonyImportSpecifierDependency the result of my plugin? I've tried to reparse module in compilation phase, but whith no success
I solved my problem with patching replacements object in source in succeedBuild hook
source.replacements = source.replacements
.filter(r => !r.content.includes(`/* ${callExpression.callee.name} */`));
Still wonder if there is a better solution
I jumped into the deep end recently and have been slowly learning to swim. I'm working on a CLI for building out a simple text game world. That code is becoming a convoluted mess and so I have tried to recreate the error I am getting in a simpler form below.
Try as I might I can't seem to understand the best way to structure all of my functions. In my project I have a parser function that breaks input up and searches for a 'verb' to invoke via a try/catch block. When a verb i.e. 'look' runs it accesses my database module and sends a query based on several parameters to return the description of a room or thing. Because this is all asynchronous virtually everything is wrapped in a promise but I am leaving that out of this example. The following is not the actual project, just a simple recreation of the way I have my objects set up.
APP:
// ***********************
const player = require('./scope_test_player');
player.look();
player.water();
Module1:
// ***********************
const apple_tree = require('./scope_test_apple_tree');
module.exports = {
look: function(){
console.log(
'The apple tree is '+apple_tree.height+'ft tall and has '
+apple_tree.apples+' apples growing on it'
);
},
water: function() {
apple_tree.grow();
}
};
Module2:
// ***********************
const player = require('./scope_test_player');
module.exports = {
height: 10,
nutrition: 0.3,
apples: [],
fertilize: function(number) {
this.nutrition+=number;
},
grow: function() {
this.height+=this.nutrition;
}
};
In the above code I get 'TypeError: apple_tree.grow is not a function' from water or undefined from look. This is the bane of my existence and I have been getting this seemingly at random in my main project which leads me to believe I dont understand scope. I know I can require the module within the function and it will work, but that is hideous and would add hundreds of lines of code by the end. How do I cleanly access the functions of objects from within other objects?
You problem is that have a cyclic dependencies in your project and that you overwrite the exports property of the module. Because of that and the way node cachges required modules, you will get the original module.exports object in scope_test_player file and not the one you have overwritten. To solve that you need to write it that way:
// ***********************
const apple_tree = require('./scope_test_apple_tree');
module.exports.look = function() {
console.log(
'The apple tree is ' + apple_tree.height + 'ft tall and has ' + apple_tree.apples + ' apples growing on it'
);
};
module.exports.water = function() {
apple_tree.grow();
};
And
// ***********************
const player = require('./scope_test_player');
module.exports.height = 10;
module.exports.nutrition = 10;
module.exports.apples = [];
module.exports.fertilize = function(number) {
this.nutrition = +number;
};
module.exports.growth = function() {
this.height = +this.nutrition;
}
But this is a really bad design in gerenal and you should find another way how to solve that. You should always avoid loops/circles in your dependency tree.
UPDATE
In node each file is wrappted into load function in this way:
function moduleLoaderFunction( module, exports /* some other paramteres that are not relavant here*/)
{
// the original code of your file
}
node.js internally does something like this for a require:
var loadedModules = {}
function require(moduleOrFile) {
var resolvedPath = getResolvedPath(moduleOrFile)
if( !loadedModules[resolvedPath] ) {
// if the file was not loaded already create and antry in the loaded modules object
loadedModules[resolvedPath] = {
exports : {}
}
// call the laoded function with the initial values
moduleLoaderFunction(loadedModules[resolvedPath], loadedModules[resolvedPath].exports)
}
return loadedModules[resolvedPath].exports
}
Because of the cyclic require, the require function will return the original cache[resolvedPath].exports, the one that was initially set before you assinged your own object to it.
Is Module1 = scope_test_player and Module2 = scope_test_apple_tree?
Maybe you have a cyclic reference here?
APP requires scope_test_player
// loop
scope_test_player requires scope_test_apple_tree
scope_test_apple_tree requires scope_test_player
// loop
As I can see scope_test_apple_tree doesn't use player.
Can you try to remove:
const player = require('./scope_test_player');
from Module2 ?
There are a few issues to address.
Remove the player require in Module 2(scope_test_apple_tree.js):
const player = require('./scope_test_player')
It doesn't do any damage keeping it there but it's just unnecessary.
Also, replace =+ with += in fertilize and grow which is what I think you are going for.
I was able to run the code natually with those fixes.
If you want to refactor, I'd probably flatten out the require files and do it in the main file controlling the player actions and explicitly name the functions with what is needed to run it (in this case...the tree).
Keeping mostly your coding conventions, my slight refactor would look something like:
index.js
const player = require('./scope_test_player');
const apple_tree = require('./scope_test_apple_tree');
player.lookAtTree(apple_tree);
player.waterTree(apple_tree);
scope_test_player.js
module.exports = {
lookAtTree: function(tree){
console.log(
'The apple tree is '+tree.height+'ft tall and has '
+tree.apples.length+' apples growing on it'
);
},
waterTree: function(tree) {
tree.grow();
console.log('The apple tree grew to', tree.height, 'in height')
}
};
scope_test_apple_tree.js
module.exports = {
height: 10,
nutrition: 0.3,
apples: [],
fertilize: function(number) {
this.nutrition += number;
},
grow: function() {
this.height += this.nutrition;
}
};
Yes, I had circular dependencies in my code because I was unaware of the danger they imposed. When I removed them from the main project sure enough it started working again. It now seems that I'm going to be forced into redesigning the project as having two modules randomly referencing each other is going to cause more problems.
I would like to test prototype functions that I make.
My set up consist of 3 files:
Base.js - Base file, that all my second files have in common
function prop(to, name, func) {
Object.defineProperty(to.prototype, name, {
value: func,
writable: true,
configurable: true
});
return func;
}
Array.js - file that modify the prototype of given Object.
prop(Array, 'hasPresent', function(what) {
return !!~this.indexOf(what)
});
/tests/Array.js - Test itself
describe('hasPresent()',function(){
it('number', function(done){
expect([0,1,2].hasPresent(0)).toBe(true)
done()
})
})
All this is done from nodeJS, that watches file for changes. My concern is, that it will return error from second file (prop is not defined ... at Array.js:1). This says me, that those files are not evaluated in same context. Is there any way to make this work? Or how to get __direcotory variable in test file, when started from node.
My setup in nodejs:
jasmine.loadConfig({
spec_files: ['Base.js','Array.js']
helpers: ['Base.js','Array.js']
})
jasmine.execute(['tests/Array.js']);
PS.: I tried putting eval in tests/Array.js, but the working directory is forgotten when loaded, so I would have to use absolute path, if there is any way to work around this, it would be great.
You just need to use the normal module mechanism for Node.js: https://nodejs.org/api/modules.html
Here is the basic example which you can adapt to your testing:
The contents of foo.js:
const circle = require('./circle.js');
console.log(`The area of a circle of radius 4 is ${circle.area(4)}`);
The contents of circle.js:
const PI = Math.PI;
exports.area = (r) => PI * r * r;
exports.circumference = (r) => 2 * PI * r;
If I take this code and compile it (advanced optimizations)
/**#constructor*/
function MyObject() {
this.test = 4
this.toString = function () {return 'test object'}
}
window['MyObject'] = MyObject
I get this code
window.MyObject=function(){this.test=4;this.toString=function(){return"test object"}};
Is there any way I can remove the toString function using the Closure Compiler?
toString is implicitly callable, so unless the Closure compiler can prove that the result of MyObject is never coerced to a string it has to preserve it.
You can always mark it as explicit debug code:
this.test = 4;
if (goog.DEBUG) {
this.toString = function () { return "test object"; };
}
then in your non-debug build, compile with
goog.DEBUG = false;
See http://closure-library.googlecode.com/svn/docs/closure_goog_base.js.source.html which does
/**
* #define {boolean} DEBUG is provided as a convenience so that debugging code
* that should not be included in a production js_binary can be easily stripped
* by specifying --define goog.DEBUG=false to the JSCompiler. For example, most
* toString() methods should be declared inside an "if (goog.DEBUG)" conditional
* because they are generally used for debugging purposes and it is difficult
* for the JSCompiler to statically determine whether they are used.
*/
goog.DEBUG = true;
The answer is surprising simple. I was researching this and didn't find the correct answer, so I am adding it here. The solution is to use a JSDoc annotation (see https://github.com/google/closure-compiler/wiki/Annotating-JavaScript-for-the-Closure-Compiler#const-const-type):
/* #const */
const debug = false;
Now anywhere in your code (also inside of nested functions) you can do the following:
if (debug) console.log("hello world");
Or in your case invalidate a complete block
if (debug) {
/* your code to remove */
}
If you set debug to false the Closure compiler can remove it because it knows you declared debug a constant a hence it won't change and the code will never be executed if gated with your debug variable.
Because the #define doesn't work in modules, I wrote a patch that can be run before compilation. It goes:
import { c } from '#artdeco/erte'
import { readFileSync, writeFileSync } from 'fs'
import { join } from 'path'
const [,,version] = process.argv
const PATH = join(__dirname, 'index.js')
let f = readFileSync(PATH, 'utf-8')
const isFree = version == '--free'
if (isFree) {
f = f.replace("\nimport isFree from './paid'", "\n// import isFree from './paid'")
f = f.replace("\n// import isFree from './free'", "\nimport isFree from './free'")
console.log('Set version to %s', c('free', 'red'))
} else {
f = f.replace("\n// import isFree from './paid'", "\nimport isFree from './paid'")
f = f.replace("\nimport isFree from './free'", "\n// import isFree from './free'")
console.log('Set version to %s', c('paid', 'magenta'))
}
writeFileSync(PATH, f)
Usage:
node ./src/version/patch --free
node ./src/version/patch --paid
And the actual ./src/version/index.js that's being patched:
// import isFree from './free'
import isFree from './paid'
With './free':
export default true
With './paid':
export default true
And based on that, you can export a variable from index.js:
export const free = isFree
So this was to allow compiling paid and free packages, but you could extend this code to adjust for debug/production version.
Still, this should be done with -D (#define) but apparently it's very difficult for a trillion-dollar company that Google is.