How to bundle isomorphic commonJS code with webpack - javascript

I have a project that uses nodeJS module format (commonJS) and should also (in parts) run in the browser.
I do have non-isomorphic code paths where I conditionally include modules:
var impl;
try {
// in node, use node-impl
impl = require('../node-impl');
} catch (err) {
// running in browser, use browser-impl
impl = require('../browser-impl');
}
Now, I want to use webpack to create a bundle that runs in the browser. I therefore need to defined the external (nodeJS specific) modules as external in the webpack.config.js so that they don't get included in the bundle:
external: {
'../node-impl': true
}
I verified that the '../node-impl' code is actually not included in bundle, but the emitted code looks like this:
/***/ },
/* 33 */
/***/ function(module, exports) {
module.exports = ../node-impl;
/***/ },
This is syntactically wrong JS and the browser will throw a syntax error there.
How is this scenario properly handled with webpack.js? Be aware that I do not wish to use webpack for running with nodeJS, only the browser bundles should be created with webpack.

// Your actual situation:
var impl;
try {
impl = require('../node-impl');
} catch(e) {
impl = require('../browser-impl');
}
You need to refactor this snippet to:
var impl = require('../node-impl');
After this rework, your code is able to work only in a node js environment, that's is good because we are going to mock this request when bundling for browsers...
// webpack.browser.config.js
module.exports = {
resolve: {
alias: {
'../node-impl': '../browser-impl'
}
}
};
Webpack - Resolve.Alias
Or using a package.json#browser, or https://webpack.github.io/docs/configuration.html#resolve-packagealias

I don't think this is the intended purpose of the externals config. Per the docs,
Specify dependencies that shouldn’t be resolved by webpack, but should become dependencies of the resulting bundle. The kind of the dependency depends on output.libraryTarget.
So you're telling webpack that your build requires that module, but not to bundle it in. It's leaving it out, but attempting to require it.
There are probably a couple of ways to do what you want. Worth mentioning that you can easily have webpack produce multiple builds, with different/shared config, from a single config file, which opens up a lot of possibilities. But the way I'd suggest is to use the DefinePlugin to define a boolean 'constant' that represents the execution context (e.g. IN_BROWSER = true). Check that constant in your conditional around the require(). Webpack's parser isn't that smart but it can evaluate boolean variables, so it will resolve the conditional correctly and only require the appropriate module. (Using a non-boolean, like CONTEXT = 'browser' is too confusing for webpack, and it will parse each require statement.) You can then use the Uglify plugin to remove the 'dead code' in the conditional so it doesn't bloat your production build.

With the help of #Hitmands I could come up with a solution that is still not perfect, but fits my needs. I introduce a fictious nonexistingmodule and declare it as external; I then declare the node-impl specific modules to be available as nonexistingmodule:
externals: {
'nonexistingmodule': true,
'../node-impl': 'nonexistingmodule'
}
This way I can keep the try/catch pattern to load specific implementations and it will still run on node. In the browser the loading of nonexistingmodule fails, and the browser-impl module is loaded - just as I intended.
I am a big fan of "don't refactor your code to match a tool", so I am going with this solution.

Related

How to split code into several bundles with Vue CLI3

I have a Vue project using TypeScript and built by Vue-CLI3.
What I'm trying to achieve is to get Webpack to build separate bundles for my workers. I've read about Webpack Code Splitting and about configureWebpack in vue.config.js, but so far had no luck in putting them together.
The project setup is the standard vue create type. I have a ./src/main.ts as the main entry point and a bunch of TypeScript modules, I want as separate bundles with their own dependency trees (I'm fine with code duplication if it can't be avoided).
I'd like to get
./dist/js/all main stuff
./dist/js/workers/worker1.6e3ebec8.js
./dist/js/workers/worker2.712f2df5.js
./dist/js/workers/worker3.83041b4b.js
So I could do new Worker(worker1.6e3ebec8.js) in the main code.
I could launch workers from the main package by generating javascript code, putting it into a blob and instantiating from that, but it looks rather awkward. Besides, my worker code import other modules, so it doesn't seem to be an option anyway.
I'm quite new to all of this, so maybe I'm not even heading in the right direction.
What is the usual way of doing that on this stack?
You can use import(), it will return a Promise and will resolve your module.
As you are using Vue-CLI 3, webpack is ready and it should split your bundle automatically.
const moduleName = 'coolModuleName'
import (
/* webpackChunkName: "[moduleName]" */
`#/my/module/path/${moduleName}.js`
).then(moduleCode => {
// use your module
})
// load them in parallel
const getModuleDynamically(path, moduleName) => import(
/* webpackChunkName: "[moduleName]" */
`#/${path}/${moduleName}.js`
)
Promise.all([
getModuleDynamically(path, moduleName1),
getModuleDynamically(path, moduleName2),
getModuleDynamically(path, moduleName3)
])
Got there! #aquilesb's answer did help, although I've failed to get getModuleDynamically() from the answer working after plenty of experimenting.
Update: Looks like with this solution I'm not able to use imports of npm modules. I've tried experimenting with worker-loader but haven't got anywhere so far.
Here are a few takeaways:
Create a separate webpack config for packing workers. The target: 'webworker' must be there. Call it with webpack --config ./webpack.config.workers.js, as Vue wouldn't know about that.
Create a separate tsconfig.json for workers TypeScript. The lib setting for workers must be there: "lib": ["esnext","webworker","scripthost"] as well as the proper include:[...]/exclude:[...] settings.
You may need to tell Vue to use the main tsconfig.json that has it's own "lib":["esnext","dom","dom.iterable","scripthost"] and include/exclude. This is done in vue.config.js, you will probably need to create it. I use chainWebpack configuration option of Vue config.
Let Webpack know you have dynamic loading by making calls to import() with static (i.e. not variable) names. I haven't found a way to do so in config file, but it doesn't matter: you can't help hardcoding the names somewhere, how else Webpack would know it has to bundle and split the code?
Somehow get the name(s) of generated files, as you must have at least one of them at runtime to do new Worker(filename). I used the --json option of Webpack CLI for that.
There are many ways all of this can be achieved. This is what this ended up looking like in my project:
Folder structure:
webpack.config.workers.js
vue.config.js
tsconfig.base.json
src/main/
src/main/tsconfig.json -- extends tsconfig.base.json
src/shared/ -- this code may be duplicated by the Vue app bundles and by workers bundle
src/workers/
src/workers/tsconfig.json -- extends tsconfig.base.json
webpack.config.workers.js: contains a single entry – the main worker file, that loads the other stuff.
entry: {
worker: './src/workers/worker.ts'
}
build.workers.sh: the script calls Webpack CLI and produces a JSON file with the resulting workers names (trivial actions on folders are omitted). The only one I need is called "worker". The rest is to be dynamically loaded by it.
#!/bin/bash
# Map entry name -> bundle file name
# "assetsByChunkName":{"entryN":"entryN.[hash].js", ...}
json=$(webpack --config ./webpack.config.workers.js --json $#|tr -d "\n\r\t "|grep -Eo '"assetsByChunkName":.+?}')
# Remove "assetsByChunkName"
json=$(echo "${json:20}")
echo $json
echo $json > "$target/$folder/workers.json"
Load workers.json at runtime. The other option would be to use it at compile time by providing Vue config with const VUE_APP_MAIN_WORKER = require("path to/workers.json").worker and using this env constant.
Now that we have the name of the main worker file, we can do new Worker("main worker file path we've got from webpack").
The main worker file contains the function that statically references other modules and dynamically loads them. This way Webpack knows what to bundle and how to split the code.
enum WorkerName {
sodium = "sodium",
socket = "socket"
}
function importModule(name: WorkerName): Promise<any> {
switch (name) {
case WorkerName.sodium:
return import(
/* webpackChunkName: "sodium" */
"workers/sodium"
);
case WorkerName.socket:
return import(
/* webpackChunkName: "socket" */
"workers/socket"
);
}
}
Use the postMessage/message event API to tell your main worker code what to load.
const messageHandler = (e: MessageEvent) => {
// here goes app-specific implementation of events
// that gets you the moduleName in the end
importModule(moduleName).then((work) => {
// do smth
});
};
Now, to the correct answer.
To achieve the following:
Using webworkers
Using both dynamic imports and normal imports in webworker code
Sharing code between webworkers and main app
I had to add a separate rule for worker-loader in vue.config.js and also to add babel-loader. It took me some time to find the correct solution, but I dropped the previous one (in my other answer) in the end. I still use separate tsconfig.js for main and for webworkers.
What I'm still not happy with, is that vue-cli–or rather fork-ts-checker plugin–doesn't seem to know the webworker-specific types in my worker classes (so I can't use DedicatedWorkerScope, for instance).

Is there a way to make Webpack 2 ignore require (CommonJS) calls? To not transpile them?

I have code that has
require(...)
can we make Webpack ignore that and not touch it, so that I can use runtime RequireJS from my Webpack bundle?
The reason is because I'd like to have new code in my app compiled with webpack, and allow it to use RequireJS to pull my old code which is already requirejs calls with a bunch of hacks that are way to difficult to port to modern modules. I just want to import them the old way while write new code with Webpack.
How can we do this?
Put your new code inside a folder, and point a webpack to it, or the opposite, put your legacy code in a folder, or rename each legacy file as file.legacy.js and then use Ignore Plugin:
var ignore = new webpack.IgnorePlugin(/legacy.js$/)
module.exports = {
//other options goes here
plugins: [ignore]
}

Can I achieve this with Gulp + Browserify? Do I need Webpack instead?

I've been using Gulp for a while now. Then I was introduced to Browserify. Now Webpack is on my horizon given I'm moving towards React. I digress...
I'd like to achieve these steps:
require front end dependencies, such as jQuery and Backbone, in a node application so as to have one source of truth - npm.
Concatenate those dependencies (in whatever order I choose) into a dependencies.js file suitable for the browser (Browserify, right?).
Concatenate the above file with my own JavaScript files (in whatever order I choose) into a all.min.js file, minified, wrapped in a self executing anonymous function - (function(){ /*all code here*/})() - so as to avoid any global variables / variables on the window object / global pollution. (<-- this one is the key).
I'd love to be able to handle this all in just Gulp as I'm used to it, but the whole global pollution thing is killing me. Take a look at the gulp file:
var gulp = require('gulp');
var concat = require('gulp-concat-util'); // Makes concat.header, concat.footer available.
var uglify = require('gulp-uglify');
var browserify = require('browserify');
var source = require('vinyl-source-stream');
var buffer = require('vinyl-buffer');
gulp.task('browserify', function() {
return browserify('libraries.js') // File containing a list of requires's.
.bundle()
.pipe(source('dependencies.js')) // Concatenated file.
.pipe(buffer())
.pipe(gulp.dest('./dev/dependencies')); // Destination directory.
});
gulp.task('scripts', function() {
return gulp.src([
'dev/dependencies/dependencies.js',
'dev/js/models/**/*.js', // Models before views & collections.
'dev/js/**/!(app|templates)*.js',
'dev/js/templates.js',
'dev/js/app.js'
])
.pipe(concat('all.min.js')) // Final file.
.pipe(concat.header('(function(){')) // Attempts to wrap in a SEAF.
.pipe(concat.footer('\n})();'))
.pipe(uglify())
.pipe(gulp.dest('public'));
});
All of that code will indeed produce a single file, minified, wrapped in a SEAF, but the libraries I required (say Backbone & jQuery) are still accessible in the global scope. My life! Is this just the way it works? Those libraries are attaching themselves to window (I looked in the code) but I thought maybe some Browserify magic could deal with that. Any thoughts are appreciated!
A lot of modules will use UMD (Universal Module Definition) patterns. One of the patterns is to always declare the module as a global resource so that other libraries that do not use a module loader will have the library available globally. Having single libraries globally registered is not terribly bad as long as they do not collide. Since you are wrapping your app code in an IIFE and hopefully not adding to the window object directly the globals should be limited to just the 3rd party libraries.
I'm using webpack for processing js and css and I'm pretty happy with it.
All library dependencies are installed by npm install and saved in package.json. In the result I have bunch of minified files that contains all my app. JS modules are stored in function scopes, so, no global scope is spoiled. Easy to maintain, easy to understand and track all dependencies used in each module.
I'll give you some basic example.
File: webpack.config.js - configuration for webpack
module.exports = {
// entrypoint for webpack to start processing
entry: {
bundle: './js/main.js'
},
output: {
path: __dirname + '/' + 'assets',
filename: '[name].js',
publicPath : '/assets/',
}
};
File: ./js/main.js - entrypoint for webpack
// This is require from node_modules
var $ = require('jquery');
// this is require for my local module
var MyModule1 = require('./modules/module1.js');
File: ./modules/module1.js - local module
var OtherLib = require('other-lib');
...
// exporting some data for others
module.exports = {
// export object
};
Pros:
One tool for everything
Easy dependency resolving and installing
Bunch of plugins/loaders from community https://webpack.github.io/docs/list-of-loaders.html
Easy to use from the box
Cons that I faced:
webpack is JS preprocessor, so, if you need to have processed CSS as separate file (not in minified JS), you need to extract CSS from JS result. So, I'm using ExtractTextPlugin.
It quite slow while processing lots of SASS files in --watch mode.

Migrating node code base to TypeScript: global scope?

I'm trying to migrate large node codebase to TypeScript. To make things easier, I just want to start by renaming .js files to .ts files, fix any semantic issues, without much refactoring, and then gradually improve the code base.
Let's consider the following example:
logger.js:
exports = function(string) {
console.log(string);
}
moduleA.js:
var logger = require('./logger')
exports.someAction = function() {
logger.log('i'm in module a')
}
moduleB.js:
//var logger = require('./logger')
exports.someOtherAction = function() {
logger.log('i'm in module B')
}
moduleC.js:
var moduleA = require('./moduleA');
var moduleB = require('./moduleB');
exports.run = function(string) {
moduleA.someAction();
moduleB.someOtherAction(); //exception here - logger not defined
}
So if i execute moduleB.someOtherAction() i'll get an exception, because logger is not defined in scope in moduleB.js.
But typescript compiles just fine, because logger is declared in moduleA, and this is because (if i understand correctly) typescript treats all of those files as a single compilation unit.
So is there anyway to avoid this without much refactoring?
Update
I've created a sample project which can be found here
If i run typescript compiler, i get no errors, though logger is commented out in moduleB.ts:
g#w (master) ~/projects/ts-demo: gulp generate
[10:39:46] Using gulpfile ~/projects/ts-demo/gulpfile.js
[10:39:46] Starting 'generate'...
[10:39:46] Starting 'clean'...
[10:39:46] Finished 'clean' after 11 ms
[10:39:46] Starting '<anonymous>'...
[10:39:47] Finished '<anonymous>' after 1.4 s
[10:39:47] Finished 'generate' after 1.41 s
g#w (master) ~/projects/ts-demo:
Update 2
Ok, so this is expected behaviour as stated in TypeScript deep dive book:
If you now create a new file bar.ts in the same project, you will be allowed by the TypeScript type system to use the variable foo from foo.ts as if it was available globally
Louy is right. Every file (when compiling for node with CommonJS) is created like its own module. Just like the normal case in node.
So to get this to work you could do something like this:
logger.ts
export default (str: string) => console.log(str);
moduleA.ts
import logger from './logger';
export var someAction = () => {
logger("i'm in module a");
}
moduleB.ts
import logger from './logger';
export var someOtherAction = () => {
logger("i'm in module B");
}
moduleC.ts
import { someAction } from './moduleA';
import { someOtherAction } from './moduleB';
someAction();
someOtherAction();
I also used a tsconfig.json:
{
"compilerOptions": {
"module": "commonjs",
"noImplicitAny": true
},
"files": [
"logger.ts",
"moduleA.ts",
"moduleB.ts",
"moduleC.ts"
]
}
Compiling this (just type tsc in the folder containing the tsconfig.json) results in one .js file per .ts file. It should compile and run just fine. The main thing you need to take from this is that each file is its own self contained module when compiling for node/CommonJS.
EDIT Due to edits in question:
Ok after looking at your code again there is a couple of reasons that this compiles but fail to run properly:
If you send a group of files to the compiler all references in all of those files, and all of the variables in the global scope will be available in all .ts files sent to the compiler. This is confusing and sometimes regrettable, but at the moment it is how things work. This in your case results in that the require function defined in the global scope in node.d.ts is available in all .ts files, not only in moduleC.ts; if the reference is removed from moduleC.ts you will get an error in logger.ts because the require function is not defined for example. It also results in var logger = .. defined in logger.ts will be seen as available in the other .ts files. Typescript assumes that it will be available at run time.. not ideal when compiling for node, but not a problem when you are writing actual typescript not trying to compile pure javascript. In fact, because it is pure javascript the compiler does not recognise it as a node module, and treats it as a generic javascript file.
When compiling with commonjs each file is compiled on its own by the typescript compiler. If any references or imports are found those will be followed and used to type and verify the code of course, but the compilation is still done on a per file basis (var something = require('./whatever'); is not an import, it is interpreted by the compiler as a variable being assigned by a function taking a string as an argument.)
So thats why it compiles, lets go on to why its not working then.
Each file outputted by the compiler in commonjs mode is its own "node-module". If you want to have access to one module from within another in node (this has nothing to do with typescript really, it is just how node works) you will need to require that file. in moduleB.ts (and in moduleB.js for that matter) there is no indication to node to say what logger is. So you simply need to require logger.
In the example you are referring to in your second update of the question commonjs is not being used, and further more it is in connection with typescripts internal modules - not node modules which is a completely different thing.
So the simple answer is, that if you want to use one file from within a another in node you have to require it. Simply uncommenting the commented reference in moduleB.ts should get everything up and running.
In the current state of your code the typescript compiler is not helping you much, because it can't really. The code is close to pure javascript, to get decent help from the compiler you will need to start converting it to typescript. The first step might be to replace var asdf = require("") with import asdf = require("");, this will make it possible for the compiler to check the references and help you a lot in return.
I realise this is a wordy answer, but you managed to create yourself a confusing and difficult to explain behaviour.
There's something called internal/external modules in typescript.
Internal modules can declare stuff that can be referenced everywhere in the project. Think about typings/lib files, you should be able to reference things like Error and String everywhere in your project.
External modules on the other hand have to be imported to be referenced. This is what you want to do.
The difference between internal and external modules is that external ones use import and export.
Internal ones use declare module/namespace/var.
Just add "compilerOptions": {"module": "commonjs"} to your tsconfig.json file to enable external modules.

Require JS files dynamically on runtime using webpack

I am trying to port a library from grunt/requirejs to webpack and stumbled upon a problem, that might be a game-breaker for this endeavor.
The library I try to port has a function, that loads and evaluates multiple modules - based on their filenames that we get from a config file - into our app. The code looks like this (coffee):
loadModules = (arrayOfFilePaths) ->
new Promise (resolve) ->
require arrayOfFilePaths, (ms...) ->
for module in ms
module ModuleAPI
resolve()
The require here needs to be called on runtime and behave like it did with requireJS. Webpack seems to only care about what happens in the "build-process".
Is this something that webpack fundamentally doesn't care about? If so, can I still use requireJS with it? What is a good solution to load assets dynamically during runtime?
edit: loadModule can load modules, that are not present on the build-time of this library. They will be provided by the app, that implements my library.
So I found that my requirement to have some files loaded on runtime, that are only available on "app-compile-time" and not on "library-compile-time" is not easily possible with webpack.
I will change the mechanism, so that my library doesn't require the files anymore, but needs to be passed the required modules. Somewhat tells me, this is gonna be the better API anyways.
edit to clarify:
Basically, instead of:
// in my library
load = (path_to_file) ->
(require path_to_file).do_something()
// in my app (using the 'compiled' libary)
cool_library.load("file_that_exists_in_my_app")
I do this:
// in my library
load = (module) ->
module.do_something()
// in my app (using the 'compiled' libary)
module = require("file_that_exists_in_my_app")
cool_library.load(module)
The first code worked in require.js but not in webpack.
In hindsight i feel its pretty wrong to have a 3rd-party-library load files at runtime anyway.
There is concept named context (http://webpack.github.io/docs/context.html), it allows to make dynamic requires.
Also there is a possibility to define code split points: http://webpack.github.io/docs/code-splitting.html
function loadInContext(filename) {
return new Promise(function(resolve){
require(['./'+filename], resolve);
})
}
function loadModules(namesInContext){
return Promise.all(namesInContext.map(loadInContext));
}
And use it like following:
loadModules(arrayOfFiles).then(function(){
modules.forEach(function(module){
module(moduleAPI);
})
});
But likely it is not what you need - you will have a lot of chunks instead of one bundle with all required modules, and likely it would not be optimal..
It is better to define module requires in you config file, and include it to your build:
// modulesConfig.js
module.exports = [
require(...),
....
]
// run.js
require('modulesConfig').forEach(function(module){
module(moduleAPI);
})
You can also try using a library such as this: https://github.com/Venryx/webpack-runtime-require
Disclaimer: I'm its developer. I wrote it because I was also frustrated with the inability to freely access module contents at runtime. (in my case, for testing from the console)

Categories

Resources