Can i minify without uglifying with webpack - javascript

What I want to do is simple: I want to minify my code without uglifying it. I want to do this because am building a node module which I need to use in different environments.
My configuration is simple and standard. I just don't know how to minify without uglifying.
This is what I got:
Files:
src
- index.js
- Dog.js
dist
- main.js
webpack.config.js
module.exports = {
target: 'node',
mode: 'production',
};
index.js
const Dog = require("./Dog");
module.exports = {
Dog
}
Dog.js
class Dog{
//Typical Dog stuff
}
module.exports = Dog;
According to the next link minifying does increases performance.
Does it make sense to minify code used in NodeJS?

Per request of the OP, rounding up the comments in an answer.
For the reader, I think it is important to clarify that even though Javascript is widely (and correctly) known as an interpreted language by nature, browsers and certain other platforms compile it to native code for performance reasons. Read more about it here. Node.js is also built on V8 ==> What is the relationship between Node.js and V8?
The reason, it is common practice to minify client side code is because those files are transfered over the wire which is where we have significant overhead. Whereas for the server side code, the file size will only effect the compilation time.
There used to be a spec in V8 that hard stopped inlining function if the function body was longer than, I believe 600 characters, but this has been removed post Node 8.3+. See kibubi's answer in this question to see the commit that removed this limit:
Does removing comments improve code performance? JavaScript
You can read more about the new V8 optimizations here

Related

How export and import work in typescript?

I was going through Angular2 quickstart tutorial with Javascript and Typescript as well, In javascript version I observed that components and modules are first assigned to a variable (window.app which I understood as some global variable that can be accessed across js files or script blocks) and that is fine. Coming to type script version just export and import were used, I tried to analyze the generated javascript code but understood nothing. Can some one explain me how this export and import works in Tyepescript.
Import and export in typescript are explained well by the documentation here https://www.typescriptlang.org/docs/handbook/modules.html.
Like toskv said in his comment, how those statements in your TypeScript files get transpiled into statements in your JavaScript files depends largely on the module system you set up in your tsconfig.json file.
For example, setting "module": "commonjs" will cause the TypeScript compiler (tsc) to transform your import/export statements into essentially node.js-style require() statements. This documentation has a few simple, but helpful, examples of how node.js modules work: https://nodejs.org/api/modules.html.
Using a setting of "systemjs" instead of "commonjs" will make TypeScript translate your import/export statements into a format that SystemJS understands, of which I am no expert.
This process is further complicated by the fact that Angular 2 projects also require build steps that take the transpiled JavaScript files and turn them into packaged "bundles." These bundled files are (depending on your configuration settings) concatenated, minified, and perhaps even uglified. So looking at the final javascript code that is run is really not helpful, as it was not written by humans.
For example, the Webpack build system (google webpack.js) takes require() statements it finds in JavaScript code and does some magic to wrap each module in its own __webpack_require__ function, which allows the build system to take your whole project file structure and bundle it in to one or several JavaScript files which still maintain their dependencies on each other.
In other words, by the time you look at the production JavaScript code, it's not meant to be intelligible by human readers. The flow can be simply represented by TS Source Code > TS Transpilation into JS Code > Module/Dependency Build Steps into Production JS Code.
TL;DR TypeScript doesn't actually handle the module importing/exporting. During transpilation, it converts those statements into statements other module systems (node.js or SystemJS) can understand, which are in turn converted into production code for serving an Angular 2 application.

How to combine all js files?

I have more than javascript files in my html documents as external which I'd like to combine on account of not to be crowded. is there any way to combine my js files ? for example;
my files:
a.js
b.js
c.js
d.js
and i want;
all.js
Take a look at requirejs.org and especially look at r.js (http://requirejs.org/docs/optimization.html)
a.js
var i=0;
function fun1()
{
...
}
b.js
var k=0;
function fun2()
{
...
}
all.js just copy, paste like css
var i=0;
function fun1()
{
...
}
var k=0;
function fun2()
{
...
}
take care of semicolons and closed braces when you purticularly write whole script in an eventlistener, especially 'DOMContentLoaded'
document.addEventlistener('DOMContentloaed',function()
{
//whole big script
}
);
instead use
document.addEventlistener('DOMContentloaed',some_function);
var some_function = function(){*bla bla bla*};
A simple bash cat operation will do what you want, but, at some stage you're probably going to want more right?
Grunt and Grunt-contrib-concat is a good starter, but you'll quickly realise grunt is not particularly good. To summarise usage, you create a gruntfile, install a few dependencies (i.e. install grunt and its command line interface, this is easy) and run grunt from your project root. It then parses the gruntfile to find out what you want it to do, and it does it. Pretty simple, and simple is good.
Next up is Gulp, which is a nice build system using streams, so, slightly more complex (well, easier and more powerful but, streams can be kind-of confusing at first). Gulp works in the same way only it parses a gulpfile for instructions. For a concat operation the actual gulp command is trivial:
gulp.src( '*.js' )
.pipe( gulp.dest( 'all.js' )
Between the .src and the .dest you can pipe the files through multiple transforms, such as minifying, transpiling, notifying—the list of plugins and modules is dizzying (as it is for grunt).
However, if you’re a fan of node and npm (you probably should be) then you can use npm scripts to create a build system. npm is the node package manager and requires a package.json to give some clues as to how to work. Part of that json specification is a scripts block
"scripts": {
"build" : "cat *.js > all.js"
}
You can then use npm run build from the command line, whereby npm will parse the package.json and execute the script using bash (sh actually).
Note that these are build systems, and there are many others.
There are also other packagers (which you would probably use as part of your build system, although for some projects they are you entire build system) but they are more complex than your needs, for your own research browserify, webpack and jspm are all excellent (bare in mind AMD modules lost so require.js is probably not worth your time), although this area is becoming congested. Each of these are very powerful modularisation tools, but they will require some changes to how you structure your code. If you are serious about modularisation then they are worth your time learning.
On a slightly different tangent, there is some discussion about whether one large file is actually more beneficial than a number of smaller scripts. In many cases simply serving a few small files is actually quicker, and may be easier, although there can be other benefits of smashing code together. Currently it is probably still best to concat at least into less HTTP requests, but this requirement for performance is going away.
It might be helpful : https://github.com/mrclay/minify
OR
create file all.js and paste a.js,b.js,c.js,d.js file code

Multiple flavors / targets with Browserify in single JavaScript codebase

I'm working on multiple browser extension / add-ons that usually need to work at least in Chrome and Firefox (sometimes in Safari as well).
The biggest issue is staying DRY and on the other hand keeping the source clean.
Conceptually the project usually has following parts:
background script for Chrome
background script for Firefox
common background code
content script for Chrome
content script for Firefox
common content script code
other scripts (for example: option page).
To reduce code duplication I have a single content script for both browser and preprocess
it (removing other-browser-specific parts) during the build process.
Unfortunately this makes content scripts really long and ugly (and hard to lint).
I would like to use Browserify basically for the whole JS code in my projects.
Still to do it I need a solution to handle this kind of flow:
Browser specific entry script -> cross-browser code -> browser specific low-level code.
I would imagine this kind of hierarchy:
- Entry scripts
- Browser A
- Browser B
- ...
- Common code
- Low-level code
- Browser A
- Browser B
- ...
So, for example, during the build process I would like Browserify to take an entry script for browser A, then bundle it together with the common code and with low-level code for browser A only. This is to be done without this kind of switching in the common code:
if(isBrowserA()) {
var lowLevelModule = require("../lowLevel/browserA/module");
} else {
var lowLevelModule = require("../lowLevel/browserB/module");
}
I would like the build process with Browserify to do exactly that for me -- replace the "root path" of low level code depending on the target.
Hacking it around with package.json wouldn't work because I need flexible number of targets (and possibly even deeper dependency tree).
Try using the factor-bundle or partition-bundle Browserify plugins. They both help split code into different entry files and a common modules file. partition-bundle also includes scripts that enable asynchronous loading of your different bundles.
One possible way to do this would be to drop the if/else require() calls from your code, using a fixed path instead.
var lowLevelModule = require("../lowLevel/module")
Then run a separate build for each browser, using browserify to change what the path resolves to for each build using expose.
So in a gulpfile.js (just for example, the browserify API is the important bit - you could do the same from a shell using -r and -x flags to browserify, using : to separate the require/expose values), run the build once for each browser, passing in a different --browser= arg each time.
var browserify = require('browserify')
var gulp = require('gulp')
var gutil = require('gulp-util')
var source = require('vinyl-source-stream')
var browser = gutil.env.browser // browserA, or browserB, or...
// You might want to configure paths up-front separately, just
// hardcoding below for brevity.
gulp.task('bundle-app', function() {
var b = browserify('./entry/' + browser + '/module', {detectGlobals: false})
b.require('./path/to/lowLevel/' + browser + '/module', {expose: '../lowLevel/module'})
return b.bundle()
.pipe(source('app.js'))
.pipe(gulp.dest('./build'))
})
For common dependencies, you could bundle them into an external file and add external calls to your browser-specific bundles:
var commonModules = ['module1', 'module2']
gulp.task('bundle-common', function() {
var b = browserify({detectGlobals: false})
commonModules.forEach(function(module) {
b.require(module)
})
return b.bundle()
.pipe(source('common.js'))
.pipe(gulp.dest('./build'))
})
gulp.task('bundle-app', function() {
var b = browserify('./entry/' + browser + '/module', {detectGlobals: false})
commonModules.forEach(function(module) {
b.external(module)
})
b.require('./path/to/lowLevel/' + browser + '/module', {expose: '../lowLevel/module'})
return b.bundle()
.pipe(source('app.js'))
.pipe(gulp.dest('./build'))
})
I usually put chains of builds in a package.json script for convenience:
"scripts": {
"build": "gulp bundle-common && gulp bundle-app --browser=browserA && gulp bundle-app --browser=browserB"
}
Finally:
npm run build
This is not something Browserify supports out-of-the-box unless, but you could in theory achieve what you want if you write a custom source code transform.
As an alternative, the RaptorJS Optimizer provides the exact same feature that you are looking for out-of-the-box. (disclaimer: I am the author of this tool and it is the tool we use at eBay for all our Node.js applications) The RaptorJS Optimizer allows you to remap one module to another based on a set of arbitrary flags that are enabled during optimization. We use this feature at eBay to conditionally send down different code for different web browsers, devices, experimentation groups, etc. For more details on that feature, please see:
https://github.com/raptorjs/optimizer-require#conditional-remap
https://github.com/raptorjs/optimizer#conditional-dependencies
FYI, the RaptorJS Optimizer supports all of the features of Browserify, plus it provides support for non-JS dependencies, async loading, conditional dependencies, dynamic requires, etc. It is still very modular like Browserify and can be extended via plugins to teach it how to handle new dependency types. Unlike Webpack, the RaptorJS Optimizer does not overload the CommonJS module loading system so that code will still be allowed run under Node.js and in the web browser. We have had a lot of success with the RaptorJS Optimizer at eBay (and other companies) so I encourage you to check it out.

Generating source maps for multiple concatenated javascript files compiled from Coffeescript

Has any one had any success with this?
I think it's more or less an unsolved problem:
https://github.com/jashkenas/coffee-script/issues/2779 . Last meanigingful comment was from jwalton, a month ago.
Still, it doesn't seem rocket science to add support for it, so it will probably come soon.
Michael Ficarra (creator of CoffeeScript Redux) suggested using https://github.com/michaelficarra/commonjs-everywhere .
Two caveats:
It only works for bundling CommonJS modules.
It uses CoffeeScript Redux, which is still in beta (although working quite well it seems), and not 100% compatible with original CoffeeScript compiler.
So this does not work for what you ask for specifically, "concatenation".
Added April 14
You might have luck with these: combine-source-map and/or generate-sourcemap, both by same author.
Added April 26
This looks really simple: https://npmjs.org/package/mapcat . You just have to feed it the individual source map files generated by the coffee compiler.
Added May 16
Mariusz Nowak has just released webmake-coffee. Like CommonJS Everywhere, it requires code to be organized as CommonJS modules. Unlike CommonJS everywhere, it uses regular CoffeeScript.
It also seems the Grunt Coffee-Script plugin has had source-map support for concatenated files for quite a while (two months), effectively proving my original answer to be incorrect.
The upcoming version 2.0 of Snockets will have support for it too.
I ended up going with browserify using coffeeify as the transform option, and enabling browserify's debug option. I bundle up the app on each request for my main.js file, and any runtime errors show up in my original source with pretty decent accuracy.
Sure beats mapping runtime errors in the concatenated/compiled js back to the coffee source with my eyeballs!
I needed to annotate AngularJS code before minification, but grunt-ng-annotate didn't accept input source maps, thus I would not be able to use maps generated by the CoffeeScript compiler.
Apparently, with gulp-sourcemaps this is not an issue:
var gulp = require('gulp');
var $ = require('gulp-load-plugins')(); // loading gulp plugins lazily
// remember to include them in the package.json
gulp.task('appJS', function() {
// concatenate compiled .coffee files and js files into build/app.js
gulp.src(['./app/**/*.js','./app/**/*.coffee'])
.pipe($.sourcemaps.init())
.pipe($['if'](/[.]coffee$/, $.coffee({bare: true}).on('error', $.util.log)))
.pipe($.concat('app.js'))
.pipe($.ngAnnotate())
.pipe($.uglify())
.pipe($.sourcemaps.write())
.pipe(gulp.dest('./build'))
});
The same approach works in other situations, too. In my case, this is the only approach that worked.
I have written a grunt task that does this flawless. Check it out

Node-style require for in-browser javascript? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
Are there any libraries for in-browser javascript that provide the same flexibility/modularity/ease of use as Node's require?
To provide more detail: the reason require is so good is that it:
Allows code to be dynamically loaded from other locations (which is stylistically better, in my opinion, than linking all your code in the HTML)
It provides a consistent interface for building modules
It is easy for modules to depend on other modules (so I could write, for instance, an API that requires jQuery so I can use jQuery.ajax()
Loaded javascript is scoped, meaning I could load with var dsp = require("dsp.js"); and I would be able to access dsp.FFT, which wouldn't interfere with my local var FFT
I have yet to find a library that does this effectively. The workarounds I tend to use are:
coffeescript-concat -- it's easy enough to require other js, but you have to compile it, which means it is less great for fast development (e.g. building APIs in-test)
RequireJS -- It's popular, straightforward, and solves 1-3, but lack of scoping is a real deal-breaker (I believe head.js is similar in that it lacks scoping, though I've never had any occasion to use it. Similarly, LABjs can load and .wait() does mollify dependency issues, but it still doesn't do scoping)
As far as I can tell, there appear to be many solutions for dynamic and/or async loading of javascript, but they tend to have the same scoping issues as just loading the js from HTML. More than anything else, I would like a way to load javascript that does not pollute the global namespace at all, but still allows me to load and use libraries (just as node's require does).
2020 UPDATE: Modules are now standard in ES6, and as of mid-2020 are natively supported by most browsers. Modules support both synchronous and asynchronous (using Promise) loading. My current recommendation is that most new projects should use ES6 modules, and use a transpiler to fall back to a single JS file for legacy browsers.
As a general principle, bandwidth today is also typically much wider than when I originally asked this question. So in practice, you might reasonably chose to always use a transpiler with ES6 modules, and focus your effort on code efficiency rather than network.
PREVIOUS EDIT (or if you don't like ES6 modules): Since writing this, I have extensively used RequireJS (which now has much clearer documentation). RequireJS really was the right choice in my opinion. I'd like to clarify how the system works for people who are as confused as I was:
You can use require in everyday development. A module can be anything returned by a function (typically an object or a function) and is scoped as a parameter. You can also compile your project into a single file for deployment using r.js (in practice this is almost always faster, even though require can load scripts in parallel).
The primary difference between RequireJS and node-style require like browserify (a cool project suggested by tjameson) uses is the way modules are designed and required:
RequireJS uses AMD (Async Module Definition). In AMD, require takes a list of modules (javascript files) to load and a callback function. When it has loaded each of the modules, it calls the callback with each module as a parameter to the callback. Thus it's truly asynchronous and therefore well-suited to the web.
Node uses CommonJS. In CommonJS, require is a blocking call that loads a module and returns it as an object. This works fine for Node because files are read off the filesystem, which is fast enough, but works poorly on the web because loading files synchronously can take much longer.
In practice, many developers have used Node (and therefore CommonJS) before they ever see AMD. In addition, many libraries/modules are written for CommonJS (by adding things to an exports object) rather than for AMD (by returning the module from the define function). Therefore, lots of Node-turned-web developers want to use CommonJS libraries on the web. This is possible, since loading from a <script> tag is blocking. Solutions like browserify take CommonJS (Node) modules and wrap them up so you can include them with script tags.
Therefore, if you are developing your own multi-file project for the web, I strongly recommend RequireJS, since it is truly a module system for the web (though in fair disclosure, I find AMD much more natural than CommonJS). Recently, the distinction has become less important, since RequireJS now allows you to essentially use CommonJS syntax. Additionally, RequireJS can be used to load AMD modules in Node (though I prefer node-amd-loader).
I realize there may be beginners looking to organize their code. This is 2022, and if you're considering a modular JS app, you should get started with npm and Webpack right now.
Here are a few simple steps to get started:
In your project root, run npm init -y to initialize an npm project
Download the Webpack module bundler: npm install webpack webpack-cli
Create an index.html file:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta http-equiv="X-UA-Compatible" content="ie=edge">
<title>App</title>
</head>
<body>
<script src="_bundle.js"></script>
</body>
</html>
Pay special attention to _bundle.js file - this will be a final JS file generated by webpack, you will not modify it directly (keep reading).
Create a <project-root>/app.js in which you will import other modules:
const printHello = require('./print-hello');
printHello();
Create a sample print-hello.js module:
module.exports = function() {
console.log('Hello World!');
}
Create a <project-root>/webpack.config.js and copy-paste the following:
var path = require('path');
module.exports = {
entry: './app.js',
output: {
path: path.resolve(__dirname),
filename: '_bundle.js'
}
};
In the code above, there are 2 points:
entry app.js is where you will write your JS code. It will import other modules as shown above.
output _bundle.js is your final bundle generated by webpack. This is what your html will see at the end.
Open your package.json, and replace scripts with the following command:
"scripts": {
"start": "webpack --mode production -w"
},
And finally run the script watch app.js and generate the _bundle.js file by running: npm start.
Enjoy coding!
Check out ender. It does a lot of this.
Also, browserify is pretty good. I've used require-kiss¹ and it works. There are probably others.
I'm not sure about RequireJS. It's just not the same as node's. You may run into problems with loading from other locations, but it might work. As long as there's a provide method or something that can be called.
TL;DR- I'd recommend browserify or require-kiss.
Update:
1: require-kiss is now dead, and the author has removed it. I've since been using RequireJS without problems. The author of require-kiss wrote pakmanager and pakman. Full disclosure, I work with the developer.
Personally I like RequireJS better. It is much easier to debug (you can have separate files in development, and a single deployed file in production) and is built on a solid "standard".
I wrote a small script which allows asynchronous and synchronous loading of Javascript files, which might be of some use here. It has no dependencies and is compatible to Node.js & CommonJS. The installation is pretty easy:
$ npm install --save #tarp/require
Then just add the following lines to your HTML to load the main-module:
<script src="/node_modules/#tarp/require/require.min.js"></script>
<script>Tarp.require({main: "./scripts/main"});</script>
Inside your main-module (and any sub-module, of course) you can use require() as you know it from CommonJS/NodeJS. The complete docs and the code can be found on GitHub.
A variation of Ilya Kharlamov great answer, with some code to make it play nice with chrome developer tools.
//
///- REQUIRE FN
// equivalent to require from node.js
function require(url){
if (url.toLowerCase().substr(-3)!=='.js') url+='.js'; // to allow loading without js suffix;
if (!require.cache) require.cache=[]; //init cache
var exports=require.cache[url]; //get from cache
if (!exports) { //not cached
try {
exports={};
var X=new XMLHttpRequest();
X.open("GET", url, 0); // sync
X.send();
if (X.status && X.status !== 200) throw new Error(X.statusText);
var source = X.responseText;
// fix (if saved form for Chrome Dev Tools)
if (source.substr(0,10)==="(function("){
var moduleStart = source.indexOf('{');
var moduleEnd = source.lastIndexOf('})');
var CDTcomment = source.indexOf('//# ');
if (CDTcomment>-1 && CDTcomment<moduleStart+6) moduleStart = source.indexOf('\n',CDTcomment);
source = source.slice(moduleStart+1,moduleEnd-1);
}
// fix, add comment to show source on Chrome Dev Tools
source="//# sourceURL="+window.location.origin+url+"\n" + source;
//------
var module = { id: url, uri: url, exports:exports }; //according to node.js modules
var anonFn = new Function("require", "exports", "module", source); //create a Fn with module code, and 3 params: require, exports & module
anonFn(require, exports, module); // call the Fn, Execute the module
require.cache[url] = exports = module.exports; //cache obj exported by module
} catch (err) {
throw new Error("Error loading module "+url+": "+err);
}
}
return exports; //require returns object exported by module
}
///- END REQUIRE FN
(function () {
// c is cache, the rest are the constants
var c = {},s="status",t="Text",e="exports",E="Error",r="require",m="module",S=" ",w=window;
w[r]=function R(url) {
url+=/.js$/i.test(url) ? "" : ".js";// to allow loading without js suffix;
var X=new XMLHttpRequest(),module = { id: url, uri: url }; //according to the modules 1.1 standard
if (!c[url])
try {
X.open("GET", url, 0); // sync
X.send();
if (X[s] && X[s] != 200)
throw X[s+t];
Function(r, e, m, X['response'+t])(R, c[url]={}, module); // Execute the module
module[e] && (c[url]=module[e]);
} catch (x) {
throw w[E](E+" in "+r+": Can't load "+m+S+url+":"+S+x);
}
return c[url];
}
})();
Better not to be used in production because of the blocking. (In node.js, require() is a blocking call is well).
Require-stub — provides node-compliant require in browser, resolves both modules and relative paths. Uses technic similar to TKRequire (XMLHttpRequest).
Resulting code is fully browserifyable, in that require-stub can serve as a replacement for watchify.
Webmake bundles Node-style modules to Browser, give it a try.

Categories

Resources