Stop Grunt Contrib Uglify from removing unused Javascript - javascript

There is a function within my Javascript source file that I need to keep within my distribution for posterity (specifically for a sniffer to sniff out some information). It is not being called, but it needs to stay.
The Grunt task, grunt-contrib-uglify is removing this method because it is uncalled in my application.
How can I utilize the compression provided by grunt-contrib-uglify, without removing any code that is deemed unusable by this Grunt library?
Thanks.

Set options: unused to false
grunt.initConfig({
uglify: {
options: {
compress: {
unused: false
}
},
my_target: {
files: {
'dest/output.min.js': ['src/input.js']
}
}
}
});
Source: UglifyJS global definitions documentation
unused : true, // drop unused variables/functions

Related

Grunt watch less files in dynamic path

Good day!
As I can configure grunt to monitor a dynamic path, I explain, I have a path (/module/[namemodule]/assets/*.less), in the namemodule part it changes, it can be anything (name) how can I have the assets folder of all the modules monitored ?, now, the destination will also change, it will not be in assets, it will be public (/module/[namemodule]/public/*.css), the truth is that it is the first time I use grunt and I'm lost
Thanks for replying, I have this in my gruntfile.js...
less: {
developmen: {
options: {
compress: true,
yuicompress: true,
optimization: 2
},
files: {
"/modules/**/public/*.css": "/modules/**/assets/*.less"
}
}
}
and even then you can not compile the file less, grunt returns the following message.
Please see Globbing Patterns.
Double splat ** should do what you need.
UPDATE Not a valid key value pair methinks
files: {
"/modules/**/public/*.css": "/modules/**/assets/*.less"
}
Try using an absolute path for the first argument and see what happens.

RrequireJS optimizer and loading dynamic paths after optimization

I have a grunt task with requirejs and am running the optimizer.
I load a lot of external files which are not always needed at run time, usually I only need a handful of core files. Then based on user decisions I load certain files during run time.
Ex:
define(["backbone", 'text!data/filePaths.json'],
function(Backbone, filePaths) {
'use strict';
return Backbone.Model.extend({
initialize: function(){
// parse the file paths, there could be a hundred here
this.filePaths = JSON.parse(filePaths);
},
// dynamically add a file via this function call
addFile: function(id){
var self = this;
return new Promise(function(resolve, reject){
// load files dynamically based on the id passed in
require([self.filePaths[id].path], function(View){
resolve(new View());
});
});
}
});
}
);
the file paths json might look like this:
"ONE": {
"name": "BlueBox",
"path": "views/box/Blue"
},
"TWO": {
"name": "RedBox",
"path": "views/box/Red"
},
etc...
The issue is that this does not work with the optimizer for me.
When I run my app with the optimized file I get:
Uncaught Error: undefined missing views/box/Red
Update:
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
requirejs: {
desktopJS: {
options: {
baseUrl: "public/js/app",
wrap: true,
// Cannot use almond since it does not currently appear to support requireJS's config-map
name: "../libs/almond",
preserveLicenseComments: false,
optimize: "uglify2",
uglify2: {
// mangle: false,
// no_mangle: true,
// stats: true,
// "mangle-props": false,
"max-line-len": 1000,
max_line_length: 1000
},
uglify: {
mangle: false,
no_mangle: true,
stats: true,
"mangle-props": false,
max_line_length: 1000,
beautify: true
},
mainConfigFile: "public/js/app/config/config.js",
include: ["init/DesktopInit"],
out: "public/js/app/init/DesktopInit.min.js"
}
},
desktopCSS: {
options: {
optimizeCss: "standard",
cssIn: "./public/css/desktop.css",
out: "./public/css/desktop.min.css"
}
}
},
Note: if I use the unoptimized version, everything works just fine.
The optimizer is unable to trace dependencies for require calls that do not have dependencies as an array of string literals. Your call is an example of a require call that the optimizer cannot process because the list of dependencies is computed at run-time:
require([self.filePaths[id].path], function(View){
The reason for this is simple: the optimizer does not evaluate your modules as it optimizes them. Anyway, the number of possible values for self.filePaths[id].path is potentially infinite so there's no way the optimizer could handle all cases. So when the optimizer optimizes your code, the modules that should be loaded by this require call are not included in the bundle. One solution, which you've touched upon in your own answer is to use include to include all possible modules that could be loaded by that require call.
As you point out, if you can have hundred of modules, this means including them all in the bundle produced by the optimizer. Is there an alternative? Yes, there is.
You can produce a bundle that includes only the other modules of your application and leave the modules that are to be loaded by the require call above to be loaded individually rather than as part of the bundle. Ah, but there's a problem with the specific configuration you show in the question. You have a comment that says you cannot use Almond. Yet, in fact you do use it, right there on the next line. (And you also have it in your answer.) The problem is that one of Almond's restrictions is that it does not do dynamic loading. That's the very first restriction in the list of restrictions. You'd have to use a full-featured AMD loader for this, like RequireJS itself.
This is not really the answer, just a way "around" my application not working.
I believe the only way I can circumvent this is with the include configuration
Ex:
requirejs: {
desktopJS: {
options: {
baseUrl: "public/js/app",
wrap: true,
name: "../libs/almond",
optimize: "uglify2",
uglify2: {
"max-line-len": 1000,
max_line_length: 1000
},
mainConfigFile: "public/js/app/config/config.js",
include: ["init/DesktopInit", "views/box/Blue"], // include the file here, all of a sudden the error goes away for that file.
Though this becomes cumbersome for when I have a hundred files. I don't want to build the whole thing with all my files included, I would like a set of optimized files, and a bunch of other ones which can be required dynamically.

Deep, one-way synchronization of two directories using grunt-contrib-watch. Code works, but grunt-contrib-watch re-init time is too slow

I have two directories src and compiled. I would like to ensure one-way data synchronization from src to compiled with Grunt Watch. As an intermediary step, I would like to compile *.less files as well as a subset of *.js files which are written with ES6 syntax.
I've successfully written the tasks which do what I need:
// NOTE: Spawn must be disabled to keep watch running under same context in order to dynamically modify config file.
watch: {
// Compile LESS files to 'compiled' directory.
less: {
options: {
interrupt: true,
spawn: false,
cwd: 'src/less'
},
files: ['**/*.less'],
tasks: ['less']
},
// Copy all non-ES6/LESS files to 'compiled' directory. Include main files because they're not ES6. Exclude LESS because they're compiled.
copyUncompiled: {
options: {
event: ['added', 'changed'],
spawn: false,
cwd: 'src'
},
files: ['**/*', '!**/background/**', '!**/common/**', '!contentScript/youTubePlayer/**/*', '!**/foreground/**', '!**/test/**', '!**/less/**', '**/main.js'],
tasks: ['copy:compileSingle']
},
// Compile and copy ES6 files to 'compiled' directory. Exclude main files because they're not ES6.
copyCompiled: {
options: {
event: ['added', 'changed'],
spawn: false,
cwd: 'src/js'
},
files: ['background/**/*', 'common/**/*', 'contentScript/youTubePlayer/**/*', 'foreground/**/*', 'test/**/*', '!**/main.js'],
tasks: ['babel:compileSingle']
},
// Whenever a file is deleted from 'src' ensure it is also deleted from 'compiled'
remove: {
options: {
event: ['deleted'],
spawn: false,
cwd: 'src'
},
files: ['**/*'],
tasks: ['clean:compiledFile']
}
}
grunt.event.on('watch', function(action, filepath, target) {
// Determine which task config to modify based on the event action.
var taskTarget = '';
if (action === 'deleted') {
taskTarget = 'clean.compiledFile';
} else if (action === 'changed' || action === 'added') {
if (target === 'copyCompiled') {
taskTarget = 'babel.compileSingle';
} else if (target === 'copyUncompiled') {
taskTarget = 'copy.compileSingle';
}
}
if (taskTarget === '') {
console.error('Unable to determine taskTarget for: ', action, filepath, target);
} else {
// Drop src off of filepath to properly rely on 'cwd' task configuration.
grunt.config(taskTarget + '.src', filepath.replace('src\\', ''));
}
});
These tasks watch the appropriate files. The event handler dynamically modifies clean copy and babel tasks such that they work upon the files being added/changed/removed.
However, I am watching several thousand files and the watch task takes a non-trivial amount of time to initialize. On my high-end development PC initialization takes 6+ seconds. This issue is exacerbated by the fact that the watch task reinitializes after every task.
This means that if I have two files, fileA and fileB, and I modify fileA and save then there is a 6+ second period where watch fails to detect modifications to fileB. This results in de-synchronization between my two directories.
I found this GitHub issue regarding my problem, but it is still open and unanswered: https://github.com/gruntjs/grunt-contrib-watch/issues/443
The discussion on GitHub highlights that the issue may only occur when spawn: false has been set, but, according to the Grunt Watch documentation:
If you need to dynamically modify your config, the spawn option must be disabled to keep the watch running under the same context.
As such, I believe I need to continue using spawn: false.
I have to assume this is a pretty standard procedure for Grunt tasks. Am I missing something obvious here? Is the Watch task inappropriate for this purpose? Other options?
Alright, so I have a working solution, but it's not pretty.
I did end up using grunt-newer to assist with the solution. Unfortunately, it doesn't play well with grunt-contrib-copy because copying a file does not update its last modified time and so grunt-newer will execute 100% of the time.
So, I forked grunt-contrib-copy and added an option to allow updating the last modified time: https://github.com/MeoMix/grunt-contrib-copy
With that, I'm now able to write:
// NOTE: Spawn must be disabled to keep watch running under same context in order to dynamically modify config file.
watch: {
// Compile LESS files to 'compiled' directory.
less: {
options: {
interrupt: true,
cwd: 'src/less'
},
files: ['**/*.less'],
tasks: ['less']
},
// Copy all non-ES6/LESS files to 'compiled' directory. Include main files because they're not ES6. Exclude LESS because they're compiled.
copyUncompiled: {
options: {
event: ['added', 'changed'],
cwd: 'src'
},
files: ['**/*', '!**/background/**', '!**/common/**', '!contentScript/youTubePlayer/**/*', '!**/foreground/**', '!**/test/**', '!**/less/**', '**/main.js'],
tasks: ['newer:copy:compiled']
},
// Compile and copy ES6 files to 'compiled' directory. Exclude main files because they're not ES6.
copyCompiled: {
options: {
event: ['added', 'changed'],
cwd: 'src/js'
},
files: ['background/**/*', 'common/**/*', 'contentScript/youTubePlayer/**/*', 'foreground/**/*', 'test/**/*', '!**/main.js'],
tasks: ['newer:babel:compiled']
},
// Whenever a file is deleted from 'src' ensure it is also deleted from 'compiled'
remove: {
options: {
event: ['deleted'],
spawn: false,
cwd: 'src'
},
files: ['**/*'],
tasks: ['clean:compiledFile']
}
}
grunt.event.on('watch', function(action, filepath) {
if (action === 'deleted') {
// Drop src off of filepath to properly rely on 'cwd' task configuration.
grunt.config('clean.compiledFile.src', filepath.replace('src\\', ''));
}
});
Now copying of ES6 files as well as non-LESS/non-ES6 files will only occur if 'src' is newer than 'dest.'
Unfortunately, grunt-newer doesn't really have support for syncing a delete operation when deleted from 'src'. So, I continue to use my previous code for 'delete' operations. This still has the same flaw where after a delete occurs the watch task will be defunct for a moment.

Browserify: External Hash ID Keeps Changing

I'm using Grunt-Browserify to load a library (jQuery) in one bundle and reference that library as external in other bundles.
Browserify assigns a unique hash id to the external library, and everything works fine for a single developer.
However, when a second developer runs the same Grunt task, the unique id for jQuery changes -- breaking any bundles that are still looking for it at the old "address".
Does anyone know how to control the id assigned to an external library in Browserify -- or how to prevent Browserify from using a hash id for external dependencies?
Here's my current configuration:
browserify: {
main: {
files: {
'./dist/main.js': ['./dev/js/main.js']
},
options: {
require: ['jquery'],
fullPaths: true,
watch: true
}
},
bundles: {
files: {
'./dist/bundle-1.js': ['./dev/bundle-1.js'],
// ...
},
options: {
external: ['jquery'],
fullPaths: true,
watch: true
}
}
}

Grunt lint error with $

I'm trying to use lint with Grunt. I'm able to run Grunt from the command line but it is giving me a lot of errors. Mostly "'$' is not defined". Even alert is throwing an error, "'alert' is not defined".
How can I get around those?
You need to tell JSHint (which is the linter that Grunt uses by default) about the global variables available to the files being linted. I'm assuming that you're including jQuery on your pages, hence the $ identifier (could be various other libraries of course).
You can either specify global variables in each file, or in the Grunt script. To specify them in a file, you can use a global directive. Place this at the top of the file, or at the top of the function in which you use the global:
/*global $:false */
Note that the false means you'll get errors if you override $. If you need the ability to do that, change it to true.
If you'd prefer to specify globals in the Grunt script, you can add a globals property to any of the tasks in your jshint section. For example:
grunt.initConfig({
jshint: {
someTask: {
globals: {
$: false
}
}
}
});
As for the alert message you're getting, you need to tell JSHint that you're allowing the use of development functions, such as alert and console.log. To do that, you can use a jshint directive in the files (just like the global directive):
/*jshint devel:true */
Or you can add an options property to the task in the Grunt script:
someTask: {
globals: {
$: false
},
options: {
devel: true
}
}
See the JSHint docs for all of the options available to you.
globals must be inside options
someTask: {
options: {
devel: true,
globals: {
$: false
}
}
}

Categories

Resources