I'm really new to Grunt.js, and have had some luck being able to run some of the tasks I've installed (e.g. watch, uglify, jslint). As I try to run more, I often run into issues and try to google/research as much as I can in order to learn from the ground up how Grunt works.
However, I get confused by different configurations like these two for uglify:
From the GitHub Repo for grunt-contrib-uglify
uglify: {
my_target: {
files: {
'dest/output.min.js': ['src/input1.js', 'src/input2.js']
}
}
}
and this one (which works for me in my Gruntfile.js):
uglify: {
build: {
src: 'js/custom-script.js',
dest: 'js/custom-script.min.js'
}
},
It's not really these in particular, but I notice that each uses its own words (my-target versus build, src, dest), structure, syntax, etc. I think that, since Grunt is all javascript, that these would be all in JSON formats, though I wasn't able to verify if they were or not.
After doing lots of research through the Grunt documentation, going through the GitHub repositories containing the plugins, and random various tutorials, I guess I have some main questions:
Is there a standardized way to write a Gruntfile.js?
Are there any reserved words for Gruntfile.js's? I have tried changing in my uglify task the word dest to gibberish, and it did fail, so my gut on this say yes.
If yes to any of the above two questions, where are these resources/links? I tried to google "grunt glossary" but came up empty. The only standard seems to be the one that Grunt itself supplies, but I'm having a hard time getting things to work by referencing only it.
There are multiple things at work here, and not all config are created equal. The reference doc is http://gruntjs.com/configuring-tasks but here's a summary:
most grunt tasks are what is called "multi-targets", meaning in your build process you might call the task multiple task, with different parameters. In the config, your first level is the name of the target, and it is completely free (except for options, see below). In your examples, these are the build and my_target names.
besides these targets, you may have an options field (reserved keyword) that is passed to all targets
in the targets themselves, grunt provides some reserved keywords, for options (options) and to define files (src, dest, files,... see http://gruntjs.com/configuring-tasks#files)
and the task author is free to define its own keys, so the doc of each task is very important.
Related
I'm looking for webpack plugin or other solution to move files from one folder to another on afterEmit phase.
There are several plugins that doesn't fit in my case.
https://github.com/kevlened/copy-webpack-plugin
Can copy but not actually move. Source files should be removed.
https://github.com/gregnb/filemanager-webpack-plugin
Can't move files via wildcard. It uses node-mv package under the hood. So I can't use following config:
...
move: [
{ source: './dest/assets/*.map', destination: './dest' },
]
...
Another downside of filemanager-webpack-plugin is it doesn't update webpack internal assets state. It means that moved files can't be served by webpack.
Is there are any other ready to go solutions?
You can try building your own piece of code that does exactly what you need, using webpack compiler hooks. I know this is not a "ready to go solution" but sometimes searching for plugins takes more time than actually writing your own code, especially if it's a simple task.
I have a fairly simple webpack set up with a bit of a twist. I have a few different ways I can think of to create my intended behavior, but I'm wondering if there are better approaches (I'm still fairly new to webpack).
I have a basic TypeScript/Scss application and all of the src files exist in a src directory. I also have a component library bundle that's dynamically generated through a separate build process (triggered via Node). This bundle also ends up in the src directory (it contains some sass variables and a few other assets that belong in src). This is src/custom-library-bundle. My current webpack set up moves the appropriate bundle files to a public (dist) directory via the CopyWebpackPlugin. My webpack dev server watches for changes and reloads as expected. This all works beautifully.
Now for the twist. I've set up a custom watcher that exists elsewhere to watch for the custom component library bundle changes. When a change occurs there, it triggers that custom build process mentioned above. This (as expected) modifies the src/custom-library-bundle directory and sets off several watches/reloads as the bundle is populated. Technically it works? And the behavior is expected, but ideally, I could tell it to wait until the custom installation work is "done" and only then have it trigger the compilation/reload.
Phew. This isn't an easy one to explain. Here's a webpack config that will (hopefully) help clarify:
devServer: {
contentBase: path.join(__dirname, 'public'),
port: 9000,
host: '127.0.0.1',
after: (app, server) => {
new MyCustomWatcherForOtherThings().watch(() => {
// invoked after the src/custom-library-bundle is done doing its thing (each time)
// now that I know it's done, I want to trigger the normal compilation/reload
})
},
watchOptions: {
ignored: [
/node_modules/
// I've been experimenting with the ignored feature a bit
// path.resolve(__dirname, 'src/custom-library-bundle/')
]
}
}
Ideal Approach: In my most ideal scenario, I want to just manually trigger webpack dev server to do its thing in my custom watch callback; have it ignore the src/custom-library-bundle until I tell it to pay attention. I can't seem to find a way to do this, however. Is this possible?
Alternate Approach #1: I could ignore the src/custom-library-bundle directory, move the updated files to public (not utilizing the webpack approach), and then trigger just a reload when I know that's complete. I don't love this because I want to utilize the same process whether I'm watching or just doing a one-off build (I want everything to end up in the public directory because webpack did that work, not becuase I wrote a script to put it there under specific scenarios). But, let's say I get over that, how can I trigger a browser reload for the dev server? Is this possible?
Alternate Approach #2 This is the one I'm leaning towards but it feels like extra, unnecessary work. I could have my custom build process output to a different directory (one that my webpack set up doesn't care about at all). Once the build process is complete, I could move all the files to src/custom-library-bundle where the watch would pick up 1 change and do a single complication/reload. This gets me so close, but feels like I'm adding a step I don't want to.
Alternate Approach #3? Can you think of a better way to solve this?
Update (including versions):
webpack 4.26.1
webpack-dev-server 3.1.14
Add the following server.sockWrite call to your after method:
devServer: {
after: (app, server) => {
new MyCustomWatcherForOtherThings().watch(() => {
// invoked after the src/custom-library-bundle is done doing its thing (each time)
// now that I know it's done, I want to trigger the normal compilation/reload
// calling this on `server` triggers a full page refresh
server.sockWrite(server.sockets, "content-changed");
});
};
}
I've never found this in the documentation, but one of webpack's core devs mentioned it in a comment on GitHub, so it's sort of sanctioned.
Useful things that webpack provides that come to mind are multi-compiler builds, creating a child compiler from a Compiler or Compilation instance, the DllPlugin and programmatically managing the compiler by calling webpack.watch() or webpack.compile().
the multi-compiler setup is useful when you want to build several compilations in a single run whose outputs are independent of each other
the child compilers allows you to set up dependencies between
compilers and using hooks they allow you to block the parent until,
say, the child compilation finished emitting the latest bundle into
the assets
the DllPlugin allows you to create a compilation and
it's manifest that could produce chunks that can contain modules that
can be used as dependents for yet unbuilt compilations (the manifest
needs to be produced and passed to them beforehand)
programmatically
managing you compiler lets you write a straightforward Node.js script
that could accomplish most of this manually.
If I understood correctly, your webpack compiler doesn't really have any dependencies on your bundle aside from expecting it to have something added to the output assets, so you might consider doing a multi-compiler build. If that doesn't work for you, you can write a simple plugin that creates a child compiler that watches all the component bundle dependencies and emits the built bundle into the main compilation assets on changes. Ultimately, as you mentioned yourself, you could write a simple script and programmatically weave everything together. All of these approaches offload tracking build dependencies to webpack, so if you put the compilers into watch mode webpack will keep track when something needs updating.
If you're interested in taking a closer look at how child compilers are created and used, I would heartily recommend reading through the sources of the html-webpack-plugin plugin. It doesn't seem like the plugin is handling the same kind of build setup as you have, but worth noting is that the HTML plugin works with files that aren't part of the build dependencies (nothing in the sources references or depends on the files/templates that are used for creating the HTML files that are added to the output).
I'm trying to use Rollup with Babel's external-helpers. It works, but it's dropping a bunch of babel helpers which I don't even need, for example asyncGenerator.
The docs show a whitelist option but I can't get it to work
rollup.rollup({
entry: 'src/buttonDropdown.es6',
plugins: [
babel({
presets: ['react', ['es2015', { modules: false }], 'stage-2'],
plugins: [['external-helpers', { whitelist: ['asyncGenerator'] }]]
})
]
})
The above has no effect: all Babel helpers are still dropped into my resulting bundle.
What is the correct way of using this feature, and is there a full list of which helpers' names the whitelist array takes?
Or is there some other Rollup plugin I should be using with Rollup to automatically "tree shake" the babel external helpers.
Problem
The babel-plugin-external-helpers plugin is not responsible for injecting those dependencies in the final bundle.
The only thing it controls is that how the generated code will access those functions. For example:
classCallCheck(this, Foo);
// or
babelHelpers.classCallCheck(this, Foo);
It is needed so all rollup-plugin-babel needs to do is to inject babelHelpers in every module.
The documentation is misleading, the whitelist options is not on the external-helpers plugin. It's on the completely separate module and command line tool called babel-external-helpers, which is actually responsible for generating babelHelpers.
It's rollup-plugin-babel what is injecting babelHelpers. And does it using a trick to modularize the final code. It calls babel-external-helpers to generate the helpers, and ignores the whitelist parameter. See my issue requesting to expose an option.
This approach is correct, because rollup will tree-shake the unused helper functions. However some of the helpers (like asyncGenerator) are written in a way that is hard to detect if the initialization has any side effects, thus preventing removal during tree-shaking.
Workaround
I forked rollup-plugin-babel and created a PR which exposes the whitelist option of building babelHelpers in the plugin's options. It can be used this way:
require("rollup").rollup({
entry: "./src/main.js",
plugins: [
require("rollup-plugin-babel")({
"presets": [["es2015", { "modules": false }]],
"plugins": ["external-helpers"],
"externalHelpersWhitelist": ['classCallCheck', 'inherits', 'possibleConstructorReturn']
})
]
}).then(bundle => {
var result = bundle.generate({
format: 'iife'
});
require("fs").writeFileSync("./dist/bundle.js", result.code);
}).then(null, err => console.error(err));
Note that I didn't publish distribution version on npm, you will have to clone the git repo and build it using rollup -c.
Solution
In my opinion the right solution would be to somehow detect or tell rollup that those exports are pure, so can be removed by tree shaking. I will start a discussion about it on github after doing some research.
As I have found in this particular issue in the GitHub page.
The Babel member Hzoo suggests that
Right now the intention of the preset is to allow people to use it without customization - if you want to modify it then you'll have to
just define plugins yourself or make your own preset.
But still if you want to exclude a specific plugin from the default preset then here are some steps.
As suggested by Krucher you can create a fork to the undesirable plugin in the following way
First one is by forking technique
"babel": {
"presets": [
"es2015"
],
"disablePlugins": [
"babel-plugin-transform-es2015-modules-commonjs"
]
}
But if two or more people want to include the es2015-with-commonjs then it would be a problem.For that you have to define your own preset or extend the preset of that module.
The second method would involve the tree-shaking as shown in this article done by Dr. Axel Rauschmayer.
According to the article webpack2 is used with the Babel6.
This helps in removal of the unwanted imports that might have been used anywhere in the project in two ways
First, all ES6 module files are combined into a single bundle file. In that file, exports that were not imported anywhere are not exported, anymore.
Second, the bundle is minified, while eliminating dead code. Therefore, entities that are neither exported nor used inside their modules do not appear in the minified bundle. Without the first step, dead code elimination would never remove exports (registering an export keeps it alive).
Other details can be found in the article.
Simple implemetation is referred as here.
The third method involves creating your own preset for the particular module.
Creating aplugin and greating your own preset can be implemented according to the documentation here
Also as an extra tip you should also use babel-plugin-transforn-runtime
If any of your modules have an external dependancy,the bundle as a whole will have the same external dependancy whether or not you actually used it which may have some side-effects.
There are also a lot of issues with tree shaking of rollup.js as seen in this article
Also as shown in the presets documentation
Enabled by default
These plugins have no effect anymore, as a newer babylon version enabled them by default
- async-functions (since babylon 6.9.1)
- exponentiation-operator (since babylon 6.9.1)
- trailing-function-commas (since babylon 6.9.1)**
Also the concept of whitelisting and blacklisting the plugins has benn brilliantly explained by loganfsmyth here in this thread.
you can pass a whitelist option to specify specific transformations to run, or a blacklist to specific transformations to disable.
You cannot blacklist specific plugins, but you may list only the plugins you want, excluding the ones you do not wish to run.
Update :
According to this article here is an important update -
"The --external-helpers option is now a plugin. To avoid repeated inclusion of Babel’s helper functions, you’ll now need to install and apply the babel-plugin-transform-runtime package, and then require the babel-runtime package within your code (yes, even if you’re using the polyfill)."
Hope this may solve your problem
Hope it may help you.
You can use grunt-http or grunt-curl etc. to download a remote file to a temporary location on disk and then read it. However it would be cleaner if that step could be skipped and we could use the Grunt files objects/arrays/globbing to download the file directly. Something like:
grunt.initConfig({
uglify: {
test: {
files: [
{ src: 'http://example.com/cool-file.js', dest: 'build/cool-file.min.js' }
],
},
});
(In my particular case I need my build process to work off of HTML files generated by a local Python Tornado web server with a lot of templating logic that I can't replicate elsewhere.)
I've tried searching for a plugin that would streamline this but no dice. Any options or patterns for doing this other than downloading the files in a separate task?
I first thought I will say
Are you an ambitious developper?
First try! Then ask.
But seem you have already try, then... instead of reporting thi pseudo-issue here, post it to their github account that later versions think of implementing such features...
Or implement it and suggest to them.
I have more than javascript files in my html documents as external which I'd like to combine on account of not to be crowded. is there any way to combine my js files ? for example;
my files:
a.js
b.js
c.js
d.js
and i want;
all.js
Take a look at requirejs.org and especially look at r.js (http://requirejs.org/docs/optimization.html)
a.js
var i=0;
function fun1()
{
...
}
b.js
var k=0;
function fun2()
{
...
}
all.js just copy, paste like css
var i=0;
function fun1()
{
...
}
var k=0;
function fun2()
{
...
}
take care of semicolons and closed braces when you purticularly write whole script in an eventlistener, especially 'DOMContentLoaded'
document.addEventlistener('DOMContentloaed',function()
{
//whole big script
}
);
instead use
document.addEventlistener('DOMContentloaed',some_function);
var some_function = function(){*bla bla bla*};
A simple bash cat operation will do what you want, but, at some stage you're probably going to want more right?
Grunt and Grunt-contrib-concat is a good starter, but you'll quickly realise grunt is not particularly good. To summarise usage, you create a gruntfile, install a few dependencies (i.e. install grunt and its command line interface, this is easy) and run grunt from your project root. It then parses the gruntfile to find out what you want it to do, and it does it. Pretty simple, and simple is good.
Next up is Gulp, which is a nice build system using streams, so, slightly more complex (well, easier and more powerful but, streams can be kind-of confusing at first). Gulp works in the same way only it parses a gulpfile for instructions. For a concat operation the actual gulp command is trivial:
gulp.src( '*.js' )
.pipe( gulp.dest( 'all.js' )
Between the .src and the .dest you can pipe the files through multiple transforms, such as minifying, transpiling, notifying—the list of plugins and modules is dizzying (as it is for grunt).
However, if you’re a fan of node and npm (you probably should be) then you can use npm scripts to create a build system. npm is the node package manager and requires a package.json to give some clues as to how to work. Part of that json specification is a scripts block
"scripts": {
"build" : "cat *.js > all.js"
}
You can then use npm run build from the command line, whereby npm will parse the package.json and execute the script using bash (sh actually).
Note that these are build systems, and there are many others.
There are also other packagers (which you would probably use as part of your build system, although for some projects they are you entire build system) but they are more complex than your needs, for your own research browserify, webpack and jspm are all excellent (bare in mind AMD modules lost so require.js is probably not worth your time), although this area is becoming congested. Each of these are very powerful modularisation tools, but they will require some changes to how you structure your code. If you are serious about modularisation then they are worth your time learning.
On a slightly different tangent, there is some discussion about whether one large file is actually more beneficial than a number of smaller scripts. In many cases simply serving a few small files is actually quicker, and may be easier, although there can be other benefits of smashing code together. Currently it is probably still best to concat at least into less HTTP requests, but this requirement for performance is going away.
It might be helpful : https://github.com/mrclay/minify
OR
create file all.js and paste a.js,b.js,c.js,d.js file code