Dynamic Grunt involving n subdirectories - javascript

I have a folder layout such that:
/
-- css/
-- js/
-- apps/
-- -- myFirstApp/
-- -- mySecondApp/
-- -- ...
Each of these are git submodules, and have a corresponding Gruntfile, package.json, etc. What I want to do is the same sequence of commands, but differ depending on the respective package.json.
My command list is this:
npm install
grunt dist
copy app/css/[fileName].css (from package.json) to css/
copy app/js/[fileName].js to js/
copy app/js/[fileName].html to /
Is there a plugin or something I'm overlooking that I can use with grunt to do this? I don't want to do it statically if at all possible -- I'd like to only have to update the submodule list for this to work.

I don't know of any pre-built Grunt task that will do this for you, but writing the task isn't so difficult. You'll need to pull in the Node fs module to deal with the filesystem and obviously there will be some other things to do... here's a general structure for it with some code and some TODO's:
var fs = require("fs"),
path = require("path");
module.exports = function ( grunt ) {
grunt.initConfig({
... // all of your other Grunt config
// options for our new task
copymodulefiles: {
all: {
rootDir: "apps/"
}
}
});
// Here's our custom task definition
grunt.registerMultiTask("copymodulefiles", "Copies files from sub-projects", function() {
var done = this.async(), // tell Grunt this is an async task
root = grunt.config(this.name + "." + this.target + ".rootDir"),
modules = fs.readdirSync(root);
modules.forEach(function(dirName) {
var pkg = fs.readFileSync(root + dirName + path.sep + "package.json", "utf8");
pkgJson = JSON.parse(pkg);
// TODO: find what you need in the pkgJson variable
// copy files from wherever to wherever
// You can write a file like so:
fs.writeFile(theFilenameToWrite, "Contents of the new file", function (err) {
// (check for errors!)
// log it?
grunt.log.ok("file written!");
});
});
// Determine when all are complete and call "done()" to tell Grunt everything's complete
// call Grunt's "done" method to signal task completion
done();
});
};

Try with grunt-shell i found it perfect and did similar tasks like what you are trying to do.
Have a look at my Gruntfile.js configuration what i have written to run shell commands:
shell: {
multiple: {
command: ['bower install',
'mv bower_components/** public/',
'rm -rf bower_components'
].join('&&')
}
}
So here i am running bower, then i am copying its components to public folder and after that i am deleting the bower_components folder. So i guess from here onwards you can customize this script as per your usage.

Related

NodeJS fs.watch is not reacting to changes inside a Docker Container

The idea of the following code is to react to changes in files inside a folder. When I run this code on my macOS everything works as executed.
let fs = require("fs");
let options = {
encoding: 'buffer'
}
fs.watch('.', options, function(eventType, filename) {
if(filename)
{
console.log(filename.toString());
}
});
Inside a Docker Container on the other hand the code does not react to file changes. I run the code in the following way:
docker run -it --rm --name $(basename $(pwd)) -v $(pwd):/app -w /app node:slim sh -c 'node index'
Is there an option to use with Docker to allow system notifications for file changes?
New answer
Initially i advised Gulp (see bottom of the updated post with old answer). It did not worked because you tried to use it programatically, when Gulp is task runner and have own usage patterns, which i did not described. Since you need something specific, i have very simple, surely working solution for you.
It uses one of modules used by gulp called gaze - module which have approx 1.7m downloads per week. Its working, for sure on every system.
npm install gaze --save
To make it work lets create index.js in your root folder (which will be mounted to your app folder inside of the docker, and then just follow basic instruction given in module README:
var gaze = require('gaze');
// Watch all .js files/dirs in process.cwd()
gaze('**/*.js', function(err, watcher) {
// Files have all started watching
// watcher === this
console.log('Watching files...');
// Get all watched files
var watched = this.watched();
// On file changed
this.on('changed', function(filepath) {
console.log(filepath + ' was changed');
});
// On file added
this.on('added', function(filepath) {
console.log(filepath + ' was added');
});
// On file deleted
this.on('deleted', function(filepath) {
console.log(filepath + ' was deleted');
});
// On changed/added/deleted
this.on('all', function(event, filepath) {
console.log(filepath + ' was ' + event);
});
// Get watched files with relative paths
var files = this.relative();
});
Now lets run your command:
docker run -it --rm --name $(basename $(pwd)) -v $(pwd):/app -w /app node sh -c 'node index'
What we have upon changes - Linux outpud, but this works for Mac OS too.
blackstork#linux-uksg:~/WebstormProjects/SO/case1> docker run -it --rm --name $(basename $(pwd)) -v $(pwd):/app -w /app node sh -c 'node index'
Watching files...
/app/bla was changed
/app/foobar.js was changed
Old answer.
You can do it with gulp.
Gulpfile.js
const gulp = require('gulp');
gulp.task( 'watch' , ()=>{
return gulp.watch(['app/**'], ['doStuff']);
});
gulp.task( 'doStuff', cb => {
console.log('stuff');
//do stuff
cb();
});
So far such approach worked for me (of course you can build much more
complex things, but if find using gulp conventient for different
filesystem tasks).

I'm using Gulp and failing to produce the final development script for production.

So I'm having a slight problem with producing production ready scripts for my project. I'm using gulp to concatenate and minify my css and js, and while the css is working fine the gulp js function isn't generating my final file. Please refer to my code below:
gulp.task('js', function() {
return gulp.src([source + 'js/app/**/*.js'])
.pipe(concat('development.js'))
.pipe(gulp.dest(source + 'js'))
.pipe(rename({
basename: 'production',
suffix: '-min',
}))
.pipe(uglify())
.pipe(gulp.dest(source + 'js/'))
.pipe(notify({ message: 'Scripts task complete', onLast: true }));
});
If anyone has encountered a similar problem or has any tips it would be much appreciated :)
There is nothing wrong with your gulpfile. I tested it and it works perfectly.
The only thing I can guess is that your source is not set correctly. Did you forget the trailing slash '/' ?
I would suggest 2 things to figure it out. Include node path library to check where source is actually pointing to like this:
var path = require('path');
// in gulp task ...
path.resolve(path.resolve(source + 'js/app'));
Make sure it points where you think it does.
Secondly, you could use gulp-debug to establish that any files are found:
npm install gulp-debug
Then
var debug = require('gulp-debug');
// in gulp task ...
return gulp.src([source + 'js/app/**/*.js'])
.pipe(concat('development.js'))
.pipe(debug())
.pipe(gulp.dest(source + 'js'))
.pipe(debug())
// etc.
Good luck!
Based on additional infomation in the comments I realise you are generating JS files in a separate process ...
gulp is asynchronous by default. What this boils down to is that all functions try to run at the same time - if you want a specific order it must be by design. This is great because it's very fast but can be a headache to work with.
Problem
Here is what's basically happening:
// SOME TASK THAT SHOULD BE RUN FIRST
gulp.task('copy-vendor-files-to-tempfolder', function (done) {
// copy files to vendor folder
done()
})
// SOME TASKS THAT DEPEND ON FIRST TASK
gulp.task('complile-styles', function () { /* independent task */ })
gulp.task('concat-vendor-files', function () { /* concat files in vendor folder. depends on vendor files existing */ })
// GENERAL TASK WHICH STARTS OTHERS
gulp.task('ready', ['copy-vendor-files-to-tempfolder', 'compile-styles', 'concat-vendor-files])
When you try to run:
$ gulp ready
GULP TASK WILL FAIL! Folder is being created at the same time!!
NOWHERE TO COPY FILES!
Solution
There are many solutions but the following module has come in handy for me again and again:
npm install run-sequence
Then in your gulpfile.js:
var runSequence = require('run-sequence')
gulp.task('ready', function (done) {
runSequence(
'create-folders', // do this first,
[
'copy-css-files',
'copy-html-files'
], // do these AFTER but in parallel
done // callback when ready
)
})
This will guarantee the folder exists when you try to run the other functions.
In your specific case, you should make sure the task that concatenates the JS files is run after the task that copies them out of vendor.
Note: I'm leaving other answer because it contains useful help for debugging similar issues.
HTH!

copying files with gulp

I have an app. My app source code is structured like this:
./
gulpfile.js
src
img
bg.png
logo.png
data
list.json
favicon.ico
web.config
index.html
deploy
I am trying to use Gulp to copy two files: ./img/bg.png and ./data/list.json. I want to copy these two files to the root of the deploy directory. In other words, the result of the task should have:
./
deploy
imgs
bg.png
data
list.json
How do I write a Gulp task to do this type of copying? The thing that is confusing me is the fact that I want my task to copy two seperate files instead of files that fit a pattern. I know if I had a pattern, I could do this:
var copy = require('gulp-copy');
gulp.task('copy-resources', function() {
return gulp.src('./src/img/*.png')
.pipe(gulp.dest('./deploy'))
;
});
Yet, I'm still not sure how to do this with two seperate files.
Thanks
You can create separate tasks for each target directory, and then combine them using a general "copy-resources" task.
gulp.task('copy-img', function() {
return gulp.src('./src/img/*.png')
.pipe(gulp.dest('./deploy/imgs'));
});
gulp.task('copy-data', function() {
return gulp.src('./src/data/*.json')
.pipe(gulp.dest('./deploy/data'));
});
gulp.task('copy-resources', ['copy-img', 'copy-data']);
You could also use merge-stream
Install dependency:
npm i -D merge-stream
Load the depedency in your gulp file and use it:
const merge = require("merge-stream");
gulp.task('copy-resources', function() {
return merge([
gulp.src('./src/img/*.png').pipe(gulp.dest('./deploy/imgs')),
gulp.src('./src/data/*.json').pipe(gulp.dest('./deploy/data'))
]);
});

How can I achieve this using gulp?

I am enumerating the subdirectories in a directory. For each sub directory I would like to apply a number of gulp activities like less compilation, and then create an output file specific to that subdirectory.
I would like the gulp process to continue, as further transformation steps need to be performed later.
Can someone help me understand how I can create these files half way through the "gulp pipeline"?
This seems quite interesting to achieve and gulp has no limitations at all.
I will give you detailed example how I have managed to accomplish such a task a while ago.
Let assume that you have directoryA. Subdirectories childA, childB and childC are contained into directoryA. So basically your tree structure looks like:
directoryA
--childA
--childB
--childC
I am always looking for a flexible solutions so I would suggest to include a JSON file in each subdirectory naming the tasks you would like to running. Using fs you can access these files. You can also use run-sequence to execute gulp tasks synchronously.
For demo purposes place a file named manifest.json inside childA subdirectory.
Manifest.json contains the following declarations:
{
"filesToProccess" : ["./childA/*.js", "./childB/*.js"],
"tasksToRun" :["taskA", "taskB"]
}
Finally gulpfile would like this:
'use strict';
//dependencies declared into package.json
//install them using npm
var gulp = require('gulp'),
fs = require('fs'),
runSequence = require('run-sequence'),
path = require('path');
//these two array will keep the actions you have included into manifest file.
var filesHolder = [], tasksHolder = [];
gulp.task('taskA', function () {
return gulp.src(filesHolder)
.pipe(whatever)
.pipe(gulp.dest('whatever')); //chailed actions
});
gulp.task('taskB', function () {
return gulp.src(filesHolder)
.pipe(whatever)
.pipe(gulp.dest('whatever'));
});
//a simple utility function to read all subdirectories of directoryA
function getDirectories(srcpath) {
return fs.readdirSync(srcpath).filter(function(file) {
return fs.statSync(path.join(srcpath, file)).isDirectory();
});
}
//finally insert the default gulp task
gulp.task('default', function(){
var manifest;
//map directory's A subdirectories
var availableDirs = getDirectories("./directoryA");
//finally loop the available subdirectories, load each manifest file and
availableDirs.forEach(function(subdir) {
manifest = require("./directoryA/"+subdir+"manifest.json");
filesHolder = manifest.filesToProccess;
tasksHolder = manifest.tasksToRun;
runSequence( tasksHolder , function () {
console.log( " Task ended :" + tasksHolder + " for subdirectory : " + subdir);
});
});
});

custom targets / running arbitrary code

In make it's possible to define custom targets that have no relevance to the actual code that they act upon, in the sense that they are language agnostic.
release_sortof:
#echo packaging release...
tar czf release.tar.gz file1 file2 file3
ls /dev/null
ls /dev/stderr
ls /dev/stdout
I know the example above is horrible, but the point I'm trying to illustrate is that the code in the release_sortof target doesn't depend on the fact that my project uses code written in C, for example; nor does it depend on me using Make built-ins such as foreach.
Is there a way to work with javascript/<INSERT-NAME>script files without using the ever insufficient plugins available for gulp? As in, could I lint my coffeescript with coffeelint by directly calling the coffeelint module:
var gulp = require('gulp')
, coffeelint = require('coffeelint')
;
gulp.task('lint', function() {
/* run coffeelint on source files */
});
Or can this only be done using plugins?
Another example would be to run arbitrary code like so:
var spawn = require('child_process').spawn;
gulp.task('blue', function() {
var child = spawn('ls');
/* do stuff with spawned child process */
});
I do this kind of thing for browserify using vinyl-source-stream - basically allowing you to use the library as it is, and not using gulp-* plugins.
var browserify = require('browserify'),
gulp = require('gulp'),
source = require('vinyl-source-stream'),
stringify = require('stringify'),
plumber = require('gulp-plumber'),
config = require('../config').scripts;
gulp.task('browserify', function () {
return browserify(config.app)
.transform(stringify(['.html']))
.bundle()
.pipe(plumber())
.pipe(source('bundle.js'))
.pipe(gulp.dest(config.dest));
});
Heres the npm - https://www.npmjs.com/package/vinyl-source-stream
Use conventional text streams at the start of your gulp or vinyl
pipelines, making for nicer interoperability with the existing npm
stream ecosystem.
Maybe that will help you?

Categories

Resources