any npm package to run grep against list of path? - javascript

I tried to look for a package that can run a grep filter against an array of string. But I can't find one.
I want to do something like this:
const paths = ['some/file/paths-1', 'some/other/paths', ... ]
const filteredPaths = grep(paths, 'some/file/**/*')
I understand this question is not a good SO question as it can be opinionated and favor a particular package.
But I think many people would look for the same thing.

I found it: https://www.npmjs.com/package/matcher
This is useful if you are getting input from cli or config.
Regex in those cases would be too complicated.
Sometimes finding the right package for the job is hard, it is even harder today with https://github.com/npm/npm/issues/19438
I'd leave the question and answer here so that people can find it a bit easier.

It wouldn't make any sense to try and use grep to solve this. Javascript regexes work just fine.
const paths = ['some/file/paths-1', 'some/other/paths', ... ]
const filteredPaths = paths.filter(path => /^some\/file\/.*/)
if you insist on using grep for this you could do it like so:
const fs = require('fs')
const execSync = require('child_process').execSync
const paths = ['some/file/paths-1', 'some/other/paths']
fs.writeFileSync('/tmp/b', paths.join('\n'), 'utf8')
const grepResult = execSync("grep 'some/file/' /tmp/b")
const filteredPaths = grepResult.toString()
console.log(filteredPaths)
I personally fail to see any situation that this appropriate though

Related

How to insert a javascript file in another javascript at build time using comments

Is there something on npm or in VS Code or anywhere that can bundle or concatenate Javascript files based on comments, like:
function myBigLibrary(){
//#include util.js
//#include init.js
function privateFunc(){...}
function publicFunc(){
//#include somethingElse.js
...
}
return {
init, publicFunc, etc
}
}
Or something like that? I would think that such a thing is common when your javascript files get very large. All I can find are complicated things like webpack.
I'm looking for any equivalent solution that allows you to include arbitrary code in arbitrary positions in other code. I suppose that would cause trouble for intellisense, but an extension could handle that.
I'm not sure what you really means but if you means linking variables from other javascript files. probably this will help you
const { variable name } = require(the javascript file path)
example:
in index.js
const { blue } = require('./../js/blue.js)
console.log(blue)
Meanwhile in blue.js
const blue = "dumbass"
if this doesnt help you just ignore this
So here is a bare bones way to to what I wanted. I have been learning more about how you can do things with esbuild or other bundlers, but I didn't quite figure out something that fit my needs. And this is simpler and more flexible. It can work for any file type. You can do automatic updates when files change using nodemon instead of node to run this code.
const fs = require('fs')
/////////////////////////
const input = 'example.js'
const output = 'output.js'
const ext = '.js'
// looks for file with optional directory as: //== dir/file
const regex = /\/\/== *([\w-\/]+)/g
const fileContent = fs.readFileSync(input).toString()
// replace the comment references with the corresponding file content
const replacement = fileContent.replace(regex, (match, group)=>{
const comment = '//////// '+group+ext+' ////////\n\n'
const replace = fs.readFileSync(group+ext).toString()
return comment + replace
})
// write replacement to a file
fs.writeFileSync(output, replacement)

NodeJS & Gulp Streams & Vinyl File Objects- Gulp Wrapper for NPM package producing incorrect output

Goal
I am currently trying to write a Gulp wrapper for NPM Flat that can be easily used in Gulp tasks. I feel this would be useful to the Node community and also accomplish my goal. The repository is here for everyone to view , contribute to, play with and pull request. I am attempting to make flattened (using dot notation) copies of multiple JSON files. I then want to copy them to the same folder and just modify the file extension to go from *.json to *.flat.json.
My problem
The results I am getting back in my JSON files look like vinyl-files or byte code. For example, I expect output like
"views.login.usernamepassword.login.text": "Login", but I am getting something like {"0":123,"1":13,"2":10,"3":9,"4":34,"5":100,"6":105 ...etc
My approach
I am brand new to developing Gulp tasks and node modules, so definitely keep your eyes out for fundamentally wrong things.
The repository will be the most up to date code, but I'll also try to keep the question up to date with it too.
Gulp-Task File
var gulp = require('gulp'),
plugins = require('gulp-load-plugins')({camelize: true});
var gulpFlat = require('gulp-flat');
var gulpRename = require('gulp-rename');
var flatten = require('flat');
gulp.task('language:file:flatten', function () {
return gulp.src(gulp.files.lang_file_src)
.pipe(gulpFlat())
.pipe(gulpRename( function (path){
path.extname = '.flat.json'
}))
.pipe(gulp.dest("App/Languages"));
});
Node module's index.js (A.k.a what I hope becomes gulp-flat)
var through = require('through2');
var gutil = require('gulp-util');
var flatten = require('flat');
var PluginError = gutil.PluginError;
// consts
const PLUGIN_NAME = 'gulp-flat';
// plugin level function (dealing with files)
function flattenGulp() {
// creating a stream through which each file will pass
var stream = through.obj(function(file, enc, cb) {
if (file.isBuffer()) {
//FIXME: I believe this is the problem line!!
var flatJSON = new Buffer(JSON.stringify(
flatten(file.contents)));
file.contents = flatJSON;
}
if (file.isStream()) {
this.emit('error', new PluginError(PLUGIN_NAME, 'Streams not supported! NYI'));
return cb();
}
// make sure the file goes through the next gulp plugin
this.push(file);
// tell the stream engine that we are done with this file
cb();
});
// returning the file stream
return stream;
}
// exporting the plugin main function
module.exports = flattenGulp;
Resources
https://github.com/gulpjs/gulp/blob/master/docs/writing-a-plugin/README.md
https://github.com/gulpjs/gulp/blob/master/docs/writing-a-plugin/using-buffers.md
https://github.com/substack/stream-handbook
You are right about where the error is. The fix is simple. You just need to parse file.contents, since the flatten function operates on an object, not on a Buffer.
...
var flatJSON = new Buffer(JSON.stringify(
flatten(JSON.parse(file.contents))));
file.contents = flatJSON;
...
That should fix your problem.
And since you are new to the Gulp plugin thing, I hope you don't mind if I make a suggestion. You might want to consider giving your users the option to prettify the JSON output. To do so, just have your main function accept an options object, and then you can do something like this:
...
var flatJson = flatten(JSON.parse(file.contents));
var jsonString = JSON.stringify(flatJson, null, options.pretty ? 2 : null);
file.contents = new Buffer(jsonString);
...
You might find that the options object comes in useful for other things, if you plan to expand on your plugin in future.
Feel free to have a look at the repository for a plugin I wrote called gulp-transform. I am happy to answer any questions about it. (For example, I could give you some guidance on implementing the streaming-mode version of your plugin if you would like).
Update
I decided to take you up on your invitation for contributions. You can view my fork here and the issue I opened up here. You're welcome to use as much or as little as you like, and in case you really like it, I can always submit a pull request. Hopefully it gives you some ideas at least.
Thank you for getting this project going.

Indexing nodejs or browserify components with gulp

I'd like to open source (via gulp-plugin) a simple build 'indexer' I'm using in a browserify project. Basically, I'm adding an 'index.js' file into every directory inside of a glob (gulp.src). Current index.js looks like this:
var index = {};
module.exports = index;
index.assets = require('./assets');
index.build = require('./build');
index.bundler = require('./bundler');
index.component = require('./component');
index.indexer = require('./indexer');
index.server = require('./server');
index.database = require('./database');
Is this an okay way to organize a set of modules?? I'm also considering adding a node_modules folders to the top of my src dir (one level below the main dir). So instead of writing:
var form = require('./components).form; //or
var input = require('../components/forms).input
I can:
var form = require('form')
var input = require('input')
I find that this little indexer helps my workflow, maybe it'll help others too? But I don't want to put a plugin out there that's doing something potentially buggy. I asked the question to make sure that it's okay to index components like this, nested, that my syntax is correct, or if there are better ways to implement this pattern?
Sure, looks useful. You should publish it and see what feedback you get.
https://www.npmjs.com/package/file-indexer
Not a gulp plugin yet, but maybe soon?

Where to put "Q.longStackSupport = true"?

From the documentation of Q (the Javascript promise library):
Q.longStackSupport = true;
This feature does come with somewhat-serious performance and memory overhead, however. If you're working with lots of promises, or trying to scale a server to many users, you should probably keep it off. But in development, go for it!
I find myself always writing code like this:
var Q = require('q');
Q.longStackSupport = true;
However, if I decided to turn off longStackSupport, I would have to touch a lot of files in my code.
So, I wonder if there is a more elegant solution:
Is there a recommended pattern when including Q?
Is it sufficient to call Q.longStackSupport only once?
Yes, it is sufficient to only call it once in one place.
In init.js, or whatever your root file is, I would put
if (process.env.NODE_ENV === "development") {
Q.longStackSupport = true;
}
Then this will automatically enable it if you have the NODE_ENV environment variable set to development.
$ export NODE_ENV=development
$ node init.js

Delete (unlink) files matching a regex

I want to delete several files from a directory, matching a regex. Something like this:
// WARNING: not real code
require('fs').unlink(/script\.\d+\.js$/);
Since unlink doesn't support regexes, I'm using this instead:
var fs = require('fs');
fs.readdir('.', (error, files) => {
if (error) throw error;
files.filter(name => /script\.\d+\.js$/.test(name)).forEach(fs.unlink);
});
which works, but IMO is a little more complex than it should be.
Is there a better built-in way to delete files that match a regex (or even just use wildcards)?
No there is no globbing in the Node libraries. If you don't want to pull in something from NPM then not to worry, it just takes a line of code. But in my testing the code provided in other answers mostly won't work. So here is my code fragment, tested, working, pure native Node and JS.
let fs = require('fs')
const path = './somedirectory/'
let regex = /[.]txt$/
fs.readdirSync(path)
.filter(f => regex.test(f))
.map(f => fs.unlinkSync(path + f))
You can look into glob https://npmjs.org/package/glob
require("glob").glob("*.txt", function (er, files) { ... });
//or
files = require("glob").globSync("*.txt");
glob internally uses minimatch. It works by converting glob expressions into JavaScript RegExp objects. https://github.com/isaacs/minimatch
You can do whatever you want with the matched files in the callback (or in case of globSync the returned object).
I have a very simple solution to do this. Read the directory in node.js using fs.readdir API. This will give an array of all the files in the directory. Once you have that array, iterate over it using for loop, apply regex.
The below code will delete all files starting with "en" and extension ".js"
fs.readdir('.', (err, files)=>{
for (var i = 0, len = files.length; i < len; i++) {
var match = files[i].match(/en.*.js/);
if(match !== null)
fs.unlink(match[0]);
}
});
The answer could depend on your environment. It looks like you are running on node.js. A quick perusal of the node.js documentation suggests there is no "built in" way to do this, i.e., there isn't a single function call that will do this for you. The next best thing might involve a small number of function calls. As I wrote in my comment, I don't think there's any easy way to make your suggested answer much briefer just relying on the standard node.js function calls. That is, if I were in your shoes, I would go with the solution you already suggested (though slightly cleaned up).
One alternative is to go to the shell, e.g.,
var exec = require('child_process').exec;
exec('ls | grep "script[[:digit:]]\\\+.js" | xargs rm');
Personally, I would strongly prefer your offered solution over this gobbledygook, but maybe you're shooting for something different.

Categories

Resources