Indexing nodejs or browserify components with gulp - javascript

I'd like to open source (via gulp-plugin) a simple build 'indexer' I'm using in a browserify project. Basically, I'm adding an 'index.js' file into every directory inside of a glob (gulp.src). Current index.js looks like this:
var index = {};
module.exports = index;
index.assets = require('./assets');
index.build = require('./build');
index.bundler = require('./bundler');
index.component = require('./component');
index.indexer = require('./indexer');
index.server = require('./server');
index.database = require('./database');
Is this an okay way to organize a set of modules?? I'm also considering adding a node_modules folders to the top of my src dir (one level below the main dir). So instead of writing:
var form = require('./components).form; //or
var input = require('../components/forms).input
I can:
var form = require('form')
var input = require('input')
I find that this little indexer helps my workflow, maybe it'll help others too? But I don't want to put a plugin out there that's doing something potentially buggy. I asked the question to make sure that it's okay to index components like this, nested, that my syntax is correct, or if there are better ways to implement this pattern?

Sure, looks useful. You should publish it and see what feedback you get.

https://www.npmjs.com/package/file-indexer
Not a gulp plugin yet, but maybe soon?

Related

How to insert a javascript file in another javascript at build time using comments

Is there something on npm or in VS Code or anywhere that can bundle or concatenate Javascript files based on comments, like:
function myBigLibrary(){
//#include util.js
//#include init.js
function privateFunc(){...}
function publicFunc(){
//#include somethingElse.js
...
}
return {
init, publicFunc, etc
}
}
Or something like that? I would think that such a thing is common when your javascript files get very large. All I can find are complicated things like webpack.
I'm looking for any equivalent solution that allows you to include arbitrary code in arbitrary positions in other code. I suppose that would cause trouble for intellisense, but an extension could handle that.
I'm not sure what you really means but if you means linking variables from other javascript files. probably this will help you
const { variable name } = require(the javascript file path)
example:
in index.js
const { blue } = require('./../js/blue.js)
console.log(blue)
Meanwhile in blue.js
const blue = "dumbass"
if this doesnt help you just ignore this
So here is a bare bones way to to what I wanted. I have been learning more about how you can do things with esbuild or other bundlers, but I didn't quite figure out something that fit my needs. And this is simpler and more flexible. It can work for any file type. You can do automatic updates when files change using nodemon instead of node to run this code.
const fs = require('fs')
/////////////////////////
const input = 'example.js'
const output = 'output.js'
const ext = '.js'
// looks for file with optional directory as: //== dir/file
const regex = /\/\/== *([\w-\/]+)/g
const fileContent = fs.readFileSync(input).toString()
// replace the comment references with the corresponding file content
const replacement = fileContent.replace(regex, (match, group)=>{
const comment = '//////// '+group+ext+' ////////\n\n'
const replace = fs.readFileSync(group+ext).toString()
return comment + replace
})
// write replacement to a file
fs.writeFileSync(output, replacement)

What's a good alternative to Object.assign?

I have
var shared = {
}
var stuff = Object.assign(Object.create(shared),{
//stuff
});
But Object.assign doesn't work on Safari and I don't wanna use something like Babel since my website is already kinda laggy. Is there any good alternative to make it so I can do this while maintaining "stuff" inheritance to "shared"..
var shared = {
}
var stuff = Object.create(shared);
stuff = {//stuff
};
I realize I can simply assign properties one by one to "stuff" but I have a lot of properties there and it would make the code a lot less organized
If you want another way to copy properties, how about something like this (copies everything from A to B):
const a = {hello:1, world:2};
const b = {};
Object.keys(a).forEach(k => b[k] = a[k]);
// b is now {hello:1, world:2}.
If you want quick dirty solution with minimal change to your codebase, try this pkg https://github.com/rubennorte/es6-object-assign
In my opinion, you should use bundling framework, say webpack, alongside with babel. There are many ways to reduce the bundled file size like tree-shaking, minifying, separating vendor files and sending JS files gzipped to the browser and it is not laggy at all.

How do you modify a nodejs module variable?

(This is probably a very silly question since I'm just beginning with nodejs, nevertheless I can't understand how this works. Let me know what is missing in my question, I'll correct.)
I'm trying to use the npm package likely.
in my server.js file I have thus written this
var Recommender = require('likely');
in likely.js you can find variables like these:
var DESCENT_STEPS = 5000; // number of iterations to execute gradient descent
var ALPHA = 0.0005; // learning rate, should be small
I would like to modify these variables inside my server.js file.
I believe the way to do that is adding this after the require()
Recommender.DESCENT_STEPS = 9999999999;
but that doesn't seem to change the value that is defined in likely.js and that is actually used by the model. (by running the model I can see it doesn't work since so much steps should take forever and the processing time doesn't change at all)
Can I only do this by modifying likely.js?
You cannot modify them programmatically because likely.js only uses the local variable values instead of the current value of the exported versions of the same variables. So if you wanted to change those values you currently would need to edit likely.js. You might want to submit a pull request to that project's repository that makes the code use the exported value (e.g. module.exports.DESCENT_STEPS) instead.
You need to publicize these variables to be viewable by server.js.
var object = {
DESCENT_STEPS = 5000;
ALPHA = 0.0005;
}
module.exports = object;
Now it can be viewed and modified in server.js.
Recommender.ALPHA = 'new value';

Nodejs, is better have a lot of same require or use a handle?

Image this esctructure:
- app.js
- core/
- service_1.js
- service_2.js
- service_3.js
- service_4.js
- service_5.js
- modules/
- module_1.js
- module_2.js
- module_3.js
The app.js uses the 3 modules, so the code of the app is:
var m1 = require('./modules/module_1');
var m2 = require('./modules/module_2');
var m3 = require('./modules/module_3');
m1.exec();
m2.exec();
m3.exec();
And each model uses all the services, so they need:
var s1 = require('./../core/service_1');
var s2 = require('./../core/service_2');
var s3 = require('./../core/service_3');
var s4 = require('./../core/service_3');
var s5 = require('./../core/service_3');
// some stuff...
So, I need to know if this is the best way to handle that or maybe do a "serviceManager" like:
app.js
var m1 = require('./modules/module_1');
var m2 = require('./modules/module_2');
var m3 = require('./modules/module_3');
var serviceManager = {
service_1 : require('./core/service_1'),
service_2 : require('./core/service_2'),
service_3 : require('./core/service_3'),
service_4 : require('./core/service_4'),
service_5 : require('./core/service_5')
};
m1.load(serviceManager);
m2.load(serviceManager);
m3.load(serviceManager);
m1.exec();
m2.exec();
m3.exec();
And each model I put:
var serviceManager = null;
exports.load = function(services) {
serviceManager = services;
}
// some stuff...
What is the best if I gonna use that class in almost all my files?
a) Get a lot of 'require'.
b) A handle to put 'require' only one time.
c) Another solution.
Dependency injection is really good for this. I've used and recommend insulin. This makes it very easy to load each of your modules in the dependency injection container and then just name dependencies in the modules you write.
You might do something like the following:
'use strict';
require('insulin').factory('myService', myServiceFactoryFunction);
function myServiceFactoryFunction(dependencyOne, dependencyTwo) {
// Do something with your dependencies.
}
This way you require once at the top of the file and never have to do it again in that file.
As mentioned in one of the other answers, node caches everything required, so each time you require the injection container, you get the same one. This makes it very quick and easy to build your app without either having to require things everywhere or pass things around.
Later, to get the module you created above you would just do the following where it's needed:
'use strict';
require('insulin').factory('mySecondService', mySecondServiceFactoryFunction);
function mySecondServiceFactoryFunction(myService) {
// myService is now available in this module
}
Insulin, as with most other dependency injectors you might use has other methods if you for some reason don't want to or can't rely on injection in some part of your application. You could do something like:
const insulin = require('insulin');
const myDependency = insulin.get('someModule');
where needed.
The best part about this to me is that the code becomes really clean and it's easy to tell what the dependencies are for a given file just by looking at the arguments passed to the factory function.
I would go with a lot of requires. It does not matter, because Node.js caches the modules after first load.
From the Node.js docs:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.

NodeJS & Gulp Streams & Vinyl File Objects- Gulp Wrapper for NPM package producing incorrect output

Goal
I am currently trying to write a Gulp wrapper for NPM Flat that can be easily used in Gulp tasks. I feel this would be useful to the Node community and also accomplish my goal. The repository is here for everyone to view , contribute to, play with and pull request. I am attempting to make flattened (using dot notation) copies of multiple JSON files. I then want to copy them to the same folder and just modify the file extension to go from *.json to *.flat.json.
My problem
The results I am getting back in my JSON files look like vinyl-files or byte code. For example, I expect output like
"views.login.usernamepassword.login.text": "Login", but I am getting something like {"0":123,"1":13,"2":10,"3":9,"4":34,"5":100,"6":105 ...etc
My approach
I am brand new to developing Gulp tasks and node modules, so definitely keep your eyes out for fundamentally wrong things.
The repository will be the most up to date code, but I'll also try to keep the question up to date with it too.
Gulp-Task File
var gulp = require('gulp'),
plugins = require('gulp-load-plugins')({camelize: true});
var gulpFlat = require('gulp-flat');
var gulpRename = require('gulp-rename');
var flatten = require('flat');
gulp.task('language:file:flatten', function () {
return gulp.src(gulp.files.lang_file_src)
.pipe(gulpFlat())
.pipe(gulpRename( function (path){
path.extname = '.flat.json'
}))
.pipe(gulp.dest("App/Languages"));
});
Node module's index.js (A.k.a what I hope becomes gulp-flat)
var through = require('through2');
var gutil = require('gulp-util');
var flatten = require('flat');
var PluginError = gutil.PluginError;
// consts
const PLUGIN_NAME = 'gulp-flat';
// plugin level function (dealing with files)
function flattenGulp() {
// creating a stream through which each file will pass
var stream = through.obj(function(file, enc, cb) {
if (file.isBuffer()) {
//FIXME: I believe this is the problem line!!
var flatJSON = new Buffer(JSON.stringify(
flatten(file.contents)));
file.contents = flatJSON;
}
if (file.isStream()) {
this.emit('error', new PluginError(PLUGIN_NAME, 'Streams not supported! NYI'));
return cb();
}
// make sure the file goes through the next gulp plugin
this.push(file);
// tell the stream engine that we are done with this file
cb();
});
// returning the file stream
return stream;
}
// exporting the plugin main function
module.exports = flattenGulp;
Resources
https://github.com/gulpjs/gulp/blob/master/docs/writing-a-plugin/README.md
https://github.com/gulpjs/gulp/blob/master/docs/writing-a-plugin/using-buffers.md
https://github.com/substack/stream-handbook
You are right about where the error is. The fix is simple. You just need to parse file.contents, since the flatten function operates on an object, not on a Buffer.
...
var flatJSON = new Buffer(JSON.stringify(
flatten(JSON.parse(file.contents))));
file.contents = flatJSON;
...
That should fix your problem.
And since you are new to the Gulp plugin thing, I hope you don't mind if I make a suggestion. You might want to consider giving your users the option to prettify the JSON output. To do so, just have your main function accept an options object, and then you can do something like this:
...
var flatJson = flatten(JSON.parse(file.contents));
var jsonString = JSON.stringify(flatJson, null, options.pretty ? 2 : null);
file.contents = new Buffer(jsonString);
...
You might find that the options object comes in useful for other things, if you plan to expand on your plugin in future.
Feel free to have a look at the repository for a plugin I wrote called gulp-transform. I am happy to answer any questions about it. (For example, I could give you some guidance on implementing the streaming-mode version of your plugin if you would like).
Update
I decided to take you up on your invitation for contributions. You can view my fork here and the issue I opened up here. You're welcome to use as much or as little as you like, and in case you really like it, I can always submit a pull request. Hopefully it gives you some ideas at least.
Thank you for getting this project going.

Categories

Resources