How can I require modules with patterns in the path? - javascript

How can I include all files in nodeJS like
require('./packages/city/model/cities')
require('./packages/state/model/states')
require('./packages/country/model/countries')
like as
require('./packages/*/model/*')
same like grunt is loading files.

You can't (or at least you shouldn't)
In order to do this, you would have to overload node's native require function, which is highly inadvisable.
The CommonJS pattern might seem tedious to you, but it's a very good one and you shouldn't try to break it just because you saw shortcuts in other languages/frameworks.
By introducing some form of magic in your module, you suddenly change everything that programmers can (and should be able to) safely assume about the CommonJS pattern itself.

Due to one-to-one correspondence in node module loading system, it wont be possible natively, but would not be surprised if there is a package for this method.
Best you can do is create a index.js that loads modules present in directory and exports them as its own.
module.exports = function() {
return {
city : require('./city/model/'),
state : require('./packages/state/model/'),
country : require('./packages/country/model/')
}
}
you would have to load models in similary fashion in all three dirrectories as well.
I know that this solution is not what you are looking for but in my experirence, this method allows to better manage custom packages as you can add/remove features easily.

Node.js's require allows you to
load only one module at a time
load modules only in synchronous fashion.
That is how the module system works in Node.js. But if you want to have minimatch kind of matching functionality, you can roll one on your own, like this
var path = require("path"),
glob = require("glob");
function requirer(pattern) {
var modules = {},
files = glob.sync(pattern);
files.forEach(function(currentFile) {
var fileName = path.basename(currentFile);
fileName = fileName.substring(0, fileName.lastIndexOf(".js"));
modules[fileName] = require(currentFile);
});
return modules;
}
This depends on glob module, which allows you to use minimatch patterns to search files and then we require the found files, store them in an object and return the object. And this can be used like this
var modules = requirer('./packages/*/model/*.js');
console.log(modules.cities);
P.S: I am working on making this a public module already.

Related

What are the best practices for writing and organizing javascript plugins?

So I have a nice chunk of code that is the same across many different javascript files, but I occasionally have to update it due to path changes or other various conditions. Now copy and pasting it all over to the new files works fine but is annoying to execute. Is there a good way to maintain one javascript file with my "plugin" code and have it be accessible by other javascript files that use the plugin?
I am looking for both a good nodejs solution, and vanilla js solution. If they could be mutually shared that'd be ideal, but not required by any means. Ideally, I'd like to host my workspace in workspace/ and have some folders, workspace/front-end-js/ and workspace/back-end-nodejs/, be able to run code off a plugin in workspace/plugins/ so that I can execute things like MyPluginVar.Foo();
I am aware of some systems, like the node's var foo = require('bar'); and the frontend browserified version, but really do not know all my options. What's the best way of writing and organizing javascript plugins?
--
Edit: I'm really trying to avoid npm, but it might be the best option.
You typically add your shared libraries and plugins as dependencies to your project's package.json file and install them using npm.
CommonJS modules, which use the module.exports object and require function, are the de facto standard at the moment.
ES2015 modules, which use the export and import statements, are an emerging formal standard, but support for them is not yet universal. They are a good option if your environment supports them.
To load either type of module on the front end, you will need to use a bundler such as Webpack or Browserify.
Older javascript modules typically publish to the global scope (window), and are a simple option for front end code.
You can also support multiple module systems if you like by using a UMD (Universal Module Definition) wrapper. Here's an example from the UMD Github repo that leverages CommonJS if supported, while falling back to a browser global:
(function (root, factory) {
if (typeof exports === 'object' && typeof exports.nodeName !== 'string') {
// CommonJS
factory(exports, require('b'));
} else {
// Browser globals
factory((root.commonJsStrictGlobal = {}), root.b);
}
}(this, function (exports, b) {
// b represents some dependency
// attach properties to the exports object to define
// the exported module properties.
exports.action = function () {};
}));

Require JS files dynamically on runtime using webpack

I am trying to port a library from grunt/requirejs to webpack and stumbled upon a problem, that might be a game-breaker for this endeavor.
The library I try to port has a function, that loads and evaluates multiple modules - based on their filenames that we get from a config file - into our app. The code looks like this (coffee):
loadModules = (arrayOfFilePaths) ->
new Promise (resolve) ->
require arrayOfFilePaths, (ms...) ->
for module in ms
module ModuleAPI
resolve()
The require here needs to be called on runtime and behave like it did with requireJS. Webpack seems to only care about what happens in the "build-process".
Is this something that webpack fundamentally doesn't care about? If so, can I still use requireJS with it? What is a good solution to load assets dynamically during runtime?
edit: loadModule can load modules, that are not present on the build-time of this library. They will be provided by the app, that implements my library.
So I found that my requirement to have some files loaded on runtime, that are only available on "app-compile-time" and not on "library-compile-time" is not easily possible with webpack.
I will change the mechanism, so that my library doesn't require the files anymore, but needs to be passed the required modules. Somewhat tells me, this is gonna be the better API anyways.
edit to clarify:
Basically, instead of:
// in my library
load = (path_to_file) ->
(require path_to_file).do_something()
// in my app (using the 'compiled' libary)
cool_library.load("file_that_exists_in_my_app")
I do this:
// in my library
load = (module) ->
module.do_something()
// in my app (using the 'compiled' libary)
module = require("file_that_exists_in_my_app")
cool_library.load(module)
The first code worked in require.js but not in webpack.
In hindsight i feel its pretty wrong to have a 3rd-party-library load files at runtime anyway.
There is concept named context (http://webpack.github.io/docs/context.html), it allows to make dynamic requires.
Also there is a possibility to define code split points: http://webpack.github.io/docs/code-splitting.html
function loadInContext(filename) {
return new Promise(function(resolve){
require(['./'+filename], resolve);
})
}
function loadModules(namesInContext){
return Promise.all(namesInContext.map(loadInContext));
}
And use it like following:
loadModules(arrayOfFiles).then(function(){
modules.forEach(function(module){
module(moduleAPI);
})
});
But likely it is not what you need - you will have a lot of chunks instead of one bundle with all required modules, and likely it would not be optimal..
It is better to define module requires in you config file, and include it to your build:
// modulesConfig.js
module.exports = [
require(...),
....
]
// run.js
require('modulesConfig').forEach(function(module){
module(moduleAPI);
})
You can also try using a library such as this: https://github.com/Venryx/webpack-runtime-require
Disclaimer: I'm its developer. I wrote it because I was also frustrated with the inability to freely access module contents at runtime. (in my case, for testing from the console)

Substitude one module for another in node.js

If I have a node.js application that has hundreds of files that reference a module (say underscore) and I want to replace that module with another (say lodash) then the obvious way to do this substitution would be a global name replace and switch out the modules in the package.json file.
Is there anyway to just change the module that a name refers to so that when node.js sees require('moduleA') it actually loads 'moduleB' instead? Now I know that this would cause naming hell because anyone working on the project would see require('moduleA') and wouldn't know that the real module being loaded was 'moduleB' so ultimately you'd probably want to go with the first solution. The use case that I'm thinking of is if you want to try a few alternatives for API compatible modules to measure your application's performance (for example) with each module.
If this is an on-going thing and you want to maintain the ability to programmatically switch between the options often, such as in tests:
Instead of using require("underscore"); throughout your codebase, require a local file instead like require("./lib/underscore");, and have that file conditionally re-export underscore or a different library:
if (global.USE_LODASH) {
module.exports = require("lodash");
} else {
module.exports = require("underscore");
}
If this is a one-off thing to try out an alternative library before making the decision to switch, and you want to do this test quickly first without find-and-replacing in all of your files:
Go inside your node_modules folder, delete or rename the underscore folder, and make a symlink named underscore to the replacement module's folder. I don't recommend this as a long-term solution: running npm install again will likely undo this hack, and most projects choose to avoid checking the node_modules folder into their source repository.
Try to use mock-require module.

browserify detect custom require()

I want to use a custom require() function in my application.
Namely I have node's standard require() and a custom one I wrote to require files starting from the root called rootRequire() which internally all it does is:
// rootRequire.js
var path = require('path');
var rootPath = __dirname;
global.rootRequire = function (modulePath) {
var filepath = path.join(rootPath, modulePath);
return require(filepath);
};
module.exports = rootRequire;
But even though rootRequire() internally uses node's require(), it does not pick up any files required through that method
Example:
require('rootRequire.js');
rootRequire('/A.js'); // server side it works, in the browser I get an error saying can't find module A.js
Perhaps, this will answer your question on why it is not working on browser.
Quoting the owner's comment: "Browserify can only analyze static requires. It is not in the scope of browserify to handle dynamic requires".
Technically, you are trying to load files dynamically from a variable with your custom require. Obviously, node.js can handle that.
In case of browserify, all requires are statically loaded and packaged when you run the CLI to build the bundle.
HTH.
What i recommend doing here is this:
First create a bundle to be required of anything you might use in rootRequire().
Something like browserify -r CURRENT_PATH/A.js -r CURRENT_PATH/B.js > rootRequirePackages.js
Then, in your webpage, you can include both rootRequirePackages.js and your regular browserified file.
Perhaps just use RequireJS ? It's simple to set up and relatively easy to use.

Refactoring a userscript to use javascript modules

I'm working on a userscript - in particular this userscript - which has been designed to encapsulate functionality in modules. In order to be able to do some automated testing I would like to split the modules into their own files and use node.js's module exporting and require functions to combine into one file for use in Greasemonkey or simple browser extensions.
My first thought was to just copy the modules into their own files as such
module.js
var exportedModule = (function (){
var Module = {
// public functions and members
};
//private functions and members
return Module;
}());
module.exports = exports = exportedModule;
And then have a central file that requires each of these modules, perhaps compiling them with something like Browserify.
script.js
var importedModule = require(./module);
importedModule.init();
Is this possible?
It seems to me that you would be better off using Requirejs, which uses AMD style modules and is inherently more browser friendly. Node commonjs-style modules are synchronous and do not fit the browser model very well.
Of course, using requirejs will change your scripts a bit.
It's possible, and Browserify makes it easy:
browserify src/my.user.js -o dist/my.user.js
The metadata in the source file may get moved, but it's still parsed correctly (by Greasemonkey at least).
For a more complex example which compiles various assets, including CSS and images, see here.

Categories

Resources