Substitude one module for another in node.js - javascript

If I have a node.js application that has hundreds of files that reference a module (say underscore) and I want to replace that module with another (say lodash) then the obvious way to do this substitution would be a global name replace and switch out the modules in the package.json file.
Is there anyway to just change the module that a name refers to so that when node.js sees require('moduleA') it actually loads 'moduleB' instead? Now I know that this would cause naming hell because anyone working on the project would see require('moduleA') and wouldn't know that the real module being loaded was 'moduleB' so ultimately you'd probably want to go with the first solution. The use case that I'm thinking of is if you want to try a few alternatives for API compatible modules to measure your application's performance (for example) with each module.

If this is an on-going thing and you want to maintain the ability to programmatically switch between the options often, such as in tests:
Instead of using require("underscore"); throughout your codebase, require a local file instead like require("./lib/underscore");, and have that file conditionally re-export underscore or a different library:
if (global.USE_LODASH) {
module.exports = require("lodash");
} else {
module.exports = require("underscore");
}
If this is a one-off thing to try out an alternative library before making the decision to switch, and you want to do this test quickly first without find-and-replacing in all of your files:
Go inside your node_modules folder, delete or rename the underscore folder, and make a symlink named underscore to the replacement module's folder. I don't recommend this as a long-term solution: running npm install again will likely undo this hack, and most projects choose to avoid checking the node_modules folder into their source repository.

Try to use mock-require module.

Related

Ember why do we have to use import for certain bower dependencies

In an Ember app, when using certain dependencies like moment installed via bower, we have to also import the same in the ember-cli-build.js file:
app.import('bower_components/moment/moment.js');
My question is why is that needed, since I would assume everything inside node_modules as well as bower_components should be available for use inside the app.
Also if that is not the case, how do we identify which dependencies would require such explicit import to be able to use them ?
You don't have to, actually.
There is a package now that lets you 'just import' things: https://github.com/ef4/ember-auto-import
Some reading on the topic of importing: https://discuss.emberjs.com/t/readers-questions-how-far-are-we-from-being-able-to-just-use-any-npm-package-via-the-import-statement/14462?u=nullvoxpopuli
In in-depth answer to your question and the reasons behind why things are the way they are is posted here:
https://discuss.emberjs.com/t/readers-questions-why-does-ember-use-broccoli-and-how-is-it-different-from-webpack-rollup-parcel/15384?u=nullvoxpopuli
(A bit too long for stack overflow, also on mobile, and I wouldn't want to lose all the links and references in a copy-paste)
Hope this helps
Edit:
To answer:
I just wanted to understand "in what cases" do we need to use the import statement in our ember-cli-build (meaning we do not do import for all the dependencies we have in our package/bower.json)...But only for specific ones...I wanted to know what is the criteria or use case for doing import.
Generally, for every package, hence the appeal of the auto-import and / or packagers (where webpack may be used instead of rollup in the future).
Though, it's common for ember-addons to define their own app.import so that you don't need to configure it, like any of these shims, specifically, here is how the c3 charting library is shimmed: https://github.com/mike-north/ember-c3-shim/blob/master/index.js#L7
Importing everything 'manually' like this is a bit of a nuisance, but it is, in part, due to the fact that js packages do not have a consistent distribution format. There is umd, amd, cjs, es6, etc.
with the rollup and broccoli combo, we need to manually specify which format a file is. There are some big advantages to the rollup + broccoli approach, which can be demonstrated here
and here
Sometimes, depending on the transform, you'll need a "vendor-shim".
These are handy when a module has decided it wants to be available on the window / global object instead of available as a module export.
Link: https://simplabs.com/blog/2017/02/13/npm-libs-in-ember-cli.html
(self represents window/global)
however, webpack has already done the work of figuring out how to detect what format a js file is in, and abstracts all of that away from you. webpack is what ember-auto-import uses, and is what allows you to simply
import { stuff} from 'package-name';. The downside to webpack is that you can't pipeline your transforms (which most people may not need, but it can be handy if you're doing Typescript -> Babel -> es5).
Actually: (almost) everything!
Ember does, by default, not add anything to your app except ember addons. There are however some addons that dynamically add stuff to your app like ember-browserify or ember-auto-import.
Also just because you do app.import this does not mean you can use the code with import ... from 'my-package'. The one thing app.import does is it adds the specified file to your vendor.js file. Noting else.
How you use this dependency depends completely on the provided JS file! Ember uses loader.js, an AMD module loader. So if the JS file you app.imported uses AMD (or UMD) this will work and you can import Foo from 'my-package'. (Because this is actually transpiled to AMD imports)
If the provided JS file provides a global you can just use the global.
However there is also the concept of vendor-shims.. Thats basically just a tiny AMD module you can write to export the global as AMD module.
However there are a lot of ember addons that add stuff to your app. For example things like ember-cli-moment-shim just exist to automagically add a dependency to your project. However how it's done completely depends on the addon.
So the rule is:
If its an ember addon read the addon docs but usually you shouldn't app.import
In every other case you manually need to use the library either by app.import or manual broccoli transforms.
The only exception is if you use an addon that tries to generically add dependencies to your project like ember-browserify or ember-auto-import

Webpack & Testing: Helper to delete/replace modules from the require cache

For our tests we need to be able to replace or delete modules from the require cache, e.g. to replace them with a fake implementation.
In order to achieve this we implemented a little helper function:
fakeModule = function(modulePath, fakeExportsObject){
require.cache[require.resolve(modulePath)] = {exports: fakeExportsObject}
}
However when we run this through webpack we get the following critical warning: "the request of a dependency is an expression" and all JavaScript files in the project are included in the webpack build.
Is there any possibility to disable the interpretation of the helper function? In our tests we can safely assume that we are only deleting/replacing existent modules from the require cache. Even if not, it wouldn´t really matter.
Have you looked at rewire and rewire-webpack? I just started looking into testing with webpack and will need to find a way to accomplish this as well. Rewire sounds promising, but I haven't used it yet.

How can I require modules with patterns in the path?

How can I include all files in nodeJS like
require('./packages/city/model/cities')
require('./packages/state/model/states')
require('./packages/country/model/countries')
like as
require('./packages/*/model/*')
same like grunt is loading files.
You can't (or at least you shouldn't)
In order to do this, you would have to overload node's native require function, which is highly inadvisable.
The CommonJS pattern might seem tedious to you, but it's a very good one and you shouldn't try to break it just because you saw shortcuts in other languages/frameworks.
By introducing some form of magic in your module, you suddenly change everything that programmers can (and should be able to) safely assume about the CommonJS pattern itself.
Due to one-to-one correspondence in node module loading system, it wont be possible natively, but would not be surprised if there is a package for this method.
Best you can do is create a index.js that loads modules present in directory and exports them as its own.
module.exports = function() {
return {
city : require('./city/model/'),
state : require('./packages/state/model/'),
country : require('./packages/country/model/')
}
}
you would have to load models in similary fashion in all three dirrectories as well.
I know that this solution is not what you are looking for but in my experirence, this method allows to better manage custom packages as you can add/remove features easily.
Node.js's require allows you to
load only one module at a time
load modules only in synchronous fashion.
That is how the module system works in Node.js. But if you want to have minimatch kind of matching functionality, you can roll one on your own, like this
var path = require("path"),
glob = require("glob");
function requirer(pattern) {
var modules = {},
files = glob.sync(pattern);
files.forEach(function(currentFile) {
var fileName = path.basename(currentFile);
fileName = fileName.substring(0, fileName.lastIndexOf(".js"));
modules[fileName] = require(currentFile);
});
return modules;
}
This depends on glob module, which allows you to use minimatch patterns to search files and then we require the found files, store them in an object and return the object. And this can be used like this
var modules = requirer('./packages/*/model/*.js');
console.log(modules.cities);
P.S: I am working on making this a public module already.

Node.js package self-reference

Can package require itself and its subsystems?
For instance there is module:
src/deep/path/to/module.js
which need to require
src/another/module.js
Instead of:
require('./../../../another/module.js');
Can one just:
require('<self>/another/module.js');
?
For instance this might be useful in testing: test unit can reference its test object without long up-and-down-style path.
I have two considerations (but they do not satisfies this issue completely):
If package is already in node_modules folder it can reference to itself by its
canonical name (that in its package.json).
Package can create symlink to itself in its own node_modules folder (sic!). Haven't try it yet, possible will lead to infinite loop in some resolving cases.
Solution 1
Split it to different sub-projects, put each one into different folders. As an example:
sub.project.1/
sub.project.2/
in sub.project.1/
# cd ../sub.project.1
npm link
# cd ../sub.project.2
npm link sub.project.1
Then in sub.project.2 you can do it simply:
var something = require("sub.project.1")
This can remove the '../../...' relative path.
Note: it can be done in same folder/project, by doing this,
in the sub folders, the project self can be easily referred.
For example when both sub.project.1 and sub.project.2 replaced by my.project. And of course, all the names should be the name in package.json initialized by npm.
Solution 2
Create a link in the folder node_modules/
# cd node_modules
ln -s .. myProjectRootDir
#
# where: .. : means parent directory in linux shell
# '#' means comments in linux shell
#
Then it can be used under same level directory trees:
var something = require("myProjectRootDir/path/to/js/file")
Thus the "../../..." can change to path read more easily.
And if myProjectRootDir happens to be the project name and
package name in package.json, it's ok.
Solution 3
There are npm packages: require-self, install-self,
they do the similar things.
Solution 4
Write a new .js file where it's easy to require,
then put all the annoying relative references into it.
For example write it at node_modules/mymods.js
// mymod.js
module.exports.mod_one = require("../path/to/mod/one.js");
//...
Then require('mymod') can gives all the other modules.
This is not the best solution, all references of requiring
need to be doing by hand. But it's a single file, so it's
manageable, and centralize the references for future deployments.
One of the cons of this solution is, if you put it in
node_modules/ folder and this folder is ignored in git repository, you need to take care when pushing or branching the git repo.
When deleting the node_modules folder, the file can also be deleted by accidents.
There could be more solutions I don't know.
Well you can shorten it a little bit. You don't need the leading ./ in this case
require("../../../another/module.js");
And a little further by removing the trailing .js
require("../../../another/module");
Another answer is suggesting the use of process.cwd() but be very careful with this. Your require calls will only work if the app is initialized from the same directory.
From the sounds of it though, 4 directories is already pretty deep. You might want to considering fragmenting your large project into smaller, single-purpose modules. We would need more information on the project to know if that was the right decision though.
I often use process.cwd() to make things like this a little more manageable. This returns where the node application is actually running from and lets you create the path in a little cleaner fashion.
Something maybe like var x = require(process.cwd() + '/lib/module')
Without seeing exactly what you're trying to do; I'm not sure if this will be helpful, but you can do things like var connect = require('express/connect') as well. Basically you can pass an installed local module, and then create paths off of it as well.

Is there a way to lazily set the path of a resource with RequireJS?

So, I have an app that is using requireJS. Quite happily. For the most part.
This app makes use of Socket.IO. Socket.IO is being provided by nodejs, and does not run on the same port as the main webserver.
To deal with this, in our main js file, we do something like this:
var hostname = window.location.hostname;
var socketIoPath = "http://" + hostname + ":3000/socket.io/socket.io";
requirejs.config({
baseUrl: "/",
paths: {
app : "scripts/appapp",
"socket.io" : socketIoPath
}
});
More complicated than this, but you get the gist.
Now, in interactive mode, this works swimingly.
The ugliness starts when we try to use r.js to compile this (technically we're using grunt to run r.js, but that's besides the point).
In the config for r.js, we set an empty path for socket.io (to avoid it failing to pull in), and we set our main file as the mainConfigFile.
The compiler yells about this, saying:
Running "requirejs:dist" (requirejs) task
>> Error: Error: The config in mainConfigFile /…/client.js cannot be used because it cannot be evaluated correctly while running in the optimizer. Try only using a config that is also valid JSON, or do not use mainConfigFile and instead copy the config values needed into a build file or command line arguments given to the optimizer.
>> at Function.build.createConfig (/…/r.js:23636:23)
Now, near as I can figure, this is due to the fact that I'm using a variable to set the path for "socket.io". If i take this out, require runs great, but i can't run the raw from a server. If I leave it is, my debug server is happy, but the build breaks.
Is there a way that I can lazily assign the path of "socket.io" at runtime so that it doesn't have to go into the requirejs.config() methos at that point?
Edit: Did some extensive research on this. Here are the results.
Loading from CDN with RequireJS is possible with a build. However, if you're using the smaller Almond loader, it's not possible.
This leaves you with two options:
Use almond along with a local copy of the file in your build.
Use the full require.js loader and try to use a CDN.
Use a <script> tag just for that resource.
I say try for #2 because there are some caveats. You'll need to include require.js in your HTML with the data-main attribute for your built file. But if you do this, require and define will be global functions, allowing users to require any of your internal modules and mess around with them. If you're okay with this, you'll need to follow the "empty: scheme" in your build config (but not in your main config).
But the fact remains that you now have another HTTP request. If you only want one built file, which includes the require.js loader, you'll need to optimize for only one file.
Now, if you want to avoid users being able to require your modules, you'll have to do something like wrap:true in your build. But as far as I can tell, once your module comes down from CDN, if it's AMD, it's going to look for a global define function to register itself with, and that won't exist because it's now wrapped in a closure.
The lesson I took away from all this: inline your resources to your build. It makes sense. You reduce HTTP requests, minify it all and get gzip compression. You don't expose your modules to the world and everything is a lot simpler. If you cache your resources properly you won't even need to worry about it.
But since new versions of socket.io don't like AMD, here's how I did it. Make sure to include the socket.io <script> tag before requirejs. Then create a requirejs module named socket.io with the following contents:
define([], function () {
var io = window.io;
window.io = null;
return io;
});
Set the path like so: 'socket.io': 'core/socket.io' or wherever you want.
And require it as normal! The build works fine this way.
Original answer
Is it possible that you could make use of the path config fallbacks specified in the RequireJS API? Maybe you could save the file locally as a fallback so your build will work.
The socket.io GitHub repository specifies that you can serve the client with the files in the socket.io-client package's dist/ directory.

Categories

Resources