Webpack & Testing: Helper to delete/replace modules from the require cache - javascript

For our tests we need to be able to replace or delete modules from the require cache, e.g. to replace them with a fake implementation.
In order to achieve this we implemented a little helper function:
fakeModule = function(modulePath, fakeExportsObject){
require.cache[require.resolve(modulePath)] = {exports: fakeExportsObject}
}
However when we run this through webpack we get the following critical warning: "the request of a dependency is an expression" and all JavaScript files in the project are included in the webpack build.
Is there any possibility to disable the interpretation of the helper function? In our tests we can safely assume that we are only deleting/replacing existent modules from the require cache. Even if not, it wouldn´t really matter.

Have you looked at rewire and rewire-webpack? I just started looking into testing with webpack and will need to find a way to accomplish this as well. Rewire sounds promising, but I haven't used it yet.

Related

Importing from outside webpack (runtime importing)

This is just something I thought today and I didn't see a lot of information so I'm going to share this weird cases and how I personally solved them (if there's a better way please comment, but meanwhile this might help others ^^)
In a webpack bundle, every import/require you do, is managed by webpack using their internal require function. That means you cannot use anymore the original NodeJS global require if you are inside a webpack bundle.
See also: Exporting from outside webpack bundle
There's a workaround provided by webpack actually:
A variable called __non_webpack_require__ which is the original require so, in your code, you can do this:
const internalModule = require('internal/module');
// or import internalModule from 'internal/module'; in the ES6 way
const externalModule = __non_webpack_require__('external/module');
If you are using TypeScript, you can add this line in your global.d.ts file to avoid syntax errors:
declare const __non_webpack_require__: NodeRequireFunction;
Fact 1: Actually after the build, you can see how your commonly used require (webpack's) has been renamed to __webpack_require__, and that __non_webpack_require__ has been preserved as the original require.
Fact 2: Webpack also use the original require to import modules from the system (not bundled with the code), such as net, events, fs, etc.

Ember why do we have to use import for certain bower dependencies

In an Ember app, when using certain dependencies like moment installed via bower, we have to also import the same in the ember-cli-build.js file:
app.import('bower_components/moment/moment.js');
My question is why is that needed, since I would assume everything inside node_modules as well as bower_components should be available for use inside the app.
Also if that is not the case, how do we identify which dependencies would require such explicit import to be able to use them ?
You don't have to, actually.
There is a package now that lets you 'just import' things: https://github.com/ef4/ember-auto-import
Some reading on the topic of importing: https://discuss.emberjs.com/t/readers-questions-how-far-are-we-from-being-able-to-just-use-any-npm-package-via-the-import-statement/14462?u=nullvoxpopuli
In in-depth answer to your question and the reasons behind why things are the way they are is posted here:
https://discuss.emberjs.com/t/readers-questions-why-does-ember-use-broccoli-and-how-is-it-different-from-webpack-rollup-parcel/15384?u=nullvoxpopuli
(A bit too long for stack overflow, also on mobile, and I wouldn't want to lose all the links and references in a copy-paste)
Hope this helps
Edit:
To answer:
I just wanted to understand "in what cases" do we need to use the import statement in our ember-cli-build (meaning we do not do import for all the dependencies we have in our package/bower.json)...But only for specific ones...I wanted to know what is the criteria or use case for doing import.
Generally, for every package, hence the appeal of the auto-import and / or packagers (where webpack may be used instead of rollup in the future).
Though, it's common for ember-addons to define their own app.import so that you don't need to configure it, like any of these shims, specifically, here is how the c3 charting library is shimmed: https://github.com/mike-north/ember-c3-shim/blob/master/index.js#L7
Importing everything 'manually' like this is a bit of a nuisance, but it is, in part, due to the fact that js packages do not have a consistent distribution format. There is umd, amd, cjs, es6, etc.
with the rollup and broccoli combo, we need to manually specify which format a file is. There are some big advantages to the rollup + broccoli approach, which can be demonstrated here
and here
Sometimes, depending on the transform, you'll need a "vendor-shim".
These are handy when a module has decided it wants to be available on the window / global object instead of available as a module export.
Link: https://simplabs.com/blog/2017/02/13/npm-libs-in-ember-cli.html
(self represents window/global)
however, webpack has already done the work of figuring out how to detect what format a js file is in, and abstracts all of that away from you. webpack is what ember-auto-import uses, and is what allows you to simply
import { stuff} from 'package-name';. The downside to webpack is that you can't pipeline your transforms (which most people may not need, but it can be handy if you're doing Typescript -> Babel -> es5).
Actually: (almost) everything!
Ember does, by default, not add anything to your app except ember addons. There are however some addons that dynamically add stuff to your app like ember-browserify or ember-auto-import.
Also just because you do app.import this does not mean you can use the code with import ... from 'my-package'. The one thing app.import does is it adds the specified file to your vendor.js file. Noting else.
How you use this dependency depends completely on the provided JS file! Ember uses loader.js, an AMD module loader. So if the JS file you app.imported uses AMD (or UMD) this will work and you can import Foo from 'my-package'. (Because this is actually transpiled to AMD imports)
If the provided JS file provides a global you can just use the global.
However there is also the concept of vendor-shims.. Thats basically just a tiny AMD module you can write to export the global as AMD module.
However there are a lot of ember addons that add stuff to your app. For example things like ember-cli-moment-shim just exist to automagically add a dependency to your project. However how it's done completely depends on the addon.
So the rule is:
If its an ember addon read the addon docs but usually you shouldn't app.import
In every other case you manually need to use the library either by app.import or manual broccoli transforms.
The only exception is if you use an addon that tries to generically add dependencies to your project like ember-browserify or ember-auto-import

Substitude one module for another in node.js

If I have a node.js application that has hundreds of files that reference a module (say underscore) and I want to replace that module with another (say lodash) then the obvious way to do this substitution would be a global name replace and switch out the modules in the package.json file.
Is there anyway to just change the module that a name refers to so that when node.js sees require('moduleA') it actually loads 'moduleB' instead? Now I know that this would cause naming hell because anyone working on the project would see require('moduleA') and wouldn't know that the real module being loaded was 'moduleB' so ultimately you'd probably want to go with the first solution. The use case that I'm thinking of is if you want to try a few alternatives for API compatible modules to measure your application's performance (for example) with each module.
If this is an on-going thing and you want to maintain the ability to programmatically switch between the options often, such as in tests:
Instead of using require("underscore"); throughout your codebase, require a local file instead like require("./lib/underscore");, and have that file conditionally re-export underscore or a different library:
if (global.USE_LODASH) {
module.exports = require("lodash");
} else {
module.exports = require("underscore");
}
If this is a one-off thing to try out an alternative library before making the decision to switch, and you want to do this test quickly first without find-and-replacing in all of your files:
Go inside your node_modules folder, delete or rename the underscore folder, and make a symlink named underscore to the replacement module's folder. I don't recommend this as a long-term solution: running npm install again will likely undo this hack, and most projects choose to avoid checking the node_modules folder into their source repository.
Try to use mock-require module.

Module definition to work with node.js, require.js and with plain scripttags too

I am working on a javascript module/library that should work in 3 environments:
in node.js
in requirejs
when simply included using tags into the webpage. In this case the whole module should be hooked up under window.myModule
Do you have any suggestions as to how to write the structure of the library so that it works in all these environments?
EDIT: basically I mean some sort of wrapper code around the library so that I can call the file form any of those three methods and I'm fine...
This requirement and its solution is known as Universal Module Definition (UMD). It is currently a draft proposal. Background and current status is described in Addy Osmani - Writing Modular JavaScript With AMD, CommonJS & ES Harmony article. Look for "UMD" link pointing to various templates you can use.
Quite many other templates can be found on the web - UMD is the search keyword.
(did not find the final link myself yet :)
We're working on the same thing, I think.
And we have some success. We have library (we call it 'slib'), compiled to AMD js files. It does not depend on npm modules or browser, so it can be called from node and from browser.
1) To call it from node, we use requirejs:
file require.conf.js
module.exports = function(nodeRequire){
global.requirejs = require('requirejs');
requirejs.config({
baseUrl: __dirname+"/../web/slib/",
paths: {
slib: "."
},
nodeRequire: nodeRequire
});
}
In any other serverside (nodejs) file we add this line at the beginning
require("./require.conf")(require);
then we call slib's code by:
var Computation = requirejs("slib/Computation");
2) To call slib from browser, we just use requirejs. It handles everything fine.
3) We do not need to call slib from < script > directly.
For production, we use r.js to make a bundle js file with most of dependencies and use it on the page with one < script >. And this script downloads all other deps, if they are not included, using standard requirejs and it does not need requirejs (as far as I remember), it just works alone. This is very flexible for large projects: use requirejs while development, use r.js to bundle core files in production to speed up page load, use whole bundle if you need only one < script > without any other requests. r.js bundles all dependencies correctly, including old js libraries, which were commonly loading using only < script > and accessible by window.myOldLibrary using shim param on config.
It seems you can use browserfy to make some npm modules accessible from slib's code, but we did not tried yet.
Also, using requirejs on node's side, I think, can be simpler (why we need second 'requirejs' function together with node's one?) We just have not investigated it well, but this works.
In any slib module you can write
if (window)
window.module1 = this // or whatever
and it will be exported as old js lib upon load

Is there a way to lazily set the path of a resource with RequireJS?

So, I have an app that is using requireJS. Quite happily. For the most part.
This app makes use of Socket.IO. Socket.IO is being provided by nodejs, and does not run on the same port as the main webserver.
To deal with this, in our main js file, we do something like this:
var hostname = window.location.hostname;
var socketIoPath = "http://" + hostname + ":3000/socket.io/socket.io";
requirejs.config({
baseUrl: "/",
paths: {
app : "scripts/appapp",
"socket.io" : socketIoPath
}
});
More complicated than this, but you get the gist.
Now, in interactive mode, this works swimingly.
The ugliness starts when we try to use r.js to compile this (technically we're using grunt to run r.js, but that's besides the point).
In the config for r.js, we set an empty path for socket.io (to avoid it failing to pull in), and we set our main file as the mainConfigFile.
The compiler yells about this, saying:
Running "requirejs:dist" (requirejs) task
>> Error: Error: The config in mainConfigFile /…/client.js cannot be used because it cannot be evaluated correctly while running in the optimizer. Try only using a config that is also valid JSON, or do not use mainConfigFile and instead copy the config values needed into a build file or command line arguments given to the optimizer.
>> at Function.build.createConfig (/…/r.js:23636:23)
Now, near as I can figure, this is due to the fact that I'm using a variable to set the path for "socket.io". If i take this out, require runs great, but i can't run the raw from a server. If I leave it is, my debug server is happy, but the build breaks.
Is there a way that I can lazily assign the path of "socket.io" at runtime so that it doesn't have to go into the requirejs.config() methos at that point?
Edit: Did some extensive research on this. Here are the results.
Loading from CDN with RequireJS is possible with a build. However, if you're using the smaller Almond loader, it's not possible.
This leaves you with two options:
Use almond along with a local copy of the file in your build.
Use the full require.js loader and try to use a CDN.
Use a <script> tag just for that resource.
I say try for #2 because there are some caveats. You'll need to include require.js in your HTML with the data-main attribute for your built file. But if you do this, require and define will be global functions, allowing users to require any of your internal modules and mess around with them. If you're okay with this, you'll need to follow the "empty: scheme" in your build config (but not in your main config).
But the fact remains that you now have another HTTP request. If you only want one built file, which includes the require.js loader, you'll need to optimize for only one file.
Now, if you want to avoid users being able to require your modules, you'll have to do something like wrap:true in your build. But as far as I can tell, once your module comes down from CDN, if it's AMD, it's going to look for a global define function to register itself with, and that won't exist because it's now wrapped in a closure.
The lesson I took away from all this: inline your resources to your build. It makes sense. You reduce HTTP requests, minify it all and get gzip compression. You don't expose your modules to the world and everything is a lot simpler. If you cache your resources properly you won't even need to worry about it.
But since new versions of socket.io don't like AMD, here's how I did it. Make sure to include the socket.io <script> tag before requirejs. Then create a requirejs module named socket.io with the following contents:
define([], function () {
var io = window.io;
window.io = null;
return io;
});
Set the path like so: 'socket.io': 'core/socket.io' or wherever you want.
And require it as normal! The build works fine this way.
Original answer
Is it possible that you could make use of the path config fallbacks specified in the RequireJS API? Maybe you could save the file locally as a fallback so your build will work.
The socket.io GitHub repository specifies that you can serve the client with the files in the socket.io-client package's dist/ directory.

Categories

Resources