Ignore imports when importing internal/external dependencies in Jest - javascript

I found questions related to this but not exactly what I was looking for. I am wondering if this is possible to do something along the lines of, if you mock a internal dependency (like a module/file you created) / external deps, can you ignore all imports in that file by default and only allow the ones you need?
Example, pretend the method foo is being tested, and its inside a file called Foo.js, I only need the camelCase depedency to work as normal cause its used inside the foo function, but the rest of the dependencies, can be mocked to avoid loading something i wish to not load (like some code thats just living in the file not in a class or method)
// Foo.js
const { get } = require('lodash');
const camelCase = require('camelcase');
const Bar = require('../../../Bar.js');
...
const foo = () => {
// need to test this method and only needs camelcase
...
}
Is this possible to say "ignore every dependency, except this one if my test function requires (imports) the Foo.js file in order to test the foo function?
I am very new to Jest so Sorry if my question my not make perfect sense or maybe something simple, or a better solution. I am working with an existing code base and some includes get really deep with more and more imports more down the tree and I can only really work with whats in the test functions
thanks in advance!

Related

Hijacking node require even when spawning another script

I am trying to replace a specific package using
import Module from 'module';
const {require: oldRequire} = Module.prototype;
Module.prototype.require = function twilioRequire(file) {
if (file === 'the-package-of-interest) {
// return something else
}
return oldRequire.apply(this, arguments);
};
const p = require('the-package-of-interest');
// this gives me the replacement
This would work fine, but if this was placed inside a script that spawns another script, this does not work in the other script, i.e
// main.js
import Module from 'module';
import spawn from 'cross-spawn';
const {require: oldRequire} = Module.prototype;
Module.prototype.require = function twilioRequire(file) {
if (file === 'the-package-of-interest) {
// return something else
}
return oldRequire.apply(this, arguments);
};
spawn.sync('node', ['/path/to/another/script.js'], { stdio: "inherit" });
// another/script.js
require('the-package-of-interest');
// gives the original package, not the replacement
I don't suppose there is a way to spawn another script, but keep the hijacked require scope the same?
Even though mocking libraries are usually used in tests, this might be a good situation to stub another library or file you wrote with another one based on a configuration.
There are a lot of mocking libraries for node, and there's a good article which covers some of them.
The library it recommends is pretty good but you can use whatever you want to do this
Create a mock definitions file that replaces that import with the second file
First - define your mocks. You can do it in any place, this is just a setup.
Easiest way is to do this in the same file, but you can create a mock defitions file if you like
import rewiremock from 'rewiremock';
...
// by mocking the module library, you actually replace it everywhere
rewiremock('module')
.with(otherModule);
your other module need to have the same export for this to work and act with the same interface.
Replace the library based on a variable
To use, you need some sort of condition that will select if to use the original library or the second one
// This is only needed if you put the mock definitions in another file
// Just put the rest of the code under the previous snippter
// if everything is in the same file
require('<your_rewiremock_definitions_file>');
if (replaceImport) {
rewiremock.enable();
}
// run the code here
if (replaceImport) {
rewiremock.disabled();
}
Your module should be replaced everywhere by the second module.
You can use any other mocking library to do this. there's a short section at the start of the readme with different options if you want a different style or another library.
rewiremock works by replacing the node.js cache so it should change the dependency everywhere.

How to migrate legacy JS app to modules

I have a large (~15k LoC) JS app (namely a NetSuite app) written in old-style all-global way. App consists of 26 files and dependencies between them are totally unclear.
The goal is to gracefully refactor the app to smaller modules. By gracefully I mean not breaking\locking the app for long time, but doing refactoring in smaller chunks, while after completing each chunk app remains usable.
An idea I have here is to concat all the JS files we have now into single-file bundle. After that some code could be extracted into modules. And the legacy code could start importing it. The modules & imports should be transpiled with webpack\whatever, while legacy code remains all-globals style. Finally all this is packed into single JS file and deployed.
My questions are
is there a better approach maybe? This sounds like a typical problem
are there any tools available to support my approach?
I gave webpack a try and I haven't managed to get what I want out of it. The export-loader and resolve-loader are no options because of amount of methods\vars that needs to be imported\exported.
Examples
Now code looks like
function someGlobalFunction() {
...
}
var myVar = 'something';
// and other 15k lines in 26 files like this
What I would ideally like to achieve is
function define(...) { /* function to define a module */ }
function require(moduleName) { /* function to import a module */ }
// block with my refactored out module definitions
define('module1', function () {
// extracted modularised code goes here
});
define('module2', function () {
// extracted modularised code goes here
});
// further down goes legacy code, which can import new modules
var myModule = require('myNewModule');
function myGlobalLegacyFunction() {
// use myModule
}
I'm following an approach similar to that outlined here: https://zirho.github.io/2016/08/13/webpack-to-legacy/
In brief:
Assuming that you can configure webpack to turn something like
export function myFunction(){...}
into a file bundle.js that a browser understands. In webpack's entry point, you can import everything from your module, and assign it to the window object:
// using namespace import to get all exported things from the file
import * as Utils from './utils'
// injecting every function exported from utils.js into global scope(window)
Object.assign(window, Utils).
Then, in your html, make sure to include the webpack output before the existing code:
<script type="text/javascript" src="bundle.js"></script>
<script type="text/javascript" src="legacy.js"></script>
Your IDE should be able to help identify clients of a method as you bring them into a module. As you move a function from legacy.js to myNiceModule.js, check to see if it still has clients that are aware of it globally - if it doesn't, then it doesn't need to be globally available.
No good answer here so far, and it would be great if the person asking the question would come back. I will pose a challenging answer saying that it cannot be done.
All module techniques end up breaking the sequential nature of execution of scripts in the document header.
All dynamically added scripts are loaded in parallel and they do not wait for one another. Since you can be certain that almost all such horrible legacy javascript code is dependent on the sequential execution, where the second script can depend on the first previous one, as soon as you load those dynamically, it can break.
If you use some module approach (either ES7 2018 modules or require.js or you own) you need to execute the code that depends on the loading having occurred in a call-back or Promise/then function block. This destroys the implicit global context, so all these spaghetti coils of global functions and var's we find in legacy javascript code files will not be defined in the global scope any more.
I have determined that only two tricks could allow a smooth transition:
Either some way to pause continuation of a script block until the import Promise is resolved.
const promise = require("dep1.js", "dep2.js", "dep3.js");
await promise;
// legacy stuff follows
or some way to revert the scope of a block inside a function explicitly into the global scope.
with(window) {
function foo() { return 123; }
var bar = 543;
}
But neither wish was granted by the javascript fairy.
In fact, I read that even the await keyword essentially just packs the rest of the statements into function to call when promise is resolved:
async function() {
... aaa makes promise ...
await promise;
... bbb ...
}
is just, I suppose, no different from
async function() {
... aaa makes promise ...
promise.then(r => {
... bbb ...
});
}
So this means, the only way to fix this is by putting legacy javascript statically in the head/script elements, and slowly moving things into modules, but continue to load them statically.
I am tinkering with my own module style:
(function(scope = {}) {
var v1 = ...;
function fn1() { ... }
var v2 = ...;
function fn2() { ... }
return ['v1', 'fn1', 'v2', 'fn2']
.reduce((r, n) => {
r[n] = eval(n);
return r;
}, scope);
})(window)
by calling this "module" function with the window object, the exported items would be put on there just as legacy code would do.
I gleaned a lot of this by using knockout.js and working with the source readable file that has everything together but in such module function calls, ultimately all features are on the "ko" object.
I hate using frameworks and "compilation" so generating the sequence of HTML tags to load them in the correct order by the topologically sorted dependency tree, while I could write myself such a thing quickly, I won't do this, because I do not want to have any "compilation" step, not even my own.
UPDATE: https://stackoverflow.com/a/33670019/7666635 gives the idea that we can just Object.assign(window, module) which is somewhat similar to my trick passing the window object into the "module" function.

Augment types for imported module in TypeScript

I'm converting an existing codebase from js/react/jsx setup to TypeScript. Naturally, I'd like to do it file by file, and have a question about approaches for making TS compiler work with the existing js codebase.
I convert file index.js but want to leave foo.js in JavaScript for now:
// index.ts
import { fooFunction } from 'foo';
foo({ val: 1 });
// foo.js
export const fooFunction = ({ val, optionalProp }) => {
//...
}
The problem in this example that TypeScript automatically infers argument to foo and complains that Property 'optionalProp' is missing in type { val:string, optionalProp:any }
Needing to "stub" the type of fooFunction sent me looking and I found a few ways i may be able to do it:
1) Use require instead of import
// index.ts
var fooFunction: any = require('foo').fooFunction;
2) Merge declarations
?
3) Add a d.ts file with custom declarations for foo - haven't attempted but seems inconvenient
Ideally I don't have to do (1) because I want to keep using import syntax and I don't have to do (3) because that will require me to add declaration files only to remove them later when I'm ready to convert foo.
(2) sounds awesome, but I can't figure out a working solution.
// index.ts
declare module 'foo' {
function fooFunction(options: any): any
}
Doesn't work and throws Cannot redeclare block-scoped variable 'fooFunction'
How do I do that? Are there docs that have examples of what I'm trying to do and/or have more explanation about declaration merging and how to work with namespace/interface/value?
Are there better approaches for incrementally transitioning to TypeScript?
This may not work, just wanted to preface this with that - I don't know much at all about React - but I know for sure that you can use ? on a variable to make it "optional". I've used this before in functions
myFooFunction(requiredParam: typeA, optionalParam?: typeB) { ... }
Another thing you should consider is the fact that Typescript can add some type checking at compile time, so IMO you should leverage that as much as possible.
Edit: to work with the optional parameter, you could just check that it exists.
myFooFunction(requiredParam: typeA, optionalParam?: typeB) {
if (optionalParam) { // if this results to true, the param was given
//do something
}
}

Difference between javascript modularisation and dependency Injection

What 's the difference with the modularisation of a javascript code (with browserify by example) and the dependency injection?
Are they synonymes? Are the two going together? Or Am I missing some point?
Modularisation refers to breaking code into individual, independent "packages".
Dependency injection refers to not hardcoding references to other modules.
As a practical example, you can write modules which are not using dependency injection:
import { Foo } from 'foo';
export function Bar() {
return Foo.baz();
}
Here you have two modules, but this module imports a specific other hardcoded module.
The same module written using dependency injection:
export function Bar(foo) {
return foo.baz();
}
Then somebody else can use this as:
import { Foo } from 'foo';
import { Bar } from 'bar';
Bar(Foo());
You inject the Foo dependency at call time, instead of hardcoding the dependency.
You can refer this article:
Modules are code fragments that implement certain functionality and
are written by using specific techniques. There is no out-of-the box
modularization scheme in the JavaScript language. The upcoming
ECMAScript 6 specification tends to resolve this by introducing the
module concept in the JavaScript language itself. This is the future.
and Dependency injection in JavaScript
The goal
Let's say that we have two modules. The first one is a service which
makes Ajax requests and the second one is a router.
var service = function() {
return { name: 'Service' };
}
var router = function() {
return { name: 'Router' };
}
We have another function which needs these modules.
var doSomething = function(other) {
var s = service();
var r = router();
};
And to make the things a little bit more interesting the function
needs to accept one more parameter. Sure, we could use the above code,
but that's not really flexible. What if we want to use ServiceXML or
ServiceJSON. Or what if we want to mockup some of the modules for
testing purposes. We can't just edit the body of the function. The
first thing which we all come up with is to pass the dependencies as
parameters to the function. I.e.:
var doSomething = function(service, router, other) {
var s = service();
var r = router();
};
By doing this we are passing the exact implementation of the module
which we want. However this brings a new problem. Imagine if we have
doSomething all over our code. What will happen if we need a third
dependency. We can't edit all the function's calls. So, we need an
instrument which will do that for us. That's what dependency injectors
are trying to solve. Let's write down few goals which we want to
achieve:
we should be able to register dependencies
the injector should accept a function and should return a function which somehow gets the needed resources
we should not write a lot, we need short and nice syntax
the injector should keep the scope of the passed function
the passed function should be able to accept custom arguments, not only the described dependencies
A nice list isn't it. Let's dive in.

Closures in Typescript (Dependency Injection)

I'm getting my butt kicked trying to use TypeScript in a functional style with dependencies. Let's say I want to make a module that depends on another module.
If I wasn't using Dependency Injection it would look like this (in node).
SomeOtherModule = require("SomeOtherModule")
exports.doSomething = function() {
SomeOtherModule.blah()
}
This is how I do it with Dependency Injection
module.exports = function(SomeOtherModule) {
function doSomething() {
SomeOtherModule.blah()
}
return {doSomething: doSomething};
}
In typescript if you define a concrete class or module you can just type the functions as you export them or include them in the class. It's all right next to each other.
But since I can't define a module inside the DI function, the only way to do this that I can see would be to define an interface for the object I'm returning separately, which is annoying, because I want to have the type annotations in line with the definitions.
What's a better way to do this?
This will probably give you a good start: http://blorkfish.wordpress.com/2012/10/23/typescript-organizing-your-code-with-amd-modules-and-require-js/
I don't know if this is the best way to set it up. But I got it to work.
I ended up dropping AMD on my project, since I'm also using AngularJS and they step on each other's toes. I did keep using that same DI pattern through, so it looks like this in the end.
I'm pretty happy with it. I experimenting uses classes instead (you can get really close if you keep your module stateless and have the constructor be the injector function), but I didn't like having to use this for all the dependencies.
Also, classes don't actually buy me anything, because if I were coding to an interface I'd have to define the types twice anyway.
interface IMyService {
doSomething();
}
module.exports = function(SomeOtherModule) {
return {doSomething: doSomething}
function doSomething() {
SomeOtherModule.blah()
}
}

Categories

Resources