Replicate React Context in Nodejs - javascript

I'd like to replicate the behavior of React Context in Nodejs but I'm struggling with it.
In React, by creating only one context, I can provide and consume different values in my components, depending on the value given to the <Provider/>. So the following works:
const MyContext = React.createContext(0);
const MyConsumer = () => {
return (
<MyContext.Consumer>
{value => {
return <div>{value}</div>
}}
</MyContext.Consumer>
)
}
const App = () =>
<React.Fragment>
<MyContext.Provider value={1}>
<MyConsumer/>
</MyContext.Provider>
<MyContext.Provider value={2}>
<MyConsumer/>
</MyContext.Provider>
</React.Fragment>;
ReactDOM.render(
<App/>,
document.getElementById("react")
);
<script src="https://cdnjs.cloudflare.com/ajax/libs/react/16.6.3/umd/react.production.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/react-dom/16.6.3/umd/react-dom.production.min.js"></script>
<div id="react"></div>
However I have no idea how to implement this in Nodejs. I've taken a look at the source code of React Context but it does not help much... Here is what I got so far:
// context.js
export const createContext = (defaultValue: number) => {
const context = {
value: defaultValue,
withContext: null,
useContext: null,
};
function withContext(value: number, callback: (...args: any[]) => any) {
context.value = value;
return callback;
}
function useContext() {
return context;
}
context.withContext = withContext;
context.useContext = useContext;
return context;
};
// functions.js
import { context } from "./index";
export function a() {
const result = context.useContext();
console.log(result);
}
export function b() {
const result = context.useContext();
console.log(result);
}
// index.js
import { createContext } from "./context";
import { a, b } from "./functions";
export const context = createContext(0);
const foo = context.withContext(1, a);
const bar = context.withContext(2, b);
console.log("foo", foo());
console.log("bar", bar());
Obviously, value is overwritten and 2 is logged twice.
Any help will be much appreciated!

2022 update
NodeJS is proposing a new built-in for doing exactly that: Asynchronous context tracking.
Thanks to #Emmanuel Meric de Bellefon for pointing this out.
Defining the desired behavior
If you only need it for synchronous code, you could do something relatively simple. All you need is to specify where the boundaries are.
In React, you do this with JSX
<Context.Provider value={2}>
<MyComponent />
</Context.Provider>
In this example, the value of Context will be 2 for MyComponent but outside of the bounds of <Context.Provider> it will be whatever the value was before that.
If I were to translate that in vanilla JS, I would probably want it to look something like this:
const myFunctionWithContext = context.provider(2, myFunction)
myFunctionWithContext('an argument')
In this example, I would expect the value of context to be 2 within myFunction but outside of the bounds of context.provider() it would be whatever value was set before.
How to work the problem
At its most basic, this could be solved by a global object
// we define a "context"
globalThis.context = 'initial value'
function a() {
// we can access the context
const currentValue = globalThis.context
console.log(`context value in a: ${currentValue}`)
// we can modify the context for the "children"
globalThis.context = 'value from a'
b()
// we undo the modification to restore the context
globalThis.context = currentValue
}
function b() {
console.log(`context value in b: ${globalThis.context}`)
}
a()
Now we know that it's never wise to pollute the global scope globalThis or window. So we could use a Symbol instead, to make sure there can't be any naming conflict:
const context = Symbol()
globalThis[context] = 'initial value'
function a() {
console.log(`context value in a: ${globalThis[context]}`)
}
a()
However, even though this solution will never cause a conflict with the global scope, it's still not ideal, and doesn't scale well for multiple contexts. So let's make a "context factory" module:
// in createContext.js
const contextMap = new Map() // all of the declared contexts, one per `createContext` call
/* export default */ function createContext(value) {
const key = Symbol('context') // even though we name them the same, Symbols can never conflict
contextMap.set(key, value)
function provider(value, callback) {
const old = contextMap.get(key)
contextMap.set(key, value)
callback()
contextMap.set(key, old)
}
function consumer() {
return contextMap.get(key)
}
return {
provider,
consumer,
}
}
// in index.js
const contextOne = createContext('initial value')
const contextTwo = createContext('other context') // we can create multiple contexts without conflicts
function a() {
console.log(`value in a: ${contextOne.consumer()}`)
contextOne.provider('value from a', b)
console.log(`value in a: ${contextOne.consumer()}`)
}
function b() {
console.log(`value in b: ${contextOne.consumer()}`)
console.log(`value in b: ${contextTwo.consumer()}`)
}
a()
Now, as long as you're only using this for synchronous code, this works by simply overriding a value before a callback and reseting it after (in provider).
Solution for synchronous code
If you want to structure your code like you would in react, here's what it would look like with a few separate modules:
// in createContext.js
const contextMap = new Map()
/* export default */ function createContext(value) {
const key = Symbol('context')
contextMap.set(key, value)
return {
provider(value, callback) {
const old = contextMap.get(key)
contextMap.set(key, value)
callback()
contextMap.set(key, old)
},
consumer() {
return contextMap.get(key)
}
}
}
// in myContext.js
/* import createContext from './createContext.js' */
const myContext = createContext('initial value')
/* export */ const provider = myContext.provider
/* export */ const consumer = myContext.consumer
// in a.js
/* import { provider, consumer } from './myContext.js' */
/* import b from './b.js' */
/* export default */ function a() {
console.log(`value in a: ${consumer()}`)
provider('value from a', b)
console.log(`value in a: ${consumer()}`)
}
// in b.js
/* import { consumer } from './myContext.js' */
/* export default */ function b() {
console.log(`value in b: ${consumer()}`)
}
// in index.js
/* import a from './a.js' */
a()
Going further: the problem of asynchronous code
The solution proposed above would not work if b() was an async function, because as soon as b returns, the context value is reset to its value in a() (that's how provider works). For example:
const contextMap = new Map()
function createContext(value) {
const key = Symbol('context')
contextMap.set(key, value)
function provider(value, callback) {
const old = contextMap.get(key)
contextMap.set(key, value)
callback()
contextMap.set(key, old)
}
function consumer() {
return contextMap.get(key)
}
return {
provider,
consumer
}
}
const { provider, consumer } = createContext('initial value')
function a() {
console.log(`value in a: ${consumer()}`)
provider('value from a', b)
console.log(`value in a: ${consumer()}`)
}
async function b() {
await new Promise(resolve => setTimeout(resolve, 1000))
console.log(`value in b: ${consumer()}`) // we want this to log 'value from a', but it logs 'initial value'
}
a()
So far, I don't really see how to manage the issue of async functions properly, but I bet it could be done with the use of Symbol, this and Proxy.
Using this to pass context
While developing a solution for synchronous code, we've seen that we can "afford" to add properties to an object that "isn't ours" as long as we're using Symbol keys to do so (like we did on globalThis in the first example). We also know that functions are always called with an implicit this argument that is either
the global scope (globalThis),
the parent scope (when calling obj.func(), within func, this will be obj)
an arbitrary scope object (when using .bind, .call or .apply)
in some cases, an arbitrary primitive value (only possible in strict mode)
In addition, javascript lets us define a Proxy to be the interface between an object and whatever script uses that object. Within a Proxy we can define a set of traps that will each handle a specific way in which our object is used. The one that is interesting for our issue is apply which traps function calls and gives us access to the this that the function will be called with.
Knowing this, we can "augment" the this of our function called with a context provider context.provider(value, myFunction) with a Symbol referring to our context:
{
apply: (target, thisArg = {}, argumentsList) => {
const scope = Object.assign({}, thisArg, {[id]: key}) // augment `this`
return Reflect.apply(target, scope, argumentsList) // call function
}
}
Reflect will call the function target with this set to scope and the arguments from argumentsList
As long as what we "store" in this allows us to get the "current" value of the scope (the value where context.provider() was called) then we should be able to access this value from within myFunction and we don't need to set/reset a unique object like we did for the synchronous solution.
First async solution: shallow context
Putting it all together, here's an initial attempt at a asynchronous solution for a react-like context. However, unlike with the prototype chain, this is not inherited automatically when a function is called from within another function. Because of this the context in the following solution only survives 1 level of function calls:
function createContext(initial) {
const id = Symbol()
function provider(value, callback) {
return new Proxy(callback, {
apply: (target, thisArg, argumentsList) => {
const scope = Object.assign({}, thisArg, {[id]: value})
return Reflect.apply(target, scope, argumentsList)
}
})
}
function consumer(scope = {}) {
return id in scope ? scope[id] : initial
}
return {
provider,
consumer,
}
}
const myContext = createContext('initial value')
function a() {
console.log(`value in a: ${myContext.consumer(this)}`)
const bWithContext = myContext.provider('value from a', b)
bWithContext()
const cWithContext = myContext.provider('value from a', c)
cWithContext()
console.log(`value in a: ${myContext.consumer(this)}`)
}
function b() {
console.log(`value in b: ${myContext.consumer(this)}`)
}
async function c() {
await new Promise(resolve => setTimeout(resolve, 200))
console.log(`value in c: ${myContext.consumer(this)}`) // works in async!
b() // logs 'initial value', should log 'value from a' (the same as "value in c")
}
a()
Second asynchronous solution: context forwarding
A potential solution for the context to survive a function call within another function call could be to have to explicitly forward the context to any function call (which could quickly become cumbersome). From the example above, c() would change to:
async function c() {
await new Promise(resolve => setTimeout(resolve, 200))
console.log(`value in c: ${myContext.consumer(this)}`)
const bWithContext = myContext.forward(this, b)
bWithContext() // logs 'value from a'
}
where myContext.forward is just a consumer to get the value and directly afterwards a provider to pass it along:
function forward(scope, callback) {
const value = consumer(scope)
return provider(value, callback)
}
Adding this to our previous solution:
function createContext(initial) {
const id = Symbol()
function provider(value, callback) {
return new Proxy(callback, {
apply: (target, thisArg, argumentsList) => {
const scope = Object.assign({}, thisArg, {[id]: value})
return Reflect.apply(target, scope, argumentsList)
}
})
}
function consumer(scope = {}) {
return id in scope ? scope[id] : initial
}
function forward(scope, callback) {
const value = consumer(scope)
return provider(value, callback)
}
return {
provider,
consumer,
forward,
}
}
const myContext = createContext('initial value')
function a() {
console.log(`value in a: ${myContext.consumer(this)}`)
const bWithContext = myContext.provider('value from a', b)
bWithContext()
const cWithContext = myContext.provider('value from a', c)
cWithContext()
console.log(`value in a: ${myContext.consumer(this)}`)
}
function b() {
console.log(`value in b: ${myContext.consumer(this)}`)
}
async function c() {
await new Promise(resolve => setTimeout(resolve, 200))
console.log(`value in c: ${myContext.consumer(this)}`)
const bWithContext = myContext.forward(this, b)
bWithContext()
}
a()
Context on async functions without explicit forwarding
Now I'm stuck... I'm open to ideas!

Your goal of "replicating React's Context in NodeJS" is slightly ambiguous. From the React docs:
Context provides a way to pass data through the component tree without having to pass props down manually at every level.
There are no component trees in NodeJS. The closest analogy that I could think of (based also on your example) was a call stack. Additionally, React's Context also causes a re-render of the tree if the value changes. I have no idea what that would mean in NodeJS, so I'll happily ignore this aspect.
Thus I will assume that you are essentially looking for a way to make a value accessible anywhere in the call stack without having to pass it down the stack as an argument from function to function.
I propose you use one of the so-called continuation-local storage libs for NodeJS to achieve this. They use a pattern that is a little different from what you were trying to do, but it might be just fine.
My favourite has been CLS Hooked (no affiliation). It taps into NodeJS's async_hooks system to preserve the provided context even if there are asynchronous calls in the stack. Last published 4 years ago it still works as expected.
I rewrote your example using CLS Hooked, although I'd argue that it's not the nicest / most intuitive way to use it. I also added an extra function call to demonstrate that it's possible to override values (i.e. create sort of child contexts). Finally, there's one noticeable difference - the context must now have an ID. If you wish to stick with this React Contexty pattern, you'll probably have to make peace with it.
// context.js
import cls from "cls-hooked";
export const createContext = (contextID, defaultValue) => {
const ns = cls.createNamespace(contextID);
return {
provide(value, callback) {
return () =>
ns.run(() => {
ns.set("value", value);
callback();
});
},
useContext() {
return ns.active ? ns.get("value") : defaultValue;
}
};
};
// my-context.js
// your example had a circular dependency problem
// the context has to be created in a separate file
import { createContext } from "./context";
export const context = createContext("my-context", 0);
// zz.js
import { context } from "./my-context";
export const zz = function () {
console.log("zz", context.useContext());
};
// functions.js
import { context } from "./my-context";
import { zz } from "./zz";
export const a = function () {
const zzz = context.provide("AAA", zz);
zzz();
const result = context.useContext();
console.log("a", result);
};
export const b = function () {
const zzz = context.provide("BBB", zz);
zzz();
const result = context.useContext();
console.log("b", result);
};
// index.js
import { context } from "./c";
import { a, b } from "./functions";
const foo = context.provide(1, a);
const bar = context.provide(2, b);
console.log("default value", context.useContext());
foo();
bar();
Running node index logs:
default value 0
zz AAA
a 1
zz BBB
b 2
This would also work if there were all sorts of asynchronous calls happening in your stack.
How I use it
My approach is a little different. I wasn't trying to replicate React's Context, which also has a limitation in that it is always bound to a single value.
// cls.ts
import cls from "cls-hooked";
export class CLS {
constructor(private readonly NS_ID: string) {}
run<T>(op: () => T): T {
return (cls.getNamespace(this.NS_ID) || cls.createNamespace(this.NS_ID)).runAndReturn(op);
}
set<T>(key: string, value: T): T {
const ns = cls.getNamespace(this.NS_ID);
if (ns && ns.active) {
return ns.set(key, value);
}
}
get(key: string): any {
const ns = cls.getNamespace(this.NS_ID);
if (ns && ns.active) {
return ns.get(key);
}
}
}
// operations-cls.ts
import { CLS } from "./cls";
export const operationsCLS = new CLS("operations");
// consumer.ts
import { operationsCLS } from "./operations-cls";
export const consumer = () => {
console.log(operationsCLS.get("some-value")); // logs 123
};
// app.ts
import { operationsCLS } from "./operations-cls";
import { consumer } from "./consumer";
cls.run(async () => {
cls.set("some-value", 123);
consumer();
});
How CLS works
I prefer to view CLS as magic, as it's always worked fine without my intervention, so can't comment much here, sorry :]

Related

How to test functions in a function using Jest

I have some code that has functions inside functions, and I want to be able to unit test the functions inside the parent function.
I am looking to have tests that unit test these and spy on them (both requirements are needed).
Example:
export default parentFunction = () => {
const innerFunction = () => {
//that does stuff
}
const anotherInnerFunction = () => {
//that does more stuff
}
//and at some point, the functions are called
//like this
innerFunction()
const anotherFunction = () => {
//or like this
anotherInnerFunction()
}
}
I have not been able to find a way to test these inner functions. I have tried the following.
Example test
import parentFunction from "myfile"
it("should call innerFunction", () => {
//this causes an error in jest
const innerFunctionSpy = jest.spyOn(parentFunction, "innerFunction")
//..etc
expect(innerFunctionSpy).toHaveBeenCalled()
})
it("will return a value from anotherInnerFunction", () => {
//this does not work
const value = parentFunction.anotherInnerFunction()
//this also does not work
const value = parentFunction().anotherInnerFunction()
//..etc
})
Does the parent function need to be refactored in order to be able to tests these inner functions? If my parent function was an object then I could test these, however, I am not sure if I can refactor my code to work like this.
For example
export default parentFunction = {
innerFunction: () => {
//that does stuff
},
//more code
}
You cannot access the variables or functions scoped inside another function in JavaScript. Unless you explicitly expose them by returning them from that function or export them from the module. This is not about Jest, this is how it works in JavaScript.
jest.spyOn(parentFunction, "innerFunction")
The above line of code indicates to Jest that the innerFunction function is set as a property of the parentFunction object but that is not the case. In fact innerFunction is a function scoped inside the parentFunction which cannot be accessed from outside of the scope of parentFunction. Unless you return it explicitly or define it on the module level scope and then export it.
But the inner workings or the implementation details of such inner functions should not be exposed, but if it is needed it should be marked as such using an _ before its name, take the following example:
//scoped to the module
const _innerFunction = () => {
//that does stuff
}
//scoped to the module
const _anotherInnerFunction = () => {
//that does more stuff
}
//exported as a public API
const anotherFunction = () => {
_anotherInnerFunction()
}
const publicApi = {
anotherFunction,
// expose the private functions for unit tests
_innerFunction,
_anotherInnerFunction
}
export default publicApi;
Then in your Jest test case:
import publicApi from "myfile"
it("should call anotherFunction", () => {
const anotherFunctionSpy = jest.spyOn(publicApi, "anotherFunction")
//..etc
expect(anotherFunctionSpy ).toHaveBeenCalled()
})
it("should call _innerFunction", () => {
const innerFunctionSpy = jest.spyOn(publicApi, "_innerFunction")
//..etc
expect(innerFunctionSpy ).toHaveBeenCalled()
})

Javascript object initialized with several functions: what syntax is it?

I am reviewing some Javascript code and stumbled upon a syntax that I didn't knew. The application is a React and Redux one, though I think this is plain Javascript.
The syntax I'm concerned with is the { f1(), f2(), ... } argument of combineReducers().
This is the syntax:
combineReducers({
Reducer1,
Reducer2,
...
});
ReducerN is a function, i.e.:
const Reducer1 = (state = INITIAL_STATE, action) => {
// ...
};
I get { f1(), ... } creates an object where the function name is the key and the function itself is the value, so in a browser console I tried the following:
a = () => { console.log(1) }
b = () => { console.log(2) }
o = {a, b}
and if I print o:
{a: ƒ, b: ƒ}
a: () => { console.log(1) }
b: () => { console.log(2) }
__proto__: Object
But if I try to initialize o in a single operation:
o = { () => return 1 }
or
o = { function y() { return 1 }}
they both give a syntax error.
It's the first time I see an object created with that syntax: What kind is that? Where can I find its reference?
As said previously,
combineReducers({
Reducer1,
Reducer2,
...
});
is equivalent to this in plain ES5:
combineReducers({
Reducer1: Reducer1,
Reducer2: Reducer2,
...
});
and combineReducers is concerned only with the values of the object passed in. The first form is just a shorthand for defining properties with the same name as the value. This is the reason you cannot use anonymous functions in this form. To define function members on classes and objects, you can use the following form:
class Foo {
foo() { console.log('foo'); }
bar = () => console.log('bar')
}
const a = new Foo();
a.foo();
a.bar();
const b = {
foo() { console.log('foo'); }
bar: () => console.log('bar')
};
b.foo();
b.bar();
When transpiling to plain ES5, this will generate the following:
"use strict";
var Foo = /** #class */ (function () {
function Foo() {
this.bar = function () { return console.log('bar'); };
}
Foo.prototype.foo = function () { console.log('foo'); };
return Foo;
}());
var a = new Foo();
a.foo();
a.bar();
var b = {
foo: function () { console.log('foo'); },
bar: function () { return console.log('bar'); }
};
b.foo();
b.bar();
{ f1() } is very different than { f1 }.
The latter is a shorthand of { f1: f1 } which is an object having the key 'f1' (a string) associated to the value f1 (a function). The function is not executed.
In the first example f1() is a function call. The function f1 is executed and the value it returns is used instead. But because you didn't provide a key to associate the value with and because f1() is a value that does not have a name (it is an expression that needs to be evaluated in order to get its value), JS cannot produce an object out of it.
{ f1 } can be evaluated at the compile time and turned into { f1: f1 }.
{ f1() } cannot be evaluated at the compile time. The value of f1() is available only at the run time.
This is why { f1() } is invalid code.
If you need to call f1 and use the value it returns to create an object you can do it this way:
const x = { f1: f1() }
This is the same thing as:
const v = f1();
const x = { f1: v }

expressjs server.js this is undefined, how can I refer to server

In an ExpressJS setup, I have server.js where I do the following:
import { call_method } from '../hereIam.mjs';
const process_db = async () => {
console.log(this); // undefined
call_method(this);
};
console.log(this) // undefined
process_db();
And then, from hereIam.mjs I want to call a parent method, but this is undefined
export const call_method = parent_this => console.log(parent_this); // undefined
I tried to include classes in server.js, in an attempt to force having a this
class AppServer {
constructor() {
console.log(this)
}
const process_db = async () => call_method(this);
}
But it seems that the arrow functions inside classes doesn't compile in (experimental) NodeJS (this should be another question)
EDITED
How I can do this is by avoiding the arrow notation to be able to use classes inside Express, and then instantiate a class that provides a this.
class AppServer {
async process_db() {call_method(this)};
}
let server = new AppServer();
server.process_db();
The question would be, the only way of getting a this reference is by using objects/classes?
You could use the the bind method and pass through any object to be used as the this context.
However, arrow functions receive the context from that which they are called from, function() {} function syntax use the context that was bound to them either implicitly by the context they were defined in or explicitly using this bind method.
So, an alternative to using classes would be to bind a simple object to the method, something like:
const call_method = require('../hereIam.mjs');
const process_db = async function() {
console.log(this);
call_method(this);
};
console.log(this);
const context = {
name: 'bound context',
parent_method: async function() {
console.log('Good evening');
}
}
process_db.bind(context)();
Presuming hereIam.mjs contains:
module.exports = parent_this => console.log(parent_this);
then the script will output:
{}
{ name: 'bound context',
parent_method: [AsyncFunction: parent_method] }
{ name: 'bound context',
parent_method: [AsyncFunction: parent_method] }

How to augment instances of a mocked constructor in Jest

I'd like to augment, but not completely replace, instances of a mocked constructor in a Jest unit test.
I want to add a few values to the instance, but keep the auto-mocked goodness of Jest.
For example:
A.js
module.exports = class A {
constructor(value) {
this.value = value;
}
getValue() {
return this.value;
}
}
To get some auto-mock awesomeness:
jest.mock('./A');
With the automock, instances have a mocked .getValue() method, but they do not have the .value property.
A documented way of mocking constructors is:
// SomeClass.js
module.exports = class SomeClass {
m(a, b) {}
}
// OtherModule.test.js
jest.mock('./SomeClass'); // this happens automatically with automocking
const SomeClass = require('./SomeClass')
const mMock = jest.fn()
SomeClass.mockImplementation(() => {
return {
m: mMock
}
})
const some = new SomeClass()
some.m('a', 'b')
console.log('Calls to m: ', mMock.mock.calls)
Using that approach for A:
jest.mock('./A');
const A = require('./A');
A.mockImplementation((value) => {
return { value };
});
it('does stuff', () => {
const a = new A();
console.log(a); // -> A { value: 'value; }
});
The nice thing about that is you can do whatever you want to the returned value, like initialize .value.
The downsides are:
You don't get any automocking for free, e.g. I'd need to add .getValue() myself to the instance
You need to have a different jest.fn() mock function for each instance created, e.g. if I create two instances of A, each instance needs its own jest.fn() mock functions for the .getValue() method
SomeClass.mock.instances is not populated with the returned value (GitHub ticket)
One thing that didn't work (I was hoping that maybe Jest did some magic):
A.mockImplementation((value) => {
const rv = Object.create(A.prototype); // <- these are mocked methods
rv.value = value;
return rv;
});
Unfortunately, all instances share the same methods (as one would expect, but it was worth a shot).
My next step is to generate the mock, myself, via inspecting the prototype (I guess), but I wanted to see if there is an established approach.
Thanks in advance.
Turns out this is fixed (as of jest 24.1.0) and the code in the question works, as expected.
To recap, given class A:
A.js
module.exports = class A {
constructor(value) {
this.value = value;
}
setValue(value) {
this.value = value;
}
}
This test will now pass:
A.test.js
jest.mock('./A');
const A = require('./A');
A.mockImplementation((value) => {
const rv = Object.create(A.prototype); // <- these are mocked methods
rv.value = value;
return rv;
});
it('does stuff', () => {
const a = new A('some-value');
expect(A.mock.instances.length).toBe(1);
expect(a instanceof A).toBe(true);
expect(a).toEqual({ value: 'some-value' });
a.setValue('another-value');
expect(a.setValue.mock.calls.length).toBe(1);
expect(a.setValue.mock.calls[0]).toEqual(['another-value']);
});
The following worked for me:
A.mockImplementation(value => {
const rv = {value: value};
Object.setPrototypeOf(rv, A.prototype);
return rv
})

Generic reading of arguments from multiple constructor calls

Follow-up question to Read arguments from constructor call:
The accepted solution allows me to get arguments passed into a constructor by defining a wrapper class that captures and exposes the arguments, but this leaves me with the problem of having n wrappers for n constructors.
Is there a way to have 1 function/wrapper/whatever that could work for any number of constructors?
I'll reiterate that I'm pursing this technique specifically to test Webpack plugin configuration, and I'd like to avoid having a separate wrapper for each plugin that I need to test.
Looking for something along the lines of
// ------------------------------------------------------------ a wrapper function?
const someWrapper = () => { /* ... */ }
const plugin1 = new Plugin({ a: 'value' })
const plugin2 = new Plugin2(arg1, arg2, { b: 'anotherValue '})
someWrapper(plugin1).args === [{ a: 'value' }]
someWrapper(plugin2).args === [arg1, arg2, { b: 'anotherValue' }]
// --------------------------------------------------------------- a wrapper class?
class Wrapper { /* ... */ }
const plugin1 = new Wrapper(Plugin, [{ a: 'value' }])
const plugin2 = new Wrapper(Plugin2, [arg1, arg2, { b: 'anotherValue '}])
plugin1.args === [{ a: 'value' }]
plugin2.args === [arg1, arg2, { b: 'anotherValue '}]
// problem with above is the wrapper is being passed to Webpack, not the underlying
// plugin; not sure yet if this would cause webpack to break or not actually
// execute the plugin as intended with a vanilla config
// ---------------------------------------------------------------- something else?
Yes, you can create generic wrapper which will add args property to instance of any passed constructor:
class Plugin {
constructor (arg1, arg2) {
this.arg1 = arg1
this.arg2 = arg2
}
}
function wrapper(initial) {
// Rewrite initial constructor with our function
return function decoratedContructor(...args) {
// Create instance of initial object
const decorated = new initial(...args)
// Add some additional properties, methods
decorated.args = [...args]
// Return instantiated and modified object
return decorated
}
}
const decoratedPlugin = wrapper(Plugin)
const plugin = new decoratedPlugin('argument', { 'argument2': 1 })
console.log(plugin.args)
FYI: it's not safe to add properties without some prefix. Consider adding __ or something like this to your property, because you can accidentally rewrite some inner object property.
I was able to get this working with a modification to #guest271314's suggestion, namely, you need to pass ...initArgs to super(), otherwise webpack will fail with a TypeError: Cannot read property '...' of undefined.
Also took #terales's point into account about making sure to prefix my additional properties.
const exposeConstructorArgs = (Plugin, ...args) => {
const ExposedPlugin = class extends Plugin {
constructor(...initArgs) {
super(...initArgs);
this.__initArgs__ = initArgs;
}
get __initArgs() {
return this.__initArgs__;
}
};
return Reflect.construct(ExposedPlugin, args);
};
// ...
const dllPlugin = exposeConstructorArgs(webpack.DllPlugin, {
name: '[name]',
path: path.join(buildDir, '[name].json'),
});
// ...
const pluginConfig = dllPlugin.__initArgs[0];
expect(pluginConfig.name).toEqual('[name]');
You can use a generic function where class expression is used within function body. Pass reference to the class or constructor and parameters expected to be arguments within the instance to the function call.
function Plugin() {}
function Plugin2() {}
function PluginWrapper(pluginRef, ...args) {
let MyPlugin = class extends pluginRef {
constructor() {
super();
this.args = [...arguments];
}
getArgs() {
return this.args;
}
}
return Reflect.construct(MyPlugin, args);
};
const anInstance = PluginWrapper(Plugin, {
a: 'path'
});
console.log(anInstance.getArgs(), anInstance instanceof Plugin);
const aSecondInstance = PluginWrapper(Plugin2, "arg1", "arg2", {
b: 'anotherPath'
});
console.log(aSecondInstance.getArgs(), aSecondInstance instanceof Plugin2);

Categories

Resources