Creating a reusable RequireJs library - javascript

Given the following simplified scenario, how could I best construct my reusable component so that it is correctly consumable by another application, such that foo.js prints 23?
Reusable Component:
/home.js
/main.js
/stuff
foo.js
--
/home.js:
define([], function () { return 23; }
/stuff/foo.js:
define(['home'], function (home) { console.log(home); } // prints 23
Consumer Application:
/app.js
/main.js
/home.js
/views
template.html
/bower_components
/myReusableComponent
/home.js
/main.js
/stuff
foo.js
--
/home.js:
define([], function () { return 17; }
/bower_components/myReusableComponent/home.js:
define([], function () { return 23; }
/bower_components/myReusableComponent/stuff/foo.js:
define(['home'], function (home) { console.log(home); } // now prints 17
Is there a consumer application requirejs config that sets the baseUrl of any module in /myReusableComponent to '/myReusableComponent'? My reusable component should not have/need access to the root level of the consumer application anyway.
I have looked into the r.js optimizer, but it just outputs a bunch of define('stuff/foo', [], function ())... what happens if the consumer application has a stuff/foo.js too?
The only solution I have found so far is forcing the use of relative paths in all of my modules: define(['../home'], function (home) { console.log(home); } but I am hoping there is a more elegant way to solve this problem.
All feedback is very appreciated!

If you want to produce a library that is going to be usable in different applications, then you should use use relative paths when one module of your library refers to another, because this makes it more likely that whoever uses your library will be able to do so without having to do modify their RequireJS configuration.
Some clashes can be eliminated by the judicious use of map or paths. Ultimately, however, there are cases that cannot be solved (or at least not be solved as the user wants it) without having access to the unoptimized modules so you should distribute your library as an optimized bundle and provide the possibility to load it as a collection of unoptimized modules.

Related

Webpack code splitting breaks instanceof between chunks

tl:dr;
class ModuleInBundleA extends ModuleInBundleC { … }
window.moduleInBundleB.foo(new ModuleInBundleA())
class ModuleInBundleB {
public foo(bar: ModuleInBundleA|ModuleInBundleC|number) {
if (bar instanceof ModuleInBundleA || bar instanceof ModuleInBundleC) {
// always false
…
}
}
}
Details:
I'm trying to start using TypeScript + Webpack 4.41.6 on the project that has mostly old codebase. Basically I want to package several small modules onto bundles to migrate softly without moving whole project onto new js stack.
I found out that Webpack can do this with code splitting, and package shared code into bundles on it's own with some configuration. However I can't really control what will be in every bundle unless I build every bundle separately and then only share types, using my own modules as external libraries and that's bit frustrating.
Maybe on this point you can say that I'm doing something wrong already and I would like to hear how can I achieve my goal of using bundles just as vanilla javascript (controlling defer/async on my own and using script tag on my own as well), and I don't really want to pack everything as an independent package with own configuration, types export and so on.
Hope you got overall context. Closer to the point.
I have the following function, that is bundled to it's own chunk called modal-manager.js.
public showModal (modal: ModalFilter|AbstractModal|number) {
let modalId: number;
console.log(modal);
console.log(typeof modal);
console.log(modal instanceof ModalFilter);
console.log(modal instanceof AbstractModal);
if (modal instanceof AbstractModal) {
modalId = modal.getId();
} else {
modalId = modal;
}
...
};
(Originally it had no ModalFilter as ModalFilter inherits AbstractModal but I included it for demonstration purposes)
The abstract modal is bundled automatically to modal-utils.js as it's shared between modules.
Next, I have another big bundle called filter.js. This one literally creates instance of ModalFilter const modalFilter = new ModalFilter(...). I think it's work mentioning that instance of modalFilter declared to the global window variable. The trouble is that filter.js calls modal.js code (through window.modalFilter.showModal(modalFilter)) with no problems whatsoever, but I see the following result of console.log:
ModalFilter {shown: false, loading: false, closing: false, html: init(1), id: 0, …}
modal.bundle.23e2a2cb.js:264 object
modal.bundle.23e2a2cb.js:265 false
modal.bundle.23e2a2cb.js:266 false
I disabled mapping to get more into code and see this:
ModalManager.prototype.showModal = function (modal) {
var modalId;
console.log(modal);
console.log(typeof modal);
console.log(modal instanceof _builder_component_modal_filter__WEBPACK_IMPORTED_MODULE_1__[/* default */ "a"]);
console.log(modal instanceof _modal_abstract__WEBPACK_IMPORTED_MODULE_0__[/* default */ "a"]);
if (modal instanceof _modal_abstract__WEBPACK_IMPORTED_MODULE_0__[/* default */ "a"]) {
modalId = modal.getId();
}
else {
modalId = modal;
}
this.modals[modalId].show();
this.scrollLock(modalId);
};
With my understanding of how javascript works, instanceof should check the object-creator function. As code chunks separated (modal.js has no same code with modal-utils.js) the creator function should be the same. However, getting more to the details I see that webpackJsonp can be really tricky and calling them from kind-of independent environments, still it should be the same environment where FilterModal, AbstractModal is called. The ModalManager could have own environment I believe. But code called is 100% the same. Could that webpackJsonp bundle-arrays be the source of the problem? If so, how can I avoid that and make modal.js bundle understand that both filter.js and others reference the same AbstractModal from modal-utils.js?
If I'm doing it wrong, is there a simple way to start bundling small and efficient scripts build with TypeScript and Webpack (or other tools)?
Also, I see the externals feature of Webpack, but haven't figured out how to use that in my case. In general, I'm ok with current set up except instanceof issue. The reason I want to avoid multiple builds is that I'll probably have dozens of smaller bundles that shared across different modules and having dozen of npm packages for each seems excessive.
Prefacing this with; I don't know the answer to the exact issue that you are facing in regards to the instanceOf part of your question. This is aimed at the "how did you do it" part.
Approx. 4 weeks ago we also changed from a .js to .ts implementation with about 1-2 hunderd .js files. Obviously we didn't want to migrate these all at once over to .ts as the effort was too high.
What we ended up doing was identifying .js scripts which needed to run on specific pages and added these into webpack as entry files. Then for all of the other suporting scripts, if we required their contents in our new .ts files, we actually created a large index/barrel file for them all, imported them in and then webpack will automatically include these in the correct scope alongside their respective .ts files.
What does this look like?
legacy.index.ts: For every single supporting .js file that we wanted to reference in any way in .ts.
var someFile_js = require("someFile.js");
export { someFile_js };
This then allowed us to import and use this in the .ts files:
import { someFile_js } from './legacy.index';
In reply to #tonix. To load a defined list:
webpack.config
const SITE_INDEX = require('./path-to-js-file/list.js')
module.exports = {
entry: SITE_INDEX
...
}
list.js
{
"filename1": "./some-path/filename1.js"
"filename2": "./some-path/filename2.ts"
}

Declaring global helper functions and understand the namespace when using laravel-mix / webpack

I am totally new to webpack (i previously built my apps by including tons of css / js files by "hand") and am now trying to understand how namespaces work when working with the named tools.
i have an app.js
require('./bootstrap');
require('./helperFunctions');
/* ... more, unrelated stuff */
webpack.mix.js is untouched from the original file delivered with the laravel 5.5 sample project
let mix = require('laravel-mix');
mix.js('resources/assets/js/app.js', 'public/js')
.sass('resources/assets/sass/app.scss', 'public/css');
my helperFunctions.js is a simple js file with some helpful functions i want to use throughout my project:
function foo_bar(A, B) {
return A - B;
}
/* more functions, following the same structure... */
but everytime i try to use one of the functions defined in the helperFunctions file i find that they are "undefined", even in the app.js file directly after the 'require' happens.
after inspecting the generated app.js file i found that my functions are encapsulated in an anonymous function function(module, exports) { /* my File contents go here */ }, resulting in them being unavailable to the rest of the scripts.
while i understand that this is propably there to reduce polluting the global namespace, i dont understand how i am supposed to define global objects (such as data Storage objects for vue) or functions such as helper functions.
can anybody explain how this is supposed to work, or link me to a ressource explaining this for someone who never worked with an asset compiler (if this is even the right terminus).
cheers
// Edit: i think i found a solution, after stumbling over this:
https://nodejs.org/api/modules.html#modules_modules
I editted the helper functions file to something like this:
module.exports = {
foo_bar(A, B) {
return (A - B);
},
/* ... more functions ... */
}
And imported it wherever i need it like this:
import HelperFunctions from './helperFunctions'
var result = HelperFunctions.foo_bar(5, 8);
However, this only works in files which are pre packed using webpack. is registering the component under window.HelperFunctions the only way to make them available in dynamically generated <script></script> tags throughout the website?
Registering your helper methods on the window object, as you kind of suggested, is a simple and easy to understand approach, so that's what I would choose to do if I wanted these methods to be available globally.
window.myHelperMethod = function () { console.log('ayo, this works!') }

Writing tests for javascript module using webpack's require.ensure function

I am running mocha tests on my server, testing source scripts an isolated unit test manner.
One of the scripts I am testing makes a call to Webpack's require.ensure function, which is useful for creating code-splitting points in the application when it gets bundled by Webpack.
The test I have written for this script does not run within a Webpack context, therefore the require.ensure function does not exist, and the test fails.
I have tried to create some sort of polyfill/stub/mock/spy for this function but have had no luck whatsoever.
There is a package, webpack-require, which does allow for the creation of a webpack context. This can work but it is unacceptably slow. I would prefer to have some sort of lightweight polyfill targeting the require.ensure function directly.
Any recommendations? :)
Here is a very basic starting point mocha test.
The mocha test loads a contrived module containing a method which returns true if require.ensure is defined.
foo.js
export default {
requireEnsureExists: () => {
return typeof require.ensure === 'function';
}
};
foo.test.js
import { expect } from 'chai';
describe('When requiring "foo"', () => {
let foo;
before(() => {
foo = require('./foo.js');
});
it('The requireEnsureExists() should be true', () => {
expect(foo.requireEnsureExists()).to.be.true;
});
});
Ok, I finally have an answer for this after much research and deliberation.
I initially thought that I could solve this using some sort of IoC / DI strategy, but then I found the source code for Node JS's Module library which is responsible for loading modules. Looking at the source code you will notice that the 'require' function for modules (i.e. foo.js in my example) get created by the _compile function of NodeJs's module loader. It's internally scoped and I couldn't see an immediate mechanism by which to modify it.
I am not quite sure how or where Webpack is extending the created "require" instance, but I suspect it is with some black magic. I realised that I would need some help to do something of a similar nature, and didn't want to write a massive block of complicated code to do so.
Then I stumbled on rewire...
Dependency injection for node.js applications.
rewire adds a special setter and getter to modules so you can modify their behaviour for better unit testing. You may
inject mocks for other modules
leak private variables
override variables within the module.
rewire does not load the file and eval the contents to emulate node's require mechanism. In fact it uses node's own require to load the module. Thus your module behaves exactly the same in your test environment as under regular circumstances (except your modifications).
Perfect. Access to private variables is all that I need.
After installing rewire, getting my test to work was easy:
foo.js
export default {
requireEnsureExists: () => {
return typeof require.ensure === 'function';
}
};
foo.test.js
import { expect } from 'chai';
import rewire from 'rewire';
describe('When requiring "foo"', () => {
let foo;
before(() => {
foo = rewire('./foo.js');
// Get the existing 'require' instance for our module.
let fooRequire = moduletest.__get__('require');
// Add an 'ensure' property to it.
fooRequire.ensure = (path) => {
// Do mocky/stubby stuff here.
};
// We don't need to set the 'require' again in our module, as the above
// is by reference.
});
it('The requireEnsureExists() should be true', () => {
expect(foo.requireEnsureExists()).to.be.true;
});
});
Aaaaah.... so happy. Fast running test land again.
Oh, in my case it's not needed, but if you are bundling your code via webpack for browser based testing, then you may need the rewire-webpack plugin. I also read somewhere that this may have problems with ES6 syntax.
Another note: for straight up mocking of require(...) statements I would recommend using mockery instead of rewire. It's less powerful than rewire (no private variable access), but this is a bit safer in my opinion. Also, it has a very helpful warning system to help you not do any unintentional mocking.
Update
I've also seen the following strategy being employed. In every module that uses require.ensure check that it exists and polyfill it if not:
// Polyfill webpack require.ensure.
if (typeof require.ensure !== `function`) require.ensure = (d, c) => c(require);

Addon SDK. Slice main.js into few units?

I'm develop some extension for Mozilla FireFox using Addon SDK.
My main.js now is very huge, contains a lot of code in all kinds.
It's possible to slice them into few custom js files?
main.js - Loader
Unit1.js - Do job A
Unit2.js - Do job B
Unit3.js - Do job C
or any good advice for developing very functional extensions.
You can create and require() "local" modules. Also read up on the Module structure of the SDK.
Therefore, you may want to try to modularize your stuff.
lib/joba.js
function non_exported_helper() {
// do something;
}
function JobA() {}
JobA.prototype = {
doSomething: function() {
non_exported_helper();
return something_else;
}
};
// Export JobA
// Stuff not in `exports` will not be visible to other code
// require()ing a module.
exports.JobA = JobA;
lib/main.js
var JobA = require("./joba").JobA;
var job = new JobA();
job.doSomething();
Of course, any module can use require(), but beware of circular imports.
It's up to you what to put into what module. E.g. one module could implement a single "class", while another module could implement a collection of functions, or another module could be for background requests and yet another module of all UI stuff.
Maybe looking at the SDK itself, which is organized in modules, and/or look into what other stdlibs do, like the Python stdlib, or the go stdlib, etc.

How to structure classes within a node module?

When working with lots of classes in a node module is the following mechanism a good idea or is there a better way to achieve this?
// mymodule/core.js
module.exports = {
ClassA: require('./class/ClassA'),
ClassB: require('./class/ClassB'),
ClassC: require('./class/ClassC')
}
// mymodule/class/ClassA.js
module.exports = function() {
...
}
// myapp.js
var core = require('mymodule/core')
;
var a = new core.ClassA();
The idea of the above is to keep classes in physically separate modules for maintenance whilst providing a namespace that is easy to use.
For me it is far clearer to have the modules you are using pulled in within the module using them. You can't get around the fact that the module has these dependencies and it will be easier for someone else reading the code to see what is being used.
Also if you ever wanted to reuse a single module you would have to also carry with it the core.js module–and all of it's dependencies–from which your module is coupled.

Categories

Resources