r.js with anonymous defines in self executing functions - javascript

I am trying to optimize a durandal app into a single file using r.js via grunt and the durandal grunt task.
I am using js libraries that register with require inside a top level self executing function. The code below is from the breeze.js source but the code is conceptually the same in knockout, moment + others
if (typeof exports === "object" && typeof module === "object") {
module.exports = def();
} else if (typeof define === "function" && define["amd"]) {
define(def);
} else {
breeze = def();
}
The problem is that the r.js optimizer obviously cant insert a module id into the anonymous define.
THis means that when the optimized build file is loaded, breeze is registered as a module without a name so it be 'required' into other modules.
Whats the best way to handle this ? (modifying the source to define('breeze', [], def) would work for example).
Maybe patch the code so it 'knows' its running with almond and registers with a configurable name ?

Related

Build both distributive and NPM package from the same source

I often make small open source tools and I don't want to limit my users. My packages are usually just one function, so this is what I want my users to get:
A JS file that they can add via the src in the script tag. That script should add my function so they can call it in a script below. Useful for users who don't want to use a package manager at all:
<script src="https://amazingCDN.com/isEven.js"></script>
<script>
isEven()
</script>
A JS file that can be published as a package, so users who use NPM can just type npm install isEven and then import my package.
Both JS files should be built from the same source. Let's say my source only contains a named function that should be added to the window and should be importable if I use Webpack. Let's say I will publish a package myself and I only want my building pipeline to generate two JS files from my source. What about CDN let's say I use jsDelivr and it can retrieve my JS file from Github and minify it so I don't care about minifying my file myself.
I tried writing my code as a module and using Browserify with the standalone flag. It actually works with CommonJS modules, but to make it work with ES modules I have to use esmify, and it just returns an object with the default key, so I can't call it like foo(), I have to call it like foo.default(). This is not what I want.
I also tried writing it as a standalone file and just doing
echo 'export default ' | paste -d'\0' - src.js > module.js
It kinda works but I wonder if there are more sophisticated and reliable solutions.
How do I achieve this?
It sounds like you might want to package your project as a UMD
https://github.com/umdjs/umd
Rollup can target UMD as output and is a bit more minimal than either webpack or browserify (especially for tiny single function libs)
isEven.mjs
function isEven(x) {
return (x % 2) === 0 && x !== 0;
}
export {
isEven
};
$ rollup isEven.mjs --format=umd --name=isEven
Will result in
(function (global, factory) {
typeof exports === 'object' && typeof module !== 'undefined' ? factory(exports) :
typeof define === 'function' && define.amd ? define(['exports'], factory) :
(global = typeof globalThis !== 'undefined' ? globalThis : global || self, factory(global.isEven = {}));
}(this, (function (exports) { 'use strict';
function isEven(x) {
return (x % 2) === 0 && x !== 0;
}
exports.isEven = isEven;
Object.defineProperty(exports, '__esModule', { value: true });
})));
Which I think is exactly what you want.
I solved this and created a template repo for this exact task.
You can create your repo from this template, implement your library as a NPM package with ES modules, and then on push the dist folder will be created, and there will be the distributive js file named after your library. Your users can just add it to a script tag and your library will appear at the global namespace.

Javascript define and module

I'm a newbie on javascript and i encountered this piece of code:
(function (root, factory) {
if(typeof define === "function" && define.amd) {
define([], function(){ return factory(root.MyObject);});
} else if(typeof module === "object" && module.exports) {
module.exports = factory(root.MyObject);
} else {
root.MyObject= factory(root.MyObject);
}
}(typeof window !== "undefined" ? window : this, function (MyObject) { ... }
When i debug this, i can not see what's inside of define and module. What are these and what data they are holding (especially define.amd)? And would you kindly explain me why this approach is good (or bad or necessary) to inject MyObject? Are there any new or better approaches to do this in javascript?
Thanks in advance
This is the old universal method for importing files.
With modern browsers and with nodejs 14, now, you can use es6 modules with the es6 sintax. For nodejs they should have "mjs" extension.
import * as myModule from "./path/to/file.mjs"
Some documentation could be found here and here.
Otherwise, if you have to import old JavaScript, then you can use the UMD script that you have posted. You can find some documentation on modules here and in particular on UMD here.
UMD is a method to integrate your module with other modules, using a few well known (but also quite old) requiring methods.
Anyway you can also use an npm library such as Babeljs in order to transpile your code from es6 to old JavaScript (some configuration could be required), and their output is what should be imported.

Using RequireJS and AMD to distribute a project that contains internal dependencies

I have an open-source JavaScript form-validation library that I've created. Recently a contributor helpfully modified my library to support AMD. In the past month or two, I have been refactoring my library to enhance maintainability, modularity, and readability. Essentially I have extracted pieces of logic into self-contained modules. However, these modules all reside inside the main module.
After looking at how AMD does things, I feel that it would be beneficial if I am able to split these internal modules into their own separate files. I come from a Java background, and from that perspective, these individual modules seem like classes to me and I'd like to separate them out. This will also help me manage my dependencies better and also enforce proper modularity. I think in the long run, this will make the code much better.
I know that RequireJS has an "optimize" feature whereby it will combine all dependencies into a single file and minimize it.
My question is this: will this minified file also be AMD compatible? That is, will the file expose itself as an AMD module? The dependencies that the project itself has are all internal and I don't want to expose them separately. However, I would still like developers to be able to import my library as a self-contained module.
will this minified file also be AMD compatible? That is, will the file
expose itself as an AMD module?
Require.js doesn't necessary generate a AMD compatible module. You have to make your library AMD compatible. It should happen in your main file. You can learn it from lowdash how to do it. They created their library compatible with Node and Require.js. They basically looking for global variables to detect Node and Require.
/** Detect free variable `exports` */
var freeExports = typeof exports == 'object' && exports;
/** Detect free variable `module` */
var freeModule = typeof module == 'object' && module && module.exports == freeExports && module;
/** Detect free variable `global` and use it as `window` */
var freeGlobal = typeof global == 'object' && global;
if (freeGlobal.global === freeGlobal) {
window = freeGlobal;
}
At the end:
// if (typeof define == 'function' && typeof define.amd == 'object' && define.amd) {
// Expose Lo-Dash to the global object even when an AMD loader is present in
// case Lo-Dash was injected by a third-party script and not intended to be
// loaded as a module. The global assignment can be reverted in the Lo-Dash
// module via its `noConflict()` method.
window._ = _;
// define as an anonymous module so, through path mapping, it can be
// referenced as the "underscore" module
define(function() {
return _;
});
}
// check for `exports` after `define` in case a build optimizer adds an `exports` object
else if (freeExports && !freeExports.nodeType) {
// in Node.js or RingoJS v0.8.0+
if (freeModule) {
(freeModule.exports = _)._ = _;
}
// in Narwhal or RingoJS v0.7.0-
else {
freeExports._ = _;
}
}
else {
// in a browser or Rhino
window._ = _;
}

How to write an AMD module for use in pages without RequireJS?

I need to re-structure an existing AMD module to make it both usable by pages with/without RequireJS presented. How should I do it, and is there any example code? Preferably, it would be a way without polluting the global namespace, though not a strict requirement.
This is not a bad idea at all, quite often JS libs are required to support a AMD/non AMD environment. Here is one variation of the solution:
!function (name, definition) {
if (typeof module != 'undefined') module.exports = definition()
else if (typeof define == 'function' && define.amd) define(name, definition)
else this[name] = definition()
}('reqwest', function () {
// Module here
});
The only downside is you can't request other dependencies, so this would only be useful in stand alone libraries, like the ones below
Dustin Diaz's Reqwest
Mustache
I recently wrote a gist regarding this topic so I have copied the relevant bits for you below; however, feel free to check out the original Gist for more info.
The following boilerplate allows you to write your module once and have it work in a CJS/NodeJs, AMD, or Browser Global environment.
Best Used When...
You are migrating from namespaced (err, globals) code to either AMD or CJS modules or both.
You can't yet factor out browser globals but also need to test your code via NodeJS (e.g. Mocha)
Benefits & Trade-offs
A single module format allowing you to target AMD, CJS/NodeJS, and legacy browser globals like window.*.
Allows multiple dependencies to be defined.
Run unit tests via CLI/NodeJS runner (e.g. Mocha).
Less pain while incrementally migrating to CJS/NodeJS or AMD modules.
You give up the Java-like namespaces (e.g. com.company.package.module) -- meh, they are a mess anyway.
This (UMD) isn't a standard; to be fair, neither is AMD (it's a convention with a well-defined spec).
Non-trivial amount of boilerplate (and Ugly).
Example
/**
* Creates a an "AppWidget" module that imports a "SingleDependency" constructor and exposes an "AppWidget" constructor.
*
* Allows you to access AppWidget as a CJS/AMD module (i.e. NodeJS or RequireJs):
*
* #example
* var AppWidget = require('app-widget')
*
* Allows you to access AppWidget as a legacy browser global:
*
* #example
* var AppWidget = window.AppWidget
*/
!(function (name, context, definition) {
if (typeof exports === 'object') { module.exports = definition(require); } else if (typeof define === 'function' && define.amd) { define(definition); } else { context[name] = definition(); }
}).call(this, 'AppWidget', this, function (require) {
'use strict'
// imports
var SingleDependency = (typeof require === 'function') ? require('./single-dependency') : window.SingleDependency;
var singleDependency = new SingleDependency();
function AppWidget() {}
AppWidget.prototype.start = function () {};
// exports
return AppWidget;
});
Check out UMD. You should find a pattern suitable for your purposes there. The templates are somewhat ugly but work.
I think that this is quite a bad idea.
Your possible steps that you have to take to make sure that it works:
ensure that all module's dependencies are loaded on that page (jQuery, Backbone and others)
include your module(s) on the page in the order you know they should be executed
ensure that any module that is a dependency of another module creates a global variable
with the same name the "importing" module expects and refers to in its code
ensure that your module refers to the dependencies (including other modules) by the same name
override/create a global method define that executes your module's factory function
And that is only a part of what you'd have to do. What about 3rd parties that you need to be AMD-compliant for your RequireJS pages (or at least be in its shim configuration), but also should be global for your non-RequireJS pages?
Simply put, IMO, it'd be easier to rework existing code into AMD version, that to make your modules non-AMD
Based on Simon Smiths answer I dealed with module dependencies:
(function (root, factory) {
if (typeof exports === "object") {
module.exports = factory();
} else if (typeof define === "function" && define.amd) {
define(['jquery', 'ol'], factory);
} else {
root.module_name = factory();
}
}(this, function (_$, _ol) {
return new function () {
// supposed that window.ol and window.$ are defined
var ol = goog.isDef(_ol) ? _ol : window.ol;
var $ = goog.isDef(_$) ? _$ : window.$;
}
}));
Where goog.isDef is Google Closure function:
goog.isDef = function(val) {
return val !== void 0;
};
Hope it will help someone.
Check out Browserify, which will create single, standalone .js file with all dependencies embedded from your AMD/JS code.
You could then ship 2 versions of your code, one for AMD users and one for "oldschool" js users.

best practices for cross commonjs/browser development

Currently, I use a few defines via the Google Closure Compiler along the lines of IS_CJS and IS_BROWSER, and just have different files that get built (browser.myproject.js, cjs.myproject.js, etc).
Is this the standard way of doing things? If not, what is it and what are the advantages?
I've been using the following preamble in all my projects, for libraries that are loaded by both browser and server code:
if (define === undefined) {
var define = function(f) {
require.paths.unshift('.');
f(require, exports, module);
};
}
define(function(require, exports, module) {
...
// main library here
...
// use require to import dependencies
var v = require(something);
...
// use exports to return library functions
exports.<stuff> = { some stuff };
...
});
This works to load the library with a require(<library>) call running on my node server, as well as with a require(<library>) call with RequireJS. On the browser, nested require calls are pre-fetched by RequireJS prior to library execution, on Node these dependencies are loaded synchronously. Since I'm not using my libraries as stand-alone scripts (via a script tag in the html), and only as dependencies for scripts loaded via the script tag, this works well for me.
However, looking at stand-alone libraries, it looks like the following preamble seems to be the most flexible. (cut and paste from the Q promise library
(function (definition, undefined) {
// This file will function properly as a <script> tag, or a module
// using CommonJS and NodeJS or RequireJS module formats. In
// Common/Node/RequireJS, the module exports the Q API and when
// executed as a simple <script>, it creates a Q global instead.
// The use of "undefined" in the arguments is a
// micro-optmization for compression systems, permitting
// every occurrence of the "undefined" variable to be
// replaced with a single-character.
// RequireJS
if (typeof define === "function") {
define(function (require, exports, module) {
definition(require, exports, module);
});
// CommonJS
} else if (typeof exports === "object") {
definition(require, exports, module);
// <script>
} else {
Q = definition(undefined, {}, {});
}
})(function (serverSideRequire, exports, module, undefined) {
...
main library here
...
/*
* In module systems that support ``module.exports`` assignment or exports
* return, allow the ``ref`` function to be used as the ``Q`` constructor
* exported by the "q" module.
*/
for (var name in exports)
ref[name] = exports[name];
module.exports = ref;
return ref;
});
While wordy, it's impressively flexible, and simply works, no matter what your execution environment is.
You can use uRequire that converts modules written in either AMD or CommonJS to either AMD, CommonJS or UMD through a template system.
Optionally uRequire builds your whole bundle as a combinedFile.js that runs in ALL environments (nodejs, AMD or module-less browser < script/>) thats using rjs optimizer and almond under the hood.
uRequire saves you from having to maintain any boilerplate in each module - just write plain AMD or CommonJS modules (as .js, .coffee, .coco, .ls etc) without gimmicks.
Plus you can declaratively add standard functionality such as exporting a module to global such as window.myModule along with a noConflict() method, or have runtimeInfo (eg __isNode, __isAMD) selectively or replace/remove/inject a dependency while building, automatically minify, manipulate module code and much more.
All of these configuration options can be turned on and off per bundle OR per module, and you can have different build profiles (development, test, production etc) that derive(inherit) from each other.
It works great with grunt through grunt-urequire or standalone and it has a great watch option that rebuilds ONLY changed files.
Have you tried this: https://github.com/medikoo/modules-webmake#modules-webmake ?
It's the approach I'm taking, and it works really well. No boilerplate in a code and you can run same modules on both server and client side

Categories

Resources