I was wondering if there were any VS 2010 extensions out there for triggering the requirejs optimization similar to how squishit works:
When in debug mode, module files stay separate
When in release mode, modules files get minified and combined
Edit: We use VS2012, but the general principle should work
While this isn't a direct answer to your question, this is the work around that we came up with:
Create a module for the files that you want to have minimized and bundled. For the sake of discussion, call it base_module.
Create a script in your _layout.cshtml
function requireBaseModulesDebug(func) {
// In debug mode, just execute the call back directly.
// Do not load any base modules, require each module to fully state any dependencies.
func();
}
function requireBaseModulesRelease(func) {
require(["js_modules/base_module"], func);
}
// Views are always in DEBUG mode, so we have to this this work-around.
#if (HttpContext.Current.IsDebuggingEnabled)
{
<text>requireBaseModules = requireBaseModulesDebug;</text>
}
else
{
<text>requireBaseModules = requireBaseModulesRelease;</text>
}
With that script created, you then have to wrap anything that would use a module like this:
// If you are in DEBUG mode, requireBaseModules won't do anything
// But, if you are in a release mode, then requireBaseModules will load your
// bundled module and have them primed. Any modules required but NOT in your
// base bundle will be loaded normally.
requireBaseModules(function () {
require(['jQuery'], function ($) {
// Do Stuff
});
});
Related
I have such a code:
requirejs.config({
urlArgs: "bust=" + (new Date()).getTime(),
paths: {
mods: 'default',
myFriend: 'myFriend',
myCoworker: 'myCoworker'
},
shim: {
mods: ['myFriend', 'myCoworker']
}
});
require(['mods'], function (mods) {
// something to do
});
and modules which are the dependencies:
myFriend.js
var mess = `<a really huge text, almost 200 Kbytes>`
console.log('This code is ran in the myFriend module...', {mess:mess});
myCoworker.js
console.log('This code is ran in the myCoworker module...');
var wrk = {
name: 'John'
};
So I hope, that accordingly to shim is should always load myFriend.js (that is checked by console.output) before myCoworker.js. But it doesn't. The console output shows:
This code is run in the myCoworker module...
and then
This code is run in the myFriend module...
Probably I have missed something, but what?
The entire code is here: http://embed.plnkr.co/zjQhBdOJCgg8QuPZ5Q8A/
Your dealing with a fundamental misconception in how RequireJS works. We use shim for files that do not call define. Using shim makes it so that RequireJS will, so to speak, add a kind of "virtual define" to those files. The shim you show is equivalent to:
define(['myFriend', 'myCoworker'], function (...) {...});
The dependency list passed to a define or require call does not, in and of itself, specifies a loading order among the modules listed in the dependency list. The only thing the dependency list does is specify that the modules in the list must be loaded before calling the callback. That's all.
If you want myFriend to load first, you need to make myCoworker dependent on it:
shim: {
mods: ['myFriend', 'myCoworker'],
myCoworker: ['myFriend'],
}
By the way, shim is really meant to be used for code you do not control. For your own code you should be using define in your code instead of setting a shim in the configuration.
I am trying to integrate requirejs framework to my app.
Is possible to create a virtual module (which doesn't exists as a physically file), where i could group all the jquery-validation plugins together?
For example, i need to load 4 dependencies everytime i want to use jquery-validate.
Instead of requesting them, each time, i create a jquery-val "virtual module", which should request all the dependencies automatically.
However, trying to load "jquery-val" actually tries to load the file from disk (which i don't have).
What should be the best practice in solving this issue?
// config
requirejs.config({
baseUrl: '/Content',
paths: {
'jquery': 'frameworks/jquery-3.1.1.min',
"jquery-validate": "frameworks/jquery.validate.min",
"jquery-validate-unobtrusive": "frameworks/jquery.validate.unobtrusive.min",
"jquery-unobtrusive-ajax": "frameworks/jquery.unobtrusive-ajax.min"
},
shim: {
"jquery-val": ["jquery", "jquery-validate", "jquery-validate-unobtrusive", "jquery-unobtrusive-ajax"]
}
});
// Solution 1: working, but ugly
define(["jquery", "jquery-validate-unobtrusive", "jquery-unobtrusive-ajax"], function ($) {
// My Module
});
// Solution 2: not working
define(["jquery-val"], function () {
// My Module
});
// Solution 3: create jquery-val.js file, which loads the dependencies automatically
// jquery-val.js
(function (global) {
define(["jquery", "jquery-validate-unobtrusive", "jquery-unobtrusive-ajax"], function ($) {
});
}(this));
take some time and read:
http://requirejs.org/docs/api.html#modulenotes
One module per file.: Only one module should be defined per JavaScript file, given the nature of the module name-to-file-path lookup algorithm. You shoud only use the optimization tool to group multiple modules into optimized files.
Optimization Tool
To answer your question:
It is good practice to define one module per file, so you don't need to define explicit a name for the module AND do the need for inserting it somewhere to be available before the other modules are loaded.
So you could require just the file: require("../services/myGroupModule") and this file would hold your module and requireJS would take care of the loading dependencies (and later the optimizations for concatenating into one file!). Here the module name is the file name.
You could nevertheless do the following and loading it as a file module or like you tried to define it beforehand and give the module a name:
//Explicitly defines the "foo/title" module:
define("myGroupModule",
["dependency1", "dependency2"],
function(dependency1, dependency2) {
return function myGroupModule {
return {
doSomething: function () { console.log("hey"); }
}
}
}
);
Maybe you should also give a look at some new module loaders:
WebPack 2: https://webpack.js.org/
SystemJS: https://github.com/systemjs/systemjs
… which would enable this pattern:
var $ = require('jQuery');
I'm looking at the Require.js docs and all I can see is the callback-version, e.g. require(['jQuery'], function ($) { … });
Require comes with a 'CommonJS' mode for allowing Node-like require statements inline in your code. For example, this is a typical AMD definition that will load these scripts asynchronously:
define(['script1', 'script2'], function (script1, script2) {
});
But you can also do this:
define(function () {
var script1 = require("script1");
var script2 = require("script2");
});
And Require will process these asynchronously also. The way it does that is that Require sees that you haven't specified an array of dependencies as the first argument and then scans your code for Require statements. It then processes each of them asynchronously.
However, to answer your question, if you DO specify a dependency array as the first parameter and still use CommonJS syntax, Require will process the require statements synchronously:
define([], function () {
// Each of these scripts will be loaded synchronously
var script1 = require("script1");
var script2 = require("script2");
});
Documentation::
https://github.com/jrburke/requirejs/wiki/Differences-between-the-simplified-CommonJS-wrapper-and-standard-AMD-define
Limited by the browser, loading scripts can only be async. So there's not a sync version for loading dependencies in Require.js.
However, some module loaders do have workarounds. For example, Sea.js offer a sync version(http://seajs.org/docs/en.html#quick-start) by running a regexp to extract all dependencies(https://github.com/seajs/seajs/blob/3d3c3bc142d9a6e3f6de14c1959d3f308b1c9e0e/src/module.js#L324) and loading them asynchronously before actually executing the script. So it seems that the dependencies are loaded synchronously.
Looks like the sync variant is used during development:
require('foo') performs a synchronous load, require('foo', function(foo){ … }) performs an asynchronous load. In development we use the sync load (convenience being the only reason) - in production the RequireJS Optimizer inlines that require-call anyway…
Source: https://github.com/rodneyrehm/viewport-units-buggyfill/issues/18#issuecomment-55604581
Currently, I use a few defines via the Google Closure Compiler along the lines of IS_CJS and IS_BROWSER, and just have different files that get built (browser.myproject.js, cjs.myproject.js, etc).
Is this the standard way of doing things? If not, what is it and what are the advantages?
I've been using the following preamble in all my projects, for libraries that are loaded by both browser and server code:
if (define === undefined) {
var define = function(f) {
require.paths.unshift('.');
f(require, exports, module);
};
}
define(function(require, exports, module) {
...
// main library here
...
// use require to import dependencies
var v = require(something);
...
// use exports to return library functions
exports.<stuff> = { some stuff };
...
});
This works to load the library with a require(<library>) call running on my node server, as well as with a require(<library>) call with RequireJS. On the browser, nested require calls are pre-fetched by RequireJS prior to library execution, on Node these dependencies are loaded synchronously. Since I'm not using my libraries as stand-alone scripts (via a script tag in the html), and only as dependencies for scripts loaded via the script tag, this works well for me.
However, looking at stand-alone libraries, it looks like the following preamble seems to be the most flexible. (cut and paste from the Q promise library
(function (definition, undefined) {
// This file will function properly as a <script> tag, or a module
// using CommonJS and NodeJS or RequireJS module formats. In
// Common/Node/RequireJS, the module exports the Q API and when
// executed as a simple <script>, it creates a Q global instead.
// The use of "undefined" in the arguments is a
// micro-optmization for compression systems, permitting
// every occurrence of the "undefined" variable to be
// replaced with a single-character.
// RequireJS
if (typeof define === "function") {
define(function (require, exports, module) {
definition(require, exports, module);
});
// CommonJS
} else if (typeof exports === "object") {
definition(require, exports, module);
// <script>
} else {
Q = definition(undefined, {}, {});
}
})(function (serverSideRequire, exports, module, undefined) {
...
main library here
...
/*
* In module systems that support ``module.exports`` assignment or exports
* return, allow the ``ref`` function to be used as the ``Q`` constructor
* exported by the "q" module.
*/
for (var name in exports)
ref[name] = exports[name];
module.exports = ref;
return ref;
});
While wordy, it's impressively flexible, and simply works, no matter what your execution environment is.
You can use uRequire that converts modules written in either AMD or CommonJS to either AMD, CommonJS or UMD through a template system.
Optionally uRequire builds your whole bundle as a combinedFile.js that runs in ALL environments (nodejs, AMD or module-less browser < script/>) thats using rjs optimizer and almond under the hood.
uRequire saves you from having to maintain any boilerplate in each module - just write plain AMD or CommonJS modules (as .js, .coffee, .coco, .ls etc) without gimmicks.
Plus you can declaratively add standard functionality such as exporting a module to global such as window.myModule along with a noConflict() method, or have runtimeInfo (eg __isNode, __isAMD) selectively or replace/remove/inject a dependency while building, automatically minify, manipulate module code and much more.
All of these configuration options can be turned on and off per bundle OR per module, and you can have different build profiles (development, test, production etc) that derive(inherit) from each other.
It works great with grunt through grunt-urequire or standalone and it has a great watch option that rebuilds ONLY changed files.
Have you tried this: https://github.com/medikoo/modules-webmake#modules-webmake ?
It's the approach I'm taking, and it works really well. No boilerplate in a code and you can run same modules on both server and client side
Is it possible to use require() (or something similar) on client side?
Example
var myClass = require('./js/myclass.js');
You should look into require.js or head.js for this.
I've been using browserify for that. It also lets me integrate Node.js modules into my client-side code.
I blogged about it here: Add node.js/CommonJS style require() to client-side JavaScript with browserify
If you want to have Node.js style require you can use something like this:
var require = (function () {
var cache = {};
function loadScript(url) {
var xhr = new XMLHttpRequest(),
fnBody;
xhr.open('get', url, false);
xhr.send();
if (xhr.status === 200 && xhr.getResponseHeader('Content-Type') === 'application/x-javascript') {
fnBody = 'var exports = {};\n' + xhr.responseText + '\nreturn exports;';
cache[url] = (new Function(fnBody)).call({});
}
}
function resolve(module) {
//TODO resolve urls
return module;
}
function require(module) {
var url = resolve(module);
if (!Object.prototype.hasOwnProperty.call(cache, url)) {
loadScript(url);
}
return cache[url];
}
require.cache = cache;
require.resolve = resolve;
return require;
}());
Beware: this code works but is incomplete (especially url resolving) and does not implement all Node.js features (I just put this together last night).
YOU SHOULD NOT USE THIS CODE in real apps but it gives you a starting point. I tested it with this simple module and it works:
function hello() {
console.log('Hello world!');
}
exports.hello = hello;
I asked myself the very same questions. When I looked into it I found the choices overwhelming.
Fortunately I found this excellent spreadsheet that helps you choice the best loader based on your requirements:
https://spreadsheets.google.com/lv?key=tDdcrv9wNQRCNCRCflWxhYQ
Take a look at requirejs project.
I have found that in general it is recommended to preprocess scripts at compile time and bundle them in one (or very few) packages with the require being rewritten to some "lightweight shim" also at compile time.
I've Googled out following "new" tools that should be able to do it
http://mixu.net/gluejs/
https://github.com/jrburke/almond
https://github.com/component/builder2.js
And the already mentioned browserify should also fit quite well - http://esa-matti.suuronen.org/blog/2013/04/15/asynchronous-module-loading-with-browserify/
What are the module systems all about?
Older Stack Overflow explanation - Relation between CommonJS, AMD and RequireJS?
Detailed discussion of various module frameworks and the require() they need is in Addy Osmani - Writing Modular JavaScript With AMD, CommonJS & ES Harmony
You can create elements to the DOM, which loads items.
Like such:
var myScript = document.createElement('script'); // Create new script element
myScript.type = 'text/javascript'; // Set appropriate type
myScript.src = './js/myclass.js'; // Load javascript file
Simply use Browserify, what is something like a compiler that process your files before it go into production and packs the file in bundles.
Think you have a main.js file that require the files of your project, when you run browserify in it, it simply process all and creates a bundle with all your files, allowing the use of the require calls synchronously in the browser without HTTP requests and with very little overhead for the performance and for the size of the bundle, for example.
See the link for more info: http://browserify.org/
Some answers already - but I would like to point you to YUI3 and its on-demand module loading. It works on both server (node.js) and client, too - I have a demo website using the exact same JS code running on either client or server to build the pages, but that's another topic.
YUI3: http://developer.yahoo.com/yui/3/
Videos: http://developer.yahoo.com/yui/theater/
Example:
(precondition: the basic YUI3 functions in 7k yui.js have been loaded)
YUI({
//configuration for the loader
}).use('node','io','own-app-module1', function (Y) {
//sandboxed application code
//...
//If you already have a "Y" instance you can use that instead
//of creating a new (sandbox) Y:
// Y.use('moduleX','moduleY', function (Y) {
// });
//difference to YUI().use(): uses the existing "Y"-sandbox
}
This code loads the YUI3 modules "node" and "io", and the module "own-app-module1", and then the callback function is run. A new sandbox "Y" with all the YUI3 and own-app-module1 functions is created. Nothing appears in the global namespace. The loading of the modules (.js files) is handled by the YUI3 loader. It also uses (optional, not show here) configuration to select a -debug or -min(ified) version of the modules to load.
Here's a solution that takes a very different approach: package up all the modules into a JSON object and require modules by reading and executing the file content without additional requests.
https://github.com/STRd6/require/blob/master/main.coffee.md
STRd6/require depends on having a JSON package available at runtime. The require function is generated for that package. The package contains all the files your app could require. No further http requests are made because the package bundles all dependencies. This is as close as one can get to the Node.js style require on the client.
The structure of the package is as follows:
entryPoint: "main"
distribution:
main:
content: "alert(\"It worked!\")"
...
dependencies:
<name>: <a package>
Unlike Node a package doesn't know it's external name. It is up to the pacakge including the dependency to name it. This provides complete encapsulation.
Given all that setup here's a function that loads a file from within a package:
loadModule = (pkg, path) ->
unless (file = pkg.distribution[path])
throw "Could not find file at #{path} in #{pkg.name}"
program = file.content
dirname = path.split(fileSeparator)[0...-1].join(fileSeparator)
module =
path: dirname
exports: {}
context =
require: generateRequireFn(pkg, module)
global: global
module: module
exports: module.exports
PACKAGE: pkg
__filename: path
__dirname: dirname
args = Object.keys(context)
values = args.map (name) -> context[name]
Function(args..., program).apply(module, values)
return module
This external context provides some variable that modules have access to.
A require function is exposed to modules so they may require other modules.
Additional properties such as a reference to the global object and some metadata
are also exposed.
Finally we execute the program within the module and given context.
This answer will be most helpful to those who wish to have a synchronous node.js style require statement in the browser and are not interested in remote script loading solutions.
I find the component project giving a much more streamlined workflow than other solutions (including require.js), so I'd advise checking out https://github.com/component/component . I know this is a bit late answer but may be useful to someone.
Here's a light weight way to use require and exports in your web client. It's a simple wrapper that creates a "namespace" global variable, and you wrap your CommonJS compatible code in a "define" function like this:
namespace.lookup('org.mydomain.mymodule').define(function (exports, require) {
var extern = require('org.other.module');
exports.foo = function foo() { ... };
});
More docs here:
https://github.com/mckoss/namespace
The clientside-require library provides an asynchronous load() function that can be used to load any JS file or NPM module (which uses module.exports), any .css file, any .json, any .html, any any other file as text.
e.g.,
npm install clientside-require --save
<script src = '/node_modules/clientside-require/dist/bundle.js'></script>
<script>
load('color-name') // an npm module
.then(color_name=>{
console.log(color_name.blue); // outputs [0, 0, 255]
})
</script>
A really cool part of this project is that inside of any load()ed script, you can use the synchronous require() function the same way you would expect in node.js!
e.g.,
load('/path/to/functionality.js')
and inside /path/to/functionality.js:
var query_string = require("qs") // an npm module
module.exports = function(name){
return qs.stringify({
name:name,
time:new Date()
}
}
That last part, implementing the synchronous require() method, is what enables it to utilize NPM packages built to run on the server.
This module was designed to implement the require functionality as closely as possible in the browser. Disclaimer: I have written this module.
Yes it is very easy to use, but you need to load javascript file in browser by script tag
<script src="module.js"></script>
and then user in js file like
var moduel = require('./module');
I am making a app using electron and it works as expected.