Prevent optimization of text! and json! plugins on requirejs optimization tool - javascript

I'm using the following architecture for my multipage requirejs based application.:
https://github.com/requirejs/example-multipage-shim
The repository explains how to optimize the application by running r.js, the command line tool used for this kind of task.
Everything should work fine but my project as some modules that have dependencies that perform HTTP GET request to fetch data from server (wich can be text or json)
This is because some of my jsons and templates used by some pages need to be server-side processed for localization.
Here is a basic example to show what i'm talking about:
define( function( require ){
var appLang = require('json!/pagelang/login'), //(a)
loginTemplate = require('text!/template/login'), //(b)
module = require('app/model/user'), //(c)
....
An HTTP get request is made to my server localhost/pagelang/login
which returns a json
{
"hash": "translated_value",
...
}
The same applied for template/template_namewhere an html with it's UI translated into the user language is returned from the server.
By running r.js it attempts to load those locations for an existant directory directory on the server, which obviously, don't exist.
Tracing dependencies for: app/main/login
Error: Error: Loader plugin did not call the load callback in the build:
json:
json!/pagelang/login: Error: ENOENT, no such file or directory '/pagelang/login'
Module loading did not complete for: app/main/login
So, i would like to prevent the command line tool from optimizing text! and json! modules. Is it possible?
I checked requirejs build settings but i didn't find the solution for my problem. Any help?
https://github.com/jrburke/r.js/blob/master/build/example.build.js

The json plugin uses if (( config.isBuild && (config.inlineJSON === false || name.indexOf(CACHE_BUST_QUERY_PARAM +'=') !== -1)) || (url.indexOf('empty:') === 0)) { when the optimiser runs so you have a couple of options.
Add the build config option inlineJSON: false,
Add !bust to the end of the json require. require('json!/pagelang/login!bust'), or
Add the path to the build config option paths: { "/pagelang/login": "empty:" }
The text plugin uses if (config.isBuild && !config.inlineText) { and if (url.indexOf('empty:') === 0) {
Set the build config option inlineText: false, or
Add the path to the build config option paths: { "/template/login": "empty:" }
=================================================================
Update: if you can't get the option inlineJSON to work, try using inlineText, which seems to cover JSON as well.
Reference: https://github.com/requirejs/r.js/blob/master/build/example.build.js

Related

How to Alternatively set "*" to allowedModules to enable everything in jsreport script file?

I'm trying to create a report using jsreport STUDIO, but I got error like below
Error occured - Error during rendering report: Unsupported module in scripts: request. To enable require on particular module, you need to update the configuration as {"scripts": { "allowedModules": ["request"] } } ... Alternatively you can also set "" to allowedModules to enable everything
Stak - Error: Unsupported module in scripts: request. To enable require on particular module, you need to update the configuration as {"scripts": { "allowedModules": ["request"] } } ... Alternatively you can also set "" to allowedModules to enable everything
Can anyone tell me where I can find the configuration file to update the allowedModules ?
https://jsreport.net/learn/configuration
jsreport merges configuration from file, environment variables, command line arguments and also directly from the application code. The configuration file needs to be stored at the root of the application with the name prod.config.json. There should be already pre created for you the default one.
There should be dev|prod.config.json in your app root if you followed the common installation. You should edit it and add the required options. If you use jsreport inside your node.js application, you should pass this option through the options object in call require('jsreport')({scripts: { ... } })
I have changed the dev.config.json file. It works fine
"scripts": {
"allowedModules": ["request"],
"timeout": 60000
},

Different settings for debug/local ("grunt serve") vs. dist/build ("grunt")?

I want to define some application settings, but I want to provide different values depending on whether I'm running in 'debug' mode (e.g. grunt serve), or whether the final compiled app is running (e.g. the output of grunt). That is, something like:
angular.module('myApp').factory('AppSettings', function() {
if (DebugMode()) { // ??
return { apiPort: 12345 };
} else {
return { apiPort: 8008 };
}
});
How can I accomplish this?
The way I handle it in my apps:
move all your config data for one environment to a file: config.js, config.json,... whatever your app finds easy to read.
now modify your config file to turn it into a template using grunt config values, and generate the file with grunt-template as part of your build - for example: app.constant('myAppConfig', {bananaHammocks: <%= banana.hammocks %>});
finally, add grunt-stage to switch grunt config values depending on environment: create your different config/secret/(env).json files, update your template (app.constant('myAppConfig', {bananaHammocks: <%= stg.banana.hammocks %>});), and then grunt stage:local:build or grunt stage:prod:build
I find this the good balance between complexity and features (separation between environments, runtime code not concerned with building options,...)

Prevent browser cache issue on Javascript files with RequireJS in SeedStack

using SeedStack 14.7 we are facing a cache issue when uploading a new version on servers: every user have to clear their cache to get the last version of files.
I tried to use "urlArgs": "version=2" in the requireConfig part of the fragment JSON file. It do the job by adding argument on every files and so we can use it when changing version, but it also affect the urls in the config of each modules !
As we are using this config to pass the REST base url to each module, it breaks all REST requests by adding the argument to the base url.
My fragment JSON file :
{
"id": "mac2-portail",
"modules": {
"gestionImage": {
"path": "{mac2-portail}/modules/gestionImage",
"autoload": true,
"config": {
"apiUrl": "muserver/rest"
}
}
},
"i18n": {...},
"routes": {...},
"requireConfig": {
"urlArgs": "version=2",
"shim": {...}
}
}
Any idea to solve the cache issue without breaking REST requests ?
EDIT : it is not a duplicate of Prevent RequireJS from Caching Required Scripts. Yes SeedStack uses RequireJS and this configuration solve the cache issue, but it also affect other modules defined in the fragment so I need to find another solution to prevent browser to cache files
The module configuration values, like apiUrl in your example, are not touched by RequireJS unless you call require.toUrl() on them explicitly. I think this is what is happening in your case. To avoid this problem, you should always do the concatenation first and only then call require.toUrl() on the full resulting URL.
So, instead of doing:
var fullUrl = require.toUrl(config.apiUrl) + '/my/resource';
Do this:
var fullUrl = require.toUrl(config.apiUrl + '/my/resource');
By the way, instead of setting the version directly in the RequireJS configuration, you can simply add the version of your application to the data-w20-app-version attribute on the <html> element of the master page:
<html data-w20-app data-w20-app-version="2.0.0">
This will provide the same behavior but will work correctly in the case of Angular templates in $templateCache. If your master page is automatically generated by the backend, this is done automatically. Check this page for the details.

Minify Scripts/CSS in production mode with node.js

I have a web app that runs in node. All the (client) Javascript/CSS files are not minified at the moment to make it easier to debug.
When I am going into production, I would like to minify these scripts. It would be nice to have something like:
node app.js -production
How do I serve the minified version of my scripts without changing the script tags in my html files? There should be something like: if I am in production, use these 2 minified(combined) scripts, else use all my unminified scripts..
Is this possible? Maybe I am thinking too complicated?
You might be interested in Piler. It's a Node.js module that delivers all the JavaScript (and CSS) files you specify as usual when in debug mode, but concatenated and minified when in production mode.
As a special feature, you can force CSS updates via Socket.io in real-time to appear in your browser (called "CSS Live Updated" in Piler), which is quite awesome :-).
The trick is that inside your template you only have placeholders for the script and link elements, and Piler renders these elements at runtime - as single elements in debug mode, and as a dynamically generated single element in production mode.
This way you can forget about creating concatenated and minified versions of your assets manually or using a build tool, it's just there at runtime, but you always have the separated, full versions when developing and debugging.
you could use 2 separate locations for your static files
Here's some express code:
if (process.env.MODE === "production") {
app.use(express['static'](__dirname + '/min'));
} else {
app.use(express['static'](__dirname + '/normal'));
}
and start node with
MODE=production node app.js
Furthermore, if you don't want to duplicate all your files, you could take advantage of the fact that express static router stops at the first file, and do something like this instead:
if (process.env.MODE === "production") {
app.use(express['static'](__dirname + '/min')); // if minized version exists, serves it
}
app.use(express['static'](__dirname + '/normal')); // fallback to regular files
Using the same name for minimized or not is going to cause problem with browser caching, though.
I want to share my final solution with you guys.
I use JSHTML for Express (enter link description here)
In my main node file I use a special route:
app.get('/**:type(html)', function (req, res, next) {
var renderingUrl = req.url.substring(1, req.url.lastIndexOf("."));
//TODO: Find a better solution
try{
var assetUrl = req.url.substring(req.url.lastIndexOf("/") + 1, req.url.lastIndexOf("."));
var assets = config.getResourceBundle(assetUrl);
assets.production = config.getEnviroment() === "production";
res.locals(assets);
res.render(renderingUrl);
}catch(e){
res.redirect("/");
}
});
As you can see, I get my assets from config.getResourceBundle. This is a simply function:
exports.getResourceBundle = function(identifier){
switch(enviroment){
case "development":
return devConfig.getResourceBundle(identifier);
case "production":
return prodConfig.getResourceBundle(identifier);
default:
return devConfig.getResourceBundle(identifier);
}
}
And finally an example for an asset file collection is here:
exports.getResourceBundle = function (identifier) {
return resourceBundle[identifier];
};
resourceBundle = {
index:{
cssFiles:[
"resources/dev/css/login.css",
"resources/dev/css/logonDlg.css",
"resources/dev/css/footer.css"
],
jsFiles:[
"resources/dev/js/lib/jquery/jquery.183.js",
"resources/dev/js/utilities.js",
"resources/dev/js/lib/crypto.3.1.2.js"
]
},
register:{
cssFiles:[
"resources/dev/css/login.css",
"resources/dev/css/modalDialog.css",
"resources/dev/css/footer.css"
],
jsFiles:[
"resources/dev/js/lib/jquery/jquery.183.js",
"resources/dev/js/utilities.js",
"resources/dev/js/lib/crypto.3.1.2.js",
"resources/dev/js/lib/jquery.simplemodal.js",
"resources/dev/js/xfiles.register.js"
]
}
(...)
I have 2 folders. dev / prod. grunt will copy the minified files into prod/.. and deletes the files from dev/...
And if the NODE_ENV variable is set to production, I will ship the minified versions of my scripts/css.
I think this is the most elegant solution at the moment.
There are build tool plugins for you, may help you gracefully solve this problem:
For Gulp:
https://www.npmjs.org/package/gulp-useref/
For Grunt:
https://github.com/pajtai/grunt-useref
Another Node.js module which could be relevant is connect-cachify.
It doesn't seem to do the actual minification for you, but it does let you serve the minified version in production, or all the original scripts in development, without changing the templates (thanks to cachify_js and cachify_css).
Seems it's not as feature-rich as Piler, but probably a bit simpler, and should meet all the requirements mentioned in the question.

JavaScript require() on client side

Is it possible to use require() (or something similar) on client side?
Example
var myClass = require('./js/myclass.js');
You should look into require.js or head.js for this.
I've been using browserify for that. It also lets me integrate Node.js modules into my client-side code.
I blogged about it here: Add node.js/CommonJS style require() to client-side JavaScript with browserify
If you want to have Node.js style require you can use something like this:
var require = (function () {
var cache = {};
function loadScript(url) {
var xhr = new XMLHttpRequest(),
fnBody;
xhr.open('get', url, false);
xhr.send();
if (xhr.status === 200 && xhr.getResponseHeader('Content-Type') === 'application/x-javascript') {
fnBody = 'var exports = {};\n' + xhr.responseText + '\nreturn exports;';
cache[url] = (new Function(fnBody)).call({});
}
}
function resolve(module) {
//TODO resolve urls
return module;
}
function require(module) {
var url = resolve(module);
if (!Object.prototype.hasOwnProperty.call(cache, url)) {
loadScript(url);
}
return cache[url];
}
require.cache = cache;
require.resolve = resolve;
return require;
}());
Beware: this code works but is incomplete (especially url resolving) and does not implement all Node.js features (I just put this together last night).
YOU SHOULD NOT USE THIS CODE in real apps but it gives you a starting point. I tested it with this simple module and it works:
function hello() {
console.log('Hello world!');
}
exports.hello = hello;
I asked myself the very same questions. When I looked into it I found the choices overwhelming.
Fortunately I found this excellent spreadsheet that helps you choice the best loader based on your requirements:
https://spreadsheets.google.com/lv?key=tDdcrv9wNQRCNCRCflWxhYQ
Take a look at requirejs project.
I have found that in general it is recommended to preprocess scripts at compile time and bundle them in one (or very few) packages with the require being rewritten to some "lightweight shim" also at compile time.
I've Googled out following "new" tools that should be able to do it
http://mixu.net/gluejs/
https://github.com/jrburke/almond
https://github.com/component/builder2.js
And the already mentioned browserify should also fit quite well - http://esa-matti.suuronen.org/blog/2013/04/15/asynchronous-module-loading-with-browserify/
What are the module systems all about?
Older Stack Overflow explanation - Relation between CommonJS, AMD and RequireJS?
Detailed discussion of various module frameworks and the require() they need is in Addy Osmani - Writing Modular JavaScript With AMD, CommonJS & ES Harmony
You can create elements to the DOM, which loads items.
Like such:
var myScript = document.createElement('script'); // Create new script element
myScript.type = 'text/javascript'; // Set appropriate type
myScript.src = './js/myclass.js'; // Load javascript file
Simply use Browserify, what is something like a compiler that process your files before it go into production and packs the file in bundles.
Think you have a main.js file that require the files of your project, when you run browserify in it, it simply process all and creates a bundle with all your files, allowing the use of the require calls synchronously in the browser without HTTP requests and with very little overhead for the performance and for the size of the bundle, for example.
See the link for more info: http://browserify.org/
Some answers already - but I would like to point you to YUI3 and its on-demand module loading. It works on both server (node.js) and client, too - I have a demo website using the exact same JS code running on either client or server to build the pages, but that's another topic.
YUI3: http://developer.yahoo.com/yui/3/
Videos: http://developer.yahoo.com/yui/theater/
Example:
(precondition: the basic YUI3 functions in 7k yui.js have been loaded)
YUI({
//configuration for the loader
}).use('node','io','own-app-module1', function (Y) {
//sandboxed application code
//...
//If you already have a "Y" instance you can use that instead
//of creating a new (sandbox) Y:
// Y.use('moduleX','moduleY', function (Y) {
// });
//difference to YUI().use(): uses the existing "Y"-sandbox
}
This code loads the YUI3 modules "node" and "io", and the module "own-app-module1", and then the callback function is run. A new sandbox "Y" with all the YUI3 and own-app-module1 functions is created. Nothing appears in the global namespace. The loading of the modules (.js files) is handled by the YUI3 loader. It also uses (optional, not show here) configuration to select a -debug or -min(ified) version of the modules to load.
Here's a solution that takes a very different approach: package up all the modules into a JSON object and require modules by reading and executing the file content without additional requests.
https://github.com/STRd6/require/blob/master/main.coffee.md
STRd6/require depends on having a JSON package available at runtime. The require function is generated for that package. The package contains all the files your app could require. No further http requests are made because the package bundles all dependencies. This is as close as one can get to the Node.js style require on the client.
The structure of the package is as follows:
entryPoint: "main"
distribution:
main:
content: "alert(\"It worked!\")"
...
dependencies:
<name>: <a package>
Unlike Node a package doesn't know it's external name. It is up to the pacakge including the dependency to name it. This provides complete encapsulation.
Given all that setup here's a function that loads a file from within a package:
loadModule = (pkg, path) ->
unless (file = pkg.distribution[path])
throw "Could not find file at #{path} in #{pkg.name}"
program = file.content
dirname = path.split(fileSeparator)[0...-1].join(fileSeparator)
module =
path: dirname
exports: {}
context =
require: generateRequireFn(pkg, module)
global: global
module: module
exports: module.exports
PACKAGE: pkg
__filename: path
__dirname: dirname
args = Object.keys(context)
values = args.map (name) -> context[name]
Function(args..., program).apply(module, values)
return module
This external context provides some variable that modules have access to.
A require function is exposed to modules so they may require other modules.
Additional properties such as a reference to the global object and some metadata
are also exposed.
Finally we execute the program within the module and given context.
This answer will be most helpful to those who wish to have a synchronous node.js style require statement in the browser and are not interested in remote script loading solutions.
I find the component project giving a much more streamlined workflow than other solutions (including require.js), so I'd advise checking out https://github.com/component/component . I know this is a bit late answer but may be useful to someone.
Here's a light weight way to use require and exports in your web client. It's a simple wrapper that creates a "namespace" global variable, and you wrap your CommonJS compatible code in a "define" function like this:
namespace.lookup('org.mydomain.mymodule').define(function (exports, require) {
var extern = require('org.other.module');
exports.foo = function foo() { ... };
});
More docs here:
https://github.com/mckoss/namespace
The clientside-require library provides an asynchronous load() function that can be used to load any JS file or NPM module (which uses module.exports), any .css file, any .json, any .html, any any other file as text.
e.g.,
npm install clientside-require --save
<script src = '/node_modules/clientside-require/dist/bundle.js'></script>
<script>
load('color-name') // an npm module
.then(color_name=>{
console.log(color_name.blue); // outputs [0, 0, 255]
})
</script>
A really cool part of this project is that inside of any load()ed script, you can use the synchronous require() function the same way you would expect in node.js!
e.g.,
load('/path/to/functionality.js')
and inside /path/to/functionality.js:
var query_string = require("qs") // an npm module
module.exports = function(name){
return qs.stringify({
name:name,
time:new Date()
}
}
That last part, implementing the synchronous require() method, is what enables it to utilize NPM packages built to run on the server.
This module was designed to implement the require functionality as closely as possible in the browser. Disclaimer: I have written this module.
Yes it is very easy to use, but you need to load javascript file in browser by script tag
<script src="module.js"></script>
and then user in js file like
var moduel = require('./module');
I am making a app using electron and it works as expected.

Categories

Resources