RequireJS, AccountingJs - passing global config into accounting - javascript

Im quite new to using requireJs and im having an issue with setting global configuration for a module.
I am using accountingJs and want to modify the setting globally in this case i want to change the symbol from $ to £.
Without RequireJS you would simply do something like this as accounting would be in the global namespace
accounting.settings = $.extend(accounting.settings, {
currency: { symbol: '\u00A3 '}
});
accountingJs is AMD compliant and works perfectly with require but i cant seem to figure out a way of passing the config into it globally rather than .
I have seen the config setting in require docs here and i can set the config here but accountingjs doesn't pick this up (it isnt coded to!).
My question is how can i set configuration like this for a AMD compliant module globally within the page?
I can see a few options
Edit accountingjs to look at module.config() and load any config it sees - i have tried this and it does work as expected but i dont really want a custom.
use shim config and use the init call back - i havent got this to work (maybe because it is already AMD compliant)
create a new module to wrap accountingjs in another define and apply the config here and use this module in each page - not tries this but i guess it would work...
what i really ant to do is have a way of globally applying config to an already existing module from the require config is it possible??

If the AMD module is not designed to use module.config, then you can't force it to use it. The solution you mention last is the most robust: create a wrapper module that configures the actual module as you want. This wrapper can use module.config to grab values. This solution is likely to work with RequireJS now and for quite a long time since you're using API features that are well documented and central to RequireJS' functionality.
As for a shim, I don't recall the docs for RequireJS ever providing a solution that consists as using a shim for a module that is already designed to work with AMD loaders. So if using a shim worked, it would be by happenstance rather than by design.

Related

Understanding the Communication between Modules in jQuery Source Code Structure [duplicate]

Uncompressed jQuery file: http://code.jquery.com/jquery-2.0.3.js
jQuery Source code: https://github.com/jquery/jquery/blob/master/src/core.js
What are they doing to make it seem like the final output is not using Require.js under the hood? Require.js examples tells you to insert the entire library into your code to make it work standalone as a single file.
Almond.js, a smaller version of Require.js also tell you to insert itself into your code to have a standalone javascript file.
When minified, I don't care for extra bloat, it's only a few extra killobytes (for almond.js), but unminified is barely readable. I have to scroll all the way down, past almond.js code to see my application logic.
Question
How can I make my code to be similar to jQuery, in which the final output does not look like a Frankenweenie?
Short answer:
You have to create your own custom build procedure.
Long answer
jQuery's build procedure works only because jQuery defines its modules according to a pattern that allows a convert function to transform the source into a distributed file that does not use define. If anyone wants to replicate what jQuery does, there's no shortcut: 1) the modules have to be designed according to a pattern which will allow stripping out the define calls, and 2) you have to have a custom conversion function. That's what jQuery does. The entire logic that combines the jQuery modules into one file is in build/tasks/build.js.
This file defines a custom configuration that it passes to r.js. The important option are:
out which is set to "dist/jquery.js". This is the single
file produced by the optimization.
wrap.startFile which is set to "src/intro.js". This file
will be prepended to dist/jquery.js.
wrap.endFile which is set to "src/outro.js". This file will
be appended to dist/jquery.js.
onBuildWrite which is set to convert. This is a custom function.
The convert function is called every time r.js wants to output a module into the final output file. The output of that function is what r.js writes to the final file. It does the following:
If a module is from the var/ directory, the module will be
transformed as follows. Let's take the case of
src/var/toString.js:
define([
"./class2type"
], function( class2type ) {
return class2type.toString;
});
It will become:
var toString = class2type.toString;
Otherwise, the define(...) call is replace with the contents of the callback passed to define, the final return statement is stripped and any assignments to exports are stripped.
I've omitted details that do not specifically pertain to your question.
You can use a tool called AMDClean by gfranko https://www.npmjs.org/package/amdclean
It's much simpler than what jQuery is doing and you can set it up quickly.
All you need to do is to create a very abstract module (the one that you want to expose to global scope) and include all your sub modules in it.
Another alternative that I've recently been using is browserify. You can export/import your modules the NodeJS way and use them in any browser. You need to compile them before using it. It also has gulp and grunt plugins for setting up a workflow. For better explanations read the documentations on browserify.org.

Encapsulating requireJS and its modules

I'd like to use something like requireJS to modularise my javascript app, however, there's one particular requirement I'd like to meet that I haven't figured out:
I'm happy enough to have require and define in the global space, however, I don't want modules that I import with require to be accessible globally (i.e. they should be accessible to my app, but not other apps running on the same page).
It seems to me that if I call define('moduleA', function() { ... }); I can then access that module - globally - via the require function. It may not be occupying a variable in the global space itself, or be attached to window, but it still feels bad, because other apps really shouldn't be able to see my internal modules (not to mention potential naming conflicts, etc, can I use contexts to circumvent this?).
This seems to be a step back from just namespacing my modules and including them all inside of one big privatising function at build time.
I could have my own private version of require but my modules (being in different files) wouldn't be able to access define.
Am I missing something or do I just have to live with this? (or alternatively, run an optimizer to bake everything into one file - feels like I could just namespace my modules and not bother with requireJS at all if I do that though).
In your r.js build config add wrap: true and you will no longer pollute the glabal scope with require or define functions.
Edit:
You can also specify a namespace for your require and define with the namespace setting in the r.js build config. You can read more about namespacing in RequireJS's FAQ. From the example.build.js comments:
// Allows namespacing requirejs, require and define calls to a new name.
// This allows stronger assurances of getting a module space that will
// not interfere with others using a define/require AMD-based module
// system. The example below will rename define() calls to foo.define().
// See http://requirejs.org/docs/faq-advanced.html#rename for a more
// complete example.
namespace: 'foo',

Dynamically resolve paths with RequireJS

Is there a way to dynamically resolve paths with RequireJS? For example, is it possible to configure it so that, if I do
require(['mymodule'], function(mod){ });
some sort of function of mine will be called, with "mymodule" being passed as a parameter, and with the value I return therefrom being used by Require as the path to use for mymodule?
I do understand that Require has some wonderful convention over configuration with respect to path resolution, and I also understand that paths can be manually configured. But right now I'm trying to add in RequireJS to an old project that was written without Require in mind, so I'm trying to see what all of my options are.
You might be best served by implementing a loader plugin.
Not sure if it's a problem for you, but this would mean your require syntax would turn into something like this:
require(['myplugin!mymodule'], function(mod){ });
The specific method you would use is normalize:
normalize is called to normalize the name used to identify a resource. Some resources could use relative paths, and need to be normalized to the full path. normalize is called with the following arguments:
EDIT: I see there's a replace plugin listed on the plugins wiki which sounds similar to what you are trying to do. It uses only the load method, so apparently what I said above about the normalize method is not a blanket rule.
If the path is truly dynamic, this won't help, but if you just need to modify how the legacy script is returned back to your modules (e.g. taking two different globals and putting them under a different top-level global), you might also think about using the init hook in the shim config option
EDIT 2: some related hacks are posted in this question: Configuring RequireJS to load from multiple CDNs

Adding a package to dojo config at runtime

Is there a way I can add a new package to dojo config? I know I can do this: Add packages when dojo.js loads.
<script src='dojo_1.7.2/dojo/dojo.js'
data-dojo-config="async:true,isDebug:true,parseOnLoad:false,
packages:[{name:'project1',location:'../../js/proj1'},
{name:'common',location:'../../common'}]"></script>
I want to be able to add new packages at runtime.
dojo.registerModulePath did do this job prior to dojo1.6 (i think) but its now deprecated in 1.7
I am using dojo 1.7.2.
Thanks.
You can add extra packages after load by calling require with a config object.
Eg:
require({
packages: [
{"name": "myLib", "location": "release/myLib"}
]
});
This will however, create another instance of Dojo, according to the documentation (dojo/_base/config). Also, this is version 1.8 code; I don't think it works with 1.7.
I thought it might possible to push an extra object to dojoConfig or require.rawConfig but these are not picked-up by the loader. It appears that config cannot be changed after load.
You can pass a config object to require, so:
Eg.
dojoConfig.packages.push({"name": "myLib", "location": "release/myLib"});
require(dojoConfig, [...moduleIds...], function(...arguments...) {
});
This will work for the individual require but will not modify the global config (and hence will not work in define() or subsequent calls to require()). Again, I'm using 1.8 here but I assume it works in 1.7.
There may be another simpler way of making this work that someone else as found?
The solution by Stephen Simpson didn't seem to work right for me with dojo v1.13. It ignored the given location and was still trying to load the files relative to the default basePath despite of the project path starting it with a /. I got errors in the console, too.
But the documentation also mentions the paths parameter which worked for me. In your case:
require({paths:{"project1": "../../js/proj1", …}});
It probably worked for you because you're using a relative path and I don't.
It used to be dojo.registerModulePath("myModule", "/path/goes/here");.

I don't understand how require.js handles the load paths. Do I need to use require.config every time I define a module?

I am learning about require.js and think I am just missing something. I don't understand how it loads files.
I have my jquery file in a lib directory.
This does not work:
It shows that jquery is being loaded in the chrome network panel. Error is: Uncaught TypeError: undefined is not a function . so it is basically saying that $ is undefined.
require(['lib/jquery'],function($) {
$(document).ready(function(){
alert('hello');
});
});
This works:
require.config({
paths: {
jquery: 'lib/jquery'
}
});
require(['jquery'],function($) {
$(document).ready(function(){
alert('hello');
});
});
In other examples I see online you don't have to set the paths with require.config. Do I have to do this every time that I want to define a module? I know I am using require and not define in this case, but I am having the same issue with the define method. Every time I make a module using define I have to set the paths using require.config(). I think I am missing something here. Can anyone point me in the right direction?
This is a constraint on the AMD registration done by jquery. It explicitly registers as a named module called 'jquery' so you must have a paths config in for it, or in the case above, set baseUrl to be 'lib' then you do not need the paths config.
Other libraries normally should use an anonymous module registration, so you should not need to do a paths config for every library. More details here.
Also, some libraries, like underscore, do not call define() directly, but you can get a level of support by using the shim config.
Update to reflect comments and James' answer:
You have two problems:
jQuery, for the reasons James outlines in his answer, requires that you either have paths set in your config, or that you, in the code you've outlined, set baseUrl to "lib".
you have to remember that you can't just load any old script with RequireJS. Only scripts that conform to the AMD standard can be loaded.
Having said this, I'd advice you use require-jquery instead.
You'll probably end up using jQuery plugins that will assume jQuery is loaded on the page, and those won't work with the approach you're trying.

Categories

Resources