Starting with RequireJS, communication between modules - javascript

I am an ActionScript 3 developer who is just making his first way in building a large-scale JavaScript app.
So I understand modules and understand that AMD is a good pattern to use. I read about RequireJS and implemented it. However, what I still don't understand is how to achieve Cross-Module communication. I understand that there should be some kind of mediator...
I read articles and posts and still couldn't understand how to implement it simply.
Here is my code, simplified:
main.js
require(["Player", "AssetsManager"], function (player, manager) {
player.loadXML();
});
Player.js
define(function () {
function parseXml(xml)
{
// NOW HERE IS THE PROBLEM -- how do I call AssetsManager from here???
AssetsManager.queueDownload($(xml).find("prop").text());
}
return {
loadXML: function () {
//FUNCTION TO LOAD THE XML HERE, WHEN LOADED CALL parseXml(xml)
}
}
});
AssetsManager.js
define(function () {
var arrDownloadQueue = [];
return {
queueDownload: function(path) {
arrDownloadQueue.push(path);
}
}
});
Any "for dummies" help will be appreciated :)
Thank you.

To load up modules from another modules that you define(), you would simply set the first parameter as an array, with your module names in it. So let's say, in your code, you wanted to load Player.js into AssetsManager.js, you would simply include the string Player in the array.
This is simply possible because define's abstract implementation is equivalent to require, only that the callback passed to define expects a value to be returned, and that it will add a "module" to a list of dependencies that you can load up.
AssetsManager.js
define(['Player'], function (player) {
//... Your code.
});
However, if I can add to it, I personally prefer the use of require inside of the callback passed to define to grab the dependency that you want to load, instead of passing parameter to the callback.
So here's my suggestion:
define(['Player'], function () {
var player = require('Player');
});
And this is because it's much more in tune with CommonJS.
And this is how main.js would look like formatted to be more CommonJS-friendly:
require(["Player", "AssetsManager"], function () {
var player = require('Player');
var manager = require('AssetsManager');
player.loadXML();
});
But the CommonJS way of doing things is just a personal preference. My rationale for it is that the order in which you input the dependency names in the array might change at any time, and i wouldn't want to have to step through both the array and the parameters list.
Another rationale of mine (though, it's just pedantic), is that I come from the world of Node.js, where modules are loaded via require().
But it's up to you.

(This would be a reply to skizeey's answer, but I don't have enough reputation for that)
Another way of solving this problem without pulling in Player's AssetManager dependency via require is to pass the AssetManager instance that main.js already has around. One way of accomplishing this might be to make Player's loadXML function accept an AssetManager parameter that then gets passed to parseXml, which then uses it. Another way might be for Player to have a variable to hold an AssetManager which gets read by parseXml. It could be set directly or a function to store an AssetManager in the variable could be used, called say, setAssetManager. This latter way has an extra consideration though - you then need to handle the case of that variable not being set before calling loadXml. This concept is generally called "dependency injection".
To be clear I'm not advising this over using AMD to load it in. I just wanted to provide you with more options; perhaps this technique may come in handier for you when solving another problem, or may help somebody else. :)

Related

JavaScript - how to use Setters and Getters in to create a listener

Before this gets marked as a duplicate: I have read posts all day about this so I know there are tons of similar questions on SO but none that I've seen so far go into the details that I need to understand.
Having said that, there are no good commented examples of how the process works. Could someone answer the following question with well a well-commented example so I could finally understand this ability?
I have a function that I want to call in one file but I need to make sure that another event in another file has already happened before I call it. These files have no connection (one is an angular 2 TypeScript file that starts the app and the other is a JS file that manages a hopscotch tour). I understand that I will need to use a global variable and I believe that the best solution I've read is going to involve using setters and getters. All examples I've seen of this seem to assume that it's just intuitive and leave out the part where I get to understand how it's working. Maybe it is intuitive but I'm not making the leap yet.
Global variable in TypeScript file:
global_variable = false;
Function I want to call in JavaScript file based on the listener:
function call_if_other_function_finishes() {
if (global_variable === true) { // I have the global already created
// run hopscotch tour
}
} // how do I turn this into a listener?
The function I need to have finished first in TypeScript file:
function someFunction() {
// run its code
GlobalFile.global_variable = true; // Should trigger the listener.
}
Thanks in advance!!
One solution is to just define the function as you do in your example and then run it when you need it:
function someFunction() {
// run its code
call_if_other_function_finishes() // it's globally defined anyway
}

RequireJS define Browser Race Condition

We have a client-side plug-in framework that is constructed of modules (AMD) and utilizes require.js. In this framework we expose a public object that consists of configuration properties and common framework functionality. All of the required functionality for the public object is contained in one file (albeit separated into modules); the only file required by the end-user to add to their page.
The issue we are seeing is most prevalent in Safari but also shows itself occasionally in IE and Chrome. 100% of the time in Safari with an empty cache we encounter a race condition. Consider this example client code which is in the body of the client’s page.
<script type=”text/javascript”>
Me.subscribe(‘someEvent’, someHandler);
</script>
‘Me’ is always available to the page as its global and outside of any define call. However, ‘Me.subscribe’ is wrapped in ‘define’ and results in ‘undefined’ with the conditions I stated above.
We can’t tell the client to use any third-party frameworks to work around this issue. The code block above must stay exactly as it is.
I’ve been playing with the idea of allowing certain public function binding to be deferred without any additional work required by the client. So far, this is what I’m considering adding to the framework:
Me.deferred = function (fn, name) {
if (fn) return fn;
fn = this;
return function () {
var args = Array.prototype.slice.call(arguments);
setTimeout(function () {
fn[name].apply(this, args);
}, 0);
};
};
Then, in the framework near the top, I can add items I want deferred like this:
Me.subscribe = Me.deferred(Me.subscribe,'subscribe');
My questions are these: Am I missing something that is already out there? Is there an existing pattern that I am not aware of to handle this exact case? Is this just a bad idea in general?
If possible, make sure the client puts requireJS and all dependencies in the head. 'Me' can include an on-demand call which executes on creation if that is not possible.

How to manage dependencies in JavaScript?

I have scripts that needs to wait for certain conditions to be met before they run - for example wait for another script to be loaded, or wait for a data object to be created.
How can I manage such dependencies? The only way I can think of is to use setTimeout to loop in short intervals and check the existence of functions or objects. Is there a better way?
And if setTimeout is the only choice, what is a reasonable time interval to poll my page? 50 ms, 100 ms?
[Edit] some of my scripts collect data, either from the page itself or from Web services, sometimes from a combination of multiple sources. The data can be ready anytime, either before or after the page has loaded. Other scripts render the data (for example to build charts).
[update] thanks for the useful answers. I agree that I shouldn't reinvent the wheel, but if I use a library, at least I'd like to understand the logic behind (is it just a fancy timeout?) to try and anticipate the performance impact on my page.
You could have a function call like loaded(xyz); at the end of the scripts that are being loaded. This function would be defined elsewhere and set up to call registered callbacks based on the value of xyz. xyzcan be anything, a simple string to identify the script, or a complex object or function or whatever.
Or just use jQuery.getScript(url [, success(data, textStatus)] ).
For scripts that have dependencies on each other, use a module system like RequireJS.
For loading data remotely, use a callback, e.g.
$.get("/some/data", "json").then(function (data) {
// now i've got my data; no polling needed.
});
Here's an example of these two in combination:
// renderer.js
define(function (require, exports, module) {
exports.render = function (data, element) {
// obviously more sophisticated in the real world.
element.innerText = JSON.stringify(data);
};
});
// main.js
define(function (require, exports, module) {
var renderer = require("./renderer");
$(function () {
var elToRenderInto = document.getElementById("#render-here");
$("#fetch-and-render-button").on("click", function () {
$.get("/some/data", "json").then(function (data) {
renderer.render(data, elToRenderTo);
});
});
});
});
There are many frameworks for this kind of thing.
I'm using Backbone at the moment http://documentcloud.github.com/backbone/
Friends have also recommended knockout.js http://knockoutjs.com/
Both of these use an MVC pattern to update views once data has been loaded by a model
[update] I think at their most basic level these libraries are using callback functions and event listeners to update the various parts of the page.
e.g.
model1.loadData = function(){
$.get('http://example.com/model1', function(response){
this.save(response);
this.emit('change');
});
}
model1.bind('change',view1.update);
model1.bind('change',view2.update);
I've used pxLoader, a JavaScript Preloader, which works pretty well. It uses 100ms polling by default.
I wouldn't bother reinventing the wheel here unless you need something really custom, so give that (or any JavaScript preloader library really) a look.

How to create AJAX semi-synchronous behaviour

I spent the better part of last month beating my head against the wall before I came up with an easy way to dynamically load, and chain together HTML canvas classes which are stored on the server, but, obviously, initialized on the client (harder than it sounds when the ordering is important in an asynchronous environment).
I was wondering if someone could help me find a way to load simple javascript scripts. Lets define a load('foo.js') function which instructs the client to load script foo.js from the server and execute it as javascript code.
Given the three files, stored on the server:
A.js
a = 10;
B.js
load('A.js');
b = a + 10;
C.js
load('B.js');
c = b + 10;
If the client issues the command load('C.js'); what's the easiest/most reliable way to implement this. One idea I had was to scan the code serverside and return all the scripts at once. This requires the minimal amount of php requests. However, if the client has already requested C.js before, the script should exist client side, and this would be inneficient, especially if C.js and all its dependent files are large. Another option I considered was to wrap all of these serverside scripts in an object like so, for C.js above:
{
depenencies: ['B.js'] ,
code : 'c.age = b.age + 10;'
}
I just don't know how to 'pause' execution of script C.js after the load('B.js') statement, and then resuming it after B.js has been loaded.
EDIT Thanks to redsqaure for suggesting yepnope and requirejs. Unfortunately, I do not like them for several reasons. For one, requirejs is difficult (I am sure I will come under criticism for this one). My main gripe with this is that, if it is so difficult to learn, I might as well recreate it myself, learning it in the process, AND having greater control over it. Second, it requires you to change your style of writing. Switching to Dojo and having to use dojo.declare("ClassName", [ParentA,ParentB], {...}); to declare classes is one thing, but wrapping every snippet of code in require(['A','B',...], function(){}); is another. Finally, I don't know how simple it will be to instruct where to look for files. I want the user to be able to define a 'PATH' variable server side, and have the search occur in each of the folders/subfolders of the 'PATH'
Depends on how optimized you want it to be. Either you can go the route of synchronous XHR or use a callback (async and recommended). If you were to go the second route your code would look something like:
// Say C.js is dependent on A.js and B.js..
load(["A.js","B.js"], function() {
// code goes here
});
EDIT
Taking a second look after you feedback what you want is somewhat possible, but would be brittle and hard to write in javascript. Below i have a sample/untested implementation of a dependency loader where a file can only have one call to load("file.js") possible. This would be more complex for multiple possible dependencies. Also I'm assuming these files are coming from the same domain.
// Usage: load("A.js")
// A.js -> B.js -> C.js
window.load = (function() {
var loaded = {};
return function(str, /* internally used */ callback) {
if(!loaded[str]) {
loaded[str] = true;
$.get(str, function(data) {
var matches = data.match(/load\(['|"](.*)["|']\)/);
if(matches.length > 1) { // has deps
window.load(matches[1], function() {
window.eval(data);
if(!!callback) callback();
});
} else { // no deps
window.eval(data);
}
});
}
}
})();
Why not look into a script loader like yepnope.js or require.js

How to isolate different javascript libraries on the same page?

Suppose we need to embed a widget in third party page. This widget might use jquery for instance so widget carries a jquery library with itself.
Suppose third party page also uses jquery but a different version.
How to prevent clash between them when embedding widgets? jquery.noConflict is not an option because it's required to call this method for the first jquery library which is loaded in the page and this means that third party website should call it. The idea is that third party site should not amend or do anything aside putting tag with a src to the widget in order to use it.
Also this is not the problem with jquery in particular - google closure library (even compiled) might be taken as an example.
What solutions are exist to isolate different javascript libraries aside from obvious iframe?
Maybe loading javascript as string and then eval (by using Function('code to eval'), not the eval('code to eval')) it in anonymous function might do the trick?
Actually, I think jQuery.noConflict is precisely what you want to use. If I understand its implementation correctly, your code should look like this:
(function () {
var my$;
// your copy of the minified jQuery source
my$ = jQuery.noConflict(true);
// your widget code, which should use my$ instead of $
}());
The call to noConflict will restore the global jQuery and $ objects to their former values.
Function(...) makes an eval inside your function, it isn't any better.
Why not use the iframe they provide a default sandboxing for third party content.
And for friendly ones you can share text data, between them and your page, using parent.postMessage for modern browser or the window.name hack for the olders.
I built a library to solve this very problem. I am not sure if it will help you of course, because the code still has to be aware of the problem and use the library in the first place, so it will help only if you are able to change your code to use the library.
The library in question is called Packages JS and can be downloaded and used for free as it is Open Source under a Creative Commons license.
It basically works by packaging code inside functions. From those functions you export those objects you want to expose to other packages. In the consumer packages you import these objects into your local namespace. It doesn't matter if someone else or indeed even you yourself use the same name multiple times because you can resolve the ambiguity.
Here is an example:
(file example/greeting.js)
Package("example.greeting", function() {
// Create a function hello...
function hello() {
return "Hello world!";
};
// ...then export it for use by other packages
Export(hello);
// You need to supply a name for anonymous functions...
Export("goodbye", function() {
return "Goodbye cruel world!";
});
});
(file example/ambiguity.js)
Package("example.ambiguity", function() {
// functions hello and goodbye are also in example.greeting, making it ambiguous which
// one is intended when using the unqualified name.
function hello() {
return "Hello ambiguity!";
};
function goodbye() {
return "Goodbye ambiguity!";
};
// export for use by other packages
Export(hello);
Export(goodbye);
});
(file example/ambiguitytest.js)
Package("example.ambiguitytest", ["example.ambiguity", "example.greeting"], function(hello, log) {
// Which hello did we get? The one from example.ambiguity or from example.greeting?
log().info(hello());
// We will get the first one found, so the one from example.ambiguity in this case.
// Use fully qualified names to resolve any ambiguities.
var goodbye1 = Import("example.greeting.goodbye");
var goodbye2 = Import("example.ambiguity.goodbye");
log().info(goodbye1());
log().info(goodbye2());
});
example/ambiguitytest.js uses two libraries that both export a function goodbye, but it can explicitly import the correct ones and assign them to local aliases to disambiguate between them.
To use jQuery in this way would mean 'packaging' jQuery by wrapping it's code in a call to Package and Exporting the objects that it now exposes to the global scope. It means changing the library a bit which may not be what you want but alas there is no way around that that I can see without resorting to iframes.
I am planning on including 'packaged' versions of popular libraries along in the download and jQuery is definitely on the list, but at the moment I only have a packaged version of Sizzle, jQuery's selector engine.
Instead of looking for methods like no conflict, you can very well call full URL of the Google API on jQuery so that it can work in the application.
<script src="myjquery.min.js"></script>
<script>window.myjQuery = window.jQuery.noConflict();</script>
...
<script src='...'></script> //another widget using an old versioned jquery
<script>
(function($){
//...
//now you can access your own jquery here, without conflict
})(window.myjQuery);
delete window.myjQuery;
</script>
Most important points:
call jQuery.noConflict() method IMMEDIATELY AFTER your own jquery and related plugins tags
store the result jquery to a global variable, with a name that has little chance to conflict or confuse
load your widget using the old versioned jquery;
followed up is your logic codes. using a closure to obtain a private $ for convience. The private $ will not conflict with other jquerys.
You'd better not forget to delete the global temp var.

Categories

Resources