Gradually move from including each JS files to module bundling - javascript

In our AngularJS application we currently have a lot (400+) of files, which get included via <script>-tags. The order these files is something like this:
AngularJS script files
3rd party plugins / modules
business Logic files
modules
services
controllers / components / directives
We would like to move to a better approach utilizing a module bundler and TypeScript. New files are already written in TypeScript but don't make use of import/export. In order to make things easier, we could convert every JavaScript file into a TypeScript file and fix the resulting errors in a feasible time.
However, before we do this, we would like to have a viable strategy on how we could gradually make use of import/export. I'm thinking of something like rewriting one module from time to time, starting with modules deep down in the dependency tree.
However I was not able to achieve this, but I'm quiet sure that others already had to solve this before.

Once you decide on a module bundler, you'll need to learn about its facilities for (1) allowing other JavaScript on the page to access things defined in the bundle and (2) allowing the bundle to access things defined by other JavaScript on the page. (If you're able to migrate in strict dependency order, you might never need #2.) For example, for Webpack, #1 would be the library* output options and #2 would be externals. Then just move the code into modules little by little, adjusting the configuration as necessary so that each part of the code has access to the things it needs from the other part of the code. Since Webpack only supports a single library export module, during the transition, you may have to maintain a dummy module that just re-exports all modules that you need to access from code outside the bundle. This is a little tedious and represents extra work that you wouldn't have to do if you migrated all at once, but you may decide it's worth paying that cost in order to be able to migrate gradually.
If you have issues getting type information from TypeScript module files in non-module TypeScript files, see this answer for a workaround.

Related

Import, Require? How to Mash Javascript Files Together?

This is vague - I apologize in advance, I am trying to be as succinct as I can with my limited understanding, while exposing my potentially incorrect assumptions.
I have a website that's literally one huge HTML file. It runs scripts defined in-line in a <scripts> tag.
My goal is to move all the scripts into individual .js files and pull them into index.html (or into one another where required). I am familiar with the usage of Node's require and I am familiar with import, as used in Angular. I stress usage because I don't really know how they work.
Assumption 1: I cannot use require - it is purely for Node.js. The reason I bring it up is that I am pretty sure I have seen require in AngularJS 1.5 code, so assuming makes me uncomfortable. I am guessing this was stitched together by WebPack, Gulp, or similar.
Assumption 2: I should use import, but import only works with a URL address, if I have my .js hosted on a server or CDN, it will be be pulled. BUT, I cannot give local pathing (on the server) here - index.html will NOT automatically pull in the dependencies while being served. I need npm/Webpack/other to pre-compile my index.html if I want the deps pulled in on the server.
Assumption 3: Pre-compiling into a single, ready-to-go, html file is the desired way to build things, because the file can be served quickly (assuming it's ready to go). I make the assumption with the recent trend of serving Markdown from CDNs, the appearance of the JAMstack, and the number of sites using Jekyll and such (though admittedly for traditional Jekyll sites).
Assumption 4: I also want to go to Typescript eventually, but I assume that changes nothing, since I will just pull in TS to compile it down to .js and then use whatever solution I used above
Question: If it's clear what I am trying to do and what confuses me, is a decent solution to look into npm/Webpack to stich together my .js files? What would prepare them for being stiched together, using import/export? If so, is there an example of how this is usually done?
As you mentioned, require cannot be used for your purposes, since it is a part of CommonJS and NodeJS's module system. More info on require can be found here: What is this Javascript "require"?
Import is a ES2015 standard JS syntax. But ES2015 standard and everything above it has very limited browser support. You can read more about it here: Import Reference
However, you can still code in the latest standard (thereby enabling the use of import/export etc.,) and then transpile the code to be able to run on the browser. Inorder to do this, you require a transpiler. You can refer Babel which is one of the most popular transpilers : https://babeljs.io/
For your exact purpose, you need to use a module bundler. Webpack/Rollup etc., are some popular module bundlers. They can automatically identify all the JS files referenced through import, combine them and then transpile code to be able to run on the browser (they also use a transpiler) and produce a single JS file (or multiple, based on your configurations).
You can refer the getting started guides of Webpack: https://webpack.js.org/guides/getting-started/
or RollupJS: https://rollupjs.org/guide/en#quick-start

Using Asynchronous Module load in javascript library project

I am working on an javascript project which will be provided as a third-library.
And there are a lot of modules and templates which is a waste of time and performance if loading them all at once. So I think use a module loader like requirejs maybe a good idea.
However since the project will be use by other people, so how about the client use requirejs at the same time?
Since I have to config the dependencies for my own library, while the client need to config the dependencies for his project. I am afraid this will cause some conflict.
Any alternatives?
And there are a lot of modules and templates which is a waste of time and performance if loading them all at once.
It sounds like you are opting to not use r.js to optimize your library. This is a choice you can make. However, you will have to document your library so that people using it can produce a RequireJS configuration that will take care of the needs of their own code and the needs of your library. You should provide an example of RequireJS configuration that satisfies your library and people wanting to use it can integrate that configuration with their own.
If you were to optimize your library with r.js, then you could use Almond to load your library as a single unit, and hide the fact that it is a collection of AMD modules. However, this entails "loading them all at once", which you do not want.

Why use requireJS instead of an ordered include list?

I've been using a grunt file to concatenate all my JS into a single file which is then sent to the client. What advantage do I have in using require calls then? The dependencies are inherent from the concatenation order and I don't have to muddy all my JS with extra code and another third-party library.
Further, backbone models (for example) clearly state their inheritance in their definitions. Not to mention that they simply wouldn't work if their dependencies weren't included anyway.
Also, wouldn't maintenance be easier if all comments related to dependencies were in one place (the grunt file) to prevent human error and having to open every JS file to understand its dependencies?
EDIT
My (ordered) file list looks something like:
....
files: [
"js/somelib.js",
"js/somelib2.js",
"js/somelib3.js",
"js/models.js",
"js/views.js",
"js/controllers.js",
"js/main.js"
], ...
So perhaps requireJS isn't worth it for small projects anyway.
Using require.js allow you to break down each part of your application into reusable modules (AMD) and to manage those dependencies easily. It is not easy to manage dependencies in a javascript application with 100 classes, for example.
Also, if you don't want all the overhead of require, check this out (developed by the same guy who created require.js): https://github.com/jrburke/almond
The answer depends on the size of your app and the end use case..
A single site.min.js payload for the front end (client) generally aims for small file sizes and simple architectures (1 single file generated from maybe 10).
back end based (server) apps are usually much bigger and complicated and therefore may warrant the use of another tool to help with managing large code libraries and dependencies (50 files for example).
In general, RequireJS is worthwhile but only if you have many files and dependencies. An alternative for use in the client would be almond. Again, using a tool like this must warrant the need (many files and dependencies).
The answer from orourkedd is also worth reading.

Using require.js module in node AND browser

I would like to create a (require.js style) AMD module that can be used in the browser and in node. What is the best way to do this? I keep seeing references to r.js, but still not 100% sure on how to use it, or if it is necessary for my situation.
Also, when including this module in node, do I still run require('module-name'), or will this change as well?
First things first: AMD basics, What all you can do with them, How to optimize them
In extremely simple terms
AMD modules are reusable JS code. Think of them as functions kept in separate files.
AMD loaders are central functions which call all other functions (modules). Think of them as "main" method in C or Java.
RequireJS is a framework which pulls all this scattered code and stitches it in a usable form.
RequireJS works in a browser. As a result, all your code is "stitched" together in a web browser.
r.js works offline (in a web server or on your development machine) to "stitch" all the code offline so that when it reaches a web browser, it's already "stitched".
Use of RequireJS lib is a must no matter you want to "stitch" your code in browser or you want to serve your code "pre-stitched".
Use of r.js is optional. It's needed only if you want to increase performance and decrease HTTP calls.

Comparison of methods to create a toolchain: JS modules / loader / build

In the process of evaluating the various approaches available to developers to use javascript modules, module loaders and build tools, i'd like some suggestions on what tools you use, and why.
I'm currently after something that is able to:
-encourage modular code
-allow features to be added based on necessity to a given module [think mixins/inheritance]
-produce a BUILD that contains a development release, and at the very minimum a production release with different layers (say, i want a layer [a script] which contains my bootstrap code, module 1, 2 and 3; and then another layer which contains modules 4,5 and 6. This way I can defer loading of code based on what's actually going on in the application.)
-Work [when built for production] in an extremely low bandwidth scenario, with xfer speeds of 1kbps and high latency (think worst case mobile connection over GPRS to get the picture).
I've seen the following:
Using prototype inheritance, as in:
myNS.thing = function(){};
myns.thing.prototype = {
something: "foo"
}
Which can be built by simply taking the contents of this script, and appending it to the next one one wants to include in a production optimized package as a single script. Loaders in this case are simple script tag injections/eval's or similar, based on a known file.
I've also seen other approaches, such as:
function(){
return function(){
something: "foo"
}
}();
Building this already gets more complex because one has to manipulate the script, removing the wrapping self executing function and combining the return values into one object. I am not aware of an "easy" way to use available build tools. The loader approach works the same as above.
Both of these approaches lack dependencies.
Then we have AMD:
define("mymodule", ["dep1"], function(dep1){
return {something: dep1}
});
Some might be nauseated by its indenting, and its "ceremony", but still its quite effective, the google closure compiler knows about it natively, it knows about dependencies, and seems to have widespread adoption across the board. There are a bunch of module loaders available for it (https://docs.google.com/spreadsheet/ccc?key=0Aqln2akPWiMIdERkY3J2OXdOUVJDTkNSQ2ZsV3hoWVE#gid=2) and quite a few build tools as well.
What other options do you know of, or have you seen used in production?
As said, i'm interested in a combination of code syntax, loader tools and build tools. These three must exist and be working together properly. The rest is an academic excercise i'm not interested in.
I personally use RequireJS, an AMD solution. If I'm whipping up quick proof-of-concepts I won't bother setting that up, but the most common source/dep-mapping solutions I know of now include:
RequireJS
CommonJS
Google Closure
YepNope (a conditional loader, can be used in combination with the others)
I have started a boilerplate that uses Require in combination with Backbone to get all the ugly setup code out of the way:
https://github.com/nick-jonas/assemblejs
So you can type assemble init to scaffold the basic project and assemble build to run the compilers and get a final production-ready build.
You might be interested in looking at Grunt a command line tool with various modules for building javascript projects. It uses npm for dependencies and it will work with amd modules, but you can just configure it to concatenate the files you need using grunt-buildconcat.

Categories

Resources