Load remote scripts with browserify - javascript

I really like using cdnjs to load up javascript on the client-side, it makes my project smaller and cleaner, and loads everything faster as well. I currently use require.js for module loading, which can load from cdnjs and shim traditional scripts to work with it easily. I've been looking more into browserify recently as an alternative, and while I did find browserify-shim, which can shim non-cjs modules much like require does, I'm curious if there is a way to load a script from a remote source with browserify, or if you have to install everything locally no matter what.
If the answer is that you would have to install everything locally through npm, this makes things a little weird. On one hand, you can add node_modules to the .gitignore file and not have to worry about keeping all the deps on version control if you are using a package.json, but on the other hand, you'd need to get the modules back in there on deploy, which means an additional post-deploy step that would run npm install and that node would need to be installed wherever you are deploying to, which also seems a little awkward to me for a static site especially.
Really, any ideas or discussion on this would be great : )

The way I think about it is this, you have three options: concat the JS files together locally (browserify) before deployment, load them in real-time (require.js), or a mix of both. To be fair, you can use require.js to concat your files with r.js too. For me at least, I like how browserify is designed to use the same syntax and mentality as npm modules. I think in the end the weirdness your experiencing doesn't really matter. If all the code is packaged together, you deploy, and there aren't any dependencies, seems like a win to me. Also, I think this is more in line with Java and similar compiled languages are doing, which is putting all the deps together in a deployable package. I know I mention Java but don't let that scare you because really we are all benefitting from the ideas of those around us even the languages we think we don't like. If I had to bet my money, I would bet on browserify since it's offering (what I consider) to be a more mature means of handling modules (organized by file based rather than syntax). The npm also gives us a great way to share our code so two thumbs up for them.

Related

Why JavaScript frontend framework is needed for Node.js to run?

When we talk about JavaScript vanilla it's frontend programming language; It needs a webserver like IIS, Apache or nginx etc to deliver the content to a client when requested. After that, JavaScript runs on client browser, but every video or article I found said we need to install node.js to make this work. What I know about node.js is its a runtime environment to make JavaScript work outside the browser; like for a backend api or regular desktop application.
Here is my question:
Why do we need to use Node.js if our target is to deploy a frontend webapp that's gonna run on the client browser?
You don't have to install and use Node to make frontend applications, but it can help a lot, especially in large projects. The main reason it's used is so that script-writers can easily install, use, and update external packages via NPM. For a few examples:
Webpack, to consolidate multiple script files into a single one for production (and to minify, if desired)
Babel, to automatically transpile scripts written in modern syntax down to ES6 or ES5
A linter like ESLint to avoid accidental bugs and enforce a consistent code style
A CSS preprocessor for Sass that can turn (concise) Sass into standard (more verbose) CSS consumable by browsers
And so on. Organizing an environment for these sorts of things would be very difficult without NPM (which depends on Node).
None if it is necessary, but many find that it can make the development process much easier.
In the process of creating files for the client to consume, if you want to do anything more elaborate than write plain raw .js, .html, .css files, you'll need something extra - which is most often done via NPM.
It's only for extra support during development, and ease of installing libraries. almost like an extra IDE / helpful editor
for example you might want to see changes you make on your HTML and frontend javascript code, without having to refresh the preview browser. node will provide a package that does that...
it also helps install and use libraries easier. for example, if you want to add a library like bootstrap to your frontend, rather than searching around and downloading the files... but if you use node project, you can simply use npm install bootstrap that will automatically download the lastest version from the right source.
that's all

Is it ok to refer to node_modules directly?

The project I'm currently working on (Java/JSP) currently uses no package manager to manage its JavaScript dependencies.
The used libraries are just committed under version control, and referred as such from the JSP pages...
I would like to evolve to a workflow were we would use a package manager (e.g. yarn), and later on eventually also webpack to further optimise the build.
I would like to do this in a phased approach. As I have little to none experience with such a frontend workflow, I have some questions:
Would it be weird to just start with defining the used libraries in a package.json file, and use yarn to manage to package?
yarn will then fetch the modules and store them in the node_modules folder.
Is it bad practice to refer to the scripts in that node_modules folder directly from within the JSP files?
Example
package.json:
"dependencies": {
"jquery": "^3.4.1"
}
app.jsp:
<script src="node_modules/jquery/dist/jquery.min.js"></script>
Yes, that's completely ok. It's the way we normally initialize frontend projects (probably sometimes, some higher-level script does it for us but still). Just run npm init.
Oh yes, that's quite bad. Most probably, it simply will not work. If you want to load something directly on a page, you need a cdn version.
To be honest, having a package.json is not that useful without a building tool like webpack, gulp or grunt.
UPD:
Regarding why loading things directly from node_modules might hurt:
A lot of modern JS packages (like, for instance, React) use modules that are not implemented yet in any browser or ES5+ syntax which is supported only by some browsers.
This way, you may load React directly but it will crash in any browser with something like import is not defined.
Basically, a lot of modern packages expect you to either have a building tool or use cdn version.
Honestly, I don't know how many packages let you seamlessly load things directly from node_modules.
So, in your particular situation, I'd say that if particular packages you use let you do so & are shipped with browser-compatible version, you can just go ahead & do it this way.
Nevertheless, I see it highly possible that sooner or later you will face a package that will not let you to include it this way (or worse: it will, but will crash some browsers that don't support latest JS features/introduce other nasty bugs in your app).
Hopefully, at this stage, you will already have the building tool configured.
Bonus:
Relatively recently some browsers started to support modules!
There are even tools like snowpack that do something particularly similar to what you are looking for.
Even though, you still need to be very careful with this. Direct inclusion of lodash.js, for instance, will generate 640 GETs (check out this article -> "Libraries" section).
NPM packages are meant to be run with Node, not in a browser. You would need to serve a browser-friendly version, using something like webpack or browserify.

Bundle external javascript libraries automatically

I often see browser-focused javascript libraries with an option to install over npm.
Is there a reason to install it using npm instead of just using <script src="cdn-url"></script>?
I am loading many libraries, so I guess it might be a good idea to fetch these files, so I don't make so many url requests (even though all the requests are targeting CDNs).
I could potentially install via npm and then use <script src='/node_modules/...'></script>, but then I need to make these paths public accessible using express.static() or something like that.
I know that I could use webpack, browserify, etc., but they seem overly complicated when I just want to bundle a few external libraries into 1 file automatically.
The point of using npm in this case is so you get the updates automatically. You bundle to reduce the number of requests and include only 1 script tag.
but they seem overly complicated when I just want to bundle a few external libraries into 1 file automatically.
This is complicated unfortunately. It would be nice if it wasn't. Also, you need to think about things like browser caching when you update a library. If you have a vendor libraries bundle, you will have to manually cachebust with a query string when you update. So to simplify the process, webpack does it all for you.
I would move to Webpack and use the CommonsChunkPlugin to create a vendor build. See this example.
To fully automate everything, combine this with Html Webpack Plugin to automatically add the script tags and cache-bust with hashing.

When Should I Combine my JS and CSS Files?

I've been browsing many of the articles here about how and why one should combine JS/CSS files for performance, but none of those articles offered any real guideline as to when is the right time.
I'm developing a single-page microsite that uses seven Javascript files (a mixture of third-party plugins from CDNs and my own files), and eight different CSS files (basically one per plugin, and my own compiled SASS file).
The site loads slowly even on the intranet here; I'm concerned about the performance outside. While searching for several plugins yesterday, I found several CodePen and plugin articles that basically said "cool kids concatenate JS" (literally) which got me thinking about this whole thing.
At what point should I start concatenating and minifying my Javascript/CSS?
And should I paste the CDN scripts into my own JS files, or is it better in the long run to have another HTTP request but use the statically served plugin files?
Edit: Just to clarify - I'm not asking for tools/techniques, but wondering when it becomes important to combine and minify files - should it always been done as #RobG suggested?
You should deliver code to UAT that is as close to production code as possible, including all minification and combining of files. If you don't, then you aren't doing proper UAT
To be honest, it depends.
People are often, wrongly, obsessed with merge-min... That's not always the case. The need for merge-min depends on a few things:
Sometimes it's faster and BETTER practice to load 2 css files than one big one? Why? Because they'll load in parallel. That simple.
So don't go with the merge-min obsession. If your users are returning, daily users, do merge and rely on browser cache. If not, optimise parallel loads by not merging.
And ignore the simplistic: 'yes you must merge because that's what was best 10 years ago and I've never questioned it' :)
When Should I Combine my JS and CSS Files?
Every time you are finished with development. specifically when your code is going to User Acceptance Test (UAT), if not earlier. thanks #RobG for mentioning it.
Which tools do you suggest?
Browserify
Let's Start with your JS files. I think a great tool for bundling various JS files/modules is Browserify.
Browsers don't have the require method defined, but Node.js does. With Browserify you can write code that uses require in the same way that you would use it in Node.
Here is a tutorial on how to use Browserify on the command line to bundle up a simple file called main.js along with all of its dependencies:
var unique = require('uniq');
var data = [1, 2, 2, 3, 4, 5, 5, 5, 6];
console.log(unique(data));
Install the uniq module with npm:
npm install uniq
Now recursively bundle up all the required modules starting at main.js into a single file called bundle.js with the browserify command:
browserify main.js -o bundle.js
Browserify parses the AST for require() calls to traverse the entire dependency graph of your project.
Drop a single tag into your html and you're done!
<script src="bundle.js"></script>
Also there is a tool similer for CSS files called browserify-css.
Gulp
gulp is a toolkit that will help you automate painful or time-consuming tasks in your development workflow. For web development (if that's your thing) it can help you by doing CSS preprocessing, JS transpiling, minification, live reloading, and much more. Integrations are built into all major IDEs and people are loving gulp across PHP, .NET, Node.js, Java, and more. With over 1700 plugins (and plenty you can do without plugins), gulp lets you quit messing with build systems and get back to work.
Public CDN scripts
should I paste the CDN scripts into my own JS files, or is it better in the long run to have another HTTP request but use the statically served plugin files?
You can keep them in public CDN; To avoid needlessly overloading servers, browsers limit the number of connections that can be made simultaneously. Depending on which browser, this limit may be as low as two connections per hostname.
Using a public CDN (like Google AJAX Libraries CDN) eliminates one request to your site, allowing more of your local content to downloaded in parallel. Read more on this here

How can I convert a multi-file node.js app to a single file?

If I have a node.js application that is filled with many require statements, how can I compile this into a single .js file? I'd have to manually resolve the require statements and ensure that the classes are loaded in the correct order. Is there some tool that does this?
Let me clarify.
The code that is being run on node.js is not node specific. The only thing I'm doing that doesn't have a direct browser equivalent is using require, which is why I'm asking. It is not using any of the node libraries.
You can use webpack with target: 'node', it will inline all required modules and export everything as a single, standalone, one file, nodejs module
https://webpack.js.org/configuration/target/#root
2021 edit: There are now other solutions you could investigate, examples.
Namely:
https://esbuild.github.io
https://github.com/huozhi/bunchee
Try below:
npm i -g #vercel/ncc
ncc build app.ts -o dist
see detail here https://stackoverflow.com/a/65317389/1979406
If you want to send common code to the browser I would personally recommend something like brequire or requireJS which can "compile" your nodeJS source into asynchronously loading code whilst maintaining the order.
For an actual compiler into a single file you might get away with one for requireJS but I would not trust it with large projects with high complexity and edge-cases.
It shouldn't be too hard to write a file like package.json that npm uses to state in which order the files should occur in your packaging. This way it's your responsibility to make sure everything is compacted in the correct order, you can then write a simplistic node application to reads your package.json file and uses file IO to create your compiled script.
Automatically generating the order in which files should be packaged requires building up a dependency tree and doing lots of file parsing. It should be possible but it will probably crash on circular dependencies. I don't know of any libraries out there to do this for you.
Do NOT use requireJS if you value your sanity. I've seen it used in a largish project and it was an absolute disaster ... maybe the worst technical choice made at that company. RequireJS is designed to run in-browser and to asynchronously and recursively load JS dependencies. That is a TERRIBLE idea. Browsers suck at loading lots and lots of little files over the network; every single doc on web performance will tell you this. So you'll very very quickly end up needing a solution to smash your JS files together ... at which point, what's the point of having an in-browser dependency resolution mechanism? And even though your production site will be smashed into a single JS file, with requireJS, your code must constantly assume that any dependency might or might not be loaded yet; in a complex project, this leads to thousands of async load barriers wrapping every interaction point between modules. At my last company, we had some places where the closure stack was 12+ levels deep. All that "if loaded yet" logic makes your code more complex and harder to work with. It also bloats the code increasing the number of bytes sent to the client. Plus, the client has to load the requireJS library itself, which burns another 14.4k. The size alone should tell you something about the level of feature creep in the requireJS project. For comparison, the entire underscore.js toolkit is only 4k.
What you want is a compile-time step for smashing JS together, not a heavyweight framework that will run in the browser....
You should check out https://github.com/substack/node-browserify
Browserify does exactly what you are asking for .... combines multiple NPM modules into a single JS file for distribution to the browser. The consolidated code is functionally identical to the original code, and the overhead is low (approx 4k + 140 bytes per additional file, including the "require('file')" line). If you are picky, you can cut out most of that 4k, which provides wrappers to emulate common node.js globals in the browser (eg "process.nextTick()").

Categories

Resources