How to import Parcel JS generated bundles into "legacy" application? - javascript

I'm working with an application with a front end built using old school techniques (jQuery and direct DOM manipulation) and I would like to move it over to ES8 and React. Since this is a rather large and complex application, this move will have to be gradual, meaning both the legacy code and React code will have to live side by side for some time.
The legacy code uses a home brew "module loader", which need to keep working. I've been looking at using Webpack and its configuration option libraryTarget: 'var', which basically outputs each entry point into a global variable. This works but the performance (build time) of Webpack isn't good enough so I have been looking at using ParcelJS instead.
Is it possible to achive something similar as Webpack's libraryTarget: 'var' with ParcelJS? Basically, in a "legacy HTML file" (which often times is server generated and may contain data I need to pass on to the ES8 modules), I would like to be able to do somethings along lines of
<script src="dist/js/ABundle.js"></script> <!-- Bundle created by ParceJS -->
<script>
var data = {/* JSON generated by server */};
var ABundle = require('ABundle'); // Export defined in ABundle.js.
ABundle.render(data); // Function exported in ABundle.
</script>
Note that i cannot pass my HTML files as entry points to ParcelJS as they contain references to Javascript files using the homebrew module loader which won't play nice with ParcelJS. I only want to pass the ES8 modules as entry points to ParcelJS and use them side by side with the home brew module loader.
EDIT: Clarified that legacy HTML is in fact server generated.

Related

Attempting to load a JavaScript sdk into an Angular2 application. Can't find all dependencies

I'm attempting to make use of this library: https://github.com/MagicTheGathering/mtg-sdk-javascript in an Angular2 application.
Unfortunately, I've been going in circles trying to load it into my application.
Firstly, on the TypeScript side if I import it using:
import { } from 'mtgsdk';
there are no types to load into the {}.
If I attempt to load it using something similar to:
import * as mtg from 'mtgsdk'
I'm unable to because it says that it's unable to find a module named mtgsdk.
I've installed the module using
npm install --save mtgsdk
Also, npm installs work fine for other modules.
The application compiles fine if I load it in using require via something similar to this:
var mtg = require('mtgsdk');
Taking that approach, I'm able to compile and launch but in the browser I get a number of errors about modules that it can't find. I figure they are prerequisites for the sdk that didn't get loaded so I start bringing them in via package.json.
For every one that I bring in, I then have to go to systemjs.config.js and add an entry pointing to the module's entry point and often have to specify a default extension using blocks like this:
pointer
'mtgsdk': 'npm:mtgsdk/lib/index.js',
'request-promise': 'npm:request-promise/lib/rp.js',
'ramda': 'npm:ramda/dist/ramda.js',
'emitter20': 'npm:emitter20/index.js',
'bluebird': 'npm:bluebird/js/browser/bluebird.js',
'request': 'npm:request/index.js'
default extension
'request-promise':
{
defaultExtension: 'js'
}
I'm not sure if that's the right approach though because the more dependencies I add, the more that I end up requiring. At one point I had literally gotten up to 50 extra dependencies added because every time I launched, the browser console would find more that were needed.
Is there any easier way to load all of these in?
Also, some of them (such as tough-cookie and request-promise-core) were very problematic to load and I couldn't get the browser console to stop complaining about them. Finally, some of them seemed very basic such as url, http, https. Those seem like they should already be present.
Using systemjs was utilized in the previous versions of Angular 2, However Angular 2 has evolved to Angular 4, with super new features like Angular CLI.
I recommend your use Angular CLI, with #angular/cli.
Importing Node modules
Since mtgsdk is a node-module, you can easily import it using
import * as mtg from 'mtgsdk'
However for your program to compile, you must install a type definition for it. or declare one for it in /typings.json or your app might not build.
Importing Client Scripts
For client scripts like firebase.js you won't need to add client scripts as entries in systemjs.config.js again.
Using #angular/cli, you would easily add them in the scripts[] array in your angular-cli.json for automatic compilation.
Then access them like this
declare const firebase: any;
Here is a quickstart tutorial to set up Angular with #angular/cli.

Angular2 - require module on client side

In the context of a Node.js / Express / Angular2 / typescript (IDE=Visual Studio) app, I am trying to load a third party .js utility (packery) onto the client side (for use in a directive). Someone made typescript definitions for it. The d.ts file looks like:
declare module "packery" {
interface PackeryOptions { stuff... }
class Packery { stuff .... }
export = Packery;
}
I refer to this d.ts file, tell the browser where the .js packery script lives, and then import the module as such:
import Packery = require('packery');
This compiles without complaint. However, upon running, the browser attempts (and fails) to find "packery" at http://localhost/packery as opposed to knowing packery is an imported library. This is in contrast to the other import statements I have made on the client such as:
import {Http, HTTP_PROVIDERS} from 'angular2/http';
which work - as far as I can tell the only two pieces of information I gave it for those were also a d.ts file and the location of the .js file, just like packery. But, I must be missing something. Have tried many combinations of file locations and linking and can't get it to work. How can I get the proper linking to "packery"?
Thanks!
I found a workaround for this and thought I'd post in case it helps anyone, although I am still having difficulty with the setup posed in the original question, that is, getting statements of the type:
import foo = require('foo')
to run on the CLIENT side. These work for me in node.js on the server, but on the client, for third party libraries that have been loaded via a script tag, I cannot get it to work, even if I add mapping entries to the system.js config file, irrespective of if I point to a .js file or a d.ts file.
Anyway, what does work is if you load the library using the script tag, then in your IDE put a reference path as such at the top of the CLIENT side code
/// <reference path="foo.d.ts" />
and ensure that your d.ts file does not declare a module/namespace but rather exports methods etc. directly. This lets the IDE compile without complaint, and the client side code is able to access the third party library.
However, I'm not sure if it is preferable / best practices to do what I did or if one should be configuring System.js somehow.
Typings are empty definitions of js libraries that aren't written in a typed language. They are only useful in development for IDEs hints and stuff, in your app, you'll still use the library as you normally would, adding the js file in your index.html or w/e you load your js files from.

Module definition to work with node.js, require.js and with plain scripttags too

I am working on a javascript module/library that should work in 3 environments:
in node.js
in requirejs
when simply included using tags into the webpage. In this case the whole module should be hooked up under window.myModule
Do you have any suggestions as to how to write the structure of the library so that it works in all these environments?
EDIT: basically I mean some sort of wrapper code around the library so that I can call the file form any of those three methods and I'm fine...
This requirement and its solution is known as Universal Module Definition (UMD). It is currently a draft proposal. Background and current status is described in Addy Osmani - Writing Modular JavaScript With AMD, CommonJS & ES Harmony article. Look for "UMD" link pointing to various templates you can use.
Quite many other templates can be found on the web - UMD is the search keyword.
(did not find the final link myself yet :)
We're working on the same thing, I think.
And we have some success. We have library (we call it 'slib'), compiled to AMD js files. It does not depend on npm modules or browser, so it can be called from node and from browser.
1) To call it from node, we use requirejs:
file require.conf.js
module.exports = function(nodeRequire){
global.requirejs = require('requirejs');
requirejs.config({
baseUrl: __dirname+"/../web/slib/",
paths: {
slib: "."
},
nodeRequire: nodeRequire
});
}
In any other serverside (nodejs) file we add this line at the beginning
require("./require.conf")(require);
then we call slib's code by:
var Computation = requirejs("slib/Computation");
2) To call slib from browser, we just use requirejs. It handles everything fine.
3) We do not need to call slib from < script > directly.
For production, we use r.js to make a bundle js file with most of dependencies and use it on the page with one < script >. And this script downloads all other deps, if they are not included, using standard requirejs and it does not need requirejs (as far as I remember), it just works alone. This is very flexible for large projects: use requirejs while development, use r.js to bundle core files in production to speed up page load, use whole bundle if you need only one < script > without any other requests. r.js bundles all dependencies correctly, including old js libraries, which were commonly loading using only < script > and accessible by window.myOldLibrary using shim param on config.
It seems you can use browserfy to make some npm modules accessible from slib's code, but we did not tried yet.
Also, using requirejs on node's side, I think, can be simpler (why we need second 'requirejs' function together with node's one?) We just have not investigated it well, but this works.
In any slib module you can write
if (window)
window.module1 = this // or whatever
and it will be exported as old js lib upon load

Is there a way to lazily set the path of a resource with RequireJS?

So, I have an app that is using requireJS. Quite happily. For the most part.
This app makes use of Socket.IO. Socket.IO is being provided by nodejs, and does not run on the same port as the main webserver.
To deal with this, in our main js file, we do something like this:
var hostname = window.location.hostname;
var socketIoPath = "http://" + hostname + ":3000/socket.io/socket.io";
requirejs.config({
baseUrl: "/",
paths: {
app : "scripts/appapp",
"socket.io" : socketIoPath
}
});
More complicated than this, but you get the gist.
Now, in interactive mode, this works swimingly.
The ugliness starts when we try to use r.js to compile this (technically we're using grunt to run r.js, but that's besides the point).
In the config for r.js, we set an empty path for socket.io (to avoid it failing to pull in), and we set our main file as the mainConfigFile.
The compiler yells about this, saying:
Running "requirejs:dist" (requirejs) task
>> Error: Error: The config in mainConfigFile /…/client.js cannot be used because it cannot be evaluated correctly while running in the optimizer. Try only using a config that is also valid JSON, or do not use mainConfigFile and instead copy the config values needed into a build file or command line arguments given to the optimizer.
>> at Function.build.createConfig (/…/r.js:23636:23)
Now, near as I can figure, this is due to the fact that I'm using a variable to set the path for "socket.io". If i take this out, require runs great, but i can't run the raw from a server. If I leave it is, my debug server is happy, but the build breaks.
Is there a way that I can lazily assign the path of "socket.io" at runtime so that it doesn't have to go into the requirejs.config() methos at that point?
Edit: Did some extensive research on this. Here are the results.
Loading from CDN with RequireJS is possible with a build. However, if you're using the smaller Almond loader, it's not possible.
This leaves you with two options:
Use almond along with a local copy of the file in your build.
Use the full require.js loader and try to use a CDN.
Use a <script> tag just for that resource.
I say try for #2 because there are some caveats. You'll need to include require.js in your HTML with the data-main attribute for your built file. But if you do this, require and define will be global functions, allowing users to require any of your internal modules and mess around with them. If you're okay with this, you'll need to follow the "empty: scheme" in your build config (but not in your main config).
But the fact remains that you now have another HTTP request. If you only want one built file, which includes the require.js loader, you'll need to optimize for only one file.
Now, if you want to avoid users being able to require your modules, you'll have to do something like wrap:true in your build. But as far as I can tell, once your module comes down from CDN, if it's AMD, it's going to look for a global define function to register itself with, and that won't exist because it's now wrapped in a closure.
The lesson I took away from all this: inline your resources to your build. It makes sense. You reduce HTTP requests, minify it all and get gzip compression. You don't expose your modules to the world and everything is a lot simpler. If you cache your resources properly you won't even need to worry about it.
But since new versions of socket.io don't like AMD, here's how I did it. Make sure to include the socket.io <script> tag before requirejs. Then create a requirejs module named socket.io with the following contents:
define([], function () {
var io = window.io;
window.io = null;
return io;
});
Set the path like so: 'socket.io': 'core/socket.io' or wherever you want.
And require it as normal! The build works fine this way.
Original answer
Is it possible that you could make use of the path config fallbacks specified in the RequireJS API? Maybe you could save the file locally as a fallback so your build will work.
The socket.io GitHub repository specifies that you can serve the client with the files in the socket.io-client package's dist/ directory.

Is there a way to include file in coffee script?

I'd like to know if there is a way to include a file in a coffee script.
Something like #include in C or require in PHP...
If you use coffeescript with node.js (e.g. when using the commandline tool coffee) then you can use node's require() function exactly as you would for a JS-file.
Say you want to include included-file.coffee in main.coffee:
In included-file.coffee: declare and export objects you want to export
someVar = ...
exports.someVar = someVar
In main.coffee you can then say:
someVar = require('included-file.coffee').someVar
This gives you clean modularization and avoids namespace conflicts when including external code.
How about coffeescript-concat?
coffeescript-concat is a utility that preprocesses and concatenates
CoffeeScript source files.
It makes it easy to keep your CoffeeScript code in separate units and
still run them easily. You can keep your source logically separated
without the frustration of putting it all together to run or embed in
a web page. Additionally, coffeescript-concat will give you a single
sourcefile that will easily compile to a single Javascript file.
Tl;DR: Browserify, possibly with a build tool like Grunt...
Solutions review
Build tool + import pre-processor
If what you want is a single JS file to be run in the browser, I recommend using a build tool like Grunt (or Gulp, or Cake, or Mimosa, or any other) to pre-process your Coffeescript, along with an include/require/import module that will concatenate included files into your compiled output, like one of these:
Browserify: probably the rising standard and my personal favourite, lets you to use Node's exports/require API in your code, then extracts and concatenates everything required into a browser includable file. Exists for Grunt, Gulp, Mimosa and probably most others . To this day I reckon it is probably the best solution if you're after compatibility both Node and the browser (and even otherwise)
Some Rails Sprocket-like solutions like grunt-sprockets-directives or gulp-include will also work in a consistent way with CSS pre-processors (though those generally have their own importing mechanisms)
Other solutions include grunt-includes or grunt-import
Standalone import pre-processor
If you'd rather avoid the extra-complexity of a build tool, you can use Browserify stand-alone, or alternatives not based on Node's require like coffeescript-concat or Coffee-Stir
[Not recommended] Asynchronous dynamic loading (AJAX + eval)
If you're writing exclusively for the browser and don't mind, or rather really want, your script being spread across several files fetched via AJAX, you can use a myriad of tools like:
yepnope.js or Modernizr's .load based on yepnope: Please note that yepnope is now deprecated by its maintainer, who recommend using build tools and concatenation instead of remote loading
RequireJS
HeadJS
jQuery's $.getScript
Vanilla AJAX + eval
your own implementation of AMD
You can try this library I made to solve this same problem coffee-stir
its very simple.
Just type #include and the name of the file that you want to include
#include MyBaseClass.coffee
For details
http://beastjavascript.github.io/Coffee-Stir/
I found that using "gulp-concat" to merge my coffee scripts before processing them did the trick. It can be easily installed to your project with npm.
npm install gulp-concat
Then edit your gulpfile.js:
var gulp = require('gulp')
,coffee = require('gulp-coffee')
,concat = require('gulp-concat');
gulp.task('coffee', function(){
gulp.src('src/*.coffee')
.pipe(concat('app.coffee')
.pipe(coffee({bare: true}).on('error', gulp.log))
.pipe(gulp.dest('build/')
})
This is the code I used to concatenate all my coffee scripts before gulp processed it into the final build Javascript. The only issue is the files are processed in alphabetical order. You can explicitly state which file to process to achieve your own file order, but you lose the flexibility of adding dynamic .coffee files.
gulp.src(['src/file3.coffee', 'src/file1.coffee', 'src/file2.coffee'])
.pipe(concat('app.coffee'))
.pipe(coffee({bare: true}).on('error', gulp.log))
.pipe(gulp.dest('build/')
gulp-concat as of February 25th, 2015 is available at this url.
Rails uses sprockets to do this, and this syntax has been adapted to https://www.npmjs.org/package/grunt-sprockets-directives. Works well for me.

Categories

Resources