Best way to make webpack run a script in browser? - javascript

I want Webpack to execute a script (that is provided by my npm package) in the browser, without needing to include that script in my app. Currently, I am using webpack.config.dev.js to force it, like this:
module.exports = {
entry: [
paths.appIndexJs,
"./node_modules/my-dev-package/client.js"
]
}
client.js contains an IIFE that modifies the DOM. I want to avoid having to add conditional logic to my application to check for the environment, when I really just need something appended to the DOM in development. Is there a more standard way to get this same effect, without having to import client.js inside my app and conditionally run it there?
I am new to webpack, so if there is a standard way to do this or a commonly used plugin, that's what I'm looking for.

Related

How to bundle react components which would be needed dynamically?

We have several applications deployed which were with plain html and js.
These applications should be completely independent modules without affecting other applications deployed.
But we should be able to load from one application to another if the situation needs which will be dynamically decided by business.
We were able to load different application screens by simply giving the relative path as there was no dependency bundling previously.
Now we are converting them to react applications with webpack as bundler.
Here we have to use dynamic import if we are going to need something dynamically. That also works on patterns which should be given at the build time.
So is there any way to achieve this kind of dynamic pattern using webpack bundler?
For importing screen components, code is something like below,
Promise.all(reqList.map(modulePath =>
{
return import(/* webpackMode: "lazy-once" */`../../../${modulePath}.jsx`)
})).then(modules => {doStuff()})
As they all are parallelly deployed applications we are trying to go back 3 folders (i.e, root folder like webapps in tomcat) and access the other application path, which will be derived dynamically in modulePath variable.
So while importing, webpack tries to import from the chunks which has already been loaded on first application launch. But this loaded chunks are not having the screens from other applications yet.
We have tried giving each jsx files as entry points in webpack which did created independent files but if we make them as entry files, they should be attached to the index.html manually, Which would cease export in jsx to work.
My wepack config is something like below,
function getEntries(pattern) {
const entries = {};
glob.sync(pattern).forEach((file) => {
let fileName = file.substr(0,file.indexOf("."));
entries[fileName.replace('src/', '')] = path.join(__dirname, file);
});
return entries;
}
let jsxFiles = getEntries('src/**/*.jsx');
console.log(Object.keys(jsxFiles));
module.exports = {
entry: jsxFiles,
output: {
path: path.resolve(__dirname, './dist'),
filename: '[name].js',
chunkFilename: "AppId"+'.js'
}
}
The first application is loading fine as webpack is able to find the bundled chunks but when we try to load different application screen components, dynamic importing fails saying module not found.
Is there any way we can achieve this kind of dynamic imports?
Thanks for going through such a long post, I didn't had a choice but to explain.. :)
Essentially, you are looking for dynamic module loading. You would need to webpack each module separately.
To load them, you have several options
Use the browser's native import(), but IE, Edge, and Firefox don't support it, yet
Use a module loader library
SystemJS
RequireJS, although it's rather old
In case of going with a module loader, you need to webpack your modules to the corresponding System.register or AMD formats

How to manually trigger watch/reload for webpack dev server?

I have a fairly simple webpack set up with a bit of a twist. I have a few different ways I can think of to create my intended behavior, but I'm wondering if there are better approaches (I'm still fairly new to webpack).
I have a basic TypeScript/Scss application and all of the src files exist in a src directory. I also have a component library bundle that's dynamically generated through a separate build process (triggered via Node). This bundle also ends up in the src directory (it contains some sass variables and a few other assets that belong in src). This is src/custom-library-bundle. My current webpack set up moves the appropriate bundle files to a public (dist) directory via the CopyWebpackPlugin. My webpack dev server watches for changes and reloads as expected. This all works beautifully.
Now for the twist. I've set up a custom watcher that exists elsewhere to watch for the custom component library bundle changes. When a change occurs there, it triggers that custom build process mentioned above. This (as expected) modifies the src/custom-library-bundle directory and sets off several watches/reloads as the bundle is populated. Technically it works? And the behavior is expected, but ideally, I could tell it to wait until the custom installation work is "done" and only then have it trigger the compilation/reload.
Phew. This isn't an easy one to explain. Here's a webpack config that will (hopefully) help clarify:
devServer: {
contentBase: path.join(__dirname, 'public'),
port: 9000,
host: '127.0.0.1',
after: (app, server) => {
new MyCustomWatcherForOtherThings().watch(() => {
// invoked after the src/custom-library-bundle is done doing its thing (each time)
// now that I know it's done, I want to trigger the normal compilation/reload
})
},
watchOptions: {
ignored: [
/node_modules/
// I've been experimenting with the ignored feature a bit
// path.resolve(__dirname, 'src/custom-library-bundle/')
]
}
}
Ideal Approach: In my most ideal scenario, I want to just manually trigger webpack dev server to do its thing in my custom watch callback; have it ignore the src/custom-library-bundle until I tell it to pay attention. I can't seem to find a way to do this, however. Is this possible?
Alternate Approach #1: I could ignore the src/custom-library-bundle directory, move the updated files to public (not utilizing the webpack approach), and then trigger just a reload when I know that's complete. I don't love this because I want to utilize the same process whether I'm watching or just doing a one-off build (I want everything to end up in the public directory because webpack did that work, not becuase I wrote a script to put it there under specific scenarios). But, let's say I get over that, how can I trigger a browser reload for the dev server? Is this possible?
Alternate Approach #2 This is the one I'm leaning towards but it feels like extra, unnecessary work. I could have my custom build process output to a different directory (one that my webpack set up doesn't care about at all). Once the build process is complete, I could move all the files to src/custom-library-bundle where the watch would pick up 1 change and do a single complication/reload. This gets me so close, but feels like I'm adding a step I don't want to.
Alternate Approach #3? Can you think of a better way to solve this?
Update (including versions):
webpack 4.26.1
webpack-dev-server 3.1.14
Add the following server.sockWrite call to your after method:
devServer: {
after: (app, server) => {
new MyCustomWatcherForOtherThings().watch(() => {
// invoked after the src/custom-library-bundle is done doing its thing (each time)
// now that I know it's done, I want to trigger the normal compilation/reload
// calling this on `server` triggers a full page refresh
server.sockWrite(server.sockets, "content-changed");
});
};
}
I've never found this in the documentation, but one of webpack's core devs mentioned it in a comment on GitHub, so it's sort of sanctioned.
Useful things that webpack provides that come to mind are multi-compiler builds, creating a child compiler from a Compiler or Compilation instance, the DllPlugin and programmatically managing the compiler by calling webpack.watch() or webpack.compile().
the multi-compiler setup is useful when you want to build several compilations in a single run whose outputs are independent of each other
the child compilers allows you to set up dependencies between
compilers and using hooks they allow you to block the parent until,
say, the child compilation finished emitting the latest bundle into
the assets
the DllPlugin allows you to create a compilation and
it's manifest that could produce chunks that can contain modules that
can be used as dependents for yet unbuilt compilations (the manifest
needs to be produced and passed to them beforehand)
programmatically
managing you compiler lets you write a straightforward Node.js script
that could accomplish most of this manually.
If I understood correctly, your webpack compiler doesn't really have any dependencies on your bundle aside from expecting it to have something added to the output assets, so you might consider doing a multi-compiler build. If that doesn't work for you, you can write a simple plugin that creates a child compiler that watches all the component bundle dependencies and emits the built bundle into the main compilation assets on changes. Ultimately, as you mentioned yourself, you could write a simple script and programmatically weave everything together. All of these approaches offload tracking build dependencies to webpack, so if you put the compilers into watch mode webpack will keep track when something needs updating.
If you're interested in taking a closer look at how child compilers are created and used, I would heartily recommend reading through the sources of the html-webpack-plugin plugin. It doesn't seem like the plugin is handling the same kind of build setup as you have, but worth noting is that the HTML plugin works with files that aren't part of the build dependencies (nothing in the sources references or depends on the files/templates that are used for creating the HTML files that are added to the output).

How to import Parcel JS generated bundles into "legacy" application?

I'm working with an application with a front end built using old school techniques (jQuery and direct DOM manipulation) and I would like to move it over to ES8 and React. Since this is a rather large and complex application, this move will have to be gradual, meaning both the legacy code and React code will have to live side by side for some time.
The legacy code uses a home brew "module loader", which need to keep working. I've been looking at using Webpack and its configuration option libraryTarget: 'var', which basically outputs each entry point into a global variable. This works but the performance (build time) of Webpack isn't good enough so I have been looking at using ParcelJS instead.
Is it possible to achive something similar as Webpack's libraryTarget: 'var' with ParcelJS? Basically, in a "legacy HTML file" (which often times is server generated and may contain data I need to pass on to the ES8 modules), I would like to be able to do somethings along lines of
<script src="dist/js/ABundle.js"></script> <!-- Bundle created by ParceJS -->
<script>
var data = {/* JSON generated by server */};
var ABundle = require('ABundle'); // Export defined in ABundle.js.
ABundle.render(data); // Function exported in ABundle.
</script>
Note that i cannot pass my HTML files as entry points to ParcelJS as they contain references to Javascript files using the homebrew module loader which won't play nice with ParcelJS. I only want to pass the ES8 modules as entry points to ParcelJS and use them side by side with the home brew module loader.
EDIT: Clarified that legacy HTML is in fact server generated.

Attempting to load a JavaScript sdk into an Angular2 application. Can't find all dependencies

I'm attempting to make use of this library: https://github.com/MagicTheGathering/mtg-sdk-javascript in an Angular2 application.
Unfortunately, I've been going in circles trying to load it into my application.
Firstly, on the TypeScript side if I import it using:
import { } from 'mtgsdk';
there are no types to load into the {}.
If I attempt to load it using something similar to:
import * as mtg from 'mtgsdk'
I'm unable to because it says that it's unable to find a module named mtgsdk.
I've installed the module using
npm install --save mtgsdk
Also, npm installs work fine for other modules.
The application compiles fine if I load it in using require via something similar to this:
var mtg = require('mtgsdk');
Taking that approach, I'm able to compile and launch but in the browser I get a number of errors about modules that it can't find. I figure they are prerequisites for the sdk that didn't get loaded so I start bringing them in via package.json.
For every one that I bring in, I then have to go to systemjs.config.js and add an entry pointing to the module's entry point and often have to specify a default extension using blocks like this:
pointer
'mtgsdk': 'npm:mtgsdk/lib/index.js',
'request-promise': 'npm:request-promise/lib/rp.js',
'ramda': 'npm:ramda/dist/ramda.js',
'emitter20': 'npm:emitter20/index.js',
'bluebird': 'npm:bluebird/js/browser/bluebird.js',
'request': 'npm:request/index.js'
default extension
'request-promise':
{
defaultExtension: 'js'
}
I'm not sure if that's the right approach though because the more dependencies I add, the more that I end up requiring. At one point I had literally gotten up to 50 extra dependencies added because every time I launched, the browser console would find more that were needed.
Is there any easier way to load all of these in?
Also, some of them (such as tough-cookie and request-promise-core) were very problematic to load and I couldn't get the browser console to stop complaining about them. Finally, some of them seemed very basic such as url, http, https. Those seem like they should already be present.
Using systemjs was utilized in the previous versions of Angular 2, However Angular 2 has evolved to Angular 4, with super new features like Angular CLI.
I recommend your use Angular CLI, with #angular/cli.
Importing Node modules
Since mtgsdk is a node-module, you can easily import it using
import * as mtg from 'mtgsdk'
However for your program to compile, you must install a type definition for it. or declare one for it in /typings.json or your app might not build.
Importing Client Scripts
For client scripts like firebase.js you won't need to add client scripts as entries in systemjs.config.js again.
Using #angular/cli, you would easily add them in the scripts[] array in your angular-cli.json for automatic compilation.
Then access them like this
declare const firebase: any;
Here is a quickstart tutorial to set up Angular with #angular/cli.

Is there a way to lazily set the path of a resource with RequireJS?

So, I have an app that is using requireJS. Quite happily. For the most part.
This app makes use of Socket.IO. Socket.IO is being provided by nodejs, and does not run on the same port as the main webserver.
To deal with this, in our main js file, we do something like this:
var hostname = window.location.hostname;
var socketIoPath = "http://" + hostname + ":3000/socket.io/socket.io";
requirejs.config({
baseUrl: "/",
paths: {
app : "scripts/appapp",
"socket.io" : socketIoPath
}
});
More complicated than this, but you get the gist.
Now, in interactive mode, this works swimingly.
The ugliness starts when we try to use r.js to compile this (technically we're using grunt to run r.js, but that's besides the point).
In the config for r.js, we set an empty path for socket.io (to avoid it failing to pull in), and we set our main file as the mainConfigFile.
The compiler yells about this, saying:
Running "requirejs:dist" (requirejs) task
>> Error: Error: The config in mainConfigFile /…/client.js cannot be used because it cannot be evaluated correctly while running in the optimizer. Try only using a config that is also valid JSON, or do not use mainConfigFile and instead copy the config values needed into a build file or command line arguments given to the optimizer.
>> at Function.build.createConfig (/…/r.js:23636:23)
Now, near as I can figure, this is due to the fact that I'm using a variable to set the path for "socket.io". If i take this out, require runs great, but i can't run the raw from a server. If I leave it is, my debug server is happy, but the build breaks.
Is there a way that I can lazily assign the path of "socket.io" at runtime so that it doesn't have to go into the requirejs.config() methos at that point?
Edit: Did some extensive research on this. Here are the results.
Loading from CDN with RequireJS is possible with a build. However, if you're using the smaller Almond loader, it's not possible.
This leaves you with two options:
Use almond along with a local copy of the file in your build.
Use the full require.js loader and try to use a CDN.
Use a <script> tag just for that resource.
I say try for #2 because there are some caveats. You'll need to include require.js in your HTML with the data-main attribute for your built file. But if you do this, require and define will be global functions, allowing users to require any of your internal modules and mess around with them. If you're okay with this, you'll need to follow the "empty: scheme" in your build config (but not in your main config).
But the fact remains that you now have another HTTP request. If you only want one built file, which includes the require.js loader, you'll need to optimize for only one file.
Now, if you want to avoid users being able to require your modules, you'll have to do something like wrap:true in your build. But as far as I can tell, once your module comes down from CDN, if it's AMD, it's going to look for a global define function to register itself with, and that won't exist because it's now wrapped in a closure.
The lesson I took away from all this: inline your resources to your build. It makes sense. You reduce HTTP requests, minify it all and get gzip compression. You don't expose your modules to the world and everything is a lot simpler. If you cache your resources properly you won't even need to worry about it.
But since new versions of socket.io don't like AMD, here's how I did it. Make sure to include the socket.io <script> tag before requirejs. Then create a requirejs module named socket.io with the following contents:
define([], function () {
var io = window.io;
window.io = null;
return io;
});
Set the path like so: 'socket.io': 'core/socket.io' or wherever you want.
And require it as normal! The build works fine this way.
Original answer
Is it possible that you could make use of the path config fallbacks specified in the RequireJS API? Maybe you could save the file locally as a fallback so your build will work.
The socket.io GitHub repository specifies that you can serve the client with the files in the socket.io-client package's dist/ directory.

Categories

Resources