I'm trying to load multiple modules on the fly via chokidar (watchdog) using Meteor 1.6 beta, however after doing extensive research on the matter I just can't seem to get it to work.
From what I gather require by design will not take in anything other than static strings, i.e.
require("test/string/here")
Since if I try:
var path = "test/string/here"
require(path)
I just get Error: Cannot find module, even though the strings are identical.
Now the thing is I'm uncertain how to go on about this, am I really forced to either use import or static strings when using meteor or is there some workaround this?
watchdog(cmddir, (dir) => {
match = "." + regex_cmd.exec(dir);
match = dir;
loader.emit("loadcommand", match)
});
loader.on('loadcommand', (file) => {
require(file);
});
There is something intrinsically weird in what you describe.
chokidar is used to watch actual files and folders.
But Meteor compiles and bundles your code, resulting in an app folder after build that is totally different from your project structure.
Although Meteor now supports dynamic imports, the mechanism is internal to Meteor and does not rely on your actual project files, but on Meteor built ones.
If you want to dynamically require files like in Node, including with dynamically generated module path, you should avoid import and require statements, which are automatically replaced by Meteor built-in import mechanism. Instead you would have to make up your own loading function, taking care of the fact that your app built folder is different from your project folder.
That may work for example if your server is watching files and/or folders in a static location, different from where your app will be running.
In the end, I feel this is a sort of XY problem: you have not described your objective in the first place, and the above issue is trying to solve a weird solution that does not seem to fit how Meteor works, hence which may not be the most appropriate solution for your implicit objective.
#Sashko does a great job of explaining Meteor's dynamic imports here. There are also docs
A dynamic import is a function that returns a promise instead of just importing statically at build time. Example:
import('./component').then((MyComponent) => {
render(MyComponent);
});
The promise runs once the module has been loaded. If you try to load the module repeatedly then it only gets loaded once and is immediately available on subsequent requests.
afaict you can use a variable for the string to import.
Related
I have a package (let's call it preset) that imports a component (component). Both are maintained by me. For backwards compatibility, preset additionally has to import a specific function from component, which is only available from a specific file (not the one in the main field). To do this, I use the require('component/lib/file.js') syntax.
However, file.js has side-effects. This is fine normally, but for Webpack builds I also have lib-mjs (with ES module syntax for tree shaking). This causes file.js to be loaded twice, once indirectly in lib-mjs via component, and once directly in lib.
A few options:
I could move file.js out of lib/lib-mjs altogether. However, this complicates the build process and messes up the repo structure.
I could export the specific function anyway. However, since component follows the same pattern with a lot of different components, I'd have to change that in a lot of places.
I could remove the side-effect to some init function. However, this would be a breaking change and involve quite some work.
I could make sure the side-effect only happens if it hasn't happened already. However, that and the previous change still has the file loaded twice. On top of that, it would require some semi-global state, I don't think I can pull off duck-typing for this one.
Is there some option like require(require.resolve('component') + '/file.js') that actually works for both Node and Webpack?
I am coding a plugin that, for specific modules, will try to execute the module generated at build time in order to save the result to a json file.
For that, I am tapping into compilation.hooks.succeedModule, which receives a NormalModule object already built. Then I am trying to eval the source replacing webpack variables like __webpack_public_path__.
While it kind of works, this approach feels terribly wrong. Like I am missing something.
Is there a nice way to execute modules at build time from a NormalModule object having basic access to vars like __webpack_public_path__? Maybe Webpack offers a better way to do these kind of things?
Ok, yeah, sounds like you can solve this another way, I've done similar stuff where I needed to change what a module output, write stuff to disk, trigger side effects, etc. It sounds like you want loaders rather than a plugin. The run-loader (https://www.npmjs.com/package/webpack-run-loader) executes the module it loads and exports or returns the result.
You can write a custom loader that you chain to run after responsive-loader, and run-loader, and which receives the JSON from run-loader and writes it to disk where you want it (as a side effect), and then returns an empty string so that nothing is added to the build. The end result would be that requiring this module in your app gets your image files created (by responsive-loader), and the JSON written out to disk where you need it (by your custom loader). Alternately you could skip run-loader and in your custom loader use regex to just grab the JSON from the output of responsive-loader. Using regex on code generated by a project dependency seems fragile, but as long as you have your dependency versions locked down it can work just fine in practice, and it's a bit simpler conceptually than adding run-loader to the pipeline.
If you're writing webpack plugins I imagine you're comfortable writing loaders as well, but if not they're pretty straightforward -- just a function that accepts source code from the loader that came before it and returns code, and does whatever you want in between. The docs aren't bad for the API, but looking at the source of a few published loaders is helpful, too. It might look roughly (just spitballing from memory) like:
// img-info-logging-loader.js
// regex version, expects source arg to be output of responsive-loader
import * as fs from 'fs';
export const imgInfoLoggingLoader = (source) => {
const jsonFinderRegex = /someregexto(match)onsource/;
const desiredJSON = source;
const matchArr = jsonFinderRegex.exec(desiredJSON);
if (!matchArr[1]) {
throw new ReferenceError('json output not found in loader source.');
} else {
const imgConfigJsonString = matchArr[1];
// you would write a fn to generate a filename based on the
// source, or based on the module's filename, which is available
// via the webpack loader api
const fileNameToWrite = getFileNameTowrite();
try {
// async might be preferable depending on your webpack
// performance needs
fs.writeFileSync(fileNameToWrite, imgConfigJsonString);
} catch (err) {
throw new Error(`error writing ${fileNameToWrite}`);
}
}
// what the loader inserts into your JS asset: empty string
return '';
}
EDIT:
Since per your comment you are looking to output a single JSON object with all of the image info in it, you would want a slightly different approach that does use a plugin (this is the most elegant way I know to do it, there may be others). As far as I know a plugin is the only way to 'do something' when webpack is done loading modules.
You still want a custom loader that is extracting the JSON from the output of the responsive-loader, as described above. It won't write each to disk, though. Instead your loader will call a method on the following module:
You also write a json-collector.js that is just a little node module that you will use to hold on to the JSON object you're building. This bit is awkward because it's separate from the loader but the loader needs it. This collector module is simple, though, and if you wanted to be cleaner you could turn it into a more generic module and treat it as a proper, separate node dependency. All it is is an object with a method for adding JSON data, which appends it to an internal JSON object, and one for reading out the collected data, which returns the JSON.
And then you have a plugin that hooks into the end of the build (I think there's one for 'build sealed' that I've used). When that hook is reached, you know webpack has no more modules to load, so the plugin now calls the 'read' method on the json-collector, gets the JSON object from it and writes that to disc.
This solution doesn't fit the standalone plugin/standalone loader convention in webpack but if that doesn't bother you it's actually pretty straightforward, each of the three pieces has a simple job to do. I've used this pattern multiple times and it's worked for me.
I've been looking to develop a method for loading modules and/or components into an AOT-compiled Angular 4 application and been stymied by a variety of solutions that never quite seem to get me where I want to be.
My requirements are as such:
My main application is AOT compiled, and has no knowledge of what it is loading until runtime, so I cannot specifically identify my dynamic module as an entry component at compile time (which is explicitly necessary for the 'dynamic' component loading example presented on Angular.io)
I'd ideally love to be able to pull the code from a back end database via a GET request, but I can survive it simply living in a folder alongside the compiled site.
I'm using Webpack to compile my main application, breaking it into chunks - and so a lot of the SystemJS based solutions seem like dead ends - based on my current research, I could be wrong about this.
I don't need to know or have access to any components of my main application directly - in essence, I'd be loading one angular app into another, with the dynamically loaded module only perhaps having a few tightly controlled explicit interface points with the parent application.
I've explored using tools like SystemJsNgModuleLoader - which seems to require that I have the Angular compiler present, which I'm happy to do if AOT somehow allowed me to include it even if I'm not using it elsewhere. I've also looked into directly compiling my dynamic module using ngc and loading the resulting ngfactory and compiled component/module, but I'm not clear if this is at all possible or if so - what tools Angular makes available to do so. I have also seen references to ANALYZE_FOR_ENTRY_COMPONENTS - but can't clearly dig up what the limitations of this are, as first analysis indicates its not quite what I'm looking for either.
I had assumed I might be able to define a common interface and then simply make a get request to bring my dynamic component into my application - but Angular seems painfully allergic to anything I try to do short of stepping outside of it alltogether and trying to attach non-angular code to the DOM directly.
Is what I'm trying to do even possible? Does Angular 2+ simply despise this kind of on the fly modification of its internal application architecture?
I think I found an article that describes exactly what you are trying to do. In short you need to take over the bootstrap lifecycle.
The magic is in this snippet here.
import {AComponentNgFactory, BComponentNgFactory} from './components.ngfactory.ts';
#NgModule({
imports: [BrowserModule],
declarations: [AComponent, BComponent]
})
export class AppModule {
ngDoBootstrap(app) {
fetch('url/to/fetch/component/name')
.then((name)=>{ this.bootstrapRootComponent(app, name)});
}
bootstrapRootComponent(app, name) {
const options = {
'a-comp': AComponentNgFactory,
'b-comp': BComponentNgFactory
};
https://blog.angularindepth.com/how-to-manually-bootstrap-an-angular-application-9a36ccf86429
I have the following situation.
EntryA
require("./test.js");
EntryB
require("./test.js");
test.js
module.exports = "something";
I want tuse webpack to compile these javascript files. On the html page i just want to include the EntryA.js. Every thing that is common between EntryA and EntryB should come in a sperate file. Now, when test.js is required, that common file should be downloaded from the net.
is this possible and ghow should i proceed?
Webpack offers two ways to split out your JS.
Use multiple entry points (As you have done).
Define split points via require.ensure (or System.import in Webpack 2).
Approach 1 is really intended to be used by traditional 'multi' page websites and requires you to import the appropriate entry point on the correct page. NOTE: You can use the CommonsChunkPlugin to extract out all the shared code but it still must be manually referenced.
Approach 2 is more geared towards single page apps and will automatically load in new scripts as necessary.
I tried something like:
var path = '../right/here';
var module = require(path);
but it can't find the module anymore this way, while:
var module = require('../right/here');
works like a charm. Would like to load modules with a generated list of strings, but I can't wrap my head around this problem atm. Any ideas?
you can use template to get file dynamically.
var myModule = 'Module1';
var Modules = require(`../path/${myModule}`)
This is due to how Browserify does its bundling, it can only do static string analysis for requirement rebinding. So, if you want to do browserify bundling, you'll need to hardcode your requirements.
For code that has to go into production deployment (as opposed to quick prototypes, which you rarely ever bother to add bundling for) it's always advisable to stick with static requirements, in part because of the bundling but also because using dynamic strings to give you your requirements means you're writing code that isn't predictable, and can thus potentially be full of bugs you rarely run into and are extremely hard to debug.
If you need different requirements based on different runs (say, dev vs. stage testing vs. production) then it's usually a good idea to use process.env or a config object so that when it comes time to decide which library to require for a specific purposes, you can use something like
var knox = config.offline ? require("./util/mocks3") : require("knox");
That way your code also stays immediately traversable for others who need to track down where something's going wrong, in case a bug does get found.
require('#/path/'.concat(fileName))
You can use .require() to add the files that you want to access calculating its path instead of being static at build time, this way this modules will be included and when calling require() later they will be found.