Modify build path of assets in Vite/Rollup? - javascript

Let's say I have this folder structure:
parent
|-parent.html
|-parent.js
|-child
|--child.html
|--child.js
I want them to output in the same structure in my dist folder.
By default this is what gets output:
dist/assets/parent.js
dist/assets/child.js
I want them to output like this:
dist/parent/parent.js
dist/parent/child/child.js
I tried changing the assetFileNames option of Rollup but it didn't do anything.

The output filenames are configured in Rollup with build.rollupOptions. Set output.entryFileNames to configure the location of the entry .js files to match their original directory structure:
// vite.config.js
import { fileURLToPath } from 'url';
import { defineConfig } from 'vite';
import path from 'path';
const rootDir = fileURLToPath(new URL('.', import.meta.url));
export default defineConfig({
build: {
rollupOptions: {
input: {
parent: './parent/parent.html',
child: './parent/child/child.html',
},
output: {
entryFileNames: (assetInfo) => {
// assetInfo.facadeModuleId contains the file's full path
if (assetInfo.facadeModuleId) {
const assetPath = path.dirname(assetInfo.facadeModuleId).replace(rootDir, '');
return assetPath + '/[name]-[hash].js';
} else {
return 'assets/js/[name]-[hash].js';
}
},
},
},
},
});
demo
Notes
Assets (such as .css files) and shared modules (vendor .js chunks) cannot be redirected with the solution above because the asset info from the related hooks do not provide the file's full path.
In a vanilla Rollup project, output.preserveModules=true would've accomplished the original goal, but that option conflicts with Vite's own settings for Rollup.

Related

Webpack: ModuleNotFoundError for an unimported module

The problem
My project consists of a yarn monorepo in which, among various packages, there is a NextJS app and a configuration package (which is also shared with the server and a react-native application). In the configuration module what I do is to export the production keys if the environment is production, while if the project is under development, the development ones.
import { merge } from "lodash";
import { IConfig, RecursivePartial } from "./interfaces";
import { defaultKeys } from "./default.keys";
import { existsSync } from "fs";
const secretKeys: RecursivePartial<IConfig> = (function (env) {
switch (env) {
case "production":
return require("./production.keys").productionKeys;
default:
try {
if (!existsSync("./development.keys")) {
return require("./production.keys").productionKeys;
} else {
return require("./development.keys").developmentKeys;
}
} catch (e) {
}
}
})(process.env.NODE_ENV);
export const keys = merge(defaultKeys, secretKeys) as IConfig;
Of course, the development configuration file is included in the .gitignore file and therefore does not exist during production.
This has always worked perfectly (with the server and the mobile app), and indeed, there was never the need to check with the fs module if the development.keys file exists (check which I added later).
However, I recently added a NextJs application to the project. I encountered no problems during development, however when I tried to deploy the app on Heroku I encountered this error:
ModuleNotFoundError: Module not found: Error: Can't resolve './development.keys' in '/tmp/build_43d652bc/packages/config/dist/keys'
What I did
Initially, I thought it was a problem with the environment variables and that in production the development file was wrongly required.
However, I later realized that this was not the problem. And indeed, even placing the import of the configuration file for development in any place of the code, even following a return, the error continues to occur.
import { merge } from "lodash";
import { IConfig, RecursivePartial } from "./interfaces";
import { defaultKeys } from "./default.keys";
import { existsSync } from "fs";
const secretKeys: RecursivePartial<IConfig> = (function (env) {
switch (env) {
case "production":
return require("./production.keys").productionKeys;
default:
try {
if (!existsSync("./development.keys")) {
return require("./production.keys").productionKeys; // <-------
} else {
return require("./production.keys").productionKeys; // <-------
}
require("./development.keys").developmentKeys; // <------- This line should not be executed
} catch (e) {
return require("./production.keys").productionKeys;
}
}
})(process.env.NODE_ENV);
export const keys = merge(defaultKeys, secretKeys) as IConfig;
It is as if during the build, nextjs (or probably webpack) controls all the imports, without following the "flow" of the code.
I hope someone shows me where I am wrong because at the moment I am stuck. Thank you!
Update
Thanks to the ideas of this discussion I changed my file which now looks like this:
const getConfigPath = (env?: string): string => {
console.log(env);
if (env === "production") return "production.keys";
else return "development.keys";
};
const secretKeys: RecursivePartial<IConfig> = require("./" +
getConfigPath(process.env.NODE_ENV)).keys;
export const keys = merge(defaultKeys, secretKeys) as IConfig;
However, now I'm running into another webpack error:
Module parse failed: Unexpected token (2:7)
You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
| import { IConfig, RecursivePartial } from "../interfaces";
> export declare const keys: RecursivePartial<IConfig>;
It is as if webpack does not recognize declaration files generated by typescript. However, the error is new and I have never encountered it. I believe it's due to the behavior of webpack pointed out in the linked discussion.
I trust in some help, since I know little about webpack
Edit
This is my next.config.js:
const path = require("path");
module.exports = {
distDir: '../../.next',
sassOptions: {
includePaths: [path.join(__dirname, 'styles')],
prependData: `#import "variables.module.scss";`
},
webpack: (config, { isServer }) => {
if (!isServer) {
config.resolve.fallback.fs = false;
}
return config;
},
};
Basically they are the defult configurations. The only thing I have changed is the path for the build, I have made a sass file globle and I have momentarily added a piece of code to allow the use of the fs module, which however as you can see above I do not use it more. So I could take this configuration out.
You need the loader for typescript configured in your next.config.js
npm install || yarn add awesome-typescript-loader
module.exports = {
distDir: '../../.next',
sassOptions: {
includePaths: [path.join(__dirname, 'styles')],
prependData: `#import "variables.module.scss";`
},
webpack: (config, options) => {
const { dir, defaultLoaders, isServer } = options;
if (!isServer) {
config.resolve.fallback.fs = false;
}
config.pageExtensions.push(".ts", ".tsx");
config.resolve.extensions.push(".ts", ".tsx");
config.module.rules.push({
test: /\.tsx?$/,
include: [dir],
exclude: /node_modules/,
use: [
defaultLoaders.babel,
{
loader: "awesome-typescript-loader",,
},
],
});
return config;
}
}

NextJS: Can't use absolute imports in external libraries when using experimental.externalDir

Having the following folder structure of my monorepo that uses nextjs, lerna and npm workspaces:
packages
next-js-app
pages
index.tsx
tsconfig.json
ui-library
src
components
dropdown.tsx
index.ts
utils.ts
tsconfig.json
package.json
lerna.json
tsconfig.json
I want to import the ui-library in the next-js-app such as:
// packages/next-js-app/pages/index.tsx
import { UiLibrary } from '#workspaceName/ui-library'
I allowed it by adding externalDir: true to experimental key in the next.config.js inside the next-js-app:
module.exports = {
...
experimental: {
externalDir: true
}
...
}
Problem
The import works, but inside the packages/ui-library/src/components/dropdown.tsx there's a line:
// packages/ui-library/src/components/dropdown.tsx
import { helperFunction } from 'utils'
meaning that I want to import helperFunction from packages/ui-library/src/utils.ts.
When running next dev script from packages/next-js-app I get the following error:
wait - compiling...
error - ../ui-library/src/components/dropdown.tsx:3:0
Module not found: Can't resolve 'utils'
1 | ...
2 | ...
> 3 | import { helperFunction } from "utils"
4 |
5 | const Dropdown = () => {
6 | const onClick = (event) => {
While if I run the ui-library separately it will be ok with using absolute paths in imports.
Question
How can I make absolute imports work in this case?
The way I fixed it in my case, was adding aliases to my webpack config (inside next.config.js file)
module.exports = {
...
experimental: {
externalDir: true
}
config.resolve = {
...config.resolve,
alias: {
...config.resolve.alias,
'utils': path.resolve('../../packages/ui-library/src/utils'),
// ^ defined all paths used within my external UI library
},
}
}

Webpack conditionally require file base on environment mode

I have 3 config JavaScript files like config.local.js, config.dev.js, config.prod.js. An example of one of these files.
// config.dev.js
function Config(){
return{
apiUrl: "https://myapi.dev.com/"
}
}
module.exports = Config;
Now I have an entry file that requires this config file. For instance:
// register.js
function Register(){
const config = require("../config");
// rest of the code to use that config
}
My web pack config looks like this.
module.exports = {
entry: {
register: "./app/src/register.js",
},
output: {
filename: "[name].js",
path: __dirname + "/app/dist",
},
};
I would like that when I run web pack for production (web pack --mode=production), it would bundle the config.prod.js file.
Is this possible?
See this page: https://webpack.js.org/guides/dependency-management/
require('../config.' + process.env.NODE_ENV)

How can I add Font Awesome to my Aurelia project using npm?

I have been following the Contact Manager tutorial and would like to add Font Awesome to the project. Here's what I have done so far:
npm install Font-Awesome --save
Added the following to aurelia.jsonunder the dependencies array of the vendor-bundle.js:
...
{
"name": "font-awesome",
"path": "../node_modules/font-awesome",
"resources": [
"css/font-awesome.min.css"
]
},
...
But when running au run --watch I get the error:
error
C:\Users\node_modules\font-awesome.js
Why is it looking for the .js file?
Don't add font-awesome resources to aurelia.json - you'd need font files too, which Aurelia don't process. Instead, take the following steps.
First, if you added anything for font-awesome already to your aurelia.json file, remove it again.
Add new file prepare-font-awesome.js in folder \aurelia_project\tasks and insert the below code. It copies font-awesome resource files to output folder (as configured aurelia.json config file; e.g. scripts):
import gulp from 'gulp';
import merge from 'merge-stream';
import changedInPlace from 'gulp-changed-in-place';
import project from '../aurelia.json';
export default function prepareFontAwesome() {
const source = 'node_modules/font-awesome';
const taskCss = gulp.src(`${source}/css/font-awesome.min.css`)
.pipe(changedInPlace({ firstPass: true }))
.pipe(gulp.dest(`${project.platform.output}/css`));
const taskFonts = gulp.src(`${source}/fonts/*`)
.pipe(changedInPlace({ firstPass: true }))
.pipe(gulp.dest(`${project.platform.output}/fonts`));
return merge(taskCss, taskFonts);
}
Open the build.js file in the \aurelia_project\tasks folder and insert the following two lines; this will import the new function and execute it:
import prepareFontAwesome from './prepare-font-awesome'; // Our custom task
export default gulp.series(
readProjectConfiguration,
gulp.parallel(
transpile,
processMarkup,
processCSS,
prepareFontAwesome // Our custom task
),
writeBundles
);
Finally, in the <head> section of your index.html file, just add the following line:
<link rel="stylesheet" href="scripts/css/font-awesome.min.css">
That's all; now you can use font-awesome icons in any Aurelia view modules (html files).
Note that this works for any complex third party library which requires resources which you have to manually include.
Simple default settings method
Here are the 4 simple steps I use to bring in Font-Awesome to an Aurelia project that uses the CLI.
1) npm install font-awesome --save
2) add copyFiles to build of aurelia.json
"build": {
"copyFiles": {
"node_modules/font-awesome/fonts/*": "/fonts/"
},
3) add bundling to dependencies array of aurelia.json
"dependencies": [
{
"name": "font-awesome",
"path": "../node_modules/font-awesome/css",
"main": "font-awesome.css"
},
4) include the import for the css file (mine lives in the app.html)
<require from="font-awesome.css"></require>
=========================================================================
Alternative
Specifying a custom font location
As I was serving my files from a different location I needed to be able to tweek the font location configured. As such, below are the steps involved if you need to do the same and specify where the fonts are stored. I am using .less
1, 2) As above.
3) Instead of adding to the bundling, you need to reference the font-awesome less file within your own less file (mine is called site.less) and then set the #fa-font-path to your custom location.
#import "../node_modules/font-awesome/less/font-awesome.less";
#fa-font-path: "fonts";
4) There is no 4, with this method as long as you have your own compiled equivalent site.css file referenced already (with the import) you don't need to add anything else.
Funny I was trying to get the same thing working this morning. This is all I had to do in my aurelia.json dependencies for it to work:
{
"name": "font-awesome",
"path": "../node_modules/font-awesome/",
"main": "",
"resources": [
"css/font-awesome.min.css"
]
},
Then in my html I had:
<require from="font-awesome/css/font-awesome.min.css"></require>
Not actually answering to your question of how to integrate Font Awesome in your application using NPM, but there is an alternative, clean way to get it in your application: using the CDN.
As mentioned in other answers, Aurlia currently doesn't support bundling resources other than js, css and html out-of-the-box using the CLI. There's a lot of discussion about this subject, and there are several, mostly hacky, workarounds, like some suggested here.
Rob Eisenberg says he's planning on getting it properly integrated in the Aurelia CLI, but he considers it low priority because there's a simple workaround. To quote him:
Of course there is interest in addressing this. However, it's lower priority than other things on the list for the CLI, in part because a simple link tag will fix the problem and is much easier than the work we would have to do to solve this inside the CLI.
Source: https://github.com/aurelia/cli/issues/248#issuecomment-254254995
Get your unique CDN link mailed here: http://fontawesome.io/get-started/
Include this link in the head of your index html file
Don't forget to remove everything you might have already added to try to get it working: the npm package (and its reference in your package.json), the reference in your aurelia.json file, any custom tasks you might have created, any <require> tags,...
importing css/fonts automagicly is now supported.
{
"name": "font-awesome",
"path": "../node_modules/font-awesome/css",
"main": "font-awesome.css"
}
<require from="font-awesome.css"></require>
Checkout this "Issue" https://github.com/aurelia/cli/issues/249
Happy codding
EDIT
I realized/read the comments this does not copy the font files.
Here is an updated build script (es6) that will copy any resources and add the folder to the git ignore. If you want the typescript version check here
https://github.com/aurelia/cli/issues/248#issuecomment-253837412
./aurelia_project/tasks/build.js
import gulp from 'gulp';
import transpile from './transpile';
import processMarkup from './process-markup';
import processCSS from './process-css';
import { build } from 'aurelia-cli';
import project from '../aurelia.json';
import fs from 'fs';
import readline from 'readline';
import os from 'os';
export default gulp.series(
copyAdditionalResources,
readProjectConfiguration,
gulp.parallel(
transpile,
processMarkup,
processCSS
),
writeBundles
);
function copyAdditionalResources(done){
readGitIgnore();
done();
}
function readGitIgnore() {
let lineReader = readline.createInterface({
input: fs.createReadStream('./.gitignore')
});
let gitignore = [];
lineReader.on('line', (line) => {
gitignore.push(line);
});
lineReader.on('close', (err) => {
copyFiles(gitignore);
})
}
function copyFiles(gitignore) {
let stream,
bundle = project.build.bundles.find(function (bundle) {
return bundle.name === "vendor-bundle.js";
});
// iterate over all dependencies specified in aurelia.json
for (let i = 0; i < bundle.dependencies.length; i++) {
let dependency = bundle.dependencies[i];
let collectedResources = [];
if (dependency.path && dependency.resources) {
// run over resources array of each dependency
for (let n = 0; n < dependency.resources.length; n++) {
let resource = dependency.resources[n];
let ext = resource.substr(resource.lastIndexOf('.') + 1);
// only copy resources that are not managed by aurelia-cli
if (ext !== 'js' && ext != 'css' && ext != 'html' && ext !== 'less' && ext != 'scss') {
collectedResources.push(resource);
dependency.resources.splice(n, 1);
n--;
}
}
if (collectedResources.length) {
if (gitignore.indexOf(dependency.name)< 0) {
console.log('Adding line to .gitignore:', dependency.name);
fs.appendFile('./.gitignore', os.EOL + dependency.name, (err) => { if (err) { console.log(err) } });
}
for (let m = 0; m < collectedResources.length; m++) {
let currentResource = collectedResources[m];
if (currentResource.charAt(0) != '/') {
currentResource = '/' + currentResource;
}
let path = dependency.path.replace("../", "./");
let sourceFile = path + currentResource;
let destPath = './' + dependency.name + currentResource.slice(0, currentResource.lastIndexOf('/'));
console.log('Copying resource', sourceFile, 'to', destPath);
// copy files
gulp.src(sourceFile)
.pipe(gulp.dest(destPath));
}
}
}
}
}
function readProjectConfiguration() {
return build.src(project);
}
function writeBundles() {
return build.dest();
}
I believe that bundles.dependencies section is for referencing JS libraries.
In your case, a bit of additional work will be needed. According to Aurelia CLI docs, you can create your own generators as well, which comes in handy for us.
Add some new paths to aurelia.json:
"paths": {
...
"fa": "node_modules\\font-awesome",
"faCssOutput": "src",
"faFontsOutput": "fonts"
...
}
Create a task for css bundling...
au generate task fa-css
Modified task file: aurelia_project\tasks\fa-css.js|ts
import * as gulp from 'gulp';
import * as changedInPlace from 'gulp-changed-in-place';
import * as project from '../aurelia.json';
import {build} from 'aurelia-cli';
export default function faCss() {
return gulp.src(`${project.paths.fa}\\css\\*.min.css`)
.pipe(changedInPlace({firstPass:true}))
/* this ensures that our 'require-from' path
will be simply './font-awesome.min.css' */
.pipe(gulp.dest(project.paths.faCssOutput))
.pipe(gulp.src(`${project.paths.faCssOutput}\\font-awesome.min.css`))
.pipe(build.bundle());
};
...and another for copying font files:
au generate task fa-fonts
Modified task file: aurelia_project\tasks\fa-fonts.js|ts
import * as gulp from 'gulp';
import * as project from '../aurelia.json';
export default function faFonts() {
return gulp.src(`${project.paths.fa}\\fonts\\*`)
.pipe(gulp.dest(project.paths.faFontsOutput));
}
Add these new tasks above to the build process in aurelia_project\tasks\build.js|ts:
export default gulp.series(
readProjectConfiguration,
gulp.parallel(
transpile,
processMarkup,
processCSS,
// custom tasks
faCss,
faFonts
),
writeBundles
);
After doing these steps, au build should embed font-awesome.min.css into scripts/app-bundle.js and copy necessary font files to ./fonts folder.
Last thing to do is to require font-awesome within our html.
<require from ="./font-awesome.min.css"></require>
You don't need more much this:
in aurelia.json
"dependencies": [
"jquery",
"text",
{
"name": "bootstrap",
"path": "../node_modules/bootstrap/dist/",
"main": "js/bootstrap.min",
"deps": ["jquery"],
"resources": [
"css/bootstrap.min.css"
]
},
{
"name": "font-awesome",
"path": "../node_modules/font-awesome/css",
"main": "",
"resources": [
"font-awesome.min.css"
]
}
]
}
],
"copyFiles": {
"node_modules/font-awesome/fonts/fontawesome-webfont.woff": "fonts/",
"node_modules/font-awesome/fonts/fontawesome-webfont.woff2": "fonts/",
"node_modules/font-awesome/fonts/FontAwesome.otf": "fonts/",
"node_modules/font-awesome/fonts/fontawesome-webfont.ttf": "fonts/",
"node_modules/font-awesome/fonts/fontawesome-webfont.svg": "fonts/"
}
See section Setup for copying files
i hope help you.
For those who wish to use the sass version of font-awesome
1) Install font-awesome
npm install font-awesome --save
2) Copy font-awesome's fonts to your project root directory
cp -r node_modules/font-awesome/fonts .
3) Include the font-awesome sass directory in the aurelia css processor task
# aurelia_project/tasks/process-css.js
export default function processCSS() {
return gulp.src(project.cssProcessor.source)
.pipe(sourcemaps.init())
.pipe(sass({
includePaths: [
'node_modules/font-awesome/scss'
]
}).on('error', sass.logError))
.pipe(build.bundle());
};
4) Import font-awesome in your app scss
# src/app.scss
#import 'font-awesome';
5) Require your app scss in your html
# src/app.html
<template>
<require from="./app.css"></require>
</template>

Compiling Webpack in memory but resolving to node_modules on disk

I'm trying to use web pack to compile an in memory string of valid javascript code. I'm using memory fs as outlined here: https://webpack.github.io/docs/node.js-api.html#compile-to-memory.
So I'm taking a string containing raw javascript, writing that to memory fs, and then web pack resolves to that entry point. But the compilation fails on the first require statement, presumably because it's not able to look in the real fs for node_modules.
Any ideas on how can I accomplish this?
import webpack from 'webpack';
import MemoryFS from 'memory-fs';
import thenify from 'thenify';
function* compile(code) {
const fs = new MemoryFS();
fs.writeFileSync('/file.js', code);
const compiler = webpack({
entry: { file: '/file.js' },
output: {
path: '/build',
filename: '[name].js'
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
],
}
});
compiler.run = thenify(compiler.run);
compiler.inputFileSystem = fs;
compiler.resolvers.normal.fileSystem = fs; //this is needed for memfs
compiler.outputFileSystem = fs;
const stats = yield compiler.run();
//retrieve the output of the compilation
const res = stats.compilation.assets['file.js'].source();
return res;
}
Usage
var code = "var _ = require('underscore'); console.log(_);";
var bundle = yield compile(code); //should be a bundle containing the underscore source.
The error is
ModuleNotFoundError: Module not found: Error: Cannot resolve module
underscore in /
This question indicates that others have tried the same thing: https://github.com/webpack/webpack/issues/1562. there's a gist referenced at https://gist.github.com/DatenMetzgerX/2a96ebf287b4311f4c18 that I believe was intended to do what I'm hoping to accomplish, but in it's current form I don't see how. It assigns an instance of MemoryFs to all of the resolvers. I've tried assigning node's fs module, but no dice.
So in short, I'm trying to set an entry point to an in memory string of raw javascript, but still have require and import statements resolved to node_modules on disk.
UPDATE
I've been able to get the result I'm looking for but it's not pretty. I'm basically overriding the implementation of #stat and #readFile in MemoryFS
to check the real filesystem if it gets any request for a file that doesn't exist in memory. I could clean this up a bit by subclassing MemoryFS instead of swapping method implementations at runtime, but the idea would still be the same.
Working solution
import webpack from 'webpack';
import JsonLoader from 'json-loader';
import MemoryFS from 'memory-fs';
import UglifyJS from "uglify-js";
import thenify from 'thenify';
import path from 'path';
import fs from 'fs';
import root from 'app-root-path';
/*
* Provide webpack with an instance of MemoryFS for
* in-memory compilation. We're currently overriding
* #stat and #readFile. Webpack will ask MemoryFS for the
* entry file, which it will find successfully. However,
* all dependencies are on the real filesystem, so any require
* or import statements will fail. When that happens, our wrapper
* functions will then check fs for the requested file.
*/
const memFs = new MemoryFS();
const statOrig = memFs.stat.bind(memFs);
const readFileOrig = memFs.readFile.bind(memFs);
memFs.stat = function (_path, cb) {
statOrig(_path, function(err, result) {
if (err) {
return fs.stat(_path, cb);
} else {
return cb(err, result);
}
});
};
memFs.readFile = function (path, cb) {
readFileOrig(path, function (err, result) {
if (err) {
return fs.readFile(path, cb);
} else {
return cb(err, result);
}
});
};
export default function* compile(code) {
// Setup webpack
//create a directory structure in MemoryFS that matches
//the real filesystem
const rootDir = root.toString();
//write code snippet to memoryfs
const outputName = `file.js`;
const entry = path.join(rootDir, outputName);
const rootExists = memFs.existsSync(rootDir);
if (!rootExists) {
memFs.mkdirpSync(rootDir);
}
memFs.writeFileSync(entry, code);
//point webpack to memoryfs for the entry file
const compiler = webpack({
entry: entry,
output: {
filename: outputName
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
]
}
});
compiler.run = thenify(compiler.run);
//direct webpack to use memoryfs for file input
compiler.inputFileSystem = memFs;
compiler.resolvers.normal.fileSystem = memFs;
//direct webpack to output to memoryfs rather than to disk
compiler.outputFileSystem = memFs;
const stats = yield compiler.run();
//remove entry from memory. we're done with it
memFs.unlinkSync(entry);
const errors = stats.compilation.errors;
if (errors && errors.length > 0) {
//if there are errors, throw the first one
throw errors[0];
}
//retrieve the output of the compilation
const res = stats.compilation.assets[outputName].source();
return res;
}
Usage
var code = "var _ = require('underscore'); console.log(_);";
var bundle = yield compile(code); //is a valid js bundle containing the underscore source and a log statement logging _.
If there's not a better way, then I'll definitely encapsulate this into a subclass of MemoryFS, but I'm hoping there's a more sane way to accomplish this with Webpack's api.
Instead of memory-fs, the combination of unionfs/memfs/linkfs should help.
https://npmjs.com/unionfs
https://npmjs.com/memfs
https://npmjs.com/linkfs
I have created this snippet untested. I think you want the inputFS to be the real one and the output fs to be the in memory one. On the other hand you want all the dependencies of file.js to be constructed separately. For that I figured the webpack.optimize.CommonsChunkPlugin plugin could help. I expect webpack to write everything to the memory. I hope it works.
import webpack from 'webpack';
import MemoryFS from 'memory-fs';
import thenify from 'thenify';
import realFS from 'fs';
function* compile(code) {
const fs = new MemoryFS();
const compiler = webpack({
entry: {
file: '/file.js',
vendor: [
'underscore',
'other-package-name'
]
},
output: {
path: '/build',
filename: '[name].js'
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
],
},
plugins: [
new webpack.optimize.CommonsChunkPlugin('vendor', 'vendor.js')
]
});
compiler.run = thenify(compiler.run);
compiler.inputFileSystem = realFS;
compiler.resolvers.normal.fileSystem = fs; //this is needed for memfs
compiler.outputFileSystem = fs;
const stats = yield compiler.run();
//retrieve the output of the compilation
const res = stats.compilation.assets['file.js'].source();
return res;
}
You're using MemoryFS, which is a JavaScript reimplementation of a feature normally handled by the Operating System. I wonder, could you mount a directory using tmpfs at the Operating System level, then use that? webpack would then not know or care that the input file is actually stored in memory.
Assuming that you have mounted a memory-based filesystem at /media/memory, the webpack configuration code could be as simple as this:
resolve: {
root: ['/media/memory', ...other paths...],
},
output: {
path: '/wherever/you/want/the/output/files'
}
}
This approach also has a hidden benefit: If you want to debug the input code, you just mount /media/memory with a non-RAM-based filesystem and you can see what's being generated.
I know it's late but for the record here comes a code snippet.
import * as fs from 'fs';
import { resolve } from 'path';
import { Volume } from 'memfs';
import { ufs } from 'unionfs';
const volume = Volume.fromJSON({
[resolve(process.cwd(), 'test.js')]: 'this file is on memory not on disk'
});
ufs.use(fs).use(volume);
// Reads from memory
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.js'), 'utf8'));
// Reads from disk
console.log(ufs.readFileSync(resolve(process.cwd(), 'package.json'), 'utf8'));
// Writing into memory
volume.writeFileSync(resolve(process.cwd(), 'test.memory'), 'This should be
on memory');
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.memory'), 'utf8'));
// Writing into disk
ufs.writeFileSync(resolve(process.cwd(), 'test.disk'), 'This should be on disk');
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.disk'), 'utf8'));
Hers the console output:
user1#pc playground % node inMem.mjs
this file is on memory not on disk
{
"name": "playground",
"version": "1.0.0",
"main": "index.js",
"license": "MIT",
"dependencies": {
"memfs": "^3.3.0",
"unionfs": "^4.4.0"
}
}
This should be on memory
This should be on disk
user1#pc playground % ls .
inMem.mjs node_modules package.json yarn.lock

Categories

Resources