importing multiple json files into a js array - javascript

I'm trying to having multiple json files imported to a js array
// module.js
module.exports={
english:require("./englishfile"),
chinese:require("./chineseFile"),
french:require("./frenchFile"),
spanish:require("./espFile"),
};
//js file
let allData=require("./module.js");
What this is doing is having all the files in a single array entry. I'm trying to have them as separate array entries for the entire size of module.js . I also would have a much larger number of files in module.js so I don't know its size and wouldn't be able to hard code them

You can do something like this
const requireModule = require.context('.',false,/\.json$/)
const modules = {}
requireModule.keys().forEach(filename =>
{
const moduleName = fileName.replace(/(\.\/|\.json)/g, '');
modules[moduleName] = requireModule(fileName)
OR
modules[moduleName] =
{
namespaced: true,
...requireModule(fileName)
}
});
export default modules;

Related

vite & react import images dynamically from public url

I would like to import images dynamically from the public folder, but this isn't working- anyone know why?
const modules = import.meta.glob("/Images/*.jpg");
const imagePaths = [];
for (const path in modules) {
modules[path]().then(() => {
const p = new URL(path, import.meta.url);
const data = {
path: p.pathname,
};
imagePaths.push(data);
});
}
First you need to adjust the path that you pass to the import.meta.glob function. Specify a path relative to the file that this code is in, since "/Images/*.jpg" will refer to the root folder on your system (because it starts with a slash).
Next you can use {eager: true} option to resolve the promise at compile time (you import just the URL, no need to use code splitting).
const imagePaths = [];
// note relative path vvv vvv this gets rid of promises
Object.values(import.meta.glob("./assets/*.jpg", { eager: true })).forEach(
({ default: path }) => {
const url = new URL(path, import.meta.url);
const data = {
path: url.pathname,
};
imagePaths.push(data);
}
);
/* imagePaths will have content like this:
[ {"path":"/src/assets/logo.jpg"}, {"path":"/src/assets/logo.jpg"} ]
*/
You can also take a look at the documentation here: https://vitejs.dev/guide/features.html#glob-import

Webpack Loader/Plugin - Replace a variable in a string to output a new string

I am writing a custom webpack loader to remove unnecessary code that Terser can't pick up.
Here's the sample source output from webpack loader;
const SvgsMap = {
map1: () => {
return '1';
},
map2: () => {
return '2';
},
map3: () => {
return '3';
},
// ...redacted
map100: () => {
return '100';
},
}
Note that above comes into the loader as a string. And I have a whitelist of string[] as which of them that should be included in the build output;
const whitelistsArr = ["map1"]
I am currently writing a webpack loader to pre-process this before getting into bundled. Which currently uses Node VM that I assume could parse it to javascript object, in which then I can remove some of the unused properties in SvgsMap, then output it back again as a string.
My question is;
Am I doing it the right way with Loader to remove them? Or is it actually a webpack plugin job to do this? Any other alternatives?
I am hitting a rock doing this with VM, It seems like it's unable to mutate the existing code and output it back as a string. Am I wrong here?
Any suggestion is appreciated.
Here's my loader's code so far;
const path = require( 'path' );
const { loader } = require( 'webpack' );
const vm = require( 'vm' );
const whitelists = ['frame21Web'];
const loaderFn = function ( source ) {
/** #type {loader.LoaderContext} */
// eslint-disable-next-line babel/no-invalid-this
const self = this;
const filename = path.basename( self.resourcePath );
const templateWithoutLoaders = filename.replace( /^.+!/, '' ).replace( /\?.+$/, '' );
const vmContext = vm.createContext( { module: {} } );
let newSource = '';
try {
const vmScript = new vm.Script( source, { filename: templateWithoutLoaders } );
const cachedData = vmScript.createCachedData();
console.log(cachedData.toString()); // Doesn't seem to output as a string.
}
catch (err) {
console.error(err);
}
console.log( 'loader', filename, source );
process.exit( 0 );
return source;
};
module.exports = loaderFn;
There may be a couple answers to this question. Difficult to know without knowing the reasoning behind the removal.
If you have control of the file, you could use a combination of Webpack's Define plugin, and some if/else logic. For example
// in your WP config file
new webpack.DefinePlugin({
IS_CLIENT: JSON.stringify(true), // will only be true when compiled via WP
});
// in your module
if (process.env.IS_CLIENT) {
SvgsMap.map1 = () => '1';
}
The above pattern allows for adding/removing chunks of code for your Client bundle, while also allowing for use on the Server.
The other option would be to write a custom Babel plugin (not a WP plugin). I found this article helpful in the past, when I had to write some plugins. Babel gives you more control of how the parts of a JS file are processed, and you can use that plugin outside of WP (like while running Unit tests).

Can I use a webpack hook to modify file output just before it gets saved?

I want to manipulate a file after it has been processed by webpack and babel. There's an emit hook that is triggered just before a new file is saved, but I couldn't see a way to manipulate the file contents. So I settled for using the afterEmit hook to read in the just-written file, modify it, and write it back out:
plugins: [
new class OutputMonitor {
apply(compiler) {
compiler.hooks.afterEmit.tap('OutputMonitor', compilation => {
if (compilation.emittedAssets.has('index.js')) {
let contents = fs.readFileSync('./dist/web/index.js', 'utf-8');
// Strip out dynamic import() so it doesn't generate warnings.
contents = contents.replace(/import(?=\("tseuqer-yb")/, 'console.log');
// Strip out large and large-alt timezone definitions from this build.
contents = contents.replace(large, 'null');
contents = contents.replace(largeAlt, 'null');
fs.writeFileSync('./dist/web/index.js', contents);
}
});
}
}()
],
This gets the job done, but is there a better way?
From what I can tell, you're basically replacing some strings with another strings.
I believe you can use processAssets hook if you're running webpack 5.
Here's an example you can adapt to your case:
const { Compilation, sources } = require('webpack');
class Replace {
apply(compiler) {
compiler.hooks.thisCompilation.tap('Replace', (compilation) => {
compilation.hooks.processAssets.tap(
{
name: 'Replace',
stage: Compilation.PROCESS_ASSETS_STAGE_OPTIMIZE,
},
() => {
// get the file main.js
const file = compilation.getAsset('main.js');
// update main.js with new content
compilation.updateAsset(
'main.js',
new sources.RawSource(file.source.source().replace('a', 'b'))
);
}
);
});
}
}
module.exports = {
entry: './wp.js',
plugins: [new Replace()],
};

Nodejs - No webpack bundle

I want to "bundle" some files into a single js file, but I don't want the webpack wrapper.
Let's say I have 3 files :
// file1.js
export default { hello: "hello" };
// file2.js
export default { world: "world" };
// index.js
import file1 from "./file1";
import file2 from "./file2";
(() => ({ ...file1, ...file2 }))()
I want the following result :
// build/output.js (+ babel...)
(function(){
return Object.assign({}, { hello: "hello" }, { world: "world" });
})()
Not a single line of code apart from the above build output.
Is it possible with webpack? Thanks!
O.P. Solution
Ok, so I found a solution!
It's maybe not the best one, but it works.
I found this library here which concatenates files together.
Building
I can build using : npm run build.
And the code that concatenates the files is :
// ======================================================
// Tools / Bundle
// ======================================================
// Libs
var path = require("path");
var bundle = require("bundle-js");
module.exports.exec = function() {
// Disable logging (hack for 'bundle-js' library).
var _log = console.log;
console.log = function() {};
// Concatenate each file (required by the application).
var file = path.resolve(__dirname, "../src/index.js");
var bundledCode = bundle({
entry: file,
print: false,
disablebeautify: true
});
// Enable logging.
console.log = _log;
// Return bundled code.
return bundledCode;
};
For some reasons, bundle-js always outputs something even with the option { print: false }. So I added a small hack to fix this.

NodeJS Variable Scope

I'm very, very new to the whole NodeJS stack, and I'm trying to rough up a simple login system for practice.
Jumping to my question,
app.js
...
var mongoose = require( 'mongoose' );
var templates = require( './data/inc.js' ); // includes schema structures
...
user.js - included in inc.js
...
module.exports =
{
"Schema" : new exports.mongoose.Schema({
"uid": mongoose.Schema.Types.ObjectId,
"username": { type:String, unique:true },
"alias": String,
"credentials":
{
"salt": String,
"password": String,
"key": String
},
"profile":
{
"age": { type: Number, min: 18 }
},
"last_login": Date,
"updated": { type: Date, default: Date.now }
})
}
...
The 'user.js' script above will not work because it doesn't have access to the mongoose object variable instantiated in the 'app.js' script. In PHP any included/required scripts would be able to access variables from the parent script, but in NodeJS as I know it for example I have to re-require/state the mongoose variable in order to create my schema tree.
user.js
...
* var mongoose = require( 'mongoose' ); // must include in script to use mongoose object
module.exports
{
...
}
...
Is there any work-around that will allow me the same scope access as PHP?
The answer is that there are workarounds, but you really don't want to use them, ever, ever, except for things which you want to hack into the global scope of all running modules in your application up to and including all dependencies (mongoose) and all of ITS dependencies.
override.js
global.thisIsNowAvailable = true;
flaky-file.js
if (thisIsNowAvailable) { /* ... */ }
index.js
require("./override");
require("./flaky-file");
The same will work for overriding methods on global prototypes, et cetera.
Unless your library is super-awesome and is intended to intercept, parse and interpret code at require-time
require("babel/register"); // all loaded modules can now be written in ES6
doing this for other reasons leads to horrible code-bases...
broken-index.js
require("flaky-file");
require("override");
// you might have just attempted to reference a variable that doesn't exist,
// thrown an error and crashed your entire server
// (not just a single connection, like PHP... ...the entire server went down,
// for everyone, and it has to be restarted).
Think of modules as separate function scopes.
It's really simple to do something like:
needs-mongoose.js
function doSomeInitWithMongoose (db) { /* ... */ }
function doSomeRuntimeWithMongoose (db, params) { /* ... */ }
module.exports = mongoose => {
doSomeInitWithMongoose(mongoose);
return {
run: params => {
/* ... app is run here ... */
doSomeRuntimeWithMongoose(mongoose, params);
}
};
};
configures-mongoose.js
var mongoose = require("mongoose");
function configure (db, cfg) { /* ... */ return db; }
module.exports = config => {
var configuredDB = configure(mongoose, config);
return configuredDB;
};
main.js
// to support arrow functions and other awesome ES6, including ES6 modules
require("babel/register");
var config = require("./mongoose-config");
var db = require("./configures-mongoose")(config);
var app = require("./needs-mongoose")(db);
app.run({ /* ... */ });
EDIT
Updated the last few files to be a structurally-correct pseudo-program (which does absolutely nothing, of course);
Of course, if index.js or server.js were to require("babel/register"); and then load main.js (without the Babel include in it), all of the require statements south of Babel could be written as ES6 modules, without issue.
server.js
require("babel/register");
require("./es6-main");
es6-main.js
import config from "./mongoose-config";
import configureDB from "./configures-mongoose";
import loadApp from "./needs-mongoose";
const db = configureDB(config);
const app = loadApp(db);
app.run({ /* ... */ });
Note that now I'm naming the functions I was originally returning, because in JS when you return a function, you can immediately call it...
getFunc( config )( data );
...but you can't act immediately on import statements.
Rule of thumb is that if you're going to export an object to the outside world, it should have 0 external dependencies, or all external dependencies will be set up later, by setters of some kind:
var utils = require("./utils"); // doesn't need any information
utils.helperFunc(data);
or
var catsAndPorn = true;
var internets = [];
var SeriesOfTubes = require("series-of-tubes");
var internet = new SeriesOfTubes( catsAndPorn );
internets.push( internet );
or
var bigOlFramework = require("big-ol-framework");
bigOlFramework.setDBPool( myDBCluster );
http.createServer( bigOlFramework.connectionHandler ).listen( 8080 );
None require outside information for their actual init (though may require their own internal dependencies).
If you want to return something which does rely on external init, either export a factory/constructor, or export a function, which accepts your config/data, and then returns what you want, after an init sequence.
EDIT 2
The last piece of advice here is that as far as mongoose usage goes, or Gulp, to a similar extent, or several routers...
...when you want to have a single file which registers its contents to a registry, or requires a core-component, to be able to return something, the pattern in Node which makes the most sense is to return a function which then does the init
var Router = require("router");
var router = new Router( );
require("./routes/login")(router);
require("./routes/usesrs")(router);
require("./routes/articles")(router);
Where "./routes/articles.js" might look like
import ArticlesController from "./../controller/articles"; // or wherever
var articles = new ArticlesController();
module.exports = router => {
router.get("/articles", ( ) => articles.getAll( ));
router.post("/articles", ( ) => articles.create( ));
};
So if you were looking to structure ORM based on schema, you might do similar:
var mongoose = require("mongoose");
var Users = require("./schema/users")(mongoose);
where "./schema/users" looks like:
module.exports = mongoose => {
return new mongoose.Schema({ /* ... */ });
};
Hope that helps.
Why don't you just do this?
var mongoose = require( 'mongoose' );
...
"Schema" : new mongoose.Schema({
Instead of:
exports.mongoose.Schema // I'm not sure where you got `exports.mongoose` from.
Also you don't have to use the .js when requiring like:
var templates = require( './data/inc' );
Edit
I believe you can't do it like PHP. Also the requires are cached so no need to worry about re requiring.

Categories

Resources