File Importing From one Dir to another Dir - javascript

so basically my dir looks like this:
/node:
~/libs/lib.js
~/projects/main/script.js
and i want to import (the entire file) "lib" [\Node\libs\lib.js] into "script" [\Node\Projects\main\script.js],
how do i do that?
thanks in advance,
-Gzrespect

As I know you can't import the entire file from one directory to another but you can export the any Data Types primitive(int, string etc.) & non-primitive(objects & arrays).
const sum = (a,b) => return a + b;
const object = {}
const array = []
exports.module = {sum, object, array};
later in file in which you want to use that functions, objects and array..
const {sum, object, array} = require("file path");

Since your script.js is in the "main" folder of "Projects" folder, so you have to go 2 levels deeper. Add this codes into top part of your script.js. Please make sure you have exported in lib.js, module.exports = connection;
const connection = require("../../libs/lib");

Related

Is there a way to dynamically import arrays inside separate files in a directory in Reactjs?

What I had tried.
I tried importing every file inside the directory using require.context. It did the job and read the files but I am stuck at accessing what's inside the files.
function importAll(r) {
let imports = {};
r.keys().map((item, index) => { imports[item.replace('./', '')] = r(item); });
return imports;
}
const imports = importAll(require.context('../questionsdata', false, /\.(js)$/));
If I get it correctly, That is the equivalent of me importing every single JS file in the directory.
Each file contains 1 array that is exported and needs to be accessed.

How to include array from a .js file into another .js file?

I'm working on a little 1 file function in Node.js.
This function uses an (normal vanilla js) array of data that is becoming very large and I'd like to move it to its own file to keep things tidy.
I've created a file with just the array.
my-array.js
const myArr = [//stuff];
And have tried many ways of including it in my main.js file, eg:
const myArr = require('./my-array.js')
Or trying the ES6 way:
import {myArr} from "./my-array.mjs";
(adding export to my-array and changing file type to .mjs)
However, nothing seems to work and I cannot seem to find any clear information about how to achieve this.
Could anyone point me in the right direction?
for the first one:
module.exports = []; // your array
You could either use the ES6 module import syntax
// my-array.mjs
const myArr = []
export { myArr }
// main.mjs
import { myArr } from './my-array.mjs'
or use good old require
// my-array.js
const myArr = []
module.exports = { myArr }
// main.js
const { myArr } = require('./my-array.js')
In any case, make sure to export your array and for ESM, you need to use .mjs as a file extension

Importing Proto File to another ProtoFile in NodeJS

I am quite new to Protocol Buffers.
I noticed that the grpc proto-loader module requires just a single proto definition file to load, so I have loaded it in my code as below:
const PROTO_PATH = `${path.resolve(__dirname, '..')}${path.sep}protos${path.sep}index.proto`;
const packageDefinition = protoLoader.loadSync(PROTO_PATH, {
keepCase: true,
longs: String,
enums: String,
defaults: true,
oneofs: true
});
let indexProto = grpc.loadPackageDefinition(packageDefinition).index;
Now my index.proto file is referencing another proto file
as below:
syntax = "proto3";
package index;
import public "location_updater.proto";
And my location_updater.proto is defined as below
syntax = "proto3";
package location_updater;
service LocationUpdater{
rpc updateLocation(Location) returns LocationUpdateResponse{}
}
message Location{
string apiKey = 1;
string updateTarget = 2;
double longitude = 3;
double latitude = 4;
}
message LocationUpdateResponse{
int32 statusCode = 1;
}
When I do the following:
let grpcServer = new grpc.Server();
grpcServer.addService(indexProto.location_updater.LocationUpdater.service, {
});
I am getting an error TypeError: Cannot read property 'LocationUpdater' of undefined
If I move the content of the location_updater.proto into to the index.proto file it works, but I don't want that behavior as I would be working with many different proto files for different business logic.
What am I doing wrong and what is the best way to go about this?.
Thanks in anticipation for your input.
You need to use the includeDirs option to include directories in the search paths, so that the proto loader library knows how to find the imported files. Those directories should be the ones that the import paths are relative to.
In this situation, assuming that location_updater.proto is in the same directory as index.proto, the includeDirs option should be an array containing the single path __dirname/../protos. Those directories can also be used to search for the main file, so you can pass a proto path that is just index.proto.

Add functions in other folder, to an object in this folder

I want to create an object that would import functions from another folder and it would look something like this:
class = {
functions: {
//All functions here
}
}
The functions would be inside of a different folder, however, I want to make some sort of importer in which it would make new classes for each new function/file it finds inside of the folder.
someFunction.js Function File:
function someFunction() {
console.log("this is some function");
}
So I would like for something to look like this:
class.functions.someFunction()
No, I do not want to have it hard coded into the object, I want to import all functions from a folder and create functions like that.
Well, first I wan't to answer your question as I think you want, even if I also think it is not the correct way to proceed.
I'll also assume that with class you are not referring to an actual ES6 Class, but we are talking about a plain object.
So this is the code:
const fs = require('fs');
const path = require('path');
function importer(dirPath) {
const absoluteDirPath = path.normalize(
path.isAbsolute(dirPath)
? dirPath
: path.resolve(process.cwd(), dirPath)
);
const output = {
functions: {}
};
const content = fs.readdirSync(path.normalize(absoluteDirPath));
content.forEach((basename) => {
const absoluteItemPath = path.join(absoluteDirPath, basename);
if (fs.statSync(absoluteItemPath).isFile() && /\.js$/i.test(basename)) {
output.functions[basename.slice(-3)] = require(path.relative(
__dirname,
absoluteItemPath
));
}
});
return output;
}
module.exports = importer;
For this to work, all your functions in your files should be exported like:
module.exports = function myFunction() {};
To use the 'importer', you just do:
const artemis = importer('/path/to/directory'); // PATH MUST BE ABSOLUTE OR RELATIVE TO CWD.
/*
SUPPOSING THAT YOUR DIRECTORY CONTAINS THE FOLLOWING FILES:
function1.js
function2.js
Then you can do:
artemis.function1();
artemis.function2();
Please note that your files must be named in a JS friendly way (a valid string for an object key).
*/
A final important note about this odd method: This will only ever work in a NodeJS environment. Even if functions could have worked in other environments (like a browser). The next method, will work for any ECMAScript environment after proper building process: transpilation (EX: Babel) and bundling (EX: Webpack).
Suggested Solution
Use ES6 Static import / export like modern JS libraries do. This comes with huge benefits, from static code analysis to tree shaking and more.
Let's suppose the following hierarchy:
// - index.js
// - internals/
// - index.js
// - module-1.js
// - module-2.js
internals/module-1.js
function module1() {}
export {module1};
internals/module-2.js
import {module1} from 'module-1.js';
function module2() {
// YOU CAN USE module1 IF YOU NEED. (AVOID CIRCULAR REFERENCES)
module1();
}
export {module2};
internals/index.js
import {module1} from './module-1.js';
import {module2} from './module-2.js';
export {module1, module2};
index.js
import * as moduleGroup from './internals/index.js';
export {moduleGroup};
Finally, where you import your moduleGroup, you can do:
moduleGroup.module1();
moduleGroup.module2();
Obviously this is a basic scenario, but this is, IMHO, the correct way to deliver a group of functions and other stuff. Please let me know if you have any doubt.

What is causing my $.getJSON to fail? (the JSON should be valid) [duplicate]

I would like to include a couple of JSON files in my JavaScript code that are in the same directory as my JavaScript source file.
If I wanted to include another JavaScript file I could simply use require.
Now I'm using readFileSync and __dirname to get the JSON, which I think is an ugly way to do it.
Is there something similar for require that enables me to load a JSON file?
As of node v0.5.x yes you can require your JSON just as you would require a js file.
var someObject = require('./somefile.json')
In ES6:
import someObject from './somefile.json'
JSON files don’t require an explicit exports statement. You don't need to export to use it as Javascript files.
So, you can use just require for valid JSON document.
data.json
{
"name": "Freddie Mercury"
}
main.js
var obj = require('data.json');
console.log(obj.name);
//Freddie Mercury
Two of the most common
First way :
let jsonData = require('./JsonFile.json')
let jsonData = require('./JsonFile') // if we omitting .json also works
OR
import jsonData from ('./JsonFile.json')
Second way :
1) synchronously
const fs = require('fs')
let jsonData = JSON.parse(fs.readFileSync('JsonFile.json', 'utf-8'))
2) asynchronously
const fs = require('fs')
let jsonData = {}
fs.readFile('JsonFile.json', 'utf-8', (err, data) => {
if (err) throw err
jsonData = JSON.parse(data)
})
Note:
1) if we JsonFile.json is changed, we not get the new data, even if we re run require('./JsonFile.json')
2) The fs.readFile or fs.readFileSync will always re read the file, and get changes
No. Either use readFile or readFileSync (The latter only at startup time).
Or use an existing library like
cjson
Alternatively write your config in a js file rather then a json file like
module.exports = {
// json
}
A nifty non-caching async one liner for node 15 modules:
import { readFile } from 'fs/promises';
const data = await readFile('{{ path }}').then(json => JSON.parse(json)).catch(() => null);
You even can use require of your JSON without specifying the extension .json.
It will let you change the file extension to .js without any changes in your imports.
assuming we have ./myJsonFile.json in the same directory.
const data = require('./myJsonFile')
If in the future you'll change ./myJsonFile.json to ./myJsonFile.js nothing should be changed in the import.
You can import json files by using the node.js v14 experimental json modules flag. More details here
file.js
import data from './folder/file.json' assert { type: 'json' }
export default {
foo () {
console.log(data)
}
}
And you call it with node --experimental-json-modules file.js
You can use a module to create a require.
import { createRequire } from 'module'
const require = createRequire(import.meta.url)
const foo = require('./foo.json')
if you are using typescript, you can just add in your tsconfig.json a new field called resolveJsonModule: true, and then you can import all the informations of any .json file just like this:
import * as jsonfile from "./path/to/json"

Categories

Resources