How to compress Folder in nodeJS Mac without .DS_STORE - javascript

Using the folder-zip-sync npm library and other zipping libraries, The .zip file gets saved with an extra .DS_STORE file. How to zip without this file? Is there a setting I can turn off? How to go about this?
var zipFolder = require("folder-zip-sync");
zipFolder(inputPath, pathToZip);

you don't need any library to compress
for this action use node js build in zlib module
const { createReadStream, createWriteStream } = require('fs');
const { createGzip } = require('zlib');
const inputFile = "./input.txt";
const outputFile = "./input.txt.gz";
const srcStream = createReadStream(inputFile)
const gzipStream = createGzip()
const destStream = createWriteStream(outputFile)
srcStream.pipe(gzipStream).pipe(destStream)

Related

How To Write and Read JSON texts on single file

I'm receiving events in JSON format via a POST route, I would like to save these events in a file like 'example.json' and be able to query it.
I tried using writeFileSync, but it rewrites the entire file. With the flag {flag: 'a+'} I was able to save more than one record, but when I try to require 'example.json', I get an error 'Unexpected token { in JSON'.
Works fine when the file has only one record, but gives the error after the second one.
Code:
const filePath = './example.json';
const fs = require('fs');
const file = require('./example.json');
app.post('/events', (request, response) => {
response.send(request.body);
const contentString = JSON.stringify(request.body);
return fs.writeFileSync(filepath, contentString, {flag: 'a+'});
});
example.json that works:
{"type":"call.new","call_id":"71252742562.40019","code":"h9e8j7c0tl0j5eexi07sy6znfd1ponj4","direction":"inbound","our_number":"1130900336","their_number":"11999990000","their_number_type":"mobile","timestamp":"2020-04-01T00:00:00Z"}
example.json (with two records) that stop working:
{"type":"call.new","call_id":"71252742562.40019","code":"h9e8j7c0tl0j5eexi07sy6znfd1ponj4","direction":"inbound","our_number":"1130900336","their_number":"11999990000","their_number_type":"mobile","timestamp":"2020-04-01T00:00:00Z"}{"type":"call.ongoing","call_id":"71252731962.40019","code":"h9e8j7c0tl0j5eexi07sy6znfd1ponj4","direction":"inbound","our_number":"1130900336","their_number":"11999990000","their_number_type":"mobile","timestamp":"2020-04-01T00:00:00Z"}
How can I write this JSON in a readable form? That does not present the error above and it is possible to perform the require.
Could someone help me, please?
Try to read the JSON file, parse it, add new elements to the array and then overwrite the file.
const fs = require("fs");
const path = require("path");
const FILE_PATH = path.join(__dirname, "./elements.json");
const file = fs.readFileSync(FILE_PATH);
const elements = JSON.parse(file);
const newElement = { id: Date.now() };
const updatedElements = [...elements, newElement];
fs.writeFileSync(FILE_PATH, JSON.stringify(updatedElements));
See more here: https://nodejs.org/api/fs.html#fsappendfilesyncpath-data-options

Node fs.writefile with absolute path

I have a node application that emits HTML files. This is the jest of how it works:
const fs = require('fs');
const outputPath = './dist/html/';
// code that generates names and content
const currentFile = `${outputPath}${name}.html`;
const content = '...';
fs.promises.writeFile(currentFile, content, 'utf8');
This works as intended, but generally it is a bad practice to write relative path this way (this works on Mac but probably would not work on Windows machine).
const fs = require('fs');
const path = require('path');
const outputPath = path.join(__dirname, 'dist', 'html');
// code that generates names and content
const currentFile = path.join(outputPath, `${name}.html`);
const content = '...';
fs.promises.writeFile(currentFile, content, 'utf8');
This works, but it creates an entire path (User/my.name/Documents/projects/my-project/dist/html/my-file.html) within my project, since fs.writeFile writes the file relative to the working directory.
Can I make fs write file to the absolute path? Alternatively, what is the proper way of generating relative paths?
I ended up using
const outputPath = `.${path.delimiter}dist${path.delimiter}ads${path.delimiter}`;
But this does not seem like the best possible solution.
according to the docs, 'fs' module works with both relative and absolute paths.
i guess you issue was somehow related to path building.
here is the working code:
const { promises: fsp } = require('fs');
const { join } = require('path');
const fileName = 'file.html';
const content = '...';
(async () => {
try {
await fsp.writeFile(join(process.cwd(), 'dist', 'html', fileName), content);
} catch (error) {
// handling
}
})();

Import all files from a folder in VSCode Studio Code Webview API

I am importing json files as following:
import input1 = require("../test/test1.json");
import input2 = require("../test/test2.json");
import input3 = require("../test/test3.json");
import input4 = require("../test/test4.json");
import input5 = require("../test/test5.json");
My tsconfig settings is:
"module": "commonjs",
"target": "es6",
But I need to import the whole "test" folder with a lot of json files. How can I import all the files and assign each file to a "input" variable?
Update:
I have tried the following code that was suggested by #Michael. But is gives following error.
const fs = require('fs');
let testDataPath = "../test"
let filenames = fs.readdirSync(testDataPath)
filenames = filenames.filter(it => it.endsWith(".json"))
let runvalue = [];
for(let filename of filenames) {
let file = JSON.parse(fs.readFileSync(testDataPath + "/" + filename, "utf-8"))
let json = Object.values(file["covered-points"]);
runvalue = [...runvalue, new Set(json)]
}
But its giving error: "Uncaught TypeError: fs.readdirSync is not a function"
I can't figure out whats wrong with "fs" in visual studio code. Someone please help me. Thank you for your time.
Assuming that you are in NodeJS rather than in a web browser, the best way would be to use the fs core module to read the directory contents, and then the file contents. Eg:
let testDataPath = "./test"
let filenames = fs.readdirSync(testDataPath)
filenames = filenames.filter(it => it.endsWith(".json"))
for(let filename of filenames) {
let json = JSON.parse(fs.readFileSync(testDataPath + "/" + filename, "utf-8"))
//do something with the json
}
Note:
In general you should use the asynchronous versions of these functions, but I'll leave that as an exercise for you to do
The path given needs to be relative to the current working directory of the program, rather than the file the function is called in
ref: https://nodejs.org/api/fs.html

Read java file using nodejs

I have some .java files inside a directory, I want to read those files and get some values inside each of them. I'm not sure on how to proceed. how can I do this using fs module and some other npm modules in node js.
Below is my current code
const path = require('path');
var fs = require('fs');
module.exports={
readTS: function () {
var CWD = path.join(__dirname, '../');
var folder = path.basename(CWD).toLowerCase();
var TSJavaPath = path.join(__dirname, '../src/main/java/com/'+folder+'/');
var files = fs.readdirSync(TSJavaPath).filter(fn => fn.startsWith('TS'));
console.log(files);
for(i=0;i<files;i++){
//Read and get data
}
}
};
You could compile your .java files and read them using leonardosnt/java-class-tools.
You have HelloWorld.java -
public class HelloWorld {
public static void main(String[] args) {
System.out.println("Hello, World");
}
}
Then compile -
javac HelloWorld.java
Then write a quick index.js -
const { JavaClassFileReader } = require('java-class-tools');
const reader = new JavaClassFileReader();
const classFile = reader.read('./HelloWorld.class');
classFile.methods.forEach(md => {
/**
* Method name in constant-pool.
*
* Points to a CONSTANT_Utf8_info structure: https://docs.oracle.com/javase/specs/jvms/se8/html/jvms-4.html#jvms-4.4.7
*/
const nameInConstantPool = classFile.constant_pool[md.name_index];
// To string (hacky)
const name = String.fromCharCode.apply(null, nameInConstantPool.bytes);
console.log(name);
});
Which outputs the following -
node index.js
<init>
main

Use node.js stream to create a file and pipe to gulp

I am using node.js. I have a Buffer and I need to write it to a file named 'bla.js' and then pipe gulp's uglify. Something like:
var uglify = require('gulp-uglify');
var buffer = ....
var wstream = fs.createWriteStream('bla.js');
wstream.write(buffer);
wstream.pipe(uglify());
What is the correct way to do it?
From what I can tell you should be able to call pipe() on the readable stream more than once and have the contents sent to two different writable streams.
Eg (dry-coded):
var fs = require('fs')
, uglify = require('gulp-uglify');
var rstream = fs.createReadStream('test.log');
rstream.pipe(fs.createWriteStream('bla.js'));
rstream.pipe(uglify());
Looks like the plugin vinyl-source-stream provides a solution for what you want to do:
var source = require('vinyl-source-stream')
var streamify = require('gulp-streamify')
var browserify = require('browserify')
var fs = require('vinyl-fs');
var uglify = require('gulp-uglify');
var bundleStream = browserify('index.js').bundle()
bundleStream
.pipe(source('index.js'))
.pipe(streamify(uglify()))
.pipe(fs.dest('./bla.js'))

Categories

Resources