node.js fs.writeFileSync() How to set encoding to big5? - javascript

the fs.writeFileSync encode default is UTF 8
I can't set the encode to big5.
The documentation does not mention those encoding support.
If this function does not support BIG5, what can I do?
var fs = require('fs');
var FilePath='./text.txt';
var Str='this is a test!';
var encode='utf8';
fs.writeFileSync(FilePath, Str, encode);
When I set encoding(var encode='big5';) BIG5, the server generates an error.

To use an encoding that isn't standard with Node Core. You can use iconv-lite.
It adds support for additional encodings including big5, here is the full list of encodings.
const iconv = require('iconv-lite');
const fs = require('fs');
const stream = require('stream');
var Str = iconv.encode('This is a test', 'big5');
var readStream = new stream.PassThrough();
var writeStream = fs.createWriteStream('./text.txt');
readStream.once('error', (err) => { console.log(err); });
readStream.once('end', () => { console.log('File Written'); });
readStream.end(Str); // write data to stream
readStream.pipe(writeStream); // pipe data to file

Related

How to read a large csv as a stream

I am using the #aws-sdk/client-s3 to read a json file from S3, take the contents and dump it into dynamodb. This all currently works fine using:
const data = await (await new S3Client(region).send(new GetObjectCommand(bucketParams)));
And then deserialising the response body etc.
However, I'm looking to migrate to use jsonlines format, effectiely csv, in the sense it needs to be streamed in line by line or in chunks of lines and processed. I can't seem to find a way of doing this that doesnt load the entire file into memory (using response.text() etc).
Ideally, I would like to pipe the response into a createReadStream, and go from there.
I found this example with createReadStream() form module fs in node.js:
import fs from 'fs';
function read() {
let data = '';
const readStream = fs.createReadStream('business_data.csv', 'utf-8');
readStream.on('error', (error) => console.log(error.message));
readStream.on('data', (chunk) => data += chunk);
readStream.on('end', () => console.log('Reading complete'));
};
read();
You can modify it for your use. Hope this helps.
Connection to your S3 you can do by:
var s3 = new AWS.S3({apiVersion: '2006-03-01'});
var params = {Bucket: 'myBucket', Key: 'myImageFile.jpg'};
var file = require('fs').createWriteStream('/path/to/file.jpg');
s3.getObject(params).createReadStream().pipe(file);
see here

How do I readFileSync audio as a string and then writeFileSync it back as audio?

I have the following:
const fileAsString = fs.readFileSync('speech.mp3', { encoding: 'utf-8' })
const encryptedString = encrypt(fileAsString)
const decryptedString = decrypt(encryptedString)
console.log(fileAsString === decryptedString) // this returns true
fs.writeFileSync('speech_copy.mp3', decryptedString, { encoding: 'utf-8' })
speech_copy.mp3 is created but it's no longer playable because I have messed up its encoding.
What am I doing wrong in the process? The only reason I'm originally reading the file using { encoding: 'utf-8' } is so that I may encrypt it and then decrypt it again. Should I use a different encoding when I write it back as a new file?
Using a base64 representation of the binary data is usually a better way:
const fs = require('fs');
// binary -> base64
const fileAsString = fs.readFileSync('speech.mp3').toString('base64');
const encryptedString = encrypt(fileAsString);
const decryptedString = decrypt(encryptedString);
// base64 -> binary
fs.writeFileSync('speech_copy.mp3', Buffer.from(decryptedString , 'base64'));

Remove or change original data from stream Nodejs

I have a code to write a hash to the file from text of another file, but the problem is that in resulting file is written not only hash, but also the original text.
For example: if content of source file qwerty a got in result file qwertyd8578edf8458ce06fbc5bb76a58c5ca4, but i need just d8578edf8458ce06fbc5bb76a58c5ca4.
const fs = require('fs');
const crypto = require('crypto');
const hash = crypto.createHash('MD5');
const readData = fs.createReadStream('./task1/input.txt');
const writeData = fs.createWriteStream('./task1/output.txt');
readData.on('data', (chunk) => {
hash.update(chunk);
});
readData.on('end', () => {
const resultHash = hash.digest('hex');
writeData.end(resultHash);
console.log(resultHash);
});
readData.pipe(writeData);
How i can fix this? Thanks.
If you want to hash a stream, thats super easy as hash is itself a stream ( a Transform stream). Just pipe your input into it, and pipe the resulting hash into your output:
const fs = require('fs');
const crypto = require('crypto');
const hash = crypto.createHash('MD5');
const readData = fs.createReadStream('./task1/input.txt');
const writeData = fs.createWriteStream('./task1/output.txt');
readData.pipe(hash).pipe(writeData);
Reference

Uncaught TypeError: fs.readFileSync is not a function in console

I using below code for reading file from my local system:
var fs = require('fs');
var text = fs.readFileSync("./men.text");
var textByLine = text.split("\n")
console.log(textByLine);
NOTE: fs is a nodejs module, you cannot use it in Browser.
Import the fs module,
readFileSync will provide you the Buffer
to use the split() function you have to convert the Buffer into String
var fs = require('fs')
var text = fs.readFileSync("./men.text");
var string = text.toString('utf-8') // converting the Buffer into String
var textByLine = string.split("\n")
console.log(textByLine);
▼ UPDATE ▼
Server-Side
fs is a nodejs built-in module, you cannot use it in Browser(Client-Side). Use fs in server-side to do the manipulation, get the data and format in required type, then you can render it with html, ejs many more.. templating engines
Here i have created a Nodejs Server using express, and from the browser hit the http://localhost:8000/ you will get the Array of Data
You can format your data and render it with the .ejs or html files using res.render
app.js
var express = require('express');
var app = express();
var fs = require('fs')
app.get('/', function (request, response) {
var text = fs.readFileSync("./men.text");
var string = text.toString('utf-8')
var textByLine = string.split("\n")
console.log(textByLine);
response.send(textByLine);
});
app.listen('8000');
Dummy Output:
To all who are still getting undefined function on there eletron apps :
The solution (at least for me) was to instead of doing :
const fs = require('fs');
I did :
const fs = window.require('fs');
And that fixed ALL the problemes I had.
var fs = require('fs');
var text = fs.readFileSync('./men.text', 'utf8');
var textByLine = text.split("\n");
console.log(textByLine);

Node.js encrypts large file using AES

I try to use following code to encrypt a file of 1 GB. But Node.js abort with "FATAL ERROR: JS Allocation failed - process out of memory". How can I deal with it?
var fs = require('fs');
var crypto = require('crypto');
var key = "14189dc35ae35e75ff31d7502e245cd9bc7803838fbfd5c773cdcd79b8a28bbd";
var cipher = crypto.createCipher('aes-256-cbc', key);
var file_cipher = "";
var f = fs.ReadStream("test.txt");
f.on('data', function(d) {
file_cipher = file_cipher + cipher.update(d, 'utf8', 'hex');
});
f.on('end', function() {
file_cipher = file_cipher + cipher.final('hex');
});
You could write the encrypted file back to disk instead of buffering the entire thing in memory:
var fs = require('fs');
var crypto = require('crypto');
var key = '14189dc35ae35e75ff31d7502e245cd9bc7803838fbfd5c773cdcd79b8a28bbd';
var cipher = crypto.createCipher('aes-256-cbc', key);
var input = fs.createReadStream('test.txt');
var output = fs.createWriteStream('test.txt.enc');
input.pipe(cipher).pipe(output);
output.on('finish', function() {
console.log('Encrypted file written to disk!');
});
crypto.createCipher() without initialization vector is deprecated since NodeJS v10.0.0 use crypto.createCipheriv() instead.
You can also pipe streams using stream.pipeline() instead of pipe method and then promisify it (so the code will easily fit into promise-like and async/await flow).
const {createReadStream, createWriteStream} = require('fs');
const {pipeline} = require('stream');
const {randomBytes, createCipheriv} = require('crypto');
const {promisify} = require('util');
const key = randomBytes(32); // ... replace with your key
const iv = randomBytes(16); // ... replace with your initialization vector
promisify(pipeline)(
createReadStream('./text.txt'),
createCipheriv('aes-256-cbc', key, iv),
createWriteStream('./text.txt.enc')
)
.then(() => {/* ... */})
.catch(err => {/* ... */});
With NodeJS 15+ you could simplify it (skip promisify part)
const {createReadStream, createWriteStream} = require('fs');
const {pipeline} = require('stream/promises');
const {randomBytes, createCipheriv} = require('crypto');
const key = randomBytes(32); // ... replace with your key
const iv = randomBytes(16); // ... replace with your initialization vector
pipeline(
createReadStream('./text.txt'),
createCipheriv('aes-256-cbc', key, iv),
createWriteStream('./text.txt.enc')
)
.then(() => {/* ... */})
.catch(err => {/* ... */});
I would simply use the Fileger to do it. It's a promise-based package and a clean alternative to the NodeJS filesystem.
const fileger = require("fileger")
const file = new fileger.File("./your-file-path.txt");
file.encrypt("your-password") // this will encrypt the file
.then(() => {
file.decrypt("your-password") // this will decrypt the file
})

Categories

Resources