I have a Node.js module which exports two functions init(data) , where data is Buffer ,and test(word) , where word is a string.
I would like to read lines from data Buffer instance line by line inside test() function .
I have no experience in Node.js, only JS. All I come to know from this stack is how to export multiple function from Node.js module.
Here is function declaration so far:
module.exports = {
init: function(data) {
},
test: function(word) {
}
}
according to your comment, data is instanceof Buffer, and it contains a dictionary with one english word per line. So, now you can convert data to array of string, splitting by new line characters. with module format:
module.exports.init = function (data) {
if (!(data instanceof Buffer)) {
throw new Error('not a instanceof Buffer');
}
this.currentData = data.toString().split(/(?:\r\n|\r|\n)/g);
};
module.exports.test = function (word) {
// for example
var yourTestMethod = function (lineNumber, lineContent, testWord) {
return true;
};
if (this.currentData && this.currentData.length) {
for (var line = 0; line < this.currentData.length; line++) {
if (yourTestMethod(line, this.currentData[line], word)) {
return true;
}
}
}
return false;
};
if you save this code as testModule.js, you can use this module in main code like:
// load module
var testModule = require('./testModule.js');
// init
var buf = new Buffer(/* load dictionaly */);
testModule.init(buf);
// test
console.log(testModule.test('foo'));
I think it is more simple. thanks.
(old answer)
I think you can use readline module.
But readline accepts a stream, not a buffer.
So it needs to convert. for example.
var readline = require('readline');
var stream = require('stream');
// string to buffer
var baseText = 'this is a sample text\n(empty lines ...)\n\n\n\nend line:)';
var buf = new Buffer(baseText);
// http://stackoverflow.com/questions/16038705/how-to-wrap-a-buffer-as-a-stream2-readable-stream
var bufferStream = new stream.PassThrough();
bufferStream.end(buf);
var rl = readline.createInterface({
input: bufferStream,
});
var count = 0;
rl.on('line', function (line) {
console.log('this is ' + (++count) + ' line, content = ' + line);
});
then output is:
> node test.js
this is 1 line, content = this is a sample text
this is 2 line, content = (empty lines ...)
this is 3 line, content =
this is 4 line, content =
this is 5 line, content =
this is 6 line, content = end line:)
how is that?
Related
I'm trying to extract the first line (headers) from a CSV using TypeScript. I found a nifty function that does this using FileReader.onloadend, iterating over the bytes in the file until it reaches a line break. This function assigns the the header string to a window namespace. This is unfortunately not that useful to me in a window namespace, but I can't find a workable way of getting the header string assigned to a global variable. Does anyone know how best to do this? Is this achievable with this function?
The function is here:
declare global {
interface Window { MyNamespace: any; }
}
export const CSVImportGetHeaders = async (file: File) => {
const reader = new FileReader();
reader.readAsArrayBuffer(file);
// onload triggered each time the reading operation is completed
reader.onloadend = (evt: any) => {
// get array buffer
const data = evt.target.result;
console.log('reader content ', reader);
// get byte length
const byteLength = data.byteLength;
console.log('HEADER STRING ', byteLength);
// make iterable array
const ui8a = new Uint8Array(data, 0);
// header string, compiled iterably
let headerString = '';
let finalIndex = 0;
// eslint-disable-next-line no-plusplus
for (let i = 0; i < byteLength; i++) {
// get current character
const char = String.fromCharCode(ui8a[i]);
// check if new line
if (char.match(/[^\r\n]+/g) !== null) {
// if not a new line, continue
headerString += char;
} else {
// if new lineBreak, stop
finalIndex = i;
break;
}
}
window.MyNamespace = headerString.split(/,|;/);
const potout = window.MyNamespace;
console.log('reader result in function', potout);
};
const output = await window.MyNamespace;
console.log('outside onload event', output);
};
I'm not able to solve following file Handling problem in NodeJS:I have a file emp.txt that contains employee data in fixed record size in the following format:
EmpID:Name:Dept:Salary
1001:Harry:Sales:23000
1002:Sarita:Accounts:20000
1003:Monika:TechSupport:35000
Read the file. Display sum of salary of all employees
I have tried following code to read file successfully, but not getting logic to solve the exact problem. My code to read File:
var fs = require("fs");
console.log("Going to read file!");
fs.readFile('emp.txt', function(err, data){
if(err){
return console.error(err);
}
console.log(data.toString().split(":"));
console.log("read Successfully");
})
What is the correct logic to read Salary field from emp.txt and calculate it's sum?
First you have to split the new lines (\n) in your text file. Then loop through each row and get the total:
var fs = require("fs");
console.log("Going to read file!");
let totalSalary = 0;
fs.readFile('emp.txt', function(err, data){
if(err){
return console.error(err);
}
const dataRows = data.toString().split("\n");
for (let index=0; index < dataRows.length; index++){
if (index > 0){
let empData = dataRows[index].split(":");
totalSalary += parseInt(empData[3]);
}
}
console.log(totalSalary);
console.log("read Successfully");
})
Repl.it Link : https://repl.it/#h4shonline/ImpressionableRadiantLogic
Why dont you:
read the file line by line. https://nodejs.org/api/readline.html#readline_example_read_file_stream_line_by_line
remove the spaces
split the line by ":"
get the last element
Convert to Number()
Check if its a Number
add to sum
Something like this:
const fs = require('fs');
const readline = require('readline');
async function processLineByLine() {
const fileStream = fs.createReadStream('emp.txt');
const rl = readline.createInterface({
input: fileStream,
crlfDelay: Infinity
});
// Note: we use the crlfDelay option to recognize all instances of CR LF
// ('\r\n') in input.txt as a single line break.
let sum = 0;
for await (let line of rl) {
// Each line in input.txt will be successively available here as `line`.
line = line.replace(/ /g,'').split(':');
const salary = Number(line.pop());
if (!isNaN(salary)) {
sum += salary;
}
}
console.log(`Sum: ${sum}`)
}
processLineByLine();
I'm really struggling to find documentation on how to read a text file into an array using OS X Automation with Javascript.
Here's what I have so far:
var app = Application.currentApplication();
app.includeStandardAdditions = true;
var myFile = "/Users/Me/Dropbox/textfile.txt";
var openedFile = app.openForAccess(myfile, { writePermission: true });
var myText = openedFile.??
app.closeAccess(URLFile);
I copied most of this from the official Apple documentation. I'm finding it really difficult to find documentation anywhere online. For example, what are the arguments for openForAccess? There doesn't seem to be anything in any dictionary to describe that method.
Am I wasting my time with JXA?
Apple has an entire page devoted to reading and writing files in their Mac Automation Scripting Guide. This includes a function that performs exactly the action you are looking for. I've re-written your example below using the readAndSplitFile function from Apple's guide:
var app = Application.currentApplication()
app.includeStandardAdditions = true
function readAndSplitFile(file, delimiter) {
// Convert the file to a string
var fileString = file.toString()
// Read the file using a specific delimiter and return the results
return app.read(Path(fileString), { usingDelimiter: delimiter })
}
var fileContentsArray = readAndSplitFile('/Users/Me/Dropbox/textfile.txt', '\n')
After running the above code, fileContentsArray will hold an array of strings, with each string containg a single line of the file. (You could also use \t as a delimiter to break at every tab, or any other character of your choosing.)
Some generic functions and an illustrative test:
(function () {
'use strict';
// GENERIC FUNCTIONS ------------------------------------------------------
// doesFileExist :: String -> Bool
function doesFileExist(strPath) {
var error = $();
return $.NSFileManager.defaultManager
.attributesOfItemAtPathError($(strPath)
.stringByStandardizingPath, error), error.code === undefined;
};
// lines :: String -> [String]
function lines(s) {
return s.split(/[\r\n]/);
};
// readFile :: FilePath -> maybe String
function readFile(strPath) {
var error = $(),
str = ObjC.unwrap(
$.NSString.stringWithContentsOfFileEncodingError($(strPath)
.stringByStandardizingPath, $.NSUTF8StringEncoding, error)
),
blnValid = typeof error.code !== 'string';
return {
nothing: !blnValid,
just: blnValid ? str : undefined,
error: blnValid ? '' : error.code
};
};
// show :: a -> String
function show(x) {
return JSON.stringify(x, null, 2);
};
// TEST -------------------------------------------------------------------
var strPath = '~/DeskTop/tree.txt';
return doesFileExist(strPath) ? function () {
var dctMaybe = readFile(strPath);
return dctMaybe.nothing ? dctMaybe.error : show(lines(dctMaybe.just));
}() : 'File not found:\n\t' + strPath;
})();
I have a txt file contains:
{"date":"2013/06/26","statement":"insert","nombre":1}
{"date":"2013/06/26","statement":"insert","nombre":1}
{"date":"2013/06/26","statement":"select","nombre":4}
how I can convert the contents of the text file as array such as:
statement = [
{"date":"2013/06/26","statement":"insert","nombre":1},
{"date":"2013/06/26","statement":"insert","nombre":1},
{"date":"2013/06/26","statement":"select","nombre":4}, ];
I use the fs module node js. Thanks
Sorry
I will explain more detailed:
I have an array :
st = [
{"date":"2013/06/26","statement":"insert","nombre":1},
{"date":"2013/06/26","statement":"insert","nombre":5},
{"date":"2013/06/26","statement":"select","nombre":4},
];
if I use this code :
var arr = new LINQ(st)
.OrderBy(function(x) {return x.nombre;})
.Select(function(x) {return x.statement;})
.ToArray();
I get the result I want.
insert select insert
but the problem my data is in a text file.
any suggestion and thanks again.
There is no reason for not to do your file parser yourself. This will work on any size of a file:
var fs = require('fs');
var fileStream = fs.createReadStream('file.txt');
var data = "";
fileStream.on('readable', function() {
//this functions reads chunks of data and emits newLine event when \n is found
data += fileStream.read();
while( data.indexOf('\n') >= 0 ){
fileStream.emit('newLine', data.substring(0,data.indexOf('\n')));
data = data.substring(data.indexOf('\n')+1);
}
});
fileStream.on('end', function() {
//this functions sends to newLine event the last chunk of data and tells it
//that the file has ended
fileStream.emit('newLine', data , true);
});
var statement = [];
fileStream.on('newLine',function(line_of_text, end_of_file){
//this is the code where you handle each line
// line_of_text = string which contains one line
// end_of_file = true if the end of file has been reached
statement.push( JSON.parse(line_of_text) );
if(end_of_file){
console.dir(statement);
//here you have your statement object ready
}
});
If it's a small file, you might get away with something like this:
// specifying the encoding means you don't have to do `.toString()`
var arrayOfThings = fs.readFileSync("./file", "utf8").trim().split(/[\r\n]+/g).map(function(line) {
// this try/catch will make it so we just return null
// for any lines that don't parse successfully, instead
// of throwing an error.
try {
return JSON.parse(line);
} catch (e) {
return null;
}
// this .filter() removes anything that didn't parse correctly
}).filter(function(object) {
return !!object;
});
If it's larger, you might want to consider reading it in line-by-line using any one of the many modules on npm for consuming lines from a stream.
Wanna see how to do it with streams? Let's see how we do it with streams. This isn't a practical example, but it's fun anyway!
var stream = require("stream"),
fs = require("fs");
var LineReader = function LineReader(options) {
options = options || {};
options.objectMode = true;
stream.Transform.call(this, options);
this._buffer = "";
};
LineReader.prototype = Object.create(stream.Transform.prototype, {constructor: {value: LineReader}});
LineReader.prototype._transform = function _transform(input, encoding, done) {
if (Buffer.isBuffer(input)) {
input = input.toString("utf8");
}
this._buffer += input;
var lines = this._buffer.split(/[\r\n]+/);
this._buffer = lines.pop();
for (var i=0;i<lines.length;++i) {
this.push(lines[i]);
}
return done();
};
LineReader.prototype._flush = function _flush(done) {
if (this._buffer.length) {
this.push(this._buffer);
}
return done();
};
var JSONParser = function JSONParser(options) {
options = options || {};
options.objectMode = true;
stream.Transform.call(this, options);
};
JSONParser.prototype = Object.create(stream.Transform.prototype, {constructor: {value: JSONParser}});
JSONParser.prototype._transform = function _transform(input, encoding, done) {
try {
input = JSON.parse(input);
} catch (e) {
return done(e);
}
this.push(input);
return done();
};
var Collector = function Collector(options) {
options = options || {};
options.objectMode = true;
stream.Transform.call(this, options);
this._entries = [];
};
Collector.prototype = Object.create(stream.Transform.prototype, {constructor: {value: Collector}});
Collector.prototype._transform = function _transform(input, encoding, done) {
this._entries.push(input);
return done();
};
Collector.prototype._flush = function _flush(done) {
this.push(this._entries);
return done();
};
fs.createReadStream("./file").pipe(new LineReader()).pipe(new JSONParser()).pipe(new Collector()).on("readable", function() {
var results = this.read();
console.log(results);
});
fs.readFileSync("myfile.txt").toString().split(/[\r\n]/)
This gets your each line as a string
You can then use UnderscoreJS or your own for loop to apply the JSON.parse("your json string") method to each element of the array.
var arr = fs.readFileSync('mytxtfile', 'utf-8').split('\n')
I think this is the simplest way of creating an array from your text file
I'm trying to use GJS and more precisely
to read a text file in a synchronous way.
Here is an example an the asynchronous function for file reading
gio-cat.js
I found how to proceed with seed using the next function:
function readFile(filename) {
print(filename);
var input_file = gio.file_new_for_path(filename);
var fstream = input_file.read();
var dstream = new gio.DataInputStream.c_new(fstream);
var data = dstream.read_until("", 0);
fstream.close();
return data;
}
but unfortunately, it doesn't work with GJS.
Can anyone help me ?
GLib has the helper function GLib.file_get_contents(String fileName) to read files synchronously:
const GLib = imports.gi.GLib;
//...
let fileContents = String(GLib.file_get_contents("/path/to/yourFile")[1]);
Here is a solution that works with just Gio.
function readFile(filename) {
let input_file = Gio.file_new_for_path(filename);
let size = input_file.query_info(
"standard::size",
Gio.FileQueryInfoFlags.NONE,
null).get_size();
let stream = input_file.open_readwrite(null).get_input_stream();
let data = stream.read_bytes(size, null).get_data();
stream.close(null);
return data;
}
As I use GJS for developing Cinnamon applets, I used to use the get_file_contents_utf8_sync function to read text files :
const Cinnamon = imports.gi.Cinnamon;
let fileContent = Cinnamon.get_file_contents_utf8_sync("file path");
If you have Cinnamon installed and you agree to use it, it answers your question.
Otherwise here is the C code of the get_file_contents_utf8_sync function in cinnamon-util.c, hoping this will help you:
char * cinnamon_get_file_contents_utf8_sync (const char *path, GError **error)
{
char *contents;
gsize len;
if (!g_file_get_contents (path, &contents, &len, error))
return NULL;
if (!g_utf8_validate (contents, len, NULL))
{
g_free (contents);
g_set_error (error,
G_IO_ERROR,
G_IO_ERROR_FAILED,
"File %s contains invalid UTF-8",
path);
return NULL;
}
return contents;
}
Cinnamon source code
Try replacing
new gio.DataInputStream.c_new(fstream);
with
gio.DataInputStream.new(fstream);
it worked for me