NodeJS child_process ansi in stdout - javascript

I want to get something like
[06:32:35] [Server thread/INFO]: [0;36;1m | [0;36;22m|__) [0;32;22mLuckPerms [0;36;1mv4.3.73[m
[06:32:35] [Server thread/INFO]: [0;36;1m |___ [0;36;22m| [0;30;1mRunning on Bukkit - CraftBukkit[m
but I get
[06:05:02] [Server thread/INFO]: | |__) LuckPerms v4.3.73
[06:05:02] [Server thread/INFO]: |___ | Running on Bukkit - CraftBukkit
When running minecraft server using child_process
prcs.stdout.on("data", function(d) {
console.log(d.toString());
});

Without knowing exactly how d is shaped, here is something that complies with your example, probably not exactly behaving as you need but you can always try and upgrade it (and at least it doesn't require any dependency):
const versionRegExp = /v[0-9]+(\.[0-9]+)*$/;
d.toString().split("\n").forEach((line) => {
// no idea what the spaces are made of
const exploded = line.trim().split(/[ \t]+/);
// add paddings to the first two structures
const first = exploded.shift().padEnd(5, ' ');
const second = exploded.shift().padEnd(7, ' ');
// work out the content
// condition based on `second`, or should it be remainder.match(versionRegExp) ?
const remainder = 0 === second.indexOf('|__)')
? `[0;30;1m${exploded.join(' ').replace(versionRegExp, '[0;36;1m$&')}[m`
: `[0;32;22m${exploded.join(' ')}[m`
;
// format line and display
console.log(`[0;36;1m${first}[0;36;22m${second}${remainder}`);
});

Related

trying to decompress xref stream from pdf - getting "ERROR incorrect header check"

I am trying to parse the xref stream from PDF in JavaScript. I managed to succesfully isolate the stream itself (I checked that it's ok by comparing it in debugging mode with the value between steram. and endstream tags in PDF.
However, when I try to inflate it using pako lib, I get an error saying: ERROR incorrect header check.
The compression method is FlateDecode, which can be seen from the dictionary.
Here is the code in question:
const dict = pdfStr.slice(pdf.startXRef);
const xrefStreamStart = this.getSubstringIndex(dict, 'stream', 1) + 'stream'.length + 2;
const xrefStreamEnd = this.getSubstringIndex(dict, 'endstream', 1) + 1;
const xrefStream = dict.slice(xrefStreamStart, xrefStreamEnd);
const inflatedXrefStream = pako.inflate(this.str2ab(xrefStream), { to: 'string' });
pdfStr is the whole PDF read as a string, while *pdf.startXRef* holds the value of the position of the xref stream object.
Here's the whole PDF if someone wants to have a look: https://easyupload.io/lzf9he
EDIT: As mcernak has suggested I had a problem that I included /r and /n in the stream. However, now that I corrected the code I got a different error: invalid distance too far back
The stream content is located between stream\r\n and \r\nendstream.
You need to take into account those two additional characters (\r\n) at the beginning and at the end to read the correct data:
const dict = pdfStr.slice(pdf.startXRef);
const xrefStreamStart = this.getSubstringIndex(dict, 'stream', 1) + 'stream'.length + 2;
const xrefStreamEnd = this.getSubstringIndex(dict, 'endstream', 1) - 2;
const xrefStream = dict.slice(xrefStreamStart, xrefStreamEnd);
const inflatedXrefStream = pako.inflate(this.str2ab(xrefStream), { to: 'string' });

How to restart at specific binlog position?

I'm using nodejs mysql-events to parse mysql db updates.
To avoid parsing all logs each time I relaunch my script, I want to use binlogName and nextPosition as read from documentation
filename string Begin reading events from this binlog file. If
specified together with position, will take precedence over
startAtEnd.
position integer Begin reading events from this position. Must be
included with filename.
The below code works like a charm:
const mysql = require('mysql');
const MySQLEvents = require('#rodrigogs/mysql-events');
// Connect to mysql instance
const connection = mysql.createConnection({
host: 'host',
user: 'user',
password: '*****'
});
const mysqlInstance = new MySQLEvents(connection, {
startAtEnd: true
});
await mysqlInstance.start();
Values are valid and copied from vscode debugger.
mysql> show master status;
+------------+----------+--------------+------------------+-------------------+
| File | Position | Binlog_Do_DB | Binlog_Ignore_DB | Executed_Gtid_Set |
+------------+----------+--------------+------------------+-------------------+
| bin.000015 | 34837697 | | | |
+------------+----------+--------------+------------------+-------------------+
1 row in set (0.00 sec)
Once I change the mysqlInstance settings, it ignores it and start from first log.
const mysqlInstance = new MySQLEvents(connection, {
binlogName: 'bin.000015',
nextPosition: 18540004
});
Should I specify something else?

JSON in CSV to CSV

Through a REST API endpoint, I get rather big CSV files with the following structure (JSON inside CSV file):
A,B,C,D
1,2,3,{"E":1,"F":2,"G":3}
1,2,3,{"E":1,"H":2}
For a different tool, I need a CSV with a flat structure (no nested JSON). So, in the end, I'd like to have a CSV that looks like that.
A,B,C,E,F,G,H
1,2,3,1,2,3,
1,2,3,1,,,2
(Although the column headlines look structured, this is not important for my use case)
As the CSV files are rather big, I'm looking for a relatively performant way to do so. I'll be writing this in JavaScript (Node.JS) (as that's the language that's used for all other parts of the script). However, for now I'm just looking for a theoretical way / fake code to do so in a performant matter.
As far as I can tell, I'll probably have to loop over the CSV files twice. The first time I just have to get all JSON keys. The second time, I can then create a new CSV file and set all values. However, how would I properly find out in which column I have to write the values?
Or, is it more performant to "convert" the CSV file to an array of objects in one loop and then use something like the CSV parser (http://csv.adaltas.com/) to convert that back into a CSV?
Here is a solution using jq
If the file filter.jq contains
[
split("\n") # split string into lines
| (.[0] | split(",")) as $headers # split header
| (.[1:][] | split(",")) # split data rows
| select(length>0) # get rid of empty lines
| $headers[:-1] as $h1 # fixed headers
| .[:($h1|length)] as $p1 # fixed part
| .[($h1|length):] as $p2 # variable part
| (
[ [ $h1, $p1 ] # \
| transpose[] # \ assemble fixed object
| {key:.[0], value:.[1]|tonumber} # / from fixed keys and values
] | from_entries # /
) + (
$p2 | join(",") | fromjson # assemble variable object
)
]
| (map(keys) | add | unique) as $all # compute final headers
| [$all] + ( # add headers to
map(. as $b | reduce $all[] as $a ([];. + [$b[$a]])) # objects with all keys
| map(map(if . == null then "" else tostring end)) # convert values to strings
)
| .[] # scan final array
| #csv # convert to csv
and your data is in a file called data then
jq -M -R -s -r -f filter.jq data
will generate
"A","B","C","E","F","G","H"
"1","2","3","1","2","3",""
"1","2","3","1","","","2"
var express = require('express');
var app = express();
var bodyParser = require('body-parser');
var mysql=require('mysql');
var fs= require('fs');
var csv = require('fast-csv');
var formidable = require('formidable');
var urlencodedParser = bodyParser.urlencoded({ extended: false })
var con=mysql.createConnection({
host:'localhost',
user:'dheeraj',
password:'123',
database:'dheeraj'
});
app.use('/assets',express.static('assets'));
app.get('/d', function (req, res) {
res.sendFile( __dirname + "/" + "/d.html" );
})
app.post('/file_upload', urlencodedParser, function (req, res) {
//{
var form = new formidable.IncomingForm();
form.parse(req, function (err, fields, files) {
res.write('File uploaded');
//console.log(files.filetoupload);
fs.createReadStream(files.filetoupload.name)
.pipe(csv())
.on('data',function(data){
var d1=data[0];
var d2=data[1];
var d3=data[2];
var d4=data[3];
var d5=data[4];
con.query('insert into demo values(\''+d1+'\',\''+d2+'\',\''+d3+'\',\''+d4+'\',\''+d5+'\')',function(err,result)
{
console.log('inserted');
})
console.log(data);
})
.on('end',function(data){
console.log('read finished');
});
res.end();
})
})
var server = app.listen(8081, function () {
var host = server.address().address
var port = server.address().port
console.log("Example app listening at http://%s:%s", host, port)
})

How to compile a simple Command-line OCaml script into Javascript

I have a simple command line OCaml application that performs a computation on Sys.argv.(1) and outputs the result to stdout. I can compile it to Javascript with js_of_ocaml, but it gives me a lot of errors about caml_ml_output_char being undefined. I fixed those errors by stubbing out the printfs, so it runs, but it freezes firefox while running.
How can I cleanly compile simple OCaml command-line script into a Javascript based webpage; without maintaining a forked version or freezing the browser?
You will probably want to use webworkers, as running software not designed around Javascript's co-operative multi-tasking in the UI thread can cause the browser to lock up.
You can add the following header to the top of your OCaml file to overload the normal OCaml Sys and print implementations
(* JsHeader.ml *)
let output_buffer_ = Buffer.create 1000
let flush x=let module J = Js.Unsafe in let () = J.call
(J.variable "postMessage") (J.variable "self")
[|J.inject (Js.string (Buffer.contents output_buffer_))|]
in Buffer.clear output_buffer_
let print_string = Buffer.add_string output_buffer_
let print_char = Buffer.add_char output_buffer_
let print_newline () = print_char '\n'
let print_endline s = print_string (s^"\n"); flush ()
let caml_ml_output_char = print_char
let printf fmt = Printf.bprintf output_buffer_ fmt
module Printf = struct
include Printf
let printf fmt = Printf.bprintf output_buffer_ fmt
end
The most natural way to pass in commandline arguments is through the URL sent to the web worker. We can override the Ocaml Sys module to instead read ?argv as a sequence of null terminated strings.
module Sys = struct
let char_split delim s = (*Str.split is overkill*)
let hd = ref "" in let l = ref [] in
String.iter (fun c ->
if c = delim
then (l := (!hd)::(!l); hd := "")
else hd := (!hd) ^ (String.make 1 c)
) s;
List.rev ((!hd)::(!l))
let getenv x = List.assoc x Url.Current.arguments
let argv = Array.of_list (char_split '\x00' (getenv "?argv"))
let executable_name = argv.(0)
end
Now that we have entered the header we can enter a simple OCaml Command Line program:
(* cli.ml *)
let _ = print_string (Array.fold_left (^) "" (Array.make 40 (String.lowercase (Sys.argv.(1)^"\n"))) )
This command line program relies on the OS to flush the output, but we will have to manually flush the output.
You may also want to send a null character so the Javascript knows that the command has finished.
This can be achieved by appending the following footer.
(* JsFooter.ml *)
let _ = flush stdout; print_endline "\x00"
We can join the files and compile them as follows:
cat JsHeader.ml cli.ml JsFooter.ml > merged.ml
ocamlbuild -use-menhir -menhir "menhir" \
-pp "camlp4o -I /opt/local/lib/ocaml/site-lib js_of_ocaml/pa_js.cmo" \
-cflags -I,+js_of_ocaml,-I,+site-lib/js_of_ocaml -libs js_of_ocaml \
-lflags -I,+js_of_ocaml,-I,+site-lib/js_of_ocaml merged.byte
js_of_ocaml merged.byte
Now that we have created the file merged.js we can wrap the javascript in a simple web page such as the following:
<html>
<head>
<meta http-equiv="Content-Type" content="text/xhtml+xml; charset=UTF-8" />
<title>ml2js sample_cli</title>
<script type="text/javascript">
<!--
var worker;
function go () {
var output=document.getElementById ("output");
var argv = encodeURIComponent("/bin/sample_cli\0"+document.getElementById ("input").value);
if (worker) {
worker.terminate();
}
worker = new Worker ("sample_cli.js?argv="+argv);
document.getElementById ("output").value="";
worker.onmessage = function (m) {
if (typeof m.data == 'string') {
if (m.data == "\0\n") {
output.scrollTop = output.scrollHeight
} else {
output.value+=m.data;
}
}
}
}
//-->
</script>
</head>
<body onload=go()>
<textarea id="input" rows="2" cols="60" onkeyup="go()" onchange="go()" style="width:90%">SAMPLE_INPUT</textarea>
<button onclick="go()">go</button><br>
<textarea id="output" rows="0" cols="60" style="width:100%;height:90%" readonly onload=go()>
Your browser does not seem to support Webworkers.
Try Firefox, Chrome or IE10+.
</textarea>
</body>
</html>
See http://www.dansted.org/app/bctl-plain.html for an example of this approach in action, and https://github.com/gmatht/TimeLogicUnify/blob/master/ATL/js/webworker/ml2js.sh for a script that appends the appropriate headers, footers etc.
What js_of_ocaml's version are you using ? You should not get errors with caml_ml_output_char. When running on node, you should have sys.argv set correctly. In the browser, Sys.argv is set to [|"a.out"|].
Please open a GitHub issue on https://github.com/ocsigen/js_of_ocaml/issues/new if you still have an issue with this.

Javascript NETMASK and CIDR conversion

I was expecting to find hundreds of examples of functions to convert to and from CIDR and NETMASK for javascript, but was unable to find any.
I need to convert to and from CIDR and NETMASKS on a nodejs page which sets and retrieves the IP address for a machine using NETCTL.
Any easy solutions to do this using javascript / nodejs ??
This code could provide a solution:
var mask = "255.255.248.0";
var maskNodes = mask.match(/(\d+)/g);
var cidr = 0;
for(var i in maskNodes)
{
cidr += (((maskNodes[i] >>> 0).toString(2)).match(/1/g) || []).length;
}
return cidr;
Here's one that doesn't check if the netmask is valid:
const netmaskToCidr = n => n
.split('.')
.reduce((c, o) => c - Math.log2(256 - +o), 32)
NETMASK BINARY CIDR
255.255.248.0 11111111.11111111.11111000.00000000 /21
255.255.0.0 11111111.11111111.00000000.00000000 /16
255.192.0.0 11111111.11000000.00000000.00000000 /10
This how calculate CIDR.. So , it is the occurrences of 1 in the second cloumn. Thus , I design a readable algorithm as below :
const masks = ['255.255.255.224', '255.255.192.0', '255.0.0.0'];
/**
* Count char in string
*/
const countCharOccurences = (string , char) => string.split(char).length - 1;
const decimalToBinary = (dec) => (dec >>> 0).toString(2);
const getNetMaskParts = (nmask) => nmask.split('.').map(Number);
const netmask2CIDR = (netmask) =>
countCharOccurences(
getNetMaskParts(netmask)
.map(part => decimalToBinary(part))
.join(''),
'1'
);
masks.forEach((mask) => {
console.log(`Netmask =${mask}, CIDR = ${netmask2CIDR(mask)}`)
})
I know it's been long since this question was asked, but I just wanted to add checks to ensure that the netmask is valid:
function mask2cidr(mask){
var cidr = ''
for (m of mask.split('.')) {
if (parseInt(m)>255) {throw 'ERROR: Invalid Netmask'} // Check each group is 0-255
if (parseInt(m)>0 && parseInt(m)<128) {throw 'ERROR: Invalid Netmask'}
cidr+=(m >>> 0).toString(2)
}
// Condition to check for validity of the netmask
if (cidr.substring(cidr.search('0'),32).search('1') !== -1) {
throw 'ERROR: Invalid Netmask ' + mask
}
return cidr.split('1').length-1
}
As the mask is only valid when the bits in 1 go from left to right, the condition checks that no bit is 1 after the first bit in 0. It also checks each group is 0 or 128-255
The method of conversion is mostly the same as the other answers
Given that you have mentioned using node.js to implement this, I'm assuming you're looking for a way to run this server side in javascript, as opposed to client side. If that's correct, does the netmask npm module cover what you need to do?

Categories

Resources