Improper parsing of strings - javascript

I'm trying to convert ansi color codes from console output into HTML. I have a found a script to do this but I cant seem to make it parse the strings inside node js. I have tried to JSON.stringify it to also include special chars but its not working.
forever list
[32minfo[39m: Forever processes running
[90mscript[39m [37mforever[39m [37mpid[39m [37mid[39m
[90mdata[39m: [37m [39m [37muid[39m [90mcommand[39m
I get output like this back from ssh2shell in node js. I have a script:
https://github.com/pixelb/scripts/blob/master/scripts/ansi2html.sh
This is supposed to convert the above to html and add the appropriate color codes. It works fine with normal terminal output for example:
npm install --color=always | ansi2html.sh > npminstall.html
This is the raw output on the linux machine piped to a file. It seems the JS strings are missing these escapes when they are shown in console.log but they are also missing newlines there. Perhaps its because im concatenating them directly into the string and its removing special chars?
total 24
-rwxr-xr-x 1 admin admin 17002 May 13 02:52 ^[[0m^[[38;5;34mansi2html.sh^[[0m
drwxr-xr-x 4 admin admin 4096 May 13 00:00 ^[[38;5;27mgit^[[0m
-rw-r--r-- 1 admin admin 0 May 13 02:57 ls.html
Hopefully some of this makes sense.
Thanks

There are a couple of filters that SSH2shell applies to the output from commands. The first removes non-standard ASCII from the response and then the colour formatting codes are removed.
In v1.6.0 I have added pipe()/unpipe(), the events for both and exposed the stream.on('data', function(data){}) event so you can access the stream output directly without SSH2shell interacting with it in any way.
This should resolve the problem of not getting the right output from SSH2shell by giving you access to the raw data.
var fs = require('fs')
var host = {
server: {
host: mydomain.com,
port: 22,
userName: user,
password: password:)
},
commands: [
"`Test session text message: passed`",
"msg:console test notification: passed",
"ls -la"
],
}
//until npm published use the cloned dir path.
var SSH2Shell = require ('ssh2shell')
//run the commands in the shell session
var SSH = new SSH2Shell(host),
callback = function( sessionText ){
console.log ( "-----Callback session text:\n" + sessionText);
console.log ( "-----Callback end" );
},
firstLog = fs.createWriteStream('first.log'),
secondLog = fs.createWriteStream('second.log'),
buffer = ""
//multiple pipes can be added but they wont be bound to the stream until the connection is established
SSH.pipe(firstLog).pipe(secondLog);
SSH.on('data', function(data){
//do something with the data chunk
console.log(data)
})
SSH.connect(callback)

tried this ?
https://github.com/hughsk/ansi-html-stream
var spawn = require('child_process').spawn
, ansi = require('ansi-html-stream')
, fs = require('fs')
var npm = spawn('npm', ['install', 'browserify', '--color', 'always'], {
cwd: process.cwd()
})
var stream = ansi({ chunked: false })
, file = fs.createWriteStream('browserify.html', 'utf8')
npm.stdout.pipe(stream)
npm.stderr.pipe(stream)
stream.pipe(file, { end: false })
stream.once('end', function() {
file.end('</pre>\n')
})
file.write('<pre>\n');

Related

ffmpeg fails when using a temporary file path for output

I'm using the fluent-ffmpeg library in Node to automatically generate a single thumbnail at the halfway mark of a given video file.
const screenshot = async (pathToFile: string) => {
// Generate a temporary file path outside of the working directory with the extension .jpg
const tempFileName = tmp.tmpNameSync({ postfix: ".jpg" });
try{
await new Promise((resolve, reject) => {
ffmpeg(pathToFile)
.thumbnail({
// This works fine when NOT using tmpNameSync
filename: tempFileName,
count: 1,
timestamps: ["50%"]
})
.on("end", resolve)
.on("error", reject);
});
} catch(err){
console.log(err);
return null;
}
return tempFileName;
};
This implementation works very well when I'm using a "non-temporary" output path, such as /path/to/thumbnail.jpg. But, when I use a library such as tmp to generate a temporary file name outside of the working directory, ffmpeg throws an error.
Error: ffmpeg exited with code 1: av_interleaved_write_frame(): Input/output error
frame= 1 fps=0.0 q=7.8 size=N/A time=00:00:00.04 bitrate=N/A speed=0.152x
frame= 1 fps=0.0 q=7.8 Lsize=N/A time=00:00:00.04 bitrate=N/A speed=0.141x
video:119kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Conversion failed!
I cannot seem to find anything about ffmpeg struggling with accessing temporary directories online, and using the command directly in the terminal works as expected, so I don't believe this to be a permissions issue. Although, I may be going about this incorrectly.
This is the full ffmpeg command that fluent-ffmpeg generates (reduced filenames so it doesn't look horrible):
ffmpeg -ss 14.118271 -i /var/folders/__/XYZ/T/tmp-XYZ/tmp-XYZ -y -filter_complex scale=w=trunc(oh*a/2)*2:h=720[size0];[size0]split=1[screen0] -vframes 1 -map [screen0] var/folders/__/XYZ/T/tmp-XYZ.jpg
After hours of debugging, I found that the problem was a result of two things.
Firstly, ffmpeg and os.tmpdir() do not mix on MacOS.
The tmpdir method generates a symlink when using MacOS instead of an absolute path, which ffmpeg doesn't seem to like. Although, this seems to be inconsistent in when it does and doesn't affect the outcome.
Regardless, the fix for this is simple.
const fixSymlinkPath = (path: string) => {
// If the current platform is MacOS, prefix the path generated with /private.
// This is the true location of the path.
return process.platform === "darwin"
? `/private${path}`
: path;
};
// Use like so.
let path = fixSymlinkPath(tmp.tmpNameSync());
Secondly, (and this is something I should've noticed earlier) fluent-ffmpeg strips leading / from the filename property, resulting in a relative path and not an absolute one.
This effectively meant that ffmpeg was outputting to a non-existent directory inside __DIRNAME.
ffmpeg(pathToFile).thumbnail({
folder: "/", // Ensure absolute path, essentially.
filename: tempFileName,
count: 1,
timestamps: ["50%"]
});
Hopefully this helps someone down the track.

Simple CSV parsing in Javascript

Here is my problem:
I am trying to parse a local CSV file in JavaScript. The file looks like this:
Year,Promo,Surname,Name,Mail
2005/2006,0,XXXX,XXXXX,xxxxx.xxxxx#gmail.com
(...)
2006/2007,1,XXXX,XXXXX,xxxxx.xxxxx#gmail.com
(...)
2007/2008,2,XXXX,XXXXX,xxxxx.xxxxx#gmail.com
etc.
I tried to parse it using several librairies (PapaParse.js, jquery-csv, d3.js...), but:
either the parsing fails (my array is empty)
or I get a XMLHttpRequest cannot load - Cross origin requests are only supported for protocol schemes: http, data, chrome, chrome-extension, https, chrome-extension-resource error, since my file is stored locally.
Is there a simple solution to parse a CSV file in JavaScript, in order to access the data? I looked up hundreds of posts on forums but I could not get it to work.
Thank you very much (excuse me, I am quite new in JS).
Tom.
this answer is canonical in that it addresses anyone's problem that might be described by the question. Only the first of these answers is meant for the OP, although, regarding the last comment, the edit section I added at the bottom is also specifically for the OP.
if you are doing this for a small, local app, you probably can do one of these two things:
launch the browser with CORS disabled:
Chrome.exe --disable-web-security
in the source there is also instructions for firefox
src
run a micro server for your files:
If you’ve got Python installed (most Mac and Linux users do), you can start a quick local web server for testing. Using the command prompt, navigate to the directory that has your HTML files and run the following command:
python -m SimpleHTTPServer
Your files should now be accessible from http://localhost:8000/ and may have a good chance of working when file:/// does not.
src
A better solution, if you run into CORS issues with the python server, might be local-web-server from node: https://www.npmjs.com/package/local-web-server
the typical user looking for an answer to this question is probably using node:
'use strict';
var fs = require('fs');
var readFilePromise = function(file) {
return new Promise(function(ok, notOk) {
fs.readFile(file, function(err, data) {
if (err) {
notOk(err)
} else {
ok(data)
}
})
})
}
readFilePromise('/etc/passwd').then(function(data) {
// do something with the data...
})
src
edit: setting it up for a simple application:
Make the server a serivce in rc.d or wherever. Follow a guide like this: https://blog.terminal.com/using-daemon-to-daemonize-your-programs/
Don't make the server a local service that is active! Instead, make a script to launch your app, and only from that script start the daemon. In your init script for the service, write a check to look for your app's PID or something every few minutes and autoshutdown when the app is no longer running.
Here is a code sample code for basic parsing of CSV you could try.
First step: Read the file.
We can read the file content using the FileReader class method readAsText, because the content in a CSV file is just some text .
Read more about FileReader here: https://developer.mozilla.org/en-US/docs/Web/API/FileReader
This code should be in an 'async' function. Because we have used 'await' to wait for the promise to resolve or reject.
Here the file variable is the File Object you have from the file input HTML element.
const fileContent = await(() => {
const promise = new Promise((resolve,reject) => {
const fileReader = new FileReader();
fileReader.onloadend = ()=>{
try {
const content = fileReader.result;
resolve(content);
} catch (error) {
reject(error);
}
};
fileReader.readAsText(file);
});
return promise;
})();
Second step: Transforming the file.
Here I transformed the file content into an array. A 2D array containing the CSV data.
/** extract the lines by splitting the text content by CRLF */
const linesArray = fileContent.split('\r\n');
const outcomeArray = [];
for (let rowIndex = 0; rowIndex < linesArray.length; rowIndex++) {
/** Checking whether the line is empty or not.
It's possible that there is a blank line in the CSV file.
We shall process only if not blank */
if (linesArray[rowIndex].trim()) {
/** Extract the cell out of the current line */
const currentline = linesArray[rowIndex].split(',').map((cellData, columnIndex) => {
/** Forming the data as an object. This can be customised as needed */
return {
rowIndex,
columnIndex,
value: cellData?.trim()
};
});
outcomeArray.push(currentline);
}
}
Example
If we parse a CSV having this content:
10,11
20,21
Output is a 2D array as below:
[
[
{
"rowIndex": 0,
"columnIndex": 0,
"value": "10"
},
{
"rowIndex": 0,
"columnIndex": 1,
"value": "11"
}
],
[
{
"rowIndex": 1,
"columnIndex": 0,
"value": "20"
},
{
"rowIndex": 1,
"columnIndex": 1,
"value": "21"
}
],
]

fs.readFileSync always returns empty string

I have a script that writes data from an API to some files. I have an object the contains the file descriptors for each file:
var csvFds = {
'file1' : null,
'file2' : null,
'file3' : null,
'file4' : null
};
for (var file in csvFds) {
var dirPath = __dirname + '/files/' + file;
try {
fs.statSync(dirPath);
}
catch (e) {
mkdirp.sync(dirPath, {mode: 0755});
}
csvFds[file] = fs.openSync(dirPath + '/' + moment().format("YYYY-MM-DDTHH:mm:ss[Z]") + '.csv', 'a+');
}
Then I have some code that uses fs.write to write lines of csv to the file in batches. This part is working fine. I have well formed csv files. Now I need to read the contents of the entire file as a string. This is how I'm trying to do it:
fs.readFileSync(csvFds['file1']).toString();
But for some reason I am always getting an empty string. I have confirmed that fs.readFileSync is in fact returning a Buffer by using console.log and dropping the toString() method.
I'm really stuck on this so any help will be greatly appreciated. Thanks in advance. Here's some additional info regarding my node version and OS:
$ node -v
v6.2.3-pre
$ uname -a
Darwin i-2.local 14.5.0 Darwin Kernel Version 14.5.0: Thu Jun 16 19:58:21 PDT 2016; root:xnu-2782.50.4~1/RELEASE_X86_64 x86_64
For others who have the same problem.
For me, the only way to have a non-empty string was to use fs.readFile instead of fs.readFileSync.
I use a Mac and I was trying to read a file that node create itself. If I try to read another file it works.
fs.readFile(file, (err, data)=>{
if(err){
console.log(err)
throw err
}else{
let file_content = data.toString('utf8')
// your code here
}
})
Try to call readFileSync like this readFileSync(csvFds['file1'], 'utf-8'). It ought to return a string. Or you can omit the argument and then provide the encoding when calling toString method e.g. readFileSync(csvFds['file1']).toString('utf-8')

How can I get terminal size in a piped node.js process?

I'm using Grunt to kick off a unit-test framework (Intern), which ultimately pipes another node.js process that I'm then using Charm to output results to the screen. I'm having to pass in the terminal size information from a Grunt config option, but it's a bit messy and I'd like to try and get the terminal size from within the piped process, but the standard process.stdout.cols/getWindowSize are simply unavailable as the piped process doesn't register as TTY (although Charm works fine with it all).
Any suggestions?
EDIT Just to be clear here ... the Grunt JavaScript file is running in the main node.js process, but the file I'm attempting to retrieve this info from (and where I'm therefore running people's suggested commands) is in a spawned child process.
Try these:
tput cols tells you the number of columns.
tput lines tells you the number of rows.
echo -e "lines\ncols"|tput -S to get both the lines and cols
There's stty, from coreutils:
$ stty size #60 120 <= sample output
While running the below code in terminal prints the cols:
var sys = require('sys')
var exec = require('child_process').exec;
function puts(error, stdout, stderr) { sys.puts(stdout) }
exec("tput cols", puts);
The pty.js module can make a child act like a regular terminal.
var pty = require('pty.js');
var term = pty.spawn('bash', [], {
name: 'xterm-color',
cwd: process.env.HOME,
env: process.env
});
term.on('data', function(data) {
console.log(data);
});
term.write('ls\r');
term.resize(100, 40);
term.write('ls /\r');
console.log(term.process);

Read XML hosted file with NodeJS

Ok so I have attempted to use multiple XML libraries that NodeJS have to offer and I can't seem to work out how to have an NodeJS read the XML file from a website.
I can pull the file using http.request, http.get and all of that but then to have NodeJS be able to actually do anything with the data in the XML file is another story.
I'm sure I must be missing something as when ever I turn the XML to JS with xml-stream; it can not use it from a website; my code runs when I host the file however I am using an api and they only use XML.
Current code:
var http = require('http');
var XmlStream = require('xml-stream');
var options = { host: 'cloud.tfl.gov.uk',
path: '/TrackerNet/LineStatus'};
var twitter = { host: 'api.twitter.com',
path: '/1/statuses/user_timeline.rss?screen_name=nwhite89'}
var request = http.get(options).on('response', function(response) {
response.setEncoding('utf8');
var xml = new XmlStream(response);
xml.on('updateElement: item', function(item) {
item.title = item.title.match(/^[^:]+/)[0] + ' on ' +
item.pubDate.replace(/ +[0-9]{4}/, '');
});
xml.on('text: item > pubDate', function(element) {
element.$text = element.$text;
});
xml.on('data', function(data) {
process.stdout.write(data);
});
});
What I don't understand is using Twitter works fine outputs at xml.on("data") part however using options (cloud.tfl.gov.uk) nothing outputs even if I put console.log("hi") inside the data function it dosn't get executed.
I know that the url is correct outputting console.log(xml) or console.log(response) after creating the variable xml outputs that it has connected. Any help would be greatly appreciated with this I have been stuck on this for a good 2 days now.
There is a byte order mark before the <?xml tag, which xml-stream trips up on a bit and stops it from being able to read the encoding in the tag. That means you need to provide it yourself.
Instead of this:
response.setEncoding('utf8');
var xml = new XmlStream(response);
Just do this:
response.setEncoding('utf8');
var xml = new XmlStream(response, 'utf8');
And really, setting the encoding on the stream is optional.
var xml = new XmlStream(response, 'utf8');
works just fine.
More info here: http://en.wikipedia.org/wiki/Byte_order_mark#UTF-8
If you look at the buffer emitted from response rather that xml, the buffer starts with
<Buffer ef bb bf 3c 3f 78 6d ...>
The first 3 bytes are the byte order mark for utf8, and afterwards you have the start of the tag. xml-stream expects the <?xml tag to only have whitespace between it and the start of the file, but byte order marks don't count as whitespace.

Categories

Resources