fs.readFileSync always returns empty string - javascript

I have a script that writes data from an API to some files. I have an object the contains the file descriptors for each file:
var csvFds = {
'file1' : null,
'file2' : null,
'file3' : null,
'file4' : null
};
for (var file in csvFds) {
var dirPath = __dirname + '/files/' + file;
try {
fs.statSync(dirPath);
}
catch (e) {
mkdirp.sync(dirPath, {mode: 0755});
}
csvFds[file] = fs.openSync(dirPath + '/' + moment().format("YYYY-MM-DDTHH:mm:ss[Z]") + '.csv', 'a+');
}
Then I have some code that uses fs.write to write lines of csv to the file in batches. This part is working fine. I have well formed csv files. Now I need to read the contents of the entire file as a string. This is how I'm trying to do it:
fs.readFileSync(csvFds['file1']).toString();
But for some reason I am always getting an empty string. I have confirmed that fs.readFileSync is in fact returning a Buffer by using console.log and dropping the toString() method.
I'm really stuck on this so any help will be greatly appreciated. Thanks in advance. Here's some additional info regarding my node version and OS:
$ node -v
v6.2.3-pre
$ uname -a
Darwin i-2.local 14.5.0 Darwin Kernel Version 14.5.0: Thu Jun 16 19:58:21 PDT 2016; root:xnu-2782.50.4~1/RELEASE_X86_64 x86_64

For others who have the same problem.
For me, the only way to have a non-empty string was to use fs.readFile instead of fs.readFileSync.
I use a Mac and I was trying to read a file that node create itself. If I try to read another file it works.
fs.readFile(file, (err, data)=>{
if(err){
console.log(err)
throw err
}else{
let file_content = data.toString('utf8')
// your code here
}
})

Try to call readFileSync like this readFileSync(csvFds['file1'], 'utf-8'). It ought to return a string. Or you can omit the argument and then provide the encoding when calling toString method e.g. readFileSync(csvFds['file1']).toString('utf-8')

Related

Ionic 3 Prod Build With Version Number

I use the following command when building an ionic project for desktop
ionic cordova build browser --prod
Which results in the following file being generated
build/main.js
However I would like to be able to add a version number to the generated file automatically as part of the build process. So would end up with something like
build/main.js?version=1.00
as to avoid needing to clear the browser cache after every prod build.
Is there a flag for this, or is it something I must do manually?
Any advice would be great!
EDIT:
My solution is on GitHub for anyone interested!
https://github.com/RichardM99/ionic-3-version-build-file-hook
Here's some advice - You can create a cordova hook.
Hooks are scripts that you want to be executed at different stages of the build process. In your case, you are looking at a script which renames the main.js file after the build event is finished, or in other words a 'after_build' type hook.
The script will usually be a Node.js file, although you can have other types of scripts executed as well.
One more thing. Since you want to get around cache, you wont be renaming the file itself. What you will want to do is rather replace the reference to "main.js" in you "index.html" to include a random or maybe your actual version number.
I have pointed you in a direction, but won't spoonfeed. Look up documentation on cordova hooks. They are super simple if you understand Javascript/Node
Something like this should get the job done:
var index_orig = fs.readFileSync(path-to-index.html, 'utf8');
var index_new = index_orig.replace("main.js", "main.js?version="+version_num);
fs.writeFileSync(path-to-index.html, index_new, 'utf8');
If you want the actual build number, you can read your config.xml and parse it to get it's value.
Hope it helps.
I wrote blog long time ago
In my build pipeline i have command to set version
version "$(app.versionPrefix)$(Build.BuildNumber)"
$(app.versionPrefix) - is a prefix version such as 0.1.
$(Build.BuildNumber) - is build version
Then I have environment file
export const environment = {
apiUrl: 'https://....',
production: true,
version: '0.0.57'
}
Then i have js script to update version in environment and config.xml
var replace = require('replace-in-file');
var package = require("./package.json");
var buildVersion = package.version;
const options = {
files: ['config.xml'],
from: /" version="([0-9]*.[0-9]*.[0-9]*)"/g,
to: "\" version=\""+ buildVersion + "\"",
allowEmptyPaths: false,
};
const optionsEnv = {
files: ['src/environments/environment.prod.ts'],
from: /version: '(.*)'/g,
to: "version: '"+ buildVersion + "' ",
allowEmptyPaths: false,
};
try {
let changedFiles = replace.sync(options);
if (changedFiles == 0) {
throw "Please make sure that file '" + options.files + "' has \"version: ''\"";
}
changedFiles = replace.sync(optionsEnv);
if (changedFiles == 0) {
throw "Please make sure that file '" + optionsEnv.files + "' has \"version: ''\"";
}
console.log('Build version set: "' + options.to + '"');
}
catch (error) {
console.error('Error occurred:', error);
throw error
}
NOTE: you need to install plugin replace-in-file
Then in build pipe line I am running this script
node ./replace.build.js
In your case if you need only for browser you can tune script.

Improper parsing of strings

I'm trying to convert ansi color codes from console output into HTML. I have a found a script to do this but I cant seem to make it parse the strings inside node js. I have tried to JSON.stringify it to also include special chars but its not working.
forever list
[32minfo[39m: Forever processes running
[90mscript[39m [37mforever[39m [37mpid[39m [37mid[39m
[90mdata[39m: [37m [39m [37muid[39m [90mcommand[39m
I get output like this back from ssh2shell in node js. I have a script:
https://github.com/pixelb/scripts/blob/master/scripts/ansi2html.sh
This is supposed to convert the above to html and add the appropriate color codes. It works fine with normal terminal output for example:
npm install --color=always | ansi2html.sh > npminstall.html
This is the raw output on the linux machine piped to a file. It seems the JS strings are missing these escapes when they are shown in console.log but they are also missing newlines there. Perhaps its because im concatenating them directly into the string and its removing special chars?
total 24
-rwxr-xr-x 1 admin admin 17002 May 13 02:52 ^[[0m^[[38;5;34mansi2html.sh^[[0m
drwxr-xr-x 4 admin admin 4096 May 13 00:00 ^[[38;5;27mgit^[[0m
-rw-r--r-- 1 admin admin 0 May 13 02:57 ls.html
Hopefully some of this makes sense.
Thanks
There are a couple of filters that SSH2shell applies to the output from commands. The first removes non-standard ASCII from the response and then the colour formatting codes are removed.
In v1.6.0 I have added pipe()/unpipe(), the events for both and exposed the stream.on('data', function(data){}) event so you can access the stream output directly without SSH2shell interacting with it in any way.
This should resolve the problem of not getting the right output from SSH2shell by giving you access to the raw data.
var fs = require('fs')
var host = {
server: {
host: mydomain.com,
port: 22,
userName: user,
password: password:)
},
commands: [
"`Test session text message: passed`",
"msg:console test notification: passed",
"ls -la"
],
}
//until npm published use the cloned dir path.
var SSH2Shell = require ('ssh2shell')
//run the commands in the shell session
var SSH = new SSH2Shell(host),
callback = function( sessionText ){
console.log ( "-----Callback session text:\n" + sessionText);
console.log ( "-----Callback end" );
},
firstLog = fs.createWriteStream('first.log'),
secondLog = fs.createWriteStream('second.log'),
buffer = ""
//multiple pipes can be added but they wont be bound to the stream until the connection is established
SSH.pipe(firstLog).pipe(secondLog);
SSH.on('data', function(data){
//do something with the data chunk
console.log(data)
})
SSH.connect(callback)
tried this ?
https://github.com/hughsk/ansi-html-stream
var spawn = require('child_process').spawn
, ansi = require('ansi-html-stream')
, fs = require('fs')
var npm = spawn('npm', ['install', 'browserify', '--color', 'always'], {
cwd: process.cwd()
})
var stream = ansi({ chunked: false })
, file = fs.createWriteStream('browserify.html', 'utf8')
npm.stdout.pipe(stream)
npm.stderr.pipe(stream)
stream.pipe(file, { end: false })
stream.once('end', function() {
file.end('</pre>\n')
})
file.write('<pre>\n');

How do I redirect asynchronous output to a file?

I have a Node.js script that reads the contents of a file, does some transformations on its contents, and logs the output:
var transformer = require('./transformer'),
fs = require('fs'),
file = process.argv[2];
if (!file) {
throw 'no file specified\n';
}
fs.readFile(file, 'utf-8', function (err, data) {
if (err) {
throw err;
}
transformer.transform(data, function (text) {
console.log(text);
});
});
This works fine:
$ node transform.js myfile.txt
And this works:
$ node transform.js myfile.txt > anotherfile.txt
But, when I try to redirect the output to the same file I'm reading from, the file becomes blank:
$ node transform.js myfile.txt > myfile.txt
Same thing using tee:
$ node transform.js myfile.txt | tee myfile.txt
Curiously, this works:
$ node transform.js myfile.txt >> myfile.txt
But I don't want to append to the file - I want to overwrite its contents.
I think the problem is, since fs.readFile is asynchronous, console.log is called asynchronously as well - i.e., it gets chunks of data as opposed to all the data at once. I think I can use fs.readFileSync instead, but what's the right way to handle this?
The issue is not actually within Node but in the shell. When you redirect with >, the first thing the shell does is open the file for writing, emptying the file. Your program goes to read from that empty file and, in your case, empty input means empty output.
This too will result in an empty file regardless of the initial contents of myfile.txt:
$ cat myfile.txt > myfile.txt
One solution would be to write the file inside the Node script rather than using redirection. You're already specifying and reading the file there, so why not specify an output file in argv as well and write to it rather than using shell redirection? Just take care to structure your code so that reading and writing to the same file works.
As #slebetman notes in a comment, another solution is cat myfile.txt > tmp; mv tmp myfile.txt (or my preferred: cat myfile.txt > tmp && mv tmp myfile.txt).
The problem is
you're opening the file for read,
then opening the file for write (emptying it),
then reading from an empty file.
transform nothing
write nothing
What I think you want instead is to:
open for read
read and buffer
transform
open for write
write
There's a couple ways to do this:
1) Read the file synchronously. Node.js 0.12 supports this.
var transformer = require('./transformer'),
fs = require('fs'),
file = process.argv[2];
if (!file) {
throw 'no file specified\n';
}
fs.readFileSync(file, 'utf-8', function (err, data) {
if (err) {
throw err;
}
transformer.transform(data, function (text) {
console.log(text);
});
});
2) Use "streams"
This is really the best way. Especially if you're wanting to learn Node.js
The best way I know to learn about streams is from NodeSchool: http://nodeschool.io/#workshoppers Try the stream-adventure.
By the end, you'll own these kinds of problems.
Good luck!

Get MIME type of a file without extension in Node.js

Given I have a file without extension appended to its name, ex: images/cat_photo
Is there a method in Node.js to extract MIME type of a given file? Module mime in this case does not work.
Yes, there is a module called mmmagic. It tries best to guess the MIME of a file by analysing its content.
The code will look like this (taken from example):
var mmm = require('mmmagic'),
var magic = new mmm.Magic(mmm.MAGIC_MIME_TYPE);
magic.detectFile('node_modules/mmmagic/build/Release/magic.node', function(err, result) {
if (err) throw err;
console.log(result);
});
But keep in mind, that the guessing of a MIME type may not always lead to right answer.
Feel free to read up on types signatures on a wiki page.
Another possibility is to use exec or execSync function to run the 'file' command on Linux SO:
/**
* Get the file mime type from path. No extension required.
* #param filePath Path to the file
*/
function getMimeFromPath(filePath) {
const execSync = require('child_process').execSync;
const mimeType = execSync('file --mime-type -b "' + filePath + '"').toString();
return mimeType.trim();
}
However is not the better solution since only works in Linux. For running this in Windows check this Superuser question:
https://superuser.com/questions/272338/what-is-the-equivalent-to-the-linux-file-command-for-windows
Greetings.
You could simply use String.prototype.split() and then take the last element of the array which will be the type.
You can take the last element on the array using the pop method:
const mimeType = fileName.split('.').pop()
or
const type = mimeType.split('/')
then type[1] will have the extension

With gjs, how can I write Soup.Buffer chunks of data to a file?

I'm writing a GTK javascript program that downloads a file and writes it to disk. Here's what my code looks like:
const Gio = imports.gi.Gio;
const Soup = imports.gi.Soup;
// start an http session to make http requests
let _httpSession = new Soup.SessionAsync();
Soup.Session.prototype.add_feature.call(_httpSession, new Soup.ProxyResolverDefault());
// open the file
let file = Gio.file_new_for_path(path);
let fstream = file.replace(null, false, Gio.FileCreateFlags.NONE, null);
// start the download
let request = Soup.Message.new('GET', url);
request.connect('got_chunk', Lang.bind(this, function(message, chunk){
// write each chunk to file
fstream.write(chunk, chunk.length, null);
}));
this._httpSession.queue_message(request, function(_httpSession, message) {
// close the file
fstream.close(null);
});
I get an error on the fstream.write() line:
JS ERROR: !!! Exception was: Error: Unhandled GType GCancellable unpacking GArgument from Number
JS ERROR: !!! message = '"Unhandled GType GCancellable unpacking GArgument from Number"'
JS ERROR: !!! fileName = '"./torbrowser-launcher"'
JS ERROR: !!! lineNumber = '402'
JS ERROR: !!! stack = '"([object _private_Soup_Message],[object _private_Soup_Buffer])#./torbrowser-launcher:402
("2.3.25-2")#./torbrowser-launcher:122
wrapper("2.3.25-2")#/usr/share/gjs-1.0/lang.js:204
("2.3.25-2")#/usr/share/gjs-1.0/lang.js:145
("2.3.25-2")#/usr/share/gjs-1.0/lang.js:239
#./torbrowser-launcher:489
"'
The only reference to this error that I can find is in this thread: https://mail.gnome.org/archives/gnome-shell-list/2012-July/msg00126.html
That person ended up giving up and porting his code to python.
I'm also confused by what the 'got_chunk' callback passes. The chunk field is a Soup.Buffer (http://www.roojs.com/seed/gir-1.2-gtk-3.0/gjs/Soup.Buffer.html). I can get its length with chunk.length, but when I try printing chunk.data it's undefined. When I just print chunk it prints: [object _private_Soup_Buffer].
fstream is a Gio.FileOutputStream (http://www.roojs.com/seed/gir-1.2-gtk-3.0/gjs/Gio.FileOutputStream.html). The write method is: write(String buffer, guint32 count, Cancellable cancellable), and cancellable is optional. Weirdly enough, if I replace the write line with this I still get the exact same error:
fstream.write('test ', 5, null);
I was hitting exactly the same problem. After a lot of trial and error, it boiled down to two issues with the write() call:
It seems that the documentation of the write function you are using (http://www.roojs.com/seed/gir-1.2-gtk-3.0/gjs/Gio.FileOutputStream.html) is wrong; the write method signature is (as far as I can tell):
write(String buffer, Cancellable cancellable, guint32 count)
Yet if you just use fstream.write(chunk, null, chunk.length); you will write a file full of zeros. I don't know why (something to do with the way GJS binds to the underlying C library) but you should use chunk.get_data() instead of just chunk. I.e. replace the write call in your code with:
fstream.write(chunk.get_data(), null, chunk.length);

Categories

Resources