JavaScript Read text file returns as undefined - javascript

I have this code which should read the current text file's contents and make a new line and write a new sentence in that line, it writes it but the line before that is just undefined.
const data = msg.author.tag + " bejelentkezett a bottal. " + d.toLocaleTimeString();
 const currenttext = fs.readFile('output.txt', 'utf8', function(err, contents) {
fs.writeFile('output.txt', currenttext + '\n' + data, (err) => {
      
    if (err) throw err;
})
});

readFile does not return anything. Instead you will be given the contents in your callback in the variable you named contents.
So change
fs.writeFile('output.txt', currenttext + '\n' + data, (err) => {
to
fs.writeFile('output.txt', contents+ '\n' + data, (err) => {
As a side note your error handling is a mess. Throwing an error from within a callback just leads to losing the error. Consider using the promise api with async/await instead to simplify your code or update your callbacks. https://nodejs.org/api/fs.html#fs_fs_promises_api

Related

critical section in nodejs - writing data from frontend to json file

I have following question that have I to synchronize requests when writing data to a file from multiple users in the same time. In java there is a synchronized keyword but I don't understand how it looks in nodejs.
app.post("/register",function(req,res){
fs.readFile( __dirname + "/" + "users.json", 'utf8', function (err, data) {
var users=JSON.parse(data);
users["users"].push(req.body);
var check=false;
fs.writeFile(__dirname + "/" + "users.json", JSON.stringify(users),'utf8', function (err) {
if(err){
res.send(JSON.stringify("error"));
check=true;
}
})
if(!check){
res.send(JSON.stringify("Created"));
}
})
})
You should use writeFileSync instead of writeFile.
It returns nothing, so if the process went to the next line, this mean the file was written successfully!!

Using FS to write new files if a certain url is found and remove the file if it's not found anymore

I'm trying to write a script, when a new url is found it will turn the url to a hash. Check if the file already has been written it just ignores it, and if it's not known earlier it should be added.
needle.get(mainUrl, function(err, res) {
if (err) throw err;
if (res.statusCode == 200 && !err ) {
var $ = cheerio.load(res.body)
var href = $('div div a').each(function(index, element) {
urlList.push($(element).attr("href"))
var url =($(element).attr("href"))
var hash = crypto.createHash('md5').update(url).digest('hex');
fs.writeFile('./directory/otherdirectory' + `${hash}`, url, (err) => {
if (err) throw err;
console.log('Hash created: ' + url + ' saved as ' + hash
});
}
)
}
})
This is what I've done so far, but this only writes new files. it doesn't check if files already has been added and doesn't remove files that's not found anymore.
So what I try to do:
I've written a script that fetches a website for urls.
Hash all the urls.
Make FS check if file already has been written, if it has just ignore it.
If it not is known earlier, add it as a new file.
If url isn't found when fetching anymore, delete it from the list.
I think this might be an X/Y problem and for that I'm still awaiting the answer to my comment.
With that said, you can simply ignore the existing files using fs.existsSync, if that returns true just skip saving the current file, otherwise save it. And to remove files that are not available anymore, just get all the files in the directory using fs.readdir and remove files that you whose urls are not in the response using fs.unlink:
needle.get(mainUrl, (err, res) => {
if (err) throw err;
if (res.statusCode == 200) {
let $ = cheerio.load(res.body);
let hashes = []; // list of hashes for this website (to be used later to keep only the items that are still available)
$('div div a').each((index, element) => {
let url = $(element).attr("href");
let hash = crypto.createHash('md5').update(url).digest('hex');
hashes.push(hash); // store the hash of the current url
if (!fs.existsSync('./directory/otherdirectory/' + hash)) { // if this file doesn't exist (notice the "not operator !" before fs.existsSync)
fs.writeFile('./directory/otherdirectory/' + hash, url, err => { // save it
if (err) throw err;
console.log('Hash created: ' + url + ' saved as ' + hash);
});
}
});
fs.readdir('./directory/otherdirectory', (err, files) => { // get a list of all the files in the directory
if (err) throw err;
files.forEach(file => { // and for each file
if(!hashes.includes(file)) { // if it was not encountered above (meaning that it doesn't exist in the hashes array)
fs.unlink('./directory/otherdirectory/' + file, err => { // remove it
if (err) throw err;
});
}
});
});
});
Another approach:
Since you only seem to want to store the urls, the best way to so would be to use one single file to store them all instead of storing each url in its own file. Something like this is more efficient:
needle.get(mainUrl, (err, res) => {
if (err) throw err;
if (res.statusCode == 200) {
let $ = cheerio.load(res.body);
let urls = $('div div a') // get the 'a' elements
.map((index, element) => $(element).attr("href")) // map each one into its href attribute
.get(); // and get them as an array
fs.writeFile('./directory/list-of-urls', urls.join('\n'), err => { // then save all the urls encountered in the file 'list-of-urls' (each on its own line, hence the join('\n'))
if (err) throw err;
console.log('saved all the urls to the file "list-of-urls"');
});
}
});
That way old urls will be removed automatically as the file gets overwritten each time, and new urls will be added automatically. No need to check whether an url is already encountered or not because it will get re-saved anyway.
And if you want to get the list of urls somewhere else, just read the file and split it by '\n' like so:
fs.readFile('./directory/list-of-urls', 'utf8', (err, data) => {
if (err) throw err;
let urls = data.split('\n');
// use urls here
});

appendFile() runs before readFile() even though appendFile() is chronologically after the readFile()

I am trying to write code that reads a file, counts the lines in it, and then adds another line with the line's number in the beginning. Like an index, basically. The problem is that the fs.appendFile() starts running before fs.readFile() is finished, but I am not sure as to why. Is there something I am doing wrong?
My code:
fs.readFile('list.txt', 'utf-8', (err, data) => {
if (err) throw err;
lines = data.split(/\r\n|\r|\n/).length - 1;
console.log("Im supposed to run first");
});
console.log("Im supposed to run second");
fs.appendFile('list.txt', '[' + lines + ']' + item + '\n', function(err) {
if (err) throw err;
console.log('List updated!');
fs.readFile('list.txt', 'utf-8', (err, data) => {
if (err) throw err;
// Converting Raw Buffer dto text
// data using tostring function.
message.channel.send('List was updated successfully! New list: \n' + data.toString());
console.log(data);
});
});
My output:
Im supposed to run second
List updated!
Im supposed to run first
[0]first item
Currently, you are using readFile and appendFile. Both of these functions are asynchronous and will run at the same time, returning whenever they complete.
If you'd like to run these synchronously, you can use the fs.readFileSync and fs.appendFileSync methods to synchronously read and append to the files.
Therefore, with something like the following:
const readFileData = fs.readFileSync("list.txt");
fs.appendFileSync('list.txt', '[' + lines + ']' + item + '\n');
The first line of code will run, then the second line of code.
The functions you are using are asynchronous, so the response of the second function can be received before the response of the first one.
fs.readFile('list.txt', 'utf-8', (err, data) => {
if (err) throw err;
lines = data.split(/\r\n|\r|\n/).length - 1;
console.log("Im supposed to run first");
appendFile(lines);
});
let appendFile = (lines)=> {
fs.appendFile('list.txt', '[' + lines + ']' + item + '\n', function(err) {
console.log("Im supposed to run second");
if (err) throw err;
console.log('List updated!');
fs.readFile('list.txt', 'utf-8', (err, data) => {
if (err) throw err;
// Converting Raw Buffer dto text
// data using tostring function.
message.channel.send('List was updated successfully! New list: \n' + data.toString());
console.log(data);
});
});
}

"Can't wait without a fiber" error when querying db?

I have the following code that reads a CSV file and then pulls a document from the database:
fs.readFile process.env.PWD + '/data/errorports.csv', 'utf8', (err, data) ->
if err
console.log "Error reading csv", err
return
rows = data.split('\n')
for row in rows
columns = row.split(',')
airportCode = columns[0]
airport = Airports.findOne({_id: airportCode})
console.log 'airport:', airport
But when I call Airports.findOne({_id: airportCode}) it throws the error:
/Users/abemiessler/.meteor/packages/meteor-tool/.1.3.4.19lp8gr++os.osx.x86_64+web.browser+web.cordova/mt-os.osx.x86_64/dev_bundle/server-lib/node_modules/fibers/future.js:159
throw new Error('Can\'t wait without a fiber');
^
Error: Can't wait without a fiber
Can anyone see why I would be getting this error? Any suggestions on how to get around it?
You probably need to make use of Meteor.wrapAsync() to do this (to make your async function run inside a Fider...which will allow you to execute your meteor code inside). Here is an example.
var syncReadFile = Meteor.wrapAsync(fs.readFile);
syncReadFile(process.env.PWD + '/data/errorports.csv', 'utf8', function(err, data) {
if (err) {
console.log "Error reading csv", err;
return;
}
rows = data.split('\n');
for (row in rows) {
columns = row.split(',');
airportCode = columns[0];
airport = Airports.findOne({_id: airportCode});
console.log 'airport:', airport;
}
});
Or ... you can try to wrap your callback into
Meteor.bindEnvironment()
Would be something like
fs.readFile process.env.PWD + '/data/errorports.csv', 'utf8',
Meteor.bindEnvironment(function(err, data) {
[..your code..]
});

Getting 'undefined' After fs.stat

I have a function that tests whether a file exists or not before editing it. I use fs.stat.
fs.stat('../fill/bower.json', function (err, stats) {
if (err) {
console.log('You don\'t have a ' + clc.red('bower.json') + ' file! Type ' + clc.bgBlack.white('touch bower.json') + ' to get started.');
return;
} if (stats.isFile()) {
var json = JSON.parse(fs.readFileSync('../bower.json', 'utf8')),
string = '\n Dependencies: ' + json;
fs.writeFile('../fill/README.md,', string, 'utf8');
console.log('it\'s saved!');
}
})
However, every time I run it (bower.json doesn't exist on purpose), it returns undefined before You don't have a bower.json file!. Why does this happen and how can I stop the function printing undefined?
Edit: for reference, here's my terminal window after running the command:
Why is undefined printed, and what do I do to have that not be displayed?
You're returning nothing or undefined from your reading function.
Gist for posterity

Categories

Resources