writeFile Asynchronous. Does not seem to work in a module - javascript

Bot used to work flawlessly but I thought it would be good to split my code since it was getting too large. I've fixed most of the mistakes that pop up but however I could not. for the life of me, get the called modules to write to file. The modules seem to have a cached values if it's like a counter (like in the slots game I made), however something like profile would still show the value in the stored values in the json file, unaffected by the game (although the count in the game is changing). When I reset the bot, the value in the game reverts back to the value on the json, so I'm assuming the fs.writeFile I placed in the module isn't successful n the module is only rendering a temporary cache.
const Discord = require('discord.js');
const fs = require("fs");
let points = JSON.parse(fs.readFileSync("./points.json", "utf8"));
let extrasettings = JSON.parse(fs.readFileSync("./extra.json", "utf8"));
let tempgamefiles = JSON.parse(fs.readFileSync("./temporarygamefiles.json", "utf8"));
module.exports = (bot, message, args) => {
// asynchronous method of writing to file, however not really working
fs.writeFile("./extra.json", JSON.stringify(extrasettings), (err) => {if (err) console.error(err)});
}
file path is correct and the exact code used to work fine in 1 single ginormous file.. don't know how moduling would break it.

Related

Node-FTP duplicating operations upon uploading a file

As there are things called 'callback hell'. It was the only way I can get a file from a server to my vps pc, and upload it. The process was simple:
Download a .json file from the ftp server
Edit the .json file on the pc
Upload the .json file and delete the pc's copy.
However my problem was this: Although it downloads once, it returns the upload based on how many times I command it during 1 session (command #1, does it once, command#2, does it twice, etc).
I tried to run it as imperative, but gets nullified. Had to resort to callback hell to run the code almost properly. The trigger works to initialize the command, but the command and session goof'd.
(( //declaring my variables as parameters
ftp=new (require('ftp'))(),
fs=require('fs'),
serverFolder='./Path/Of/Server/',
localFolder='./Path/Of/Local/',
file='some.json',
{log}=console
)=>{
//run server if its ready
ftp.on('ready',()=>{
//collect a list of files from the server folder
ftp.list(serverFolder+file,(errList,list)=>
errList|| typeof list === 'object' &&
list.forEach($file=>
//if the individual file matches, resume to download the file
$file.name===file&&(
ftp.get(serverFolder+file,(errGet,stream)=>
errGet||(
log('files matched! cdarry onto the operation...'),
stream.pipe(fs.createReadStream(localFolder+file)),
stream.once('close',()=>{
//check if the file has a proper size
fs.stat(localFolder+file,(errStat,stat)=>
errStat || stat.size === 0
//will destroy server connection if bytes = 0
?(ftp.destroy(),log('the file has no value'))
//uploads if the file has a size, edits, and ships
:(editThisFile(),
ftp.put(
fs.createReadStream(localFolder+file),
serverFolder+file,err=>err||(
ftp.end(),log('process is complete!')
))
//editThisFile() is a place-holder editor
//edits by path, and object
)
})
)
)
)
)
);
});
ftp.connect({
host:'localHost',
password:'1Forrest1!',
port:'21',
keepalive:0,
debug: console.log.bind(console)
});
})()
The main problem is: it'll return a copy of the command over and over as 'carry over' for some reason.
Edit: although the merits of "programming style" is different than common meta. It all leads to the same issue of callback hell. Any recommendations are needed.
For readability, I had help editing my code to ease difficulty. Better Readability version
The ftp modules API leads to the callback hell. It also hasn't been maintained for a while and is buggy. Try a module with promises like basic-ftp.
With promises the code flow becomes much easier to reason with and errors don't require specific handling, unless you want to.
const ftp = require('basic-ftp')
const fsp = require('fs').promises
async function updateFile(localFile, serverFile){
const client = new ftp.Client()
await client.access({
host: 'localHost',
password: '1Forrest1!',
})
await client.downloadTo(localFile, serverFile)
const stat = await fsp.stat(localFile)
if (stat.size === 0) throw new Error('File has no size')
await editThisFile(localFile)
await client.uploadFrom(localFile, serverFile)
}
const serverFolder = './Path/Of/Server'
const localFolder = './Path/Of/Local'
const file = 'some.json'
updateFile(localFolder + file, serverFolder + file).catch(console.error)

Discord.js dynamic command handler not working

So I followed the dynamic command handler guide on the discord.js guide site, and it turns out every time I try to let it execute a command, it says that the execute function is undefined, no matter how I tried to fix it. To make sure that my code is supposed to be working, I downloaded the example code they had on the guide and ran it, which also didn’t work for some reason. My discord.js and node.js are all up to date.
Since I do not know your current files/code, I can provide an example.
Also, in this example, I will be assuming that you have named your bot variable client.
First, make sure you have a folder named commands.
On the top of your bot code (index.js or whatever it's called), add this line:
const fs = require("fs");
In your bot code, after the bot definition (var client = new Discord.Client()), add these lines:
client.commands = new Discord.Collection();
const commandFiles = fs.readdirSync('./commands').filter(file => file.endsWith('.js'));
for(let file of commandFiles) {
let command = require('./commands/' + file);
client.commands.set(command.name, command);
}
And on your message event listener (assuming that you already have made some commands in the folder), replace your command if statement(s) with this:
// you can use other expressions to check if the command is there
// the commandname in the client.commands.get is the filename without .js
if(message.content.startsWith("commandname")) client.commands.get("commandname").execute(message, args);
Creating commands will be the process of creating JavaScript files in your commands folder. So in your commands folder, create a file (filename will be like "commandname.js" or something) and the content will be:
module.exports = {
name: "commandname",
description: "Command description here.",
execute(message, args) {
// Now you can do your command logic here
}
}
I hope this helped! If it isn't clear, feel free to downvote.

Reading Multiple files and writing to one file Node.JS

I am currently trying to make a data pipeline using Node.js
Of course, it's not the best way to make it but I want to try implementing it anyways before I make improvements upon it.
This is the situation
I have multiple gzip compressed csv files on AWS S3. I get these "objects" using aws sdk
like the following and make them into readStream
const unzip = createGunzip()
const input = s3.getObject(parameterWithBucketandKey)
.createReadStream()
.pipe(unzip)
and using the stream above I create readline interface
const targetFile = createWriteSTream('path to target file');
const rl = createInterface({
input: input
})
let first = true;
rl.on('line', (line) => {
if(first) {
first = false;
return;
}
targetFile.write(line);
await getstats_and_fetch_filesize();
if(filesize > allowed_size){
changed_file_name = change_the_name_of_file()
compress(change_file_name)
}
});
and this is wrapped as a promise
and I have array of filenames to be retrieved from AWS S3 and map those array of filenames like this
const arrayOfFileNames = [name1, name2, name3 ... and 5000 more]
const arrayOfPromiseFileProcesses= arrayOfFileNames.map((filename) => return promiseFileProcess(filename))
await Promise.all(arrayOfPromiseFileProcesses);
// the result should be multiple gzip files that are compressed again.
sorry I wrote in pseudocode if it needs more to provide context then I will write more but I thought this would give a general contenxt of my problem.
My problem is that it writes to a file fine, but when i change the file_name it it doesn't create one afterwards. I am lost in this synchronous and asynchronous world...
Please give me a hint/reference to read upon. Thank you.
line event handler must be a async function as it invokes await
rl.on('line', async(line) => {
if(first) {
first = false;
return;
}
targetFile.write(line);
await getstats_and_fetch_filesize();
if(filesize > allowed_size){
changed_file_name = change_the_name_of_file()
compress(change_file_name)
}
});

Is it possible to write text in the middle of a file with fs.createWriteStream ? (or in nodejs in general)

I'm trying to write in a text file, but not at the end like appendFile() do or by replacing the entiere content...
I saw it was possible to chose where you want to start with start parameter of fs.createwritestream() -> https://nodejs.org/api/fs.html#fs_fs_createwritestream_path_options
But there is no parameter to say where to stop writting, right ? So it remove all the end of my file after I wrote with this function.
const fs = require('fs');
var logger = fs.createWriteStream('result.csv', {
flags: 'r+',
start: 20 //start to write at the 20th caracter
})
logger.write('5258,525,98951,0,1\n') //example a new line to write
Is there a way to specify where to stop writting in the file to have something like:
....
data from begining
....
5258,525,98951,0,1
...
data till the end
...
I suspect you mean, "Is it possible to insert in the middle of the file." The answer to that is: No, it isn't.
Instead, to insert, you have to:
Determine how big what you're inserting is
Copy the data at your insertion point to that many bytes later in the file
Write your data
Obviously when doing #2 you need to be sure that you're not overwriting data you haven't copied yet (either by reading it all into memory first or by working in blocks, from the end of the file toward the insertion point).
(I've never looked for one, but there may be an npm module out there that does this for you...)
You could read/parse your file at first. Then apply the modifications and save the new file.
Something like:
const fs = require("fs");
const fileData = fs.readFileSync("result.csv", { encoding: "utf8" });
const fileDataArray = fileData.split("\n");
const newData = "5258,525,98951,0,1";
const index = 2; // after each row to insert your data
fileDataArray.splice(index, 0, newData); // insert data into the array
const newFileData = fileDataArray.join("\n"); // create the new file
fs.writeFileSync("result.csv", newFileData, { encoding: "utf8" }); // save it

How can I extract (read + delete) from a textfile in NodeJS?

I'm building a script that reads log files, handles what needs to be handled then writes them to a database
Some caveats :
Some log files have a lot of input, multiple times a second
Some log files have few to no input at all
What I try in simple words:
Reading the first line of a file, then deleting this line to go to the next one, while I handle the first line, other lines could be added..
Issues I'm facing
When I try reading a file then processing it, then deleting the
files, some lines have been added
When the app crashes while
handling multiple lines at once for any reason, I can't know what
lines have been processed.
Tried so far
fs.readdir('logs/', (err, filenames) => {
filenames.forEach((filename) => {
fs.readFile('logs/'+filename, 'utf-8', (err, content) => {
//processing all new lines (can take multiple ms)
//deleting file
fs.unlink('logs/'+filename)
});
});
});
Is there not a (native or not) method to 'take' first line(s), or take all lines, from a file at once?
Something similar to what the Array.shift() method does to arrays..
Why you are reading the file at once. Instead you can use the node.js streams.
https://nodejs.org/api/fs.html#fs_class_fs_readstream
This will read the files and output to console
var fs = require('fs');
var readStream = fs.createReadStream('myfile.txt');
readStream.pipe(process.stdout);
You can also go for the npm package node-tail to read the content of a files while new content written to it.
https://github.com/lucagrulla/node-tail
If your log files has been writen as rotate logs. Example: Each hours has each log file, 9AM.log, 10AM.log....When you process the log files, you can skip current file and process another files. ex: now is 10:30 AM o’clock, skip file 10AM.log, solve another files.

Categories

Resources