I am trying to read a txt file line by line using fetch()
here is what I have:
fetch('http://localhost:8888/foo.txt')
.then(response => response.text())
.then((data) => {
var player = GetPlayer();
player.SetVar("phrase",data);
})
this code is reading the entire txt file at once and showing it on the "phrase" variable correctly.
But I want something like:
read line 1 -> show line 1 on the variable
read line 2 -> show line 2 on the variable
read line 3 -> show line 3 on the variable
...
Is that possible?
Thanks!
You can just use some String methods. You can use a .split() method with \n as the regex which will split your data into an array.
Example:
Let's suppose a text file at localhost:6060/foo.txt have the following content:
Format - username:password
Peter:NoLol1231#
Maffe:xDOkLmao
Loe:OOPSOkay
John:OhXDAlright
We can access the data in a proper way like this,
const fetch = require('node-fetch');
async function getData() {
return new Promise(resolve => {
let result = (await (await fetch("http://localhost:6060/foo.txt")).text()).split('\n');
let finalres = result.map((x) => ({[x.split(':')[0]]: x.split(':')[1]}));
resolve(finalres)
});
}
getData((d) => {
console.log(d["Peter"]); // Will print his password now!
});
Related
I try to set data to my highchart with a fetch function but the dta is not displayed or even set to my chart element.
Here is my fetch function
fetch("data.csv")
.then(response => response.text())
.then((response) => {
//console.log("d"+response)
function csvToArray(str, delimiter = ",") {
let array = str.split("\n").map(function (line) {
return line.split(delimiter);
});
return array;
}
let array = csvToArray(response);
array.splice(0, 1);
array.splice((array.length-1),1)
let string =JSON.stringify(array);
let stringnew = string.replaceAll("\"","");
//console.log(csvToArray(response));
//console.log(csvToArray(response)[0]);
console.log(stringnew);
chart.series[0].setData(stringnew);
})
.catch(err => console.log(err))
as well as my data.csv file
time,data
1672683118394,12.00
1672683159084,10.00
1672683199305,9.00
I also see in the console the right output
[[1672683118394,12.00],[1672683159084,10.00],[1672683199305,9.00]]
its corrosponding to the Highcharts doku as I understood. But the data is not loaded.
Any help appreciated :)
stringnew is a string not a JSON object.
The issue is solved by using
JSON.parse(stringnew)
I have tried getting cypress to save a text for later usage but I feel unable to reuse it as a variable
cy.get('.unrd').then(($span) => {
const filteredunread = $span.text()
cy.wrap(filteredunread).as('oldunreadmessage');
})
Codeblock to send a mail, wait and return to the inbox expecting an echo reply
cy.get('.unrd').then(($span) => {
cy.get('#oldunreadmessage') //seen as object
const newunread = $span.text()
expect(newunread).to.eq(cy.get('#oldunreadmessage') +1)
})
This gives me errors such as:
expected '(27)' to equal '[object Object]1'
I have tried to use .as(). However I seem to be unable to properly resolve my old object as a text or integer constant.
The first part is fine, but because of the cy.wrap you need to use should or then on the cy.get('#oldunreadmessage')
cy.get('.unrd').then(($span) => {
const newunread = $span.text();
cy.get('#oldunreadmessage').should('have.text', newunread);
})
or
cy.get('.unrd').then(($span) => {
const newunread = $span.text();
cy.get('#oldunreadmessage').then((oldunread) => {
expect(newunread)...
}
})
before, i already search the question asked in SOF before i deciding to ask,
like here
or here
but none of it solve my problem..
ok so heres my code :
const file = './PAGE1.txt';
const fs = require('fs');
fs.readFile(file,'utf-8', (e,d)=>{
let textByLine = d.split('\n'); //make it an array
let hasil=textByLine[2];
});
the page1.txt is like
Aa
Ab
Ac
so then i try
console.log(hasil)
it succeeded showing "Ac" on the console.
but when i do
console.log(hasil + " Test")
it shows up "Test"
why its not "Ac Test" ?
thank you for your help.
Edit : this is solved, i just simply add '\r' :
let textByLine = d.split('\r\n'); //make it an array
and now the console show "Ac Test".
now i wanna ask what does this "\r" function?
why i need it to solve my question..
thankyou again :)
const fs = require('fs'); // file system package
const rl = require('readline'); // readline package helps reading data line by line
// create an interface to read the file
const rI = rl.createInterface({
input: fs.createReadStream('/path/to/file') // your path to file
});
rI.on('line', line => {
console.log(line); // your line
});
You can simply use this to log line by line data. But in real world you will use with Promise e.g
const getFileContent = path =>
new Promise((resolve, reject) => {
const lines = [],
input = fs.createReadStream(path);
// handle if cann't create a read strem e.g if file not found
input.on('error', e => {
reject(e);
});
// create a readline interface so that we can read line by line
const rI = rl.createInterface({
input
});
// listen to event line when a line is read
rI.on('line', line => {
lines.push(line);
})
// if file read done
.on('close', () => {
resolve(lines);
})
// if any errors occur while reading line
.on('error', e => {
reject(e);
});
});
and you will use it like this.
getFileContent('YOUR_PATH_TO_FILE')
.then(lines => {
console.log(lines);
})
.catch(e => {
console.log(e);
});
Hope this will help you :)
Edit : this is solved, i just simply add '\r' :
let textByLine = d.split('\r\n'); //make it an array
and now the console show "Ac Test"
now i wanna ask what does this "\r" function?
why i need it to solve my question..
thankyou again :)
I am writing a script which at its core parses a .csv file for certain columns storing them in an array and then writes the contents to another .csv file. I am able to parse the file using fast-csv and have confirmed in the terminal that my array is in the correct format. However, when I attempt to write this array using the fast-csv to a .csv file, the contents never appear in the file and no errors are thrown. I have validated that the array is being passed all the way through to the callback. In addition I have gone so far as to replace that variable in the writeToPath function with a simple array and still no luck. Any assistance would be appreciated.
const processFile = (fileName, file, cb) => {
let writeData = []
let tempArray = []
csv.fromPath(basePath + file, {ignoreEmpty: false, headers: false})
.on("data", function(data){
if (data[0] != ''){
[startDate, endDate] = fileName
tempArray[0] = data[0]
tempArray[1] = data[1]
tempArray[2] = data[2]
tempArray[3] = data[3]
tempArray[4] = data[4]
tempArray[5] = data[8]
tempArray[6] = ""
tempArray[7] = ""
tempArray[8] = ""
tempArray[9] = startDate
tempArray[10] = endDate
writeData[i] = tempArray
writeData.shift()
tempArray = []
i++
}
})
.on("end", () => {
console.log('end')
}).on('finish', (() => {
cb(writeData)
}));
}
processFile(fileName, file, (csvData) => {
console.log(csvData)
csv.writeToPath('./working-files/top.csv', {headers: false}, csvData).on("finish", () => {
console.log('done')
})
Unfortunately, without any context to the dataset you are using, there is only so much I can suggest. The variables needed to debug this properly would be: the file, the file names used and whatever 'i' is. If you can update this then I'll be happy to take another look.
I would suggest going back and logging the variables after each step that would modify them, hopefully then you'll get a better picture as what is going wrong.
I understand this isn't a complete answer and it will probably get removed but I don't have the 50 needed reputation to make a comment.
I've used the 'fast-csv' module to parse the csv file for other manipulations, but that returns data row-wise. I want to read the first 2 columns of a csv file. Can someone please help?
I see two options.
One is do specify which headers you want in fast-csv and discard the rest. This approach will return an object which may suit your needs or you can then turn that into an array afterwards.
const csv = require('fast-csv')
const CSV_STRING = 'a,b,c\n' +
'a1,b1,c1\n' +
'a2,b2,c2\n'
let filtered = []
csv
.fromString(CSV_STRING, { headers: ['column_1', 'column_2'], renameHeaders: true, discardUnmappedColumns: true }) // I give arbitrary names to the first two columns - use whatever make sense
// .fromString(CSV_STRING, { headers: ['column_1', undefined, 'column_3'], discardUnmappedColumns: true }) // I could use undefined if I wanted to say skip column 2 and just want 1 and 3
.on('data', function (data) {
// console.log([data.column_1, data.column_2])
filtered.push([data.column_1, data.column_2]) // or you can push to an array
})
.on('end', function () {
console.log('done')
console.log(filtered)
})
The other is to return as an array (default) and filter what you need using the transform method
const csv = require('fast-csv')
const CSV_STRING = 'a,b,c\n' +
'a1,b1,c1\n' +
'a2,b2,c2\n'
csv
.fromString(CSV_STRING)
.transform(function (data) {
return [data[0], data[1]]
})
.on('data', function (data) {
console.log(data)
})
.on('end', function () {
console.log('done')
})