How to read data columnwise from csv file in nodejs? - javascript

I've used the 'fast-csv' module to parse the csv file for other manipulations, but that returns data row-wise. I want to read the first 2 columns of a csv file. Can someone please help?

I see two options.
One is do specify which headers you want in fast-csv and discard the rest. This approach will return an object which may suit your needs or you can then turn that into an array afterwards.
const csv = require('fast-csv')
const CSV_STRING = 'a,b,c\n' +
'a1,b1,c1\n' +
'a2,b2,c2\n'
let filtered = []
csv
.fromString(CSV_STRING, { headers: ['column_1', 'column_2'], renameHeaders: true, discardUnmappedColumns: true }) // I give arbitrary names to the first two columns - use whatever make sense
// .fromString(CSV_STRING, { headers: ['column_1', undefined, 'column_3'], discardUnmappedColumns: true }) // I could use undefined if I wanted to say skip column 2 and just want 1 and 3
.on('data', function (data) {
// console.log([data.column_1, data.column_2])
filtered.push([data.column_1, data.column_2]) // or you can push to an array
})
.on('end', function () {
console.log('done')
console.log(filtered)
})
The other is to return as an array (default) and filter what you need using the transform method
const csv = require('fast-csv')
const CSV_STRING = 'a,b,c\n' +
'a1,b1,c1\n' +
'a2,b2,c2\n'
csv
.fromString(CSV_STRING)
.transform(function (data) {
return [data[0], data[1]]
})
.on('data', function (data) {
console.log(data)
})
.on('end', function () {
console.log('done')
})

Related

Unable to set data for Highcharts

I try to set data to my highchart with a fetch function but the dta is not displayed or even set to my chart element.
Here is my fetch function
fetch("data.csv")
.then(response => response.text())
.then((response) => {
//console.log("d"+response)
function csvToArray(str, delimiter = ",") {
let array = str.split("\n").map(function (line) {
return line.split(delimiter);
});
return array;
}
let array = csvToArray(response);
array.splice(0, 1);
array.splice((array.length-1),1)
let string =JSON.stringify(array);
let stringnew = string.replaceAll("\"","");
//console.log(csvToArray(response));
//console.log(csvToArray(response)[0]);
console.log(stringnew);
chart.series[0].setData(stringnew);
})
.catch(err => console.log(err))
as well as my data.csv file
time,data
1672683118394,12.00
1672683159084,10.00
1672683199305,9.00
I also see in the console the right output
[[1672683118394,12.00],[1672683159084,10.00],[1672683199305,9.00]]
its corrosponding to the Highcharts doku as I understood. But the data is not loaded.
Any help appreciated :)
stringnew is a string not a JSON object.
The issue is solved by using
JSON.parse(stringnew)

Typescript Array push deletes sometimes items

I'm uploading some Excel files to the server to do something with the data. I'm using xlsx to read and Tabulator to present the uploaded data in the frontend.
This is the loop in which i read the data and push it into an array:
for (const k of raw) {
let columns: Column[] = []
//new Array each loop
let tmpData: any[] = []
//read data from Excel files
const wb = read(await k.file.arrayBuffer(), {
sheets: k.type.sheet
})
tmpData = utils.sheet_to_json(wb.Sheets[k.type.sheet], {
range: k.type.row
})
//do something with the data of the files
tmpData.forEach((item, index) => {
if (Object.entries(item).length < 2) {
data.splice(index, 1);
}
});
tmpData.forEach((item) => {
Object.keys(item).forEach((key) => {
if (key.includes('EMPTY')) {
delete item[key]
}
})
})
//seperate the first row to extract columns
Object.keys(tmpData[0]).forEach((key) => {
columns.push(new Column(key, key, 'input', true, 300))
})
//create columns Array containing the headers of all files
columnsObject.push(new ColumnsType(columns, k.type.name))
//push the data into an Array
data.push(new TablesType(tmpData, k.type.name))
//troubleshooting
data.forEach(item => {
console.log(item.vendor)
})
console.log('\n ----- \n')
}
This is an example of the output of the troubleshooting where the problem accurs:
Ciena
-----
Ciena
PaloAlto
-----
PaloAlto
Infinera
-----
PaloAlto
Infinera
Arista
-----
In the third loop the Ciena object is missing. I tried to do it with a traditional for loop already, but the same issue accured with some constellation of uploaded files.
The data is reading just fine for all files, just the array is throwing it away.
//do something with the data of the files
tmpData.forEach((item, index) => {
if (Object.entries(item).length < 2) {
data.splice(index, 1);
}
});
Why even write the comment if it is as meaningless as this? What is this supposed to do? Something?!
This is btw. also the place where your mistake most likely lies. You're splicing your data array. (Meaning deleting stuff from your data array) This would explain how your data array loses some of it's elements. As I don't know what this part of your code is actually supposed to to, I can't tell you how to fix it, just that this is most likely the origin of your troubles.

Create global variable in utils.js file in node.js

Hi I'm currently trying to figure it out how to properly define global variable in node.js. I know it is not a good practice to do this, but in this particular screnario it's the only way to do this without using connection to database.
I'm getting data from github API to display some information, I'm trying to store them in the global variable. It should allow me to e.g pass specific object from first global list to new global list that display only chosen items.
I have file called utils.js that have this two empty arrays that should be global:
let repoItems = [];
let bookmarkedItems = [];
exports.repoItems = repoItems;
exports.bookmarkedItems = bookmarkedItems;
Then I have another file that fetch and should assign items to the first global variable, but it looks like it doesn't. Because in the moment I'm trying to chose one item & push it into second empty array it's impossible, I'm getting empty array. I'm not sure if the mistake is taken because of bad implementation of global variable from other file, or something else :/
Below I include the fetching part of code, with bold part that I'm confused with:
let utils = require('../utils');
let {
repoItems,
} = utils;
router.get('/', async (req, res) => {
try {
const result = await fetchGithubAPI(`searching word`);
const urls = result.map(url => url);
console.log(urls);
res.render('repositories.ejs', {
'data': urls
});
} catch (e) {
console.log(e);
}
});
async function fetchGithubAPI(search) {
const response = await fetch(`https://api.github.com/?q=${search}`, {
method: 'GET',
headers: {
'Accept': 'application/vnd.github.v3+json',
'Content-Type': 'application/json',
},
});
const data = await response.json();
**repoItems = data.items.map(item => item);**
return repoItems;
}
Try:
let repoItems = require('./utils').repoItems
instead of
let {
repoItems,
} = utils;
if you remove
let {
repoItems,
} = utils;
you can try using
utils.repoItems = data.items.map(item => item)
I tried an example setup for it
--utils.js
module.exports.someValue = 3;
module.exports.setSomeValue = (value) => {someValue = value}
--index.js
const utils = require('./utils');
console.log(utils.someValue);// 3
utils.someValue = 5
console.log(utils.someValue);// 5
Update after knowing about getter and setter methods in Js
You can refer to https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/set
this is another way to change value of any private properties in JS

Create new object and write to file with fs.writeFile()

There are two things I want to do:
I want to create a new array of objects from an existing object,
And increment the object so each object can have a count id of 1,2,3 etc
My issue is that when I write to the file it writes only 1 random object to the file and the rest don't show. There are so errors and all the objects have the same increment value. Please explain what I am doing wrong. Thanks.
Code:
data.json:
{
"users":[
{
"name":"mike",
"category":[
{
"title":"cook",
}
],
"store":{
"location":"uptown",
"city":"ulis"
},
"account":{
"type":"regular",
"payment":[
"active":false
]
}
}
]
}
index.js:
const appData = ('./data.json')
const fs = require('fs');
let newObject = {}
appData.forEach(function(items){
let x = items
let numincrement = 1++
newObject.name = x.name
newObject.count = numincrement
newObject.categories = x.categories
newObject.store = x.store
newObject.account = x.account
fs.writeFile('./temp.json', JSON.stringify(newObject, null, 2),'utf8' , function(err, data) {
// console.log(data)
if(err) {
console.log(err)
return
} else{
console.log('created')
}
})
})
There are a whole bunch of problems here:
You're just rewriting the same object over and over to the file. fs.writeFile() rewrites the entire file. It does not append to the file. In addition, you cannot append to the JSON format either. So, this code will only every write one object to the file.
To append new JSON data to what's in the existing file, you would have to read in the existing JSON, parse it to convert it to a Javascript array, then add new items onto the array, then convert back to JSON and write out the file again. For more efficient appending, you would need a different data format (perhaps comma delimited lines).
Your loop has all sorts of problems. You're assigning to the same newObject over and over again.
Your numincrement is inside the loop so it will have the same value on every invocation of the loop. You can also just use the index parameter passed to the forEach() callback instead of using your own variable.
If what you're trying to iterate over is the users array in your data, then you may need to be iterating over appData.users, not just appData.
If you really just want to append data to a text file, the JSON is not the easiest format to use. It might be easier to just use comma delimited lines. Then, you can just append new lines to the file. Can't really do that with JSON.
If you're willing to just overwrite the file with the current data, you can do this:
const appData = ('./data.json').users;
const fs = require('fs');
// create an array of custom objects
let newData = appData.map((item, index) => {
return {
name: item.name,
count: index + 1,
categories: item.categoies,
store: item.store,
account: item.account
};
});
// write out that data to a file as JSON (overwriting existing file)
fs.writeFile('./temp.json', JSON.stringify(newData, null, 2),'utf8' , function(err, data) {
if (err) {
console.log(err);
} else {
console.log("data written");
}
});

No Output fast-csv writeToPath

I am writing a script which at its core parses a .csv file for certain columns storing them in an array and then writes the contents to another .csv file. I am able to parse the file using fast-csv and have confirmed in the terminal that my array is in the correct format. However, when I attempt to write this array using the fast-csv to a .csv file, the contents never appear in the file and no errors are thrown. I have validated that the array is being passed all the way through to the callback. In addition I have gone so far as to replace that variable in the writeToPath function with a simple array and still no luck. Any assistance would be appreciated.
const processFile = (fileName, file, cb) => {
let writeData = []
let tempArray = []
csv.fromPath(basePath + file, {ignoreEmpty: false, headers: false})
.on("data", function(data){
if (data[0] != ''){
[startDate, endDate] = fileName
tempArray[0] = data[0]
tempArray[1] = data[1]
tempArray[2] = data[2]
tempArray[3] = data[3]
tempArray[4] = data[4]
tempArray[5] = data[8]
tempArray[6] = ""
tempArray[7] = ""
tempArray[8] = ""
tempArray[9] = startDate
tempArray[10] = endDate
writeData[i] = tempArray
writeData.shift()
tempArray = []
i++
}
})
.on("end", () => {
console.log('end')
}).on('finish', (() => {
cb(writeData)
}));
}
processFile(fileName, file, (csvData) => {
console.log(csvData)
csv.writeToPath('./working-files/top.csv', {headers: false}, csvData).on("finish", () => {
console.log('done')
})
Unfortunately, without any context to the dataset you are using, there is only so much I can suggest. The variables needed to debug this properly would be: the file, the file names used and whatever 'i' is. If you can update this then I'll be happy to take another look.
I would suggest going back and logging the variables after each step that would modify them, hopefully then you'll get a better picture as what is going wrong.
I understand this isn't a complete answer and it will probably get removed but I don't have the 50 needed reputation to make a comment.

Categories

Resources