I am reading in a csv file and i am in the process of parsing through it but having some trouble.
1 - Before parsing the file i already have an array with strings of headers that i want to pull the data for from the csv file.
2- I want to parse the file so i can also display the data into a table with my predefined headers. And any extra headers they will be ignored from being displayed in the table.
Here is my code:
this.predefinedHeaders = ["Name", "Age", "Gender"];
readCSV(event: Event) {
const file = (event.target as HTMLInputElement).files![0];
var reader = new FileReader();
let text = (reader.result as string).split(/\r\n|\r|\n/);
let lines = [];
for( var i=1; i<text.length; i++) {
var data = text[i].split(',');
var tarr=[];
for(var j=0; j<this.predefinedHeaders.length; j++) {
tarr.push(data[j]);
}
lines.push(tarr);
}
this.tableData = lines;
}
reader.readAsText(file);
What is currently happening is that data is being populated to the table but not under the right headers.. How can i bind the data to my headers... NOTE: the predefined are guaranteed to be part of the original headers from the file. the difference is that it doesn't show data for all the columns such several of them.
HTML View:
table
thead
tr
th(v-for='column in predefinedColumns) {{column.name}}
tbody
tr(v-for='(a, index)in data')
td(v-for='(b, index2) in a') {{data[index][index2]}}
You might want to use most popular CSV parser which is PapaParse.
URLs for the deep documentation:
https://www.papaparse.com/demo
this library has various configuration options and one of them is 'Header row' which is exact solution you need.
to use the predefined header you can supply header argument as true to have all the data parsed as key-value pairs.
example: { data: Papa.parse(reader.result, { header: true }) }
with 'header: true', it will take first row of CSV file as key value for all the row in CSV file
NPM package for easiest implementation in javaScript app:
https://www.npmjs.com/package/papaparse
If you want to have the predefined headers and display only the table with only needed columns.
checkout this one of my example on codesandbox,
https://codesandbox.io/embed/llqmrp96pm
sample CSV file is already uploaded in the same directory so you would be able to upload it and see magic
this CSV file in there has 7 or 8 column but I am displaying only 4 columns, I assume that's what you are looking for
I see you are looking for JavaScript solution, my example is created with ReactJS and couple of NPM libraries but it is almost same as you are looking for, I believe it would be easier than anything to replicate in your code.
Related
I have some xlsx files, and besides parsing this file in a "classic" way, I also need to do it in a "smart" way.
var filePath = process.env.PWD + '/testdata/unfried-xlsx-1.xlsx';
var inboundWorkbook = new ExcelJS.Workbook();
inboundWorkbook.xlsx.readFile(filePath).then(function() {
var inboundWorksheet = inboundWorkbook.getWorksheet(1); //or name of the worksheet
inboundWorksheet.eachRow({ includeEmpty: true }, function(row, rowNumber) {
console.log(row.values);
});
});
example of xlsx file
My file is made of a header, a table, and a footer.
Problem is, I need to "isolate" informations. For example I need a customer number located either in the header or in the footer, and of course the datas in the table.
I could be easy if the file was constant, but the feature includes a variety of files that I dont know in advanced (different headers and footers, different columns names ect...)
I know I wont have the most precise feature I can imagine, but if I can get close to it, it could be great :)
Any idea how to proceed ?
Regards,
I am converting my excel file to json using the below library and it works fine when I have one header row and the remaining rows are data.
var excelToJson = require('convert-excel-to-json');
var result = excelToJson({
sourceFile: 'Auto.xlsx',
header: {
rows: 2 //Used 2 to skip the first line of headers
},
columnToKey: {
'*': '{{columnHeader}}'
}
});
But I have a file that has nested rows as headers similar to below and want to convert that into json.
Something like this, with multiple sheets
Appreciate any help or pointers. Thanks!
I'm trying to write data into an Excel file using ExcelJS library. I was successfully able to create worksheet and add column data.
However, while trying to implement addRow() or addRows() method, the data is not added into the Excel worksheet.
Here is the code I tried:
const ExcelJS = require('exceljs');
var workbook = new ExcelJS.Workbook();
var worksheet = workbook.addWorksheet('Payment Data');
worksheet.columns = reportHeaders; //reportHeaders is an array of header objects
I'm able to see the columns created successfully in the excel sheet. The trouble starts from below, where I'm trying to add data (into rows):
1st method :
worksheet.addRows(excelData);//excelData is an array of data objects
2nd method:
for(var rowItem in excelData){
worksheet.addRow(excelData[rowItem]);}
However, it seems either of these methods aren't working for me.
Finally, the file is saved:
workbook.xlsx.writeFile('PaymentData.xlsx')
Is there anything I'm missing? Any help will be appreciated.
So the issue with your code was, you were trying to add the data to the columns without specifying the key property in the columns array and hence it was unable to add the data.
I modified the worksheet.columns array to look something like the following:
worksheet.columns = [
{ header: "A", key: "a" },
{ header: "B", key: "b" },
];
This will solve your problem
I managed to convert the object to an array and passed it to the addRow method.
This worked for me.
I'm still not sure why I'm not able to pass an array of objects to addRow method.
I'm trying to format some data into a Excel table, I already have the data written in the Excel file, but I want it as a table, is there any way to do this, if not, is there any way with another npm package
You could convert the range to table:
const sheet = context.workbook.worksheets.getItem("Sample");
let expensesTable = sheet.tables.add("A1:E7", true);
expensesTable.name = "ExpensesTable";
here is the sample gist that you could have a try
https://gist.github.com/lumine2008/8eccb88f7fccf34b63c7ecd5fd05aaea
I've got an update to my question.
What I really wanted to know was this:
How do I get csv data into netsuite?
Well, it seems I use the csv import tool to create a mapping and use this call to import the csv nlapiSubmitCSVImport(nlobjCSVImport).
Now my question is: How do I iterate through the object?!
That gets me half way - I get the csv data but I can't seem to find out how I iterate through it in order to manipulate the date. This is, of course, the whole point of a scheduled script.
This is really driving me mad.
#Robert H
I can think of a million reasons why you'd want to import data from a CSV. Billing, for instance. Various reports on data any company keeps and I wouldn't want to keep this in the file cabinet nor would I really want to keep the file at all. I just want the data. I want to manipulate it and I want to enter it.
Solution Steps:
To upload a CSV file we have to use a Suitelet script.
(Note: file - This field type is available only for Suitelets and will appear on the main tab of the Suitelet page. Setting the field type to file adds a file upload widget to the page.)
var fileField = form.addField('custpage_file', 'file', 'Select CSV File');
var id = nlapiSubmitFile(file);
Let's prepare to call a Restlet script and pass the file id to it.
var recordObj = new Object();
recordObj.fileId = fileId;
// Format input for Restlets for the JSON content type
var recordText = JSON.stringify(recordObj);//stringifying JSON
// Setting up the URL of the Restlet
var url = 'https://rest.na1.netsuite.com/app/site/hosting/restlet.nl?script=108&deploy=1';
// Setting up the headers for passing the credentials
var headers = new Array();
headers['Content-Type'] = 'application/json';
headers['Authorization'] = 'NLAuth nlauth_email=amit.kumar2#mindfiresolutions.com, nlauth_signature=*password*, nlauth_account=TSTDRV****, nlauth_role=3';
(Note: nlapiCreateCSVImport: This API is only supported for bundle installation scripts, scheduled scripts, and RESTlets)
Let's call the Restlet using nlapiRequestURL:
// Calling Restlet
var output = nlapiRequestURL(url, recordText, headers, null, "POST");
Create a mapping using Import CSV records available at Setup > Import/Export > Import CSV records.
Inside the Restlet script Fetch the file id from the Restlet parameter. Use nlapiCreateCSVImport() API and set its mapping with mapping id created in step 3. Set the CSV file using the setPrimaryFile() function.
var primaryFile = nlapiLoadFile(datain.fileId);
var job = nlapiCreateCSVImport();
job.setMapping(mappingFileId); // Set the mapping
// Set File
job.setPrimaryFile(primaryFile.getValue()); // Fetches the content of the file and sets it.
Submit using nlapiSubmitCSVImport().
nlapiSubmitCSVImport(job); // We are done
There is another way we can get around this although neither preferable nor would I suggest. (As it consumes a lot of API's if you have a large number of records in your CSV file.)
Let's say that we don't want to use the nlapiCreateCSVImport API, so let's continue from the step 4.
Just fetch the file Id as we did earlier, load the file, and get its contents.
var fileContent = primaryFile.getValue();
Split the lines of the file, then subsequently split the words and store the values into separate arrays.
var splitLine = fileContent.split("\n"); // Splitting the file on the basis of lines.
for (var lines = 1,count=0; lines < splitLine.length; lines++)
{
var words = (splitLine[lines]).split(","); // words stores all the words on a line
for (var word = 0; word < words.length; word++)
{
nlapiLogExecution("DEBUG", "Words:",words[word]);
}
}
Note: Make sure you don't have an additional blank line in your CSV file.
Finally create the record and set field values from the array that we created above.
var myRec = nlapiCreateRecord('cashsale'); // Here you create the record of your choice
myRec.setFieldValue('entity', arrCustomerId[i]); // For example, arrCustomerId is an array of customer ID.
var submitRec = nlapiSubmitRecord(myRec); // and we are done
fellow NetSuite user here, I've been using SuiteScripts for a while now but never saw nlobjCSVImport object nor nlapiSubmitCSVImport .. I looked in the documentation, it shows, but there is no page describing the details, care to share where you got the doc from?
With the doc for the CSVImport object I might be able to provide some more help.
P.S. I tried posting this message as a comment but the "Add comment" link didn't show up for some reason. Still new to SOF
CSV to JSON:
convert csv file to json object datatable
https://code.google.com/p/jquery-csv/
If you know the structure of the CSV file, just do a for loop and map the fields to the corresponding nlapiSetValue.
Should be pretty straightforward.