How go I get csv data into netsuite? - javascript

I've got an update to my question.
What I really wanted to know was this:
How do I get csv data into netsuite?
Well, it seems I use the csv import tool to create a mapping and use this call to import the csv nlapiSubmitCSVImport(nlobjCSVImport).
Now my question is: How do I iterate through the object?!
That gets me half way - I get the csv data but I can't seem to find out how I iterate through it in order to manipulate the date. This is, of course, the whole point of a scheduled script.
This is really driving me mad.
#Robert H
I can think of a million reasons why you'd want to import data from a CSV. Billing, for instance. Various reports on data any company keeps and I wouldn't want to keep this in the file cabinet nor would I really want to keep the file at all. I just want the data. I want to manipulate it and I want to enter it.

Solution Steps:
To upload a CSV file we have to use a Suitelet script.
(Note: file - This field type is available only for Suitelets and will appear on the main tab of the Suitelet page. Setting the field type to file adds a file upload widget to the page.)
var fileField = form.addField('custpage_file', 'file', 'Select CSV File');
var id = nlapiSubmitFile(file);
Let's prepare to call a Restlet script and pass the file id to it.
var recordObj = new Object();
recordObj.fileId = fileId;
// Format input for Restlets for the JSON content type
var recordText = JSON.stringify(recordObj);//stringifying JSON
// Setting up the URL of the Restlet
var url = 'https://rest.na1.netsuite.com/app/site/hosting/restlet.nl?script=108&deploy=1';
// Setting up the headers for passing the credentials
var headers = new Array();
headers['Content-Type'] = 'application/json';
headers['Authorization'] = 'NLAuth nlauth_email=amit.kumar2#mindfiresolutions.com, nlauth_signature=*password*, nlauth_account=TSTDRV****, nlauth_role=3';
(Note: nlapiCreateCSVImport: This API is only supported for bundle installation scripts, scheduled scripts, and RESTlets)
Let's call the Restlet using nlapiRequestURL:
// Calling Restlet
var output = nlapiRequestURL(url, recordText, headers, null, "POST");
Create a mapping using Import CSV records available at Setup > Import/Export > Import CSV records.
Inside the Restlet script Fetch the file id from the Restlet parameter. Use nlapiCreateCSVImport() API and set its mapping with mapping id created in step 3. Set the CSV file using the setPrimaryFile() function.
var primaryFile = nlapiLoadFile(datain.fileId);
var job = nlapiCreateCSVImport();
job.setMapping(mappingFileId); // Set the mapping
// Set File
job.setPrimaryFile(primaryFile.getValue()); // Fetches the content of the file and sets it.
Submit using nlapiSubmitCSVImport().
nlapiSubmitCSVImport(job); // We are done
There is another way we can get around this although neither preferable nor would I suggest. (As it consumes a lot of API's if you have a large number of records in your CSV file.)
Let's say that we don't want to use the nlapiCreateCSVImport API, so let's continue from the step 4.
Just fetch the file Id as we did earlier, load the file, and get its contents.
var fileContent = primaryFile.getValue();
Split the lines of the file, then subsequently split the words and store the values into separate arrays.
var splitLine = fileContent.split("\n"); // Splitting the file on the basis of lines.
for (var lines = 1,count=0; lines < splitLine.length; lines++)
{
var words = (splitLine[lines]).split(","); // words stores all the words on a line
for (var word = 0; word < words.length; word++)
{
nlapiLogExecution("DEBUG", "Words:",words[word]);
}
}
Note: Make sure you don't have an additional blank line in your CSV file.
Finally create the record and set field values from the array that we created above.
var myRec = nlapiCreateRecord('cashsale'); // Here you create the record of your choice
myRec.setFieldValue('entity', arrCustomerId[i]); // For example, arrCustomerId is an array of customer ID.
var submitRec = nlapiSubmitRecord(myRec); // and we are done

fellow NetSuite user here, I've been using SuiteScripts for a while now but never saw nlobjCSVImport object nor nlapiSubmitCSVImport .. I looked in the documentation, it shows, but there is no page describing the details, care to share where you got the doc from?
With the doc for the CSVImport object I might be able to provide some more help.
P.S. I tried posting this message as a comment but the "Add comment" link didn't show up for some reason. Still new to SOF

CSV to JSON:
convert csv file to json object datatable
https://code.google.com/p/jquery-csv/
If you know the structure of the CSV file, just do a for loop and map the fields to the corresponding nlapiSetValue.
Should be pretty straightforward.

Related

How to append an item to a JSON stored in a file?

Im triying to write a JSON object to a DB with FS but it´s not working as expected. What i want to do is write the JSON data inside this: [] Example:
[
{"name":"jhon"},
{"name":"jhonathan"}
]
Here is the code:
Thanks.
The comment you provided doesn't make much sense, because the JSON data you provided in the post and in the comments is completely different. But I get the gist of it. I guess you have a JSON file containing an array and you want to push new items to it. Let's do this.
The thing is, when you call fs.appendFile, you're only writing to the end of the file. You're not following JSON format by doing so.
You need to do this:
Read file content.
Parse JSON text into an object.
Update the object in memory.
Write object in JSON format back to the file system.
I'll call the synchronous methods for simplicity's sake, but you should be able to convert it to async quite easily.
const path = __dirname + 'db.json'
// Reading items from the file system
const jsonData = fs.readFileSync(path)
const items = JSON.parse(jsonData)
// Add new item to the item list
items.push(newItem)
// Writing back to the file system
const newJsonString = JSON.stringify(items)
fs.writeFileSync(path, newJsonString)

Google Script - importing file data from a file name listed in a spreadsheet

Very new to this, but have been pretty lucky with my tinkering in the past. Really stuck on this one, however.
Looking to import file data to specific sheets. With a specific file name, I've been successful with this script:
function importVauto() {
var app = SpreadsheetApp;
var import1 = app.getActiveSpreadsheet().getSheetByName('Import1');
var file = DriveApp.getFilesByName("samplefilename").next();
var csvData = Utilities.parseCsv(file.getBlob().getDataAsString());
import1.getRange(1, 1, csvData.length, csvData[0].length).setValues(csvData);
}
The challenge is, the file names change weekly. My thought was to have the required file names listed in a spreadsheet, reference the cell containing the file name, and import the required file. Tried this:
function importVauto() {
var app = SpreadsheetApp;
var import1 = app.getActiveSpreadsheet().getSheetByName('Import1');
var data = app.getActiveSpreadsheet().getSheetByName('Data');
var name = data.getRange("C24")
name.getValues()
Logger.log(name)
var file = DriveApp.getFilesByName(name).next();
var csvData = Utilities.parseCsv(file.getBlob().getDataAsString());
import1.getRange(1, 1, csvData.length, csvData[0].length).setValues(csvData);
}
Appears there is a problem with using the var 'name' as the file name for DriveApp.getFilesByName(), but chances are, there's a ton I'm missing (don't know what you don't know).
Hopefully this question makes sense (and even more hopefully, there is a simple solution). Again, very new to this. Appreciate any feedback.
How about this modification?
Modification points :
getValues() returns 2 dimensional array. If you want to retrieve the value of one cell "C24", you can use getValue(). In this case, you can directly retrieve the string value of the cell.
Modified script :
From :
name.getValues();
To :
name = name.getValue();
OR
name = name.getValues()[0][0];
Note :
When you retrieve file using DriveApp.getFilesByName(name).next(); if there are several files with the same filename, this can retrieve only one of them. Please be careful about this.
References :
getValues()
getValue()
If this was not what you want, please tell me. I would like to modify my answer.

Appending new Google Sheets Data into BigQuery Table

So I'm new to all of this, both BigQuery and AppScript (coding in general..) and I'm learning as I go, so maybe to some my question may seem stupid. Please just hear me out.
I have created a script that loads 10 of the most recent data points into a Google Sheets doc from one of my BigQuery tables. Now, when I manually add new data points to the bottom of this table, I would like to have a load script run that uploads that new data back into BigQuery, and appends it to my original table. I read somewhere that just by inserting a new table the data is automatically appended if the table mentioned already exists. However, I haven't tested that part yet since I get stuck on an error earlier up the line..
Below is the load script I have loosely copied from https://developers.google.com/apps-script/advanced/bigquery
function loadCsv() {
var projectId = 'XX';
var datasetId = 'YY';
var tableId = 'daily_stats_testing';
var file = SpreadsheetApp.getActiveSpreadsheet();
var sheet = file.getActiveSheet().getRange("A12:AK10000").getValues();
var data = sheet.getBlob().setContentType('application/octet-stream');
var job = {
configuration: {
load: {
destinationTable: {
projectId: projectId,
datasetId: datasetId,
tableId: tableId
},
skipLeadingRows: 1
}
}
};
job = BigQuery.Jobs.insert(job, projectId, data);
var Msg = "Load started. Check status of it here:" +
"https://bigquery.cloud.google.com/jobs/%s", projectId
Logger.log(Msg);
Browser.msgBox(Msg);
return;
}
Now the error I get (in a variety of forms, since I've been testing stuff out) is that the BigQuery.Jobs function only accepts Blob data, and that the data from my current sheet (with rage A12 marking the first manually imputed row of data) is not a Blob recognizable data set.
Is there any way (any function?) I can use that will directly convert the selected data range and make it Blob compatible?
Does anyone have any recommendation on how to do this more efficiently?
Unfortunately the script has to load directly out of the current, open Spreadsheet sheet, since it is part of a larger script I will be running. Hopefully this isn't too much of a hinder!
Thanks in advance for the help :)
Is there any way (any function?) I can use that will directly convert the selected data range and make it Blob compatible?
There is actually a function that does convert objects into blob type, namely newBlob(data).
To test this I got a range from my spreadsheet and used it.
function blobTest(){
var sheet = SpreadsheetApp.getActiveSheet();
var range = sheet.getRange("A1:C1");
Logger.log(range); //here, range data is of type Range object
var blob = Utilities.newBlob(range);
Logger.log(blob); //here, range data is now of type Blob object
}

How would you create a JSON file based on different cols of your database (PostgreSQL/SQLAlchemy) in Flask to use it in your JavaScript code?

Would this happen in views.py? And if so, how would the file be created?
For example in app/models.py, some of the columns include:
class Company(db.Model):
id = db.Column(db.Integer, primary_key = True)
permalink = db.Column(db.String(255))
name = db.Column(db.String(255))
homepage_url = db.Column(db.String(255))
description = db.Column(db.Text)
founded_on_day = db.Column(db.Integer)
founded_on_month = db.Column(db.Integer)
founded_on_year = db.Column(db.Integer)
I want to create a JSON file with specific fields that I'd select for 'company' objects and access it in my html/Javascript code such as:
d3.json("data.json", function (data) {
And then just selected specific fields from there to render.
How would the file be created based on the database's information.
The views are used to perform actions and/or serve data as a response to a HTTP method sent to the web server. If you want a specific JSON file to be created, that's more suitable to be done from a script called regularly (with cron for example).
On the other hand, I don't think you need to create a JSON file at all. It would be more suitable for you to create a view that is serving the URL used as the first argument to d3.json(). You could have something like this:
from flask import jsonify
#app.route('/_get_companies')
def get_companies_json():
companies = {}
for c in session.query(Company).all():
companies[c.id] = {
'name': c.name,
'homepage_url': c.homepage_url,
}
return jsonify(companies)
Of course the contents of the returned JSON object is dependent on what you are trying to do, but you should get the idea.
Then in you JS file use:
d3.json('http://domain.com/_get_companies', function(data) {
// Process your data
}

sp.js get all lists by content type

I have combed through the sp namespace docs and not found much to go on.
I found this snippet from http://www.c-sharpcorner.com/Blogs/12134/how-to-get-the-list-content-types-using-csom-in-sharepoint-2.aspx
//// String Variable to store the siteURL
string siteURL = "http://c4968397007/";
//// Get the context for the SharePoint Site to access the data
ClientContext clientContext = new ClientContext(siteURL);
//// Get the content type collection for the list "Custom"
ContentTypeCollection contentTypeColl = clientContext.Web.Lists.GetByTitle("Custom").ContentTypes;
clientContext.Load(contentTypeColl);
clientContext.ExecuteQuery();
//// Display the Content Type name
foreach (ContentType ct in contentTypeColl)
{
Console.WriteLine(ct.Name);
}
Console.ReadLine();
which will get a a certain lists content type.
My thought is get all lists, then get all their content types, then use their id/title to query the lists for data.
It seems like a ton of work to do in a display template.
Am I on the right path or is there something I'm missing? Any sp wiz out there care to weight in on the new search/js architecture?
You may want to use a JavaScript library like SharepointPlus or the popular SPServices.
I think the syntax of SharepointPlus is simplier and the code would be something like:
$SP().lists(function(list) {
for (var i=0; i<list.length; i++) {
// list[i]['Name'] contains the name of the list
$SP().list(list[i]['Name']).get(/* something */)
}
});
You said something about the content types. So you may also want to look at the info() function and check the field with the name "ContentTypeId".
FYI I created this SharepointPlus library.

Categories

Resources