JS Array not recognized as an array in Google Apps script - javascript

I am writing what should be a simple HTML form for users to make a reservation. However, having difficulty doing something I've done many times before - getting data from a GSheet and presenting it in an HTML form. Code snippets:
**Server side**
//Getting data from spreadsheet; expecting it to be an array of arrays.
function getMembers(){
const url = "https://my gsheet url"
const ss = SpreadsheetApp.openByUrl(url);
const ws = ss.getSheetByName("Members");
const row = ws.getLastRow()
const memberData = ws.getRange(2,1,row-1,7).getValues();
const array = JSON.stringify(memberData); //EDIT...helps get data to client side
return //array; //this data to be passed to browser when called. Type Error?
//Log(memberData) = //Looks like an array to me...
//[1,"Simmie","Smith","Simmie Smith","2020-01-01T05:00:00.000Z","temp","53120-1"],
//[2,"Fred","Williams","Fred Williams","2020-01-01T05:00:00.000Z","temp","53120-2"],
//[3,"Carleton","Johnson","Carleton Johnson","2020-01-01T05:00:00.000Z","temp","53120-3"],
//[4,"Caroll","Williams","Caroll Williams","2020-01-01T05:00:00.000Z","temp","53120-4"],
//[5,"Gabby","Williams","Gabby Williams","2020-01-01T05:00:00.000Z","temp","53120-5"]
//EDIT HtmlService code
function doGet(e){
return HtmlService.createTemplateFromFile('Reservations HTML')
.evaluate().setXFrameOptionsMode(HtmlService.XFrameOptionsMode.ALLOWALL)
.setSandboxMode(HtmlService.SandboxMode.IFRAME);
}
Calling the above server function when page loads so that member name can be presented in a select dropdown and other info from the array used elsewhere.
**Browser JS**
function getMemberData(){
google.script.run.withSuccessHandler(populateMembersDD).getMembers();
}
function populateMembersDD(data){
console.log(data) //undefined - same result FFox or Chrome..
memberInfo = data.filter(function(row){return true});
not getting any further than that because data is not passing to browser in a format with which I can apply necessary methods to get data into dropdown. Have tried various methods (slice, filter etc) to convert data into an array but none have worked. I suspect a type error, as though server function is not producing an object in array format. But I cannot figure out why. Have read many similar questions/answers on Stack but none have solved my problem. Help much appreciated as I am 24 hours in on solution experiments. Have many other apps where this basic code pattern works totally fine...

Related

Appending new Google Sheets Data into BigQuery Table

So I'm new to all of this, both BigQuery and AppScript (coding in general..) and I'm learning as I go, so maybe to some my question may seem stupid. Please just hear me out.
I have created a script that loads 10 of the most recent data points into a Google Sheets doc from one of my BigQuery tables. Now, when I manually add new data points to the bottom of this table, I would like to have a load script run that uploads that new data back into BigQuery, and appends it to my original table. I read somewhere that just by inserting a new table the data is automatically appended if the table mentioned already exists. However, I haven't tested that part yet since I get stuck on an error earlier up the line..
Below is the load script I have loosely copied from https://developers.google.com/apps-script/advanced/bigquery
function loadCsv() {
var projectId = 'XX';
var datasetId = 'YY';
var tableId = 'daily_stats_testing';
var file = SpreadsheetApp.getActiveSpreadsheet();
var sheet = file.getActiveSheet().getRange("A12:AK10000").getValues();
var data = sheet.getBlob().setContentType('application/octet-stream');
var job = {
configuration: {
load: {
destinationTable: {
projectId: projectId,
datasetId: datasetId,
tableId: tableId
},
skipLeadingRows: 1
}
}
};
job = BigQuery.Jobs.insert(job, projectId, data);
var Msg = "Load started. Check status of it here:" +
"https://bigquery.cloud.google.com/jobs/%s", projectId
Logger.log(Msg);
Browser.msgBox(Msg);
return;
}
Now the error I get (in a variety of forms, since I've been testing stuff out) is that the BigQuery.Jobs function only accepts Blob data, and that the data from my current sheet (with rage A12 marking the first manually imputed row of data) is not a Blob recognizable data set.
Is there any way (any function?) I can use that will directly convert the selected data range and make it Blob compatible?
Does anyone have any recommendation on how to do this more efficiently?
Unfortunately the script has to load directly out of the current, open Spreadsheet sheet, since it is part of a larger script I will be running. Hopefully this isn't too much of a hinder!
Thanks in advance for the help :)
Is there any way (any function?) I can use that will directly convert the selected data range and make it Blob compatible?
There is actually a function that does convert objects into blob type, namely newBlob(data).
To test this I got a range from my spreadsheet and used it.
function blobTest(){
var sheet = SpreadsheetApp.getActiveSheet();
var range = sheet.getRange("A1:C1");
Logger.log(range); //here, range data is of type Range object
var blob = Utilities.newBlob(range);
Logger.log(blob); //here, range data is now of type Blob object
}

Querying a parse table and eagerly fetching Relations for matching

Currently, I have a table named Appointments- on appointments, I have a Relation of Clients.
In searching the parse documentation, I haven't found a ton of help on how to eagerly fetch all of the child collection of Clients when retrieving the Appointments. I have attempted a standard query, which looked like this:
var Appointment = Parse.Object.extend("Appointment");
var query = new Parse.Query(Appointment);
query.equalTo("User",Parse.User.current());
query.include('Rate'); // a pointer object
query.find().then(function(appointments){
let appointmentItems =[];
for(var i=0; i < appointments.length;i++){
var appt = appointments[i];
var clientRelation = appt.relation('Client');
clientRelation.query().find().then(function(clients){
appointmentItems.push(
{
objectId: appt.id,
startDate : appt.get("Start"),
endDate: appt.get("End"),
clients: clients, //should be a Parse object collection
rate : appt.get("Rate"),
type: appt.get("Type"),
notes : appt.get("Notes"),
scheduledDate: appt.get("ScheduledDate"),
confirmed:appt.get("Confirmed"),
parseAppointment:appt
}
);//add to appointmentitems
}); //query.find
}
});
This does not return a correct Clients collection-
I then switched over to attempt to do this in cloud code- as I was assuming the issue was on my side for whatever reason, I thought I'd create a function that did the same thing, only on their server to reduce the amount of network calls.
Here is what that function was defined as:
Parse.Cloud.define("GetAllAppointmentsWithClients",function(request,response){
var Appointment = Parse.Object.extend("Appointment");
var query = new Parse.Query(Appointment);
query.equalTo("User", request.user);
query.include('Rate');
query.find().then(function(appointments){
//for each appointment, get all client items
var apptItems = appointments.map(function(appointment){
var ClientRelation = appointment.get("Clients");
console.log(ClientRelation);
return {
objectId: appointment.id,
startDate : appointment.get("Start"),
endDate: appointment.get("End"),
clients: ClientRelation.query().find(),
rate : appointment.get("Rate"),
type: appointment.get("Type"),
notes : appointment.get("Notes"),
scheduledDate: appointment.get("ScheduledDate"),
confirmed:appointment.get("Confirmed"),
parseAppointment:appointment
};
});
console.log('apptItems Count is ' + apptItems.length);
response.success(apptItems);
})
});
and the resulting "Clients" returned look nothing like the actual object class:
clients: {_rejected: false, _rejectedCallbacks: [], _resolved: false, _resolvedCallbacks: []}
When I browse the data, I see the related objects just fine. The fact that Parse cannot eagerly fetch relational queries within the same call seems a bit odd coming from other data providers, but at this point I'd take the overhead of additional calls if the data was retrieved properly.
Any help would be beneficial, thank you.
Well, in your Cloud code example - ClientRelation.query().find() will return a Parse.Promise. So the output clients: {_rejected: false, _rejectedCallbacks: [], _resolved: false, _resolvedCallbacks: []} makes sense - that's what a promise looks like in console. The ClientRelation.query().find() will be an async call so your response.success(apptItems) is going to be happen before you're done anyway.
Your first example as far as I can see looks good though. What do you see as your clients response if you just output it like the following? Are you sure you're getting an array of Parse.Objects? Are you getting an empty []? (Meaning, do the objects with client relations you're querying actually have clients added?)
clientRelation.query().find().then(function(clients){
console.log(clients); // Check what you're actually getting here.
});
Also, one more helpful thing. Are you going to have more than 100 clients in any given appointment object? Parse.Relation is really meant for very large related collection of other objects. If you know that your appointments aren't going to have more than 100 (rule of thumb) related objects - a much easier way of doing this is to store your client objects in an Array within your Appointment objects.
With a Parse.Relation, you can't get around having to make that second query to get that related collection (client or cloud). But with a datatype Array you could do the following.
var query = new Parse.Query(Appointment);
query.equalTo("User", request.user);
query.include('Rate');
query.include('Clients'); // Assumes Client column is now an Array of Client Parse.Objects
query.find().then(function(appointments){
// You'll find Client Parse.Objects already nested and provided for you in the appointments.
console.log(appointments[0].get('Clients'));
});
I ended up solving this using "Promises in Series"
the final code looked something like this:
var Appointment = Parse.Object.extend("Appointment");
var query = new Parse.Query(Appointment);
query.equalTo("User",Parse.User.current());
query.include('Rate');
var appointmentItems = [];
query.find().then(function(appointments){
var promise = Parse.Promise.as();
_.each(appointments,function(appointment){
promise = promise.then(function(){
var clientRelation = appointment.relation('Clients');
return clientRelation.query().find().then(function(clients){
appointmentItems.push(
{
//...object details
}
);
})
});
});
return promise;
}).then(function(result){
// return/use appointmentItems with the sub-collection of clients that were fetched within the subquery.
});
You can apparently do this in parallel, but that was really not needed for me, as the query I'm using seems to return instantaniously. I got rid of the cloud code- as it didnt seem to provide any performance boost. I will say, the fact that you cannot debug cloud code seems truly limiting and I wasted a bit of time waiting for console.log statements to show themselves on the log of the cloud code panel- overall the Parse.Promise object was the key to getting this to work properly.

SAPUI5 Create OData entity with dates - generates incorrect request payload that ends in CX_SXML_PARSE_ERROR

We are trying to create an entity that has date attributes via an odata service. Backend is an sap system. This entity has only 3 key attributes plus a bunch of other attributes. We have identified that dates in the keys are the root cause of the problem.
Keys:
Pernr type string,
begda type datetime
endda type datetime.
The code below, (which does not work), has been severely simplified when trying to troubleshoot the issue. At the moment, it reads an entity from an entity set and immediately tries to create one with exactly the same data.
Code:
var oODataModel = new sap.ui.model.odata.ODataModel("/sap/opu/odata/sap/Z_PERSONAL_DATA_SRV/");
//Test entity to be saved
var entity = null;
//Handler for read error
var handleReadE = function (oEvent){
alert("error");
};
//Handler for read success
var handleRead = function (oEvent){
//Get the data read from backend
entity = oEvent.results[0];
//Try to create a new entity with same data
oODataModel.create('/PersDataSet', entity, null, function(){
alert("Create successful");
},function(oError){
alert("Create failed", oError);
});
};
oODataModel.read("/PersDataSet", null, [], true, handleRead, handleReadE);
In the gateway error log, an xml parsing error appears. In this log, we can see the request data and it can be seen that the dates are transported with String types. These dates are defined in the service as DateTimes so the request is rejected.
Example:
<m:properties>
<d:Pernr m:type="Edm.String">00000001</d:Pernr>
<d:Endda m:type="Edm.String">9999-12-31T00:00:00</d:Endda>
<d:Begda m:type="Edm.String">1979-05-23T00:00:00</d:Begda>
When the entity is read, the backend does not send any type information. It sends like the following example:
<m:properties>
<d:Pernr>72010459</d:Pernr>
<d:Endda>9999-12-31T00:00:00</d:Endda>
<d:Begda>1876-07-21T00:00:00</d:Begda>
And, indeed, if we try to save the same info without the type=".." it works. So the problem are the incorrect types ODataModel.create adds to the xml.
My question is:
Can I tell ODataModel.create to not add this type info? It is not doing a good job inferring the types.
Can anyone share an example reading and writing dates through odata?
Thank you very much in advance.
the data returned from oODataModel.read is raw, before you post you need to parse it
var handleRead = function (oEvent){
//Get the data read from backend
entity = oEvent.results[0];
var newEntity = jQuery.extend({},entity);
delete newEntity.__metadata;
newEntity.Begda = new Date(entity.Begda);
newEntity.Endda = new Date(entity.Endda);
//Try to create a new entity with same data
oODataModel.create('/PersDataSet', newEntity, null, function(){
why not use json instead of xml?
Thanks all for the help.
We got this working accounting for the following:
The problem of the wrong types appended to the attributes comes from the read itself. The object returned by read has a __metadata attribute which describes the values. In this object the dates are set with type=edm.string, even when the service says they are DateTime. To me this is a bug of the .read function.
When trying to use the same object to save, create sees the __metatada on the entry and uses those values, producing type edm.string type for the dates. This caused the request to be rejected. Manually changing these __metadata.properties...type to Edm.DateTime makes it work.
In the end, we did the following:
Dates are parsed manually from the Odata response, creating a js Date
object from the strings in format "yyyy-mm-ddT00:00:00", to make it work with control bindings. When we want to save, the reverse is done.
The object to be created is a new object with
only the attributes we care (no __metadata)

Breeze EntityManager.executeQuery with expand - httpResponse data does not match result data?

I'm working on a web app that uses Breeze and Knockout. I have the following function:
function getArticle(id, articleObservable) {
var query = EntityQuery.from('KbArticles')
.where("articleId", "==", id)
.expand("KbArticleType, KbSubject, KbArticleTags");
return manager.executeQuery(query)
.then(querySucceeded)
.fail(queryFailed);
function querySucceeded(data) {
if (articleObservable) {
articleObservable(data.results[0]);
}
log('Retrieved [Article] from remote data source', data.results.length, true);
}
}
A 1 to many relationship exists between the KbArticles entity and the KbArticleTags entity. The goal of this function is to fetch a specific article and the list of tags associated with the article. The function executes successfully (and I DO get data for the other 2 expanded entities - they are 1 to 1), however, I get an empty array for the KbArticleTags in data.results. Interestingly, the tags are populated in data.httpResponse:
So, it appears that the data is coming over the wire, but it's not making it to the results in the querySucceeded function. I tried to step through the breeze code to determine how the httpResponse is mapped to the result, but got lost fairly quickly (I'm a javascript newb). Does anyone have any troubleshooting tips for figuring out why the expanded entity doesn't show in the results, but is returned in the httpResponse? Or am I trying to do something that isn't supported?
Ok, so for whatever reason (I probably deleted it one day while testing and didn't add it back), the navigation property (in my Entity Framework diagram) was missing on the KbArticleTag entity:
All I had to do was add that back and everything is working as expected.

How go I get csv data into netsuite?

I've got an update to my question.
What I really wanted to know was this:
How do I get csv data into netsuite?
Well, it seems I use the csv import tool to create a mapping and use this call to import the csv nlapiSubmitCSVImport(nlobjCSVImport).
Now my question is: How do I iterate through the object?!
That gets me half way - I get the csv data but I can't seem to find out how I iterate through it in order to manipulate the date. This is, of course, the whole point of a scheduled script.
This is really driving me mad.
#Robert H
I can think of a million reasons why you'd want to import data from a CSV. Billing, for instance. Various reports on data any company keeps and I wouldn't want to keep this in the file cabinet nor would I really want to keep the file at all. I just want the data. I want to manipulate it and I want to enter it.
Solution Steps:
To upload a CSV file we have to use a Suitelet script.
(Note: file - This field type is available only for Suitelets and will appear on the main tab of the Suitelet page. Setting the field type to file adds a file upload widget to the page.)
var fileField = form.addField('custpage_file', 'file', 'Select CSV File');
var id = nlapiSubmitFile(file);
Let's prepare to call a Restlet script and pass the file id to it.
var recordObj = new Object();
recordObj.fileId = fileId;
// Format input for Restlets for the JSON content type
var recordText = JSON.stringify(recordObj);//stringifying JSON
// Setting up the URL of the Restlet
var url = 'https://rest.na1.netsuite.com/app/site/hosting/restlet.nl?script=108&deploy=1';
// Setting up the headers for passing the credentials
var headers = new Array();
headers['Content-Type'] = 'application/json';
headers['Authorization'] = 'NLAuth nlauth_email=amit.kumar2#mindfiresolutions.com, nlauth_signature=*password*, nlauth_account=TSTDRV****, nlauth_role=3';
(Note: nlapiCreateCSVImport: This API is only supported for bundle installation scripts, scheduled scripts, and RESTlets)
Let's call the Restlet using nlapiRequestURL:
// Calling Restlet
var output = nlapiRequestURL(url, recordText, headers, null, "POST");
Create a mapping using Import CSV records available at Setup > Import/Export > Import CSV records.
Inside the Restlet script Fetch the file id from the Restlet parameter. Use nlapiCreateCSVImport() API and set its mapping with mapping id created in step 3. Set the CSV file using the setPrimaryFile() function.
var primaryFile = nlapiLoadFile(datain.fileId);
var job = nlapiCreateCSVImport();
job.setMapping(mappingFileId); // Set the mapping
// Set File
job.setPrimaryFile(primaryFile.getValue()); // Fetches the content of the file and sets it.
Submit using nlapiSubmitCSVImport().
nlapiSubmitCSVImport(job); // We are done
There is another way we can get around this although neither preferable nor would I suggest. (As it consumes a lot of API's if you have a large number of records in your CSV file.)
Let's say that we don't want to use the nlapiCreateCSVImport API, so let's continue from the step 4.
Just fetch the file Id as we did earlier, load the file, and get its contents.
var fileContent = primaryFile.getValue();
Split the lines of the file, then subsequently split the words and store the values into separate arrays.
var splitLine = fileContent.split("\n"); // Splitting the file on the basis of lines.
for (var lines = 1,count=0; lines < splitLine.length; lines++)
{
var words = (splitLine[lines]).split(","); // words stores all the words on a line
for (var word = 0; word < words.length; word++)
{
nlapiLogExecution("DEBUG", "Words:",words[word]);
}
}
Note: Make sure you don't have an additional blank line in your CSV file.
Finally create the record and set field values from the array that we created above.
var myRec = nlapiCreateRecord('cashsale'); // Here you create the record of your choice
myRec.setFieldValue('entity', arrCustomerId[i]); // For example, arrCustomerId is an array of customer ID.
var submitRec = nlapiSubmitRecord(myRec); // and we are done
fellow NetSuite user here, I've been using SuiteScripts for a while now but never saw nlobjCSVImport object nor nlapiSubmitCSVImport .. I looked in the documentation, it shows, but there is no page describing the details, care to share where you got the doc from?
With the doc for the CSVImport object I might be able to provide some more help.
P.S. I tried posting this message as a comment but the "Add comment" link didn't show up for some reason. Still new to SOF
CSV to JSON:
convert csv file to json object datatable
https://code.google.com/p/jquery-csv/
If you know the structure of the CSV file, just do a for loop and map the fields to the corresponding nlapiSetValue.
Should be pretty straightforward.

Categories

Resources