How to add if and date in google script - javascript

I am a beginner with code script. Can you help me with my function please?
I have bot and he send data to google sheets, he send name, phone, date and method of communication. I need that google sheets write in column C date when was get the data from phone. I only now get the date, but in addition i need if - else. "If the column C is not empty send their date since last request", in addition i think I need to add method forEach and method so that the data is updated automatically when phone is received. For this I think need trigger "doGet(e)" from google documentation
(spread sheet image)
Data get from webhook
Here is my code:
function getDate() {
var ss = SpreadsheetApp.getActiveSpreadsheet();
var numbers = ss.getActiveSheet().getRange("B2:B1000")
let dateGoogle = new Date();
var rr = ss.getActiveSheet().getRange("C1:C1000").setValue(dateGoogle);
}

Just in case. If you're able to run the function getDate() and all you need is to make it to fill cells in C column only for rows that have filled cells in B column it can be done this way:
function getDate() {
var ss = SpreadsheetApp.getActiveSpreadsheet();
var range = ss.getActiveSheet().getData();
var data = range.getValues();
let dateGoogle = new Date();
data.forEach(x => x[2] = (x[2] == '' && x[1] != '') ? dateGoogle : x[2]);
range.setValues(data);
}
If you ask how to run the function getData() via doGet() I have no answer.

Using a doPost()
function doPost(e) {
Logger.log(e.postData.contents);
Logger.log(e.postData.type);
const ss = SpreadsheetApp.getActive();
const sh = ss.getSheetByName("Sheet1");
let data = JSON.parse(e.postData.contents);
let row = [];
['name','phone','date','method'].forEach(k => row.push(data[k]));
Logger.log(JSON.stringify(row))
sh.appendRow(row);
}
Below function Simulate what I imagine the bot can do to send data. This one is sending the data as JSON.
function sendData(obj) {
const url = ScriptApp.getService().getUrl();
const params={"contentType":"application/json","payload":JSON.stringify(obj),"muteHttpExceptions":true,"method":"post","headers": {"Authorization": "Bearer " + ScriptApp.getOAuthToken()}};
UrlFetchApp.fetch(url,params);
}
function saveMyData() {
sendData({name:"name",phone:"phone1",date:"date1",method:"post"});
}
You will have to Deploy the doPost(e) as a webapp.

Related

No data scraping a table using Apps Script

I'm trying to scrape the first table (FINRA TRACE Bond Market Activity) of this website using Google Apps Script and I'm getting no data.
https://finra-markets.morningstar.com/BondCenter/TRACEMarketAggregateStats.jsp
enter image description here
function myFunction() {
const url = 'https://finra-markets.morningstar.com/BondCenter/TRACEMarketAggregateStats.jsp';
const res = UrlFetchApp.fetch(url, { muteHttpExceptions: true }).getContentText();
const $ = Cheerio.load(res);
var data = $('table').first().text();
Logger.log(data);
}
I have also tried from this page and I do not get any result.
https://finra-markets.morningstar.com/transferPage.jsp?path=http%3A%2F%2Fmuni-internal.morningstar.com%2Fpublic%2FMarketBreadth%2FC&_=1655503161665
I can't find a solution on the web and I ask you for help.
Thanks in advance
This page does a lot of things in the background. First, there is a POST request to https://finra-markets.morningstar.com/finralogin.jsp that initiates the session. Then, XHR requests are made to load the data tables. If you grab the cookie by POST'ing to that login page, you can pass it on the desired XHR call. That will return the table. The date you want to fetch the data for can be set with the date URL paramter. Here is an example:
function fetchFinra() {
const LOGIN_URL = "https://finra-markets.morningstar.com/finralogin.jsp";
const DATE = "06/24/2022" //the desired date
let opts = {
method: "POST",
payload: JSON.stringify({redirectPage: "/BondCenter/TRACEMarketAggregateStats.jsp"})
};
let res = UrlFetchApp.fetch(LOGIN_URL, opts);
let cookies = res.getAllHeaders()["Set-Cookie"];
const XHR_URL = `https://finra-markets.morningstar.com/transferPage.jsp?path=http%3A%2F%2Fmuni-internal.morningstar.com%2Fpublic%2FMarketBreadth%2FC&_=${new Date().getTime()}&date=${DATE}`;
res = UrlFetchApp.fetch(XHR_URL, { headers: {'Cookie': cookies.join(";")}} );
const $ = Cheerio.load(res.getContentText());
var data = $('table td').text();
Logger.log(data);
}

Not getting response back in webapp

I am new in app script and trying to make simple webapp, but I am not getting any return from apsscript when webpage is loading, it is returning null instead
Here is the code:-
function loadTest()
{
var ss = SpreadsheetApp.getActiveSpreadsheet();
const ssname = ss.getSheetByName('Sales');
const range = ssname.getDataRange().getValues();
return range
}
Client side Code:-
document.addEventListener('DOMContentLoaded',(event)=>{
const startTime = new Date()
google.script.run.withSuccessHandler(function(e){
console.log(e)
}).loadTest()
})
Seems like you're trying to return prohibited elements like date from server side of webApp, which is making request fail and client getting null as a return .
Try following modification, changing this:-
const range = ssname.getDataRange().getValues();
To this :-
const range = ssname.getDataRange().getDisplayValues();
Reference:-
Parameters and Return Values

Copy Google Form Input to two different google Sheet tabs

I have a Google form the information that is submitted is from students and faculty. The form has a trigger to run the function every time information is submitted. I want to copy all the submitted information to different tabs. One with staff members and one with student info. I can copy all the information into one tab, but when I try to separate it I am not able to get the results I need.
Any tips or guidance would be much appreciated.
function copyRowsWithCopyto(){
var spreadSheet = SpreadsheetApp.getActiveSpreadsheet();
var sourceSheet = spreadSheet.getSheetByName('Entrega_Dispositivos');
var sourceRange = sourceSheet.getDataRange();
var studentSheet = spreadSheet.getSheetByName('Student_Copy');
var staffSheet = spreadSheet.getSheetByName('Staff_copy');
var lr = sourceSheet.getLastRow();
var data = sourceSheet.getRange("A2:AS" + lr).getValues();
for(var i = 0;i<data.length;i++){
var rowData = data[i];
var status = rowData[2];
if(status == "Student" && status != "Staff"){
sourceRange.copyTo(studentSheet.getRange(1, 1));
} else {
sourceRange.copyTo(staffSheet.getRange(1, 1));
}
}
}
function onFormSubmit(e) {
const itemResponses = e.response.getItemResponses();
const status = itemResponses[1].getResponse();
const ss = SpreadsheetApp.getActiveSpreadsheet();
if (status === 'Student') {
ss.getSheetByName('Student_Copy').appendRow(e.values);
}
else {
ss.getSheetByName('Staff_copy').appendRow(e.values);
}
}
You can simply put 2 queries, one in each tab, to separate student and staff, without any scripts !

How to parse API data into google sheets

I have an API response that I'm trying to place in my spreadsheet.
I managed to figure out how to call it using the following code but it all goes to the first cell. How do I make each value go to a different cell?
function callCandles() {
var response = UrlFetchApp.fetch("https://api-pub.bitfinex.com/v2/candles/trade:1D:tBTCUSD/hist?limit=1000&start=1577841154000&end=1606785154000&sort=-1");
Logger.log(response.getContentText());
var fact = response.getContentText();
var sheet = SpreadsheetApp.getActiveSheet();
sheet.getRange(1,1).setValue([fact])
}
This is the correct way to set the values in a sheet. Avoid using for loops with appendRow. It can be extremely slow for a relatively small amount of data.
Suggested solution:
function myFunction() {
var response = UrlFetchApp.fetch("https://api-pub.bitfinex.com/v2/candles/trade:1D:tBTCUSD/hist?limit=1000&start=1577841154000&end=1606785154000&sort=-1");
var fact = JSON.parse(response.getContentText());
var sheet = SpreadsheetApp.getActiveSheet();
sheet.getRange(sheet.getLastRow()+1,1,fact.length,fact[0].length).setValues(fact);
}
Try something like this,
var response = UrlFetchApp.fetch("https://api-pub.bitfinex.com/v2/candles/trade:1D:tBTCUSD/hist?limit=1000&start=1577841154000&end=1606785154000&sort=-1");
Logger.log(response.getContentText());
var fact = JSON.parse(response.getContentText());
var sheet = SpreadsheetApp.getActiveSheet();
fact.forEach(row => {
sheet.appendRow(row);
});
Hope you got some idea.

Use GAS to Fetch URL, Then Take Screenshot (or Convert HTML to PDF/JPG)

My Goal
I am, in short, attempting to create a script that visits a list of URLs and takes a screenshot of what is on the URL. For context, my goal is to save a snapshot that shows an image is listed on free stock photo sites (like pexels.com and unsplash.com).
My Code So Far
Here is what I have so far:
function stockDatabasePDF(){
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getSheetByName('stock db');
//make the pdf from the sheet
var data = sheet.getRange('A2:A').getValues(); //these are your URLs to check
for ( var i = 0; i < data.length; i++ ) {
if ( data[i][0] !== "" ) { //this means if your data (urls) are NOT blank
var theurl = data[i][0];
var token = ScriptApp.getOAuthToken();
var docurl = UrlFetchApp.fetch(theurl, { headers: { 'Authorization': 'Bearer ' + token } });
var pdf = docurl.getBlob().setName('name').getAs('application/pdf');
}
//save the file to folder on Drive
var fid = '1eHvWjIYyOeB9MQDwovzyXxx8CEIK5aOt';
var folder = DriveApp.getFolderById(fid);
var pdfs = folder.createFile(pdf).getUrl(); //both creates the file and copies the url
}
}
Just FYI... What is in range A2:A is the URLs to the stock photo sites, for example:
https://www.pexels.com/photo/person-riding-brown-horse-3490257/
https://unsplash.com/photos/Cz_SNZZyHgI
The issue
This script seems to ALMOST work. But it runs into one of two problems:
I get a 403 response code (forbidden request)
The screenshot takes, but no images/css/etc. is included.
Here is an example of that
To Close
Any help here would be greatly appreciated. Please let me know if I left anything out or if anyone has any questions. Thank you all!

Categories

Resources