How to increase speed of google docs apps scripts - javascript

I have a google apps script which fetches data from spreadsheet. The data is fetched pretty slowly, I think because of amount of data. The first call of this function takes till a second:
sheet.getRange(row, coll).getValue()
I think there must be away to create global variables which will be available all the time the spreadsheet is opened. I would store the whole spreadsheet on open and access it, which should be much faster. But I could not find this option.

You can use: sheet.getDataRange().getValues(); it will result in a 2D array containing all values in the sheet up to the last row and column containing data.
For example, if you have a sheet with data in cells A1:B10 and then data in cell E20 the .getDataRange() method will get the range A1:E20 including empty cells.
You can then use loops or filters to access or modify the arrays.
Without more information or knowing what exactly you are trying to achieve that is the best method I can give you.
NOTE: Storing the whole spreadsheet onOpen() is not a very good idea because any changes to the spreadsheet will not be picked up unless you refresh or reopen the spreadsheet.

Related

Dynamically creating tables with indexedDb

On my web-app, the user can request different data lines. Those data lines have an unique "statusID" each, say "18133". When the user requests to load the data, it is either loaded from the server or from indexedDB(that part im trying to figure out). In order to make it as fast as possible, I want the index to be the timestamp of the data, as I will request ranges which are smaller than the actual data in the indexedDB. However, I am trying to figure out how to create the stores and store the data properly. I tried to dynamically create stores everytime data with a new Id is requested, but creating stores is only possible in "onupradeneeded". I also thought about storing everything in the same store, but I fear that the performance will make that bad. I do not really now how to approach this thing.
What I do know: If you index a value, it means that the data is sorted, which is exactly what I want. I dont know if the following is possible but this would solve my issue too: store everthing in the same store, index by "statusID" and index by "timestamp". This way, it would be fast too i guess.
Note that I am talking about many many datapoints, possible in the millions.
You can index by multiple values, allowing you to get all by statusID and restricting to a range for your timestamp. So I'd go with the one datastore solution. Performance should not be an issue.
This earlier post may be helpful: Javascript: Searching indexeddb using multiple indexes

Import data to Google Sheets from JavaScript with automatic column widths (v3 API)

I'm writing JavaScript browser/client-side code that needs to upload table/grid based data to Google Sheets and I'm having trouble setting column widths sensibly. The data is stored in JavaScript as arrays of arrays.
So far I've tried uploading the data formatted as a CSV file to "/upload/drive/v3/files" with a "application/vnd.google-apps.spreadsheet" MIME type set as part of the metadata. This works as in I can open the file in Drive as a Google Sheet. However, all the columns are the same size and tiny. Many of my cells contain around 300 characters of text so all the text is truncated and looks bad.
Ideally the columns would resize to fit the longest entry. If I could set the width by value that would work as well.
I need to upload around 100 tables so they need to be either separate files or separate sheets on a single spreadsheet. I want to reduce the number of API calls I need to make as I have many users.
Ideas I've thought of:
Upload as above then use up extra API calls to change the dimensions of the columns. Not what I want to do given there's about 100 tables.
Is there a way to use the Drive API to create the file then the Sheets API to add the data with column widths specified? I'd rather do everything with one request per table if possible though.
Convert the data in JS to Excel or OpenOfffice format data with the column width encoded and upload that. Would this work? This feels complex though if there's alternatives.
I'm surprised if there's not a way to make Google Sheets resize the columns automatically on upload as it seems a common expectation when importing CSV data.
You want to create a spreadsheet with 100 sheets from CSV files. When a table from CSV data is imported to a sheet, you want to create new sheet and import the table. And then, you want to automatically modify the length of columns. If my understanding is correct, how about this method?
I thought that sheets.spreadsheets.batchUpdate can achieve your situation. sheets.spreadsheets.batchUpdate can sequentially run the batch process by adding methods to the request array. A simple sample for your situation is as follows.
Flow :
Create a new sheet in Spreadsheet of ### fileId ###.
You can give freely sheetId as Int32. This sample uses 1000000001.
Import data to the created sheet.
Please import each row of the CSV data to pasteData. This sample gives 2 rows.
delimiter is ,.
Resize the column width using autoResizeDimensions.
By this, the column width is resized automatically by the length of imported values.
In this sample, these flow can be run by one API call.
Endpoint
POST https://sheets.googleapis.com/v4/spreadsheets/### fileId ###:batchUpdate
Request body :
{
"requests":[
{"addSheet":{"properties":{"title":"sssample","sheetId":1000000001}}},
{"pasteData":{"data":"##########sampletext1##########,##########sampletext2##########,##########sampletext3##########","delimiter":",","coordinate":{"rowIndex":0,"columnIndex":0,"sheetId":1000000001}}},
{"pasteData":{"data":"##########sampletext4##########,##########sampletext5##########,##########sampletext6##########","delimiter":",","coordinate":{"rowIndex":0,"columnIndex":0,"sheetId":1000000001}}},
{"autoResizeDimensions":{"dimensions":{"dimension":"COLUMNS","sheetId":1000000001,"startIndex":0,"endIndex":3}}}
]
}
Note :
Please modify above sample request body for your situation.
If you have already created the sheets, please remove addSheet from the request body.
I couldn't know the limitation for the number of methods in the request body, so if you have the limitation in your situation, please separate it.
References :
sheets.spreadsheets.batchUpdate
addSheet
pasteData
autoResizeDimensions
If I misunderstand your question, I'm sorry.
For formatting the cells, check the Basic Formatting guide. It includes instructions on how to specify the size of the width. Other supported methods are found in the Sheets API reference which includes creating spreadsheets.

Get Google Sheets cell value

I want to get the value of a cell in google sheets and be able to compare it to something later. When searching for the answer on the web, I was directed to many different places including the Visualization API, and the Sheets Script, but could not find the answer anywhere here. I know I have probably missed something, but I am new to this and would appreciate any pointers you could give.
You may want to use Google Sheets API wherein you can read and write cell values via the spreadsheets.values collection.
Then to be able to properly compare cell values, you need to create and update the conditional formatting within spreadsheets. I suggest that you also check conditional formatting rules.
Lastly, this thread regarding conditional formatting to compare cell contents might also help.

How to do a bulk insert while avoiding duplicates in Postgresql

I'm working in nodejs, hosted at Heroku (free plan so far).
I get the data from elsewhere automatically (this part work fine and I get JSON or CVS), and my goal is do add them into a Prostresql DB.
While, I'm new to DB mangement and Postgresql, I've made my research before posting this. I'm aware that the COPY command exist, and how to INSERT multiple data without duplicate. But my problem is a mix of both (plus another difficulty).
I hope my question is not breaking the rules.
Short version, I need to :
Add lots of data a once
Never create duplicate
Rename column name between source data and my table
Long version with details :
The data I collect are from multiples sources (2 for now but will get bigger) and are quite big (>1000).
I also need to remap the column name to one unified system. What could be called "firstDay" on one source is called "dateBegin" in another, and I want them to be called "startDate" in my table.
If I'm using INSERT, I take care of this myself (in JS) while constructing the query. But maybe COPY could do that in a better way. Also, INSERT seem to have a limit of data you can push in one time, and so I will need to divide my query multiple time and maybe use callback or promise to avoid drowning the DB.
And finally, I will update this DB regularly and automatically and they will be a lot of duplicate. Hopefully, every piece of data has an unique id, and I have made a column PRIMARY KEY in the table that store this id. I thought it may eliminate any problem with duplicate, but I may be wrong.
My first version was very ugly (for loop making a new query a every loop) and didn't work. I was thinking about doing 1000 data at a time in a recursive way waiting for callback before sending another batch. It seem clunky and time expensive to do it that way. COPY seem perfect if I can select/rename/remap columns and avoid duplicated. I've read the documentation and I don't see a way to do that.
Thank you very much, any help is welcome. I'm still learning so please be kind.
I have done this before using temporary tables to "stage" your data and then do an INSERT SELECT to move the data from staging to your production table.
For populating your staging table you can use bulk INSERTs or COPY.
For example,
BEGIN;
CREATE TEMPORARY TABLE staging_my_table ( // your columns etc );
// Now that you have your staging table you can bulk INSERT or COPY
// into it from your code, e.g.,
INSERT INTO staging_my_table (blah, bloo, firstDay) VALUES (1,2,3), (4,5,6), etc.
// Now you can do an INSERT into your live table from your staging, e.g.,
INSERT INTO my_table (blah, bloo, startDate)
SELECT cool, bloo, firstDay
FROM staging_my_table staging
WHERE NOT EXISTS (
SELECT 1
FROM mytable
WHERE staging.bloo = mytable.bloo
);
COMMIT;
There are always exceptions, but this might just work for you.
Have a good one

Writing to an empty cell in Google Spreadsheets

I would like to have some Javascript code running in a web browser write to a Google Spreadsheet in the user's Google account.
The Javascript API is a bit long-winded, involving lots of round trips, but does seem to work. I can successfully log in, create a new spreadsheet, read values from it, update cells, etc. However, I haven't yet figured out how to write to an empty cell. (By empty, I mean that a cell that has had no value written into it yet.)
The issue is: in order to update the value of a cell, I need to know that cell's id. To get the cell's id, I read the cell feed, which shows me the contents (and id) of all non-empty cells. However it won't show me empty cells, therefore I don't know their id, therefore I can't write to them. I've tried making up my own id based on the naming pattern of the other cells, but that doesn't work.
There must be an easy way round this. What is it?
(Context: what I'm actually doing is trying to store user preferences in the user's Google account. The spreadsheet API seems like to be the only one that's feasible to use from a pure Javascript environment. If anyone can suggest any alternatives that's easier to use than Spreadsheets, I'd be grateful.)
I was about to ask a similar question when I stumbled upon yours and while in the process of compiling my question I found the answer!
http://code.google.com/apis/spreadsheets/data/2.0/reference.html#CellParameters
There is a property called return-empty which is set to false by default hence only returning cells that aren't empty in the feed. Set this to "true" in your cell query and you will be able to update the value of empty cells.
I've testing this using the .NET API .

Categories

Resources