I'm using LockService when sending emails, but when multiple users run it at the same time, it throws the error: Service invoked too many times...
Here's some log, showing the failures:
function sendEmail() {
const ss = SpreadsheetApp.getActiveSpreadsheet();
var lock = LockService.getScriptLock();
try {
lock.waitLock(3000); // wait 03 seconds for others' use of the code section and lock to stop and then proceed
} catch (e) {
Logger.log('Someone has just sent an Email. Try it again 3 seconds later.');
return HtmlService.createHtmlOutput("<b> Server Busy please try after some time <p>")
// In case this a server side code called asynchronously you return a error code and display the appropriate message on the client side
//return "Error: Server busy try again later... Sorry :("
}
/*
Gets certain data from current row. These are used as criteria
*/
GmailApp.sendEmail(email, "Text", name + " BODY.", { name: 'Diplay Custom Name as Sender' });
//Looks for a matching record in another sheet to mark its adjacent column as sent ("Yes")
for (var n = 0; n < formRespValues.length; n++) {
if (formRespValues[n][1] == testNo) {
formRespSheet.getRange('M' + (2 + n)).setValue('Yes');
}
}
}
SpreadsheetApp.flush(); // applies all pending spreadsheet changes
lock.releaseLock();
}
As #Tanaike pointed out in the comments, a short time within waitLock(timeoutInMillis) or tryLock(timeoutInMillis) may prevent the Lock Service from working as expected.
To ensure the correct functioning, try setting a time that is largely greater than the execution time of your script. If it is less than this, the lock will "open" prematurely, and cause errors such as those outlined in the question.
Documentation :
Lock Service
Class Lock
Related
When processing results of Google forms with Google Apps Script accessing the form by …
let formID = FormApp.getActiveForm().getId();
… sometimes fails with an exception like "Form data could not be retrieved." Manually started just a minute later it works properly.
To handle those errors the best way I want to catch the exception and retry the method one minute later. I came up with this:
function foo() {
const maxTries = 3;
let formID;
let tries = 1;
while(true) {
try {
formID = FormApp.getActiveForm().getId();
break;
} catch (e) {
console.log("Retrieving form data failed (" + tries + ")");
if (tries >= maxTries) {
console.log("Retrieving form data not possible"); // and/or …
throw(e);
return;
} else {
tries++;
};
};
};
// Do things with form stuff
};
How can I insert a 60 second pause between the tries? And I'm not sure anyway, if there isn't a better way to overcome those errors (at all or within Google Apps Script).
In this case why don't you try using Utilities.sleep. This method will pretty much puts the script to sleep for a set amount of time. Please note the maximum amount is of 5 minutes as the maximum execution time for a script is of 6 minutes. You can check runtime limits in the documentation.
I have a script that basically take info from a website for multiple users, and put these info in a google spreadsheet, with one sheet per users.
I have a function that remove values of the firstline, resize every columns, and put back the setValues:
function adjustColumnsAndIgnoreFirstLine(sheet) {
Logger.log('--- Adjust columns ---')
const range = sheet.getRange("1:1")
// save the title line
const datas = range.getValues();
// clear it
range.clearContent();
// format without the title lines
var lastColumn = sheet.getLastColumn()
sheet.autoResizeColumns(1, lastColumn);
// set width to a minimum
for (var i = 1; i < 37; i++) { // fixed number of columns
if (sheet.getColumnWidth(i) < 30) {
sheet.setColumnWidth(i, 30);
}
}
// put back titles
range.setValues(datas);
}
my problem is that the script stop executing in the middle of the function. I still have the "execution please wait" popup, but in the logs, the script stopped like there was no error (execution finished) with this as the last log:
And, on the google spreadsheet:
One thing to note is that the problem doesn't comes from the script itself, as I do not encounter this problem on any of my machines, but my client does. My client ran the script on different navigator (chrome and edge), and had the same problem, but on different users (sometimes it blocks at the before-last user, sometimes at the before-before-last user...)
So I'm kinda lost on this problem...
The problem is actually a timeout. Google app script limit the execution time of a script at ~6 minutes.
There is existing issues for this
[![enter image description here][1]][1]I need to implement code to check what my throttling limit is on an endpoint (I know it's x times per minute). I've only been able to find an example of this in python, which I have never used. It seems like my options are to run a script to send the request repeatedly until it throttles me or, if possible, query the API to see what the limit is.
Does anyone have a good idea on how to go about this?
Thanks.
Note: The blank space is just data from the api calls.
[1]: https://i.stack.imgur.com/gAFQQ.png
This starts concurency number of workers (I'm using workers as a loose term here; don't # me). Each one makes as many requests as possible until one of the requests is rate-limited or it runs out of time. It them reports how many of the requests completed successfully inside the given time window.
If you know the rate-limit window (1 minute based on your question), this will find the rate-limit. If you need to discover the window, you would want to intentionally exhaust the limit, then slow down the requests and measure the time until they started going through again. The provided code does not do this.
// call apiCall() a bunch of times, stopping when a apiCall() resolves
// false or when "until" time is reached, whichever comes first. For example
// if your limit is 50 req/min (and you give "until" enough time to
// actuially complete 50+ requests) this will call apiCall() 50 times. Each
// call should return a promise resolving to TRUE, so it will be counted as
// a success. On the 51st call you will presumably hit the limit, the API
// will return an error, apiCall() will detect that, and resolve to false.
// This will cause the worker to stop making requests and return 50.
async function workerThread(apiCall, until) {
let successfullRequests = 0;
while(true) {
const success = await apiCall();
// only count it if the request was successfull
// AND finished within the timeframe
if(success && Date.now() < until) {
successfullRequests++;
} else {
break;
}
}
return successfullRequests;
}
// this just runs a bunch of workerThreads in parallell, since by doing a
// single request at a time, you might not be able to hit the limit
// depending on how slow the API is to return. It returns the sum of each
// workerThread(), AKA the total number of apiCall()s that resolved to TRUE
// across all threads.
async function testLimit(apiCall, concurency, time) {
const endTime = Date.now() + time;
// launch "concurency" number of requests
const workers = [];
while(workers.length < concurency) {
workers.push(workerThread(apiCall, endTime));
}
// sum the number of requests that succeded from each worker.
// this implicitly waits for them to finish.
let total = 0;
for(const worker of workers) {
total += await worker;
}
return total;
}
// put in your own code to make a trial API call.
// return true for success or false if you were throttled.
async function yourAPICall() {
try {
// this is a really sloppy example API
// the limit is ROUGHLY 5/min, but because of the sloppy server-side
// implimentation you might get 4-6.
const resp = await fetch("https://9072997.com/demos/rate-limit/");
return resp.ok;
} catch {
return false;
}
}
// this is a demo of how to use the function
(async function() {
// run 2 requests at a time for 5 seconds
const limit = await testLimit(yourAPICall, 2, 5*1000);
console.log("limit is " + limit + " requests in 5 seconds");
})();
Note that this method measures the quota available to itself. If other clients or previous requests have already depleted the quota, it will affect the result.
I use an API that has been recently rate limited and requests have to be run every 15 seconds as that is the wait time otherwise a status code of 429 rate limit exceeded is returned.
I often have more than one email address that needs to be run against this API and the email addresses are contained within an array. How would I go about running the request every say 15.5 seconds but move onto each email address until the end of the array? It's a very tricky one for sure. I've tried:
setInterval(checkEmail(email), 15500);
No joy, for some reason that just doesn't seem to work. Btw should point out that I'm using a JQuery $.ajax() within that checkEmail(email) function.
Any ideas anybody?
Thanks in advance.
You could do the following:
var emails = [];
var interval_id;
function _start() {
interval_id = window.setInterval(function() {
var email = emails.shift();
if (email) {
var r=checkEmail(email.address);
if (email.callback) email.callback(r);
}
else {
window.clearInterval(interval_id);
}
}, 15500);
}
function add_email(email, callback) {
emails.push({address:email, callback:callback});
if (!interval_id) _start();
}
Whenever you have a new email address to check, just run add_email(email). This will add it to the queue and start the interval timer if necessary. Additionally, you can say add_email(email,function(result) {....}) if you want to get notified when the check is completed.
I'm developing an add-on for the first time. It puts a little widget in the status bar that displays the number of unread Google Reader items. To accommodate this, the add-on process queries the Google Reader API every minute and passes the response to the widget. When I run cfx test I get this error:
Error: The page has been destroyed and can no longer be used.
I made sure to catch the widget's detach event and stop the refresh timer in response, but I'm still seeing the error. What am I doing wrong? Here's the relevant code:
// main.js - Main entry point
const tabs = require('tabs');
const widgets = require('widget');
const data = require('self').data;
const timers = require("timers");
const Request = require("request").Request;
function refreshUnreadCount() {
// Put in Google Reader API request
Request({
url: "https://www.google.com/reader/api/0/unread-count?output=json",
onComplete: function(response) {
// Ignore response if we encountered a 404 (e.g. user isn't logged in)
// or a different HTTP error.
// TODO: Can I make this work when third-party cookies are disabled?
if (response.status == 200) {
monitorWidget.postMessage(response.json);
} else {
monitorWidget.postMessage(null);
}
}
}).get();
}
var monitorWidget = widgets.Widget({
// Mandatory widget ID string
id: "greader-monitor",
// A required string description of the widget used for
// accessibility, title bars, and error reporting.
label: "GReader Monitor",
contentURL: data.url("widget.html"),
contentScriptFile: [data.url("jquery-1.7.2.min.js"), data.url("widget.js")],
onClick: function() {
// Open Google Reader when the widget is clicked.
tabs.open("https://www.google.com/reader/view/");
},
onAttach: function(worker) {
// If the widget's inner width changes, reflect that in the GUI
worker.port.on("widthReported", function(newWidth) {
worker.width = newWidth;
});
var refreshTimer = timers.setInterval(refreshUnreadCount, 60000);
// If the monitor widget is destroyed, make sure the timer gets cancelled.
worker.on("detach", function() {
timers.clearInterval(refreshTimer);
});
refreshUnreadCount();
}
});
// widget.js - Status bar widget script
// Every so often, we'll receive the updated item feed. It's our job
// to parse it.
self.on("message", function(json) {
if (json == null) {
$("span#counter").attr("class", "");
$("span#counter").text("N/A");
} else {
var newTotal = 0;
for (var item in json.unreadcounts) {
newTotal += json.unreadcounts[item].count;
}
// Since the cumulative reading list count is a separate part of the
// unread count info, we have to divide the total by 2.
newTotal /= 2;
$("span#counter").text(newTotal);
// Update style
if (newTotal > 0)
$("span#counter").attr("class", "newitems");
else
$("span#counter").attr("class", "");
}
// Reports the current width of the widget
self.port.emit("widthReported", $("div#widget").width());
});
Edit: I've uploaded the project in its entirety to this GitHub repository.
I think if you use the method monitorWidget.port.emit("widthReported", response.json); you can fire the event. It the second way to communicate with the content script and the add-on script.
Reference for the port communication
Reference for the communication with postMessage
I guess that this message comes up when you call monitorWidget.postMessage() in refreshUnreadCount(). The obvious cause for it would be: while you make sure to call refreshUnreadCount() only when the worker is still active, this function will do an asynchronous request which might take a while. So by the time this request completes the worker might be destroyed already.
One solution would be to pass the worker as a parameter to refreshUnreadCount(). It could then add its own detach listener (remove it when the request is done) and ignore the response if the worker was detached while the request was performed.
function refreshUnreadCount(worker) {
var detached = false;
function onDetach()
{
detached = true;
}
worker.on("detach", onDetach);
Request({
...
onComplete: function(response) {
worker.removeListener("detach", onDetach);
if (detached)
return; // Nothing to update with out data
...
}
}).get();
}
Then again, using try..catch to detect this situation and suppress the error would probably be simpler - but not exactly a clean solution.
I've just seen your message on irc, thanks for reporting your issues.
You are facing some internal bug in the SDK. I've opened a bug about that here.
You should definitely keep the first version of your code, where you send messages to the widget, i.e. widget.postMessage (instead of worker.postMessage). Then we will have to fix the bug I linked to in order to just make your code work!!
Then I suggest you to move the setInterval to the toplevel, otherwise you will fire multiple interval and request, one per window. This attach event is fired for each new firefox window.