How should I use refresh tokens with Google Picker and Google Drive? - javascript

The problem:
I have a app on google script's platform, that's meant to allow uploads to Google Drive without using any account. The upload feature works well but I'm having issues with very long/big uploads. I'm trying to solve this since a week now, mostly because I need to test the expiration of the tokens.
When a user tries to upload a big file (20/30 GB) to the server, the Auth token expires Error Screenshot 1 and then I get this error Error screenshot 2.
So, what I need is to use a token that would expire in more than 5 hours. I did try to use a refresh token but I ended up very confused. I did created the refresh token in OAuth 2.0 Playground.
Things I've tried:
Pass the refresh token in the setOAuthToken. (Rejected by the function)
Use the refresh token to use setOAuthToken but it failed.
Questions
Can I permanently authorize the app access to Picker? (since it's always the same user accesing the drive on the server side code)?
Should I use a refresh token to obtain an Auth token?
Original Code:
var a = (new google.picker.PickerBuilder)
.addView(t)
.enableFeature(google.picker.Feature.NAV_HIDDEN)
.setOAuthToken("<?= ScriptApp.getOAuthToken(); ?>")
.enableFeature(google.picker.Feature.MULTISELECT_ENABLED)
.hideTitleBar()
.setSize(DIALOG_DIMENSIONS.width - 2, DIALOG_DIMENSIONS.height - 2)
.setCallback(pickerCallback).setOrigin(config.FORM_EMBED_DOMAIN)
.build()
Any help will be extremely appreciated.

afaik, the Picker can't take a Refresh Token and use this to renew its Access Tokens. This is almost certainly by design, since Refresh Tokens should never be on an insecure device such as a browser.
The only approach I can suggest would be to:-
A 1. have a Refresh Token on a secure server
implement your own endpoint to return an Access Token using the stored refresh Token
or
B 1. Use gapi, immediate=true (or however you currently obtain an Access Token)
Have a setTimeout/setInterval function which every 59 minutes, gets a new Access Token using option A or B
Poke this into the Picker object by finding the internal property where the Access Token is stored.
This is fugly and fragile, but I honestly can't think of a better answer.

You can dispose created picker object after 1hr, and create a new one with freshly obtained access_token
https://developers.google.com/picker/docs/reference#picker
Look at method dispose in API description

The suggested solutions did not solve the problem.
I tried the same using Google forms, I tried to upload the same files I used to test the error described in the original question. It turns out, I have the exactly same error!
So, I think is a case of "worked as design". I already sent a error report to Google, we have a G Suite Account, I hope we receive some feedback. But I think is not something easy to solve.
The main problem with the google form alternative, is that it requires a Gmail/Google account, and if the files you want to upload are bigger than your free quota, the upload will fail. I'm trying with a personal account with 21 GB (the uploader) and an unlimited G Suite account (receiver and form owner).
So,
After a lot of testing different options, the easiest/fastest solution is to limit the clients to upload up to 3 files (because you can upload 3 files at the time during the beginning of the process). When you try to upload the 4th file you'll get an authentication error.
Case closed!

Related

Google oauth session lost after page reload (javascript)

I recently moved from the deprecated gapi.auth2 to the new Google Identity Services, using the javascript client library, and noticed a big difference: if someone signs in, and then reloads the page, the session is lost, and has to sign in again, every time the page is loaded. This was not the case with the deprecated library.
The problem can be easily reproduced with the Calendar API example.
Is there any configuration option to keep the session persistent? Or do I need to store the access tokens somehow? I could not find anything relevant in the official docs.
UPDATE:
The migration guide states the following:
Previously, Google Sign-In helped you to manage user signed-in status using:
Callback handlers for Monitoring the user's session state.
Listeners for events and changes to signed-in status for a user's Google Account.
You are responsible for managing sign-in state and user sessions to your web app.
However there's absolutely no information on what needs to be done.
UPDATE 2
To be more specific, the actual issue is not making the session persistent. Managing the sign in state and user session is something I can solve.
The real problem is the access token used to call the Google APIs.
As mentioned in the comments, the access tokens are 1) short lived 2) are not stored anywhere, so even if not expired, they do not persist between page reloads.
Google provides the requestAccessToken method for this, however even if I specify prompt: '', it opens the sign-in popup. If I also specify the hint option with the signed in user's email address, than the popup opens, displays a loading animation briefly, and closes without user interaction. I could live with this, however this only works if triggered by a user interaction, otherwise the browser blocks the popup window, meaning that I cannot renew the token without user interaction, e.g. on page load. Any tips to solve this?
I faced all the same issues you described in your question.
In order to help:
Google 3P Authorization JavaScript Library: in this link we can check all the methods the new library has (it does not refresh token, etc..)
This doc says the library won't control the cookies to keep the state anymore.
Solution
Firstly I need to thanks #Sam O'Riil answer.
As Sam described: "you can somehow save access token and use it to speed-up things after page reload."
Given the the Google's exampe, we should call initTokenClient in order to configure the Google Auth and the requestAccessToken to popup the auth:
tokenClient = google.accounts.oauth2.initTokenClient({
client_id: 'YOUR_CLIENT_ID',
scope: 'https://www.googleapis.com/auth/calendar.readonly',
prompt: 'consent',
callback: tokenCallback
});
tokenClient.requestAccessToken({prompt: ''})
In your tokenCallback you can save the credentials you get somehow, e.g.:
const tokenCallback(credentials) => {
// save here the credentials using localStorage or cookies or whatever you want to.
}
Finally, when you restart/reload your application and you initialize the gapi.server again, you only need to get the credentials again and set token to gapi, like:
gapi.load('client', function() {
gapi.client.init({}).then(function() {
let credentials = // get your credentials from where you saved it
credentials = JSON.parse(credentials); // parse it if you got it as string
gapi.client.setToken(credentials);
... continue you app ...
}).catch(function(err) {
// do catch...
});
});
Doing it, your application will work after the reload. I know it could not be the best solution, but seeing what you have and the library offers, I think that's you can do.
p.s.: the token expires after 1 hour and there is no refresh token (using the implicit flow) so, you will have to ask the user to sign-in again.

Why does a Cloud Function for Firebase (deployed without errors), throw "Uncaught (in promise) FirebaseError: internal" when called? [duplicate]

After months of developing a Web App under Firebase suddenly these days we have a problem with the Authentication, it returns this console.alert only with Facebook and google login (email/pass login works fine):
[firebase-auth] Info: The current domain is not authorized for OAuth
operations. This will prevent signInWithPopup, signInWithRedirect,
linkWithPopup and linkWithRedirect from working. Add your domain
(front.qualify.mx) to the OAuth redirect domains list in the Firebase
console -> Auth section -> Sign in method tab.
The App uses 3 different sub-domains, and in all 3 we can access over email/pass but not Facebook nor google.
We tried updating the Firebase initialization script, nothing. We have checked the API keys (in the Google APIs Credentials) and there was a new "Server key (auto created by Google Service)" which no one told us it was generated (Jan. 18th), so we edited it to include the domains as the original API key in different ways (w/wo * and /*), nothing. We deleted this new Server Key, suddenly something different, now the console includes a 403 error before the alert stated above and returns auth/timeout code inside the object.
We also found the Identity Toolkit API has detected many errors, so we tried to add the URLs for login, logout and email, but nothing happens when trying to save.
What are we missing?
The solution was adding my-app.firebaseapp.com (being my-app the unique identifier of our Firebase App) to the HTTP referrers in the Browser-Key Credentials from the Google APIs console and wait some time to propagate.
After many months of development the app never had a problem, and we are sure we never removed such referrer (if it was ever there).
Anyway... it's done and learned.
The simple way I was able to solve this issue I had with my ionic project was by following the instructions in the log, if you don't see any message try console log the response from firebase.
So what I simply did was follow the url: https://console.developers.google.com/apis/api/identitytoolkit.googleapis.com/overview?project='projectId'
*projectId = the Id of your project
and enable the Identity API something it brought up. Finish, and it worked instantly.

Setting limits on file uploads via Firebase auth and storage without server in the middle?

I'm learning about Firebase auth and storage in a web app. My idea asks users to login via Firebase and then upload an image.
I can see that this is possible from Firebase auth and storage. However, I would like to put limits on the file count and file-size they can upload.
Is it possible to control uploads within the Firebase console (or somewhere else)? After reviewing the JavaScript examples, I see how I can put files in, and I can imagine writing code which would query Firebase for a user's upload count, and then limit on the client side, but of course, this is a completely insecure method.
If I hosted this as a single page app on, say, GitHub pages, I am wondering if I could set these limits without involving a server. Or, do I need to proxy my uploads through a server to make sure I never allow users to upload more than I intend them to?
You can limit what a user can upload through Firebase Storage's security rules.
For example this (from the linked docs) is a way to limit the size of uploaded files:
service firebase.storage {
match /b/<your-firebase-storage-bucket>/o {
match /images/{imageId} {
// Only allow uploads of any image file that's less than 5MB
allow write: if request.resource.size < 5 * 1024 * 1024
&& request.resource.contentType.matches('image/.*');
}
}
}
But there is currently no way in these rules to limit the number of files a user can upload.
One approach that comes to mind would be to use fixed file names for that. For example, if you limit the allowed file names to be numbered 1..5, the user can only ever have five files in storage:
match /public/{userId}/{imageId} {
allow write: if imageId.matches("[1-5]\.txt");
}
If you need a per user storage validation the solution is a little bit more tricky, but can be done.
Ps.: You will need to generate a Firebase token with Cloud Functions, but the server won't be in the middle for the upload...
https://medium.com/#felipepastoree/per-user-storage-limit-validation-with-firebase-19ab3341492d
One solution may be is to use Admin SDK to change Storage Rules based on a Firestore document holding the upload count per day.
Say you have a firestore collection/document as userUploads/uid having fields uploadedFiles: 0 and lastUploadedOn.
Now, once the user uploads the file to Firebase Storage (assuming within limits and no errors), you can trigger Cloud Function which will read userUploads/uid document and check the lastUploadedOn field is of an earlier date than the currently uploaded file's date and if yes then make the uploadedFiles to 1 and change the lastUploadedOn to uploaded datetime. Else, increment the uploadedFiles count and change lastUpdateOn to current datetime. Once the uploadedFiles value becomes 10 (your limit), you can change the storage rules using Admin SDK. See example here. Then, change the count to 0 in userUploads/uid document.
However, there a little caveat. The change in rules might take some time and there should be no legit async work under process for that rule. From Admin SDK:
Firebase security rules take a period of several minutes to fully deploy. When using the Admin SDK to deploy rules, make sure to avoid race conditions in which your app immediately relies on rules whose deployment is not yet complete
I haven't tried this myself but it looks like it will work. On a second thought, changing back the rules to allow write could be complicated. If the user uploads on the next day (after rules has been changed), the upload error handler can trigger another cloud function to check if it is a legit request, change the rules back to normal and attempt the upload again after sometime but it will be very bad user experience. On the other hand, if you use a scheduler cloud function to check userUploads/uid document everyday and reset values, it could be costly (~$18 per million users per month # $0.06/100K reads) and it may be complicated if users are in different timezones and it may be irrelevant regarding most users depending on they're uploading that frequently. Furthermore, rules have limits
Rules must be smaller than 64 KiB of UTF-8 encoded text when serialized
A project can have at most 2500 total deployed rulesets. Once this limit is reached, you must delete some old rulesets before creating new ones.
So per user rule for a large user base can easily reach this limit (apart from other rules).
Perhaps the optimum solution could be to use Auth Claims. Originally have a deny write rule if user has a particular auth claim token (say canUpload: false). Then in cloud function triggered on upload, attach this claim when the user reached limit. This will be real-time as it immediately blocks the user as oppose to Admin SDK rules deployment delay.
To remove the auth claim:
Check through another cloud function in the upload error handler if the lastUploadedOn has been changed hence removing the claim
Check through a separate cloud function called before upload that checks if the user has auth claim and the lastUploadedOn is an earlier date, then remove the claim
Additionally, during login, it can be checked and removed if lastUploadedOn is earlier than today but it is less efficient than 2 since it would constitute unnecessary and needless read on firestore while the user is not even uploading anything
In 2, if the client tries to skip the call, and has the auth claim, s/he cannot upload ever as blocked by security rule. Otherwise if no auth claim then s/he will go through the normal process.
Note: Changing auth claims needs to be pushed to the client. See this doc.
Following the filenames hack Frank gave us, I think we can improve on that to make it more flexible.
For example, in my case I don't want to put a hard limit on user uploads like "you can upload up to 50 files, ever", but rather "you're allowed to upload up to 20 files per day".
I just had this idea and will work on the implementation soon enough, but here it goes:
Following the same logic, we can allow only filenames like 1-07252022, 2-07252022, etc.
And since Firebase rules handles us some string and timestamp methods, I think we can achieve this upload limit/day only using Storage Rules, without the need for user custom claims or any cloud function.
Although in my case, I only want to allow uploads from paying customers, so in that case I would need also a custom claim on the user's token.
I'll edit this answer when I work on the code snippet, but anyone struggling, here you have an idea.
One way to limit number of files (or storage size) a user can upload is to use signed URLs . You would need a server (Cloud Functions) to generate signed URLs but then you can upload large files directly to Cloud storage without streaming the file through the server. The flow would be:
Send the file names and sizes to your server in the request body
Generate signed URL for each file and set Content-Length equal to size of file so that user can only upload a file of that size using the URL.
Update user's storage usage in a database like Firestore.
Upload the files to Cloud storage using the signed URLs received from server.
You just need to ensure that user has enough storage available by checking their Firestore document before generating the signed URLs. If not, you can return an error like:
// storageLimit
if (storageUsed + size > storageLimit) {
throw new functions.https.HttpsError(
"failed-precondition",
"Not enough storage available"
);
}
Checkout How to set maximum storage size limit per euser in Google Cloud Storage? for detailed explanation and code snippets.

Importing events from MS Office 365 (PHP)

I have an intranet site for a small medical clinic, and on the front page I want to display upcoming events associated with the clinic-wide MS Office 365 email account.
I'm new to APIs, so some resources on how to get started would help.
The site is in PHP, but as I understand it, API functions can be done in JavaScript - either way is fine.
Once I can get an XML or JSON file from Microsoft, I'm pretty sure I can figure out how to format it for the site. The problem is just getting the info.
So far I have:
<script>
var req = new XMLHttpRequest();
req.open("GET", "https://outlook.office365.com/api/v1.0/users/{email address}/events", false);
req.send();
console.log(req.status);
console.log(req.StatusText);
</script>
The console logged:
"NetworkError: 401 Anonymous Request Disallowed
I've also tried the line req.open("GET", "https://outlook.office365.com/api/v1.0/users/me/events", false{or true}/ {username}, {password});, to which the console logged
NS_ERROR_DOM_BAD_URI: Access to restricted URI denied
Almost all the documentation I can find is directed toward individual users (employees of a company) interfacing with their 365 accounts through some web-based interface, so almost all of the urls have /me/ in them, indicating they have authenticated somehow. But I want my PHP or JavaScript script to automatically authenticate a single user and retrieve information. I imagine this requires hard-coding the user and password somewhere, but I've found no examples like that.
I'm obviously in way over my head, but can anyone offer any advice on how I can get this done? Or read more about how APIs work? Most of the documentation out there is directed at people who already have a certain level of knowledge, which I don't have, and don't really know how to get.
Thanks.
Missing part is authentication (OAuth) to connect from your app to O365..
Maybe this helps http://msdn.microsoft.com/library/bde5647a-fff1-4b51-b67b-2139de79ce4a%28Office.15%29.aspx
Yes, you do need to authenticate against the Office 365 APIs as indicated previously. To make calls against Office 365, you must register your app for OAuth against Azure AD.
I'd suggest looking at http://dev.office.com/getting-started/office365apis. It should guide you through setting up authentication and show you how to make the rest call.

Google OAuth WildCard Domains

I am using the google auth but keep getting an origin mismatch. The project I am working has sub domains that are generated by the user. So for example there can be:
john.example.com
henry.example.com
larry.example.com
In my app settings I have one of my origins being http://*.example.com but I get an origin mismatch. Is there a way to solve this? Btw my code looks like this:
gapi.auth.authorize({
client_id : 'xxxxx.apps.googleusercontent.com',
scope : ['https://www.googleapis.com/auth/plus.me',
state: 'http://henry.example.com',
'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/userinfo.profile'],
immediate : false
}, function(result) {
if (result != null) {
gapi.client.load('oath2', 'v2', function() {
console.log(gapi.client);
gapi.client.oauth2.userinfo.get().execute(function(resp) {
console.log(resp);
});
});
}
});
Hooray for useful yet unnecessary workarounds (thanks for complicating yourself into a corner Google)....
I was using Google Drive using the javascript api to open up the file picker, retrieve the file info/url and then download it using curl to my server. Once I finally realized that all my wildcard domains would have to be registered, I about had a stroke.
What I do now is the following (this is my use case, cater it to yours as you need to)
On the page that you are on, create an onclick event to open up a new window in a specific domain (https://googledrive.example.com/oauth/index.php?unique_token={some unique token}).
On the new popup I did all my google drive authentication, had a button to click which opened the file picker, then retrieved at least the metadata that I needed from the file. Then I stored the token (primary key), access_token, downloadurl and filename in my database (MySQL).
Back on step one's page, I created a setTimeout() loop that would run an ajax call every second with that same unique_token to check when it had been entered in the database. Once it finds it, I kill the loop and then retrieve the contents and do with them as I will (in this case I uploaded them through a separate upload script that uses curl to fetch the file).
This is obviously not the best method for handling this, but it's better than entering each and every subdomain into googles cloud console. I bet you can probably do this with googles server side oauth libraries they use, but my use case was a little complicated and I was cranky cause I was frustrated at the past 4 days I've spent on a silly little integration with google.
Wildcard origins are not supported, same for redirect URIs.
The fact that you can register a wildcard origin is a bug.
You can use the state parameter, but be very careful with that, make sure you don't create an open redirector (an endpoint that can redirect to any arbitrary URL).

Categories

Resources