Getting an access token from Google Playground to test a project - javascript

For a Google project app that i code with node js, I need to have the client access to upload a file into his drive. As I test my code, no file appears in my drive. It's probably because I run the thing with my service account. So what i did is that : as i have a client ID (with my email etc.) and key that I created with google, i uploaded the json that contains those informations :
{
"web":
{
"project_id":"",
"private_key_id": "",
"private_key": "-----BEGIN PRIVATE KEY-----\n\n-----END PRIVATE KEY-----\n",
"client_email": "",
"client_id":"",
"auth_uri":"https://accounts.google.com/o/oauth2/auth",
"token_uri":"https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs",
"client_secret":"",
"redirect_uris":["https://developers.google.com/oauthplayground"]
}
}
Obviously blank spaces aren't blank in my code.
I also handled, in the google console, the client API access and added the client with the Drive API application.
As I run the thing in the terminal, I get this error in the terminal :
"Error: No key or keyFile set."
Thanks ! In my js, I have access to the api and to my keys.json file containing the informations necessary.

If you want to upload a file to your / a user's drive with the service account - you need to give to the service account a) the scopes to access yours / the user's drive (implies domain-wide delegation) and b) either a reading permission to the file or enable impersonation so it acts on your / your user's behalf.
If you want to test a method (e.g. upload a file) without a service account, I recommend you to do it with the Try this API.
If you want to test your code with your own credentials instead of the service account - the authentication process will be different, as well as the content of the required credentials.json file. You need to stick to the procedure and authorization process as outlined in the quickstart - you can download the correct credentials file directly by clicking the "Enable the Drive API" button in the quickstart documentation. The correct credentials.json file for running the code as you (opposed to service account) will be the following:
{
"installed":{
"client_id":"XXX.apps.googleusercontent.com",
"project_id":"XXX",
"auth_uri":"https://accounts.google.com/o/oauth2/auth",
"token_uri":"https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs",
"client_secret":"XXX",
"redirect_uris":["urn:ietf:wg:oauth:2.0:oob","http://localhost"]
}
}
The private key (and key Id) is only necessary for service accounts.

Related

Accessing user files in GAE from javascript

I am new to GAE so struggling to understand a few things.
I am trying to build a python web-app that processes videos uploaded by users (through the webapp) and displays some visualizations (built using d3-js) once the processing is done. The artifacts created during processing are saved locally and later uploaded to user-specific GCS buckets (they are not publically accessible).
I want to be able to display the visualization (using processed video artifacts) when a user requests for it. As per my understanding, since these are dynamically generated, I cannot store the artifacts in static folder for javascript to access. So, it seems that I have to save the processed video artifacts in a /tmp folder.
How do I ensure that javascript is able to fetch files from this external /tmp folder?
Or is there a better way to do this using GCS itself, how do I access buckets from javascript without making them public?
Please suggest some resources or ideas to solve this. Thanks!
I think you've got it backwards.
You have a private bucket, that's great for security. In order to have the client javascript (browser, mobile App) to download an object you need to either:
Have a HTTP handler for your python GAE that retrieves the file from GCS and sends it to the client. (flask pseudo code)
#app.route('/private/<name>')
def hello_name(name):
## if user is not authorized
#### return 401.
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(name)
bts = blob.download_as_bytes()
return bts
Give the client a Signed URL from GCS so they can download the file directly.
#app.route('/private/<name>')
def hello_name(name):
## if user is not authorized
#### return 401.
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(name)
url = blob.generate_signed_url(
version="v4",
# This URL is valid for 15 minutes
expiration=datetime.timedelta(minutes=15),
# Allow GET requests using this URL.
method="GET",
)
return url
As a note, in the second case the javascript file will need to first access /prvite/your_file_name to retrieve the signed url, then it will need to download the actual file from GCS using the signed url.

Can't add user for role/editor user via api in Google Cloud Platform

I have been trying to use the api to add user of project editor in Google Cloud Platform.
API I use is Resource Manager API setIampolicy.
To Add user is using Google Apps Script.
PROCEDURE
get current all policy on Google Cloud Platform by using [Resource Manager API getIampolicy].
add user and fix 1.response json.
post 2.json using [Resource Manager API setIampolicy].
https://cloud.google.com/resource-manager/reference/rest/v1/projects/getIamPolicy
https://cloud.google.com/resource-manager/reference/rest/v1/projects/setIamPolicy
but, I can't add user.
With the below Error/Exception:
{
"error": {
"code": 400,
"message": "Request contains an invalid argument.",
"status": "INVALID_ARGUMENT",
"details": [
{
"#type": "type.googleapis.com/google.cloudresourcemanager.v1.ProjectIamPolicyError",
"type": "SOLO_REQUIRE_TOS_ACCEPTOR",
"role": "roles/owner"
}
]
}
}
Other information
I can do by [Try it!] of documents, but can't do by Google Apps Script.
I use OAuth Library of Google Apps Script and OAuth Authentication.
Why?
Don't use this method that you are trying to do (pulling the iam policy using get-iam-policy edit the JSON/YAML file this push the changes using set-iam-policy) becasue it's bad practice and small error in the file can cause loosing access your project. also this way you are dealing with too much data and you are pulling a whole file, editing it then pushing it back to be processed again(all the file).
you should use
gcloud [GROUP] add-iam-policy-binding [RESOURCE-NAME] --role
[ROLE-ID-TO-GRANT] --member user:[USER-EMAIL]
and
gcloud [GROUP] remove-iam-policy-binding [RESOURCE-NAME] --role
[ROLE-ID-TO-GRANT] --member user:[USER-EMAIL]
instead
example:
gcloud projects add-iam-policy-binding $PROJECT_NAME \
--role roles/editor \
--member serviceAccount:$SA_EMAIL
these two methods are better because:
changes are simpler, less work and less error prone than editing JSON/YAML
you will avoid race conditions because you can do multiple roles bindings simltaneousoly and they won't confilct each other.
I found it less error prone by using terraform for this task. The terraform module "google_project_iam_binding" does this without overwriting the existing iam policies (especially when I need to update the IAM Conditional Policies).
I'll pull the policy with gcloud projects get-iam-policy PROJECT_ID --format json > policy.json to understand the policy. Once I understand the policy, I'll convert the portion which I need to update to the terraform template and then use terraform to deploy it.

Google drive API publish document and get published link

If I make a document in Google drive and type some text, I can later go to file => Publish to the Web and get a link to a public webpage-style of the google doc, along with an embed link.
This is how its done manually. How can I do this automatically with a Node.JS server script (for example, using a Service Account) using the Goolgle Drive API? I couldn't find anything about this particular thing in their docs, is this possible? DO I need to make a google script instead? Is it even possible with that?
You want to publish the Google Document to web using googleapis with Node.js.
You want to retrieve the published URL.
You want to achieve this using the service account.
You have already had the service account and been able to use Drive API.
From your question and replying comments, I could understand like this. If my understanding is correct, how about this answer? Please think of this as just one of several possible answers.
Usage:
In order to use the following sample script, please do the following flow.
Prepare a Google Document.
In this case, as a test case, please create new Google Document to your Google Drive.
Share the created Google Document with the email of the service account as the writer.
Retrieve the file ID of the Google Document.
Set the variables to the following sample script.
Run the script.
In this case, "Revisions: update" of Drive API is used.
By this, the Google Document is published to the web and you can see the published URL at the console.
Sample script:
const { google } = require("googleapis");
// Please set the email and private key of service account.
const auth = new google.auth.JWT(
"### email of service account ###",
null,
"### private key of service account ###" ,
["https://www.googleapis.com/auth/drive"]
);
const fileId = "###"; // Please set the file ID.
const drive = google.drive({ version: "v3", auth });
const body = {
resource: {
published: true,
publishedOutsideDomain: true,
publishAuto: true
},
fileId: fileId,
revisionId: 1
};
drive.revisions.update(body, err => {
if (err) {
console.error(err);
return;
}
console.log(`https://docs.google.com/document/d/${fileId}/pub`);
});
Please set the private key like "-----BEGIN PRIVATE KEY-----\n###\n-----END PRIVATE KEY-----\n".
Note:
The published URL is https://docs.google.com/document/d/${fileId}/pub. In this case, the file ID is used. Because unfortunately, in the current stage, the URL like https://docs.google.com/document/d/e/2PACX-###/pubhtml cannot be retrieved by Google API.
Reference:
Revisions: update

Avoid exposing LinkedIn API key in HTML

I want my users to login using the Sign In With LinkedIn feature. The LinkedIn API docs provides the following example snippet for getting started:
<script type="text/javascript" src="//platform.linkedin.com/in.js">
api_key: YOUR_API_KEY_HERE
authorize: true
onLoad: onLinkedInLoad
</script>
<script type="text/javascript">
// Setup an event listener to make an API call once auth is complete
function onLinkedInLoad() {
IN.Event.on(IN, "auth", getProfileData);
}
// Handle the successful return from the API call
function onSuccess(data) {
console.log(data);
}
// Handle an error response from the API call
function onError(error) {
console.log(error);
}
// Use the API call wrapper to request the member's basic profile data
function getProfileData() {
IN.API.Raw("/people/~").result(onSuccess).error(onError);
}
</script>
How to I implement this without exposing YOUR_API_KEY_HERE to the public? There's a few npm packages out there that handle this kind of thing, but they are all old (I get nervous whenever a package hasn't been updated in at least a year).
My application uses node and express. Should I go with an old npm package or is there a better way to hide the api_key?
It is ok and necessary to have YOUR_API_KEY_HERE in the javascript or website and it is necessary at times. They important piece is not to share your SECRET_KEY because you need both to do anything with the API. Be sure to always use HTTPS for all communications.
From the linkedin best practices for security application website:
https://developer.linkedin.com/docs/best-practices
API Key & Secret Key
When making calls to the LinkedIn APIs you use two pieces of identifiable information: the API Key (sometimes called the Consumer Key) and the Secret Key (or Consumer Secret).
The API Key is a public identifier of your application and the Secret Key is confidential and should only be used to authenticate your application on the LinkedIn APIs.
Since both the API Key and Secret Key are needed together to confirm your application’s identity, it is critical that you never expose your Secret Key. Here are some suggestions for proper Secret Key storage:
When creating a native mobile application, do not store it locally on a mobile device.
Do not expose in any client side code files like JavaScript or HTML files.
Do not store it in files on a web server that can be viewed externally e.g.: configuration files, include files, etc.
Do not store it in log files or error messages.
Do not email it or post it on a message board or other public forum.
Remember that when exchanging an OAuth 2.0 authorization code for an access token, the Secret Key is passed as part of the request. Do not expose this request publicly!

Using Google Drive API + Parse to store files on a single Drive account

I am trying to use a single Google Drive account as a 'web server.' I have an Android app that needs to be able to store and retrieve pictures. My idea was to use Parse to help manage everything and extend my storage capacity beyond Parse's free amount.
Essentially, I will have a single Google Drive account and a Parse project. When a user wants to store a file, he/she uploads the file to Parse, Parse authenticates with the single Google Drive account (using CloudCode), Parse uploads the files to Drive, stores the URL to the file in a table, and deletes the file from Parse's cloud storage. I plan on giving the folder that stores these files private write access and public read access so the clients don't have to make Parse requests to
The purpose of this is to get more storage for my application. (Amazon S3 only gives 5g, Parse gives 1g, DropBox 2g, Google Cloud Storage I don't think they have a free plan, and Drive gives 15g but I have also heard about Google Photos integrating with Google Drive which might give me unlimited storage for pictures)
Because Google Drive wasn't really designed to do this, I am having some difficultly figuring out how all the pieces fit together.
I have looked at this question which doesn't seem to apply to my situation because I will be able to run all my write operations on a secure sever. (Also I have read I need to store a refresh token which should also be secure using this method)
I have looked at this but many of the links are outdated, so that didn't help me much.
I have looked at this, but this seems to be authorizing an app's use of the user's personal files.
I have also read that I may need to use a service account instead if a Web Application account, but again, the information I have been reading has not been very clear. In my opinion, this is because Drive was not designed to work this way.
Summary:
1:
Can someone point me in the right direction for writing files to a single Google Drive account using Parse Cloud Code (Server-side Javascript)?
2:
If the above question is solvable/possible, does Google's release of Google Photo's mean that I would have essentially unlimited storage capacity for pictures?
OK. So after hours of research and testing, I have come up with a way of getting this to work.
Google's API authentication process is a bit confusing to say the least. The general procedure is as follows:
User is asked to grant access to INSERT YOUR PROJECT HERE for the scope YOUR SCOPE
The user can either accept or reject that request
If the user accepts, you can retrieve an access token (and more importantly a refresh token)
Once you have the refresh token you can get a new access token to make API calls any time you want.
So the whole goal of this is to make API calls (specifically to drive) which requires a refresh code.
The main difficulty in this is, (AS FAR AS I KNOW) there is no way of authenticating a user and getting an access code inside of Parse CloudCode. BUT, I had a theory that if I was somehow able to authenticate the user account outside of CloudCode, you could still make GET/POST requests from inside a CloudCode function. (More specifically: Parse.Cloud.httpRequest)
There are many ways/platforms that allow you to authenticate a user in order to get a refresh code. The most accessible method would probably be by using the Android/Java version of the API because anyone with a computer can run an Android emulator, but I have easy access to a PHP capable website, so I chose to use PHP instead. But again, this part of the process can be done on many different platforms.
The first step is to install the Google API Client Library for PHP (Github). There are a few different ways to install this on your server, but since I only need it to get a refresh token, I chose to include it dynamically at runtime (also because I didn't have quick access to my php.ini file).
(Note before I continue: I am not a PHP developer, so please let me know if I do anything odd or redundant; I am merely showing what I did to get this to work)
To do this I simply downloaded a copy of the library, uploaded it to my server, and included the following line in all the files I used the library:
set_include_path(get_include_path() . PATH_SEPARATOR . 'google-api-php-client-master/src');
Where google-api-php-client-master/src is the path to your src folder.
Once you have that done, you have to authenticate the user. The following two files send the user to an authentication page, and then print the refresh code for that user on the screen.
index.php:
<?php
set_include_path(get_include_path() . PATH_SEPARATOR . 'google-api-php-client-master/src');
require_once 'google-api-php-client-master/src/Google/autoload.php'; // or wherever autoload.php is located
session_start();
$client = new Google_Client();
$client->setAuthConfigFile('client_secrets.json'); //You can download this file from your developer console under OAuth 2.0 client IDs
$client->addScope(Google_Service_Drive::DRIVE); //Scope that grants full access to user's Google Drive
$client->setAccessType("offline"); //Important so a refresh code is returned
if (isset($_SESSION['access_token']) && $_SESSION['access_token']) {
print($_SESSION['refresh_token']);
} else {
$redirect_uri = 'http://' . $_SERVER['HTTP_HOST'] . '/oauth2callback.php';
header('Location: ' . filter_var($redirect_uri, FILTER_SANITIZE_URL));
}
?>
oauth2callback.php:
<?php
set_include_path(get_include_path() . PATH_SEPARATOR . 'google-api-php-client-master/src');
require_once 'google-api-php-client-master/src/Google/autoload.php';
session_start();
$client = new Google_Client();
$client->setAuthConfigFile('client_secrets.json');
$client->setRedirectUri('http://' . $_SERVER['HTTP_HOST'] . '/oauth2callback.php');
$client->addScope(Google_Service_Drive::DRIVE);
$client->setAccessType("offline");
if (! isset($_GET['code'])) {
$auth_url = $client->createAuthUrl();
header('Location: ' . filter_var($auth_url, FILTER_SANITIZE_URL));
} else {
$client->authenticate($_GET['code']);
$_SESSION['access_token'] = $client->getAccessToken();
$_SESSION['refresh_token'] = $client->getRefreshToken(); //Important to clear the session variable after you have the token saved somewhere
$redirect_uri = 'http://' . $_SERVER['HTTP_HOST'] . '/'; //Redirects to index.php
header('Location: ' . filter_var($redirect_uri, FILTER_SANITIZE_URL));
}
?>
Several important notes about this code:
You need to make sure you have created the appropriate credentials (Web application)
You need to add the correct redirect URI to your OAuth 2.0 client ID. It should be something like http://yourdomain.com/oauth2callback.php.
You need to have uploaded your client_secrets.json file to your server. You can get this by going to your developers console and clicking the download icon on the far right side of the OAuth 2.0 client IDs list.
You should probably clear your session variables after you copy the refresh token down for security reasons.
So in summary, we have created a simple program that prints out the refresh token for a specific user. You need to copy that code down and save it for later.
Even though you have the refresh token, you need a way of getting an access token to make API calls. (You can't make an API call without an access token, and because they expire every 3600 seconds, you need a way of getting a new one when you need it)
Here is the Parse CloudCode for accomplishing this step:
Parse.Cloud.define("getAccessToken", function(request, response) {
Parse.Cloud.httpRequest({
method: "POST",
url: 'https://www.googleapis.com/oauth2/v3/token/',
params: {
refresh_token : 'INSERT_REFRESH_TOKEN_HERE',
client_id : 'INSERT_CLIENT_ID_HERE',
client_secret : 'INSERT_CLIENT_SECRET_HERE',
grant_type : 'refresh_token'
}
}).then(function(httpResponse) {
response.success(httpResponse.text);
}, function(httpResponse) {
response.error('Request failed with response code ' + httpResponse.status);
});
});
This function sends a request for an access token. You can do whatever you want with that token; it is stored in httpResponse.text (if the request was successful). Note that grant_type should be refresh_token not YOUR refresh token.
Once you have an access token, you are free to make API calls within the scope of your refresh token. Example:
Parse.Cloud.define("sendRequest", function(request, response) {
Parse.Cloud.httpRequest({
method: "GET",
url: 'https://www.googleapis.com/drive/v2/about',
params: {
access_token : 'INSERT_YOUR_ACCESS_TOKEN_HERE'
}
}).then(function(httpResponse) {
response.success(httpResponse.text);
}, function(httpResponse) {
response.error('Request failed with response code ' + httpResponse.status);
});
});
This function returns information about the drive account associated with that refresh token.
With regards to my second question, I have not done enough research to know the answer to that yet, but I plan on finding out and posting it here.

Categories

Resources