I'm reading this documenation:
https://developers.google.com/drive/api/v3/appdata
This is my code:
var fileMetadata = {
'name': 'config.json',
'parents': ['appDataFolder']
};
var media = {
mimeType: 'application/json',
body: '"sample text"'
};
const request = gapi.client.drive.files.create({
resource: fileMetadata,
media,
fields: 'id'
})
request.execute(function (err, file) {
if (err) {
// Handle error
console.error(err);
} else {
console.log('Folder Id:', file.id);
}
})
I get a 403 error: "The user does not have sufficient permissions for this file."
Does not the user have permission to create a file in his appDataFolder? How to create a file in it?
The scope of gapi client is 'https://www.googleapis.com/auth/drive.appdata' and the user accepted it.
I believe the reason for this error is that you are only using the scope to access the appdata folder, but not the scope to create files. Accessing the app data folder and creating files are two different things. According to your code, you are trying to create a file in the appdata folder.
I suggest you to include both scopes:
https://www.googleapis.com/auth/drive.appdata
https://www.googleapis.com/auth/drive.file
If you are not using incremental authorization, make sure to revoke access and reauthorize again.
Reference: https://developers.google.com/drive/api/v3/about-auth#OAuth2Authorizing
You don't actually need https://www.googleapis.com/auth/drive.file scope to create or delete data inside the appDataFolder. https://www.googleapis.com/auth/drive.appdata scope covers all that.
Try this. Just pass your auth client to the createFile() function.
// Requiring the modular service is much better than requiring the whole GAPI
const GDrive = require('#googleapis/drive');
function createFile(auth) {
const drive = GDrive.drive({version: 'v3', auth});
const fileMetadata = {
'name': 'config.json',
'parents': ['appDataFolder']
};
const media = {
mimeType: 'application/json',
body: '{"TEST": "THIS WORKED"}'
};
drive.files.create({
resource: fileMetadata,
media: media,
fields: 'id'
}).then((resp) => {
console.log('File Id: ', resp.data.id);
}).catch((error) => {
console.error('Unable to create the file: ', error);
});
}
Related
I've been trying to GET google drive file by file id using a service account in NodeJS, but requests are failed with the following error (meaning the lack of access):
code: 404,
errors: [
{
message: 'File not found: XXX.',
domain: 'global',
reason: 'notFound',
location: 'fileId',
locationType: 'parameter'
}
]
Scope
I've tried to play around with the scope by adding extra scopes, but essentially https://www.googleapis.com/auth/drive have been always in place.
const scopes = [
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/drive.appdata',
'https://www.googleapis.com/auth/drive.file',
'https://www.googleapis.com/auth/drive.metadata',
'https://www.googleapis.com/auth/drive.metadata.readonly',
'https://www.googleapis.com/auth/drive.photos.readonly',
'https://www.googleapis.com/auth/drive.readonly',
];
Service account
I've created a service account, following the same conventional flow showed in different resources/docs/tutorials (1, 2, 3, etc)
https://console.cloud.google.com/iam-admin/serviceaccounts?project=XXXX
Enabled Google Drive API
https://console.cloud.google.com/marketplace/product/google/drive.googleapis.com
Enabled Domain-wide Delegation in admin google panel with the exact same scope as listed above (also had tested without additionally enabling this)
https://admin.google.com/ac/owl/domainwidedelegation
Source code
There's a google nodejs quickstart out there for accessing drive api that works the way a user granting permissions through oauth2 modal (example), that's not acceptable in my case, since it must be working using service account (akka demon machine-2-machine) without any real user interaction.
I've tried out many ways:
using google-auth-library package:
const { auth } = require('google-auth-library');
const client = auth.fromJSON({
type: 'service_account',
project_id: 'XXX',
private_key_id: 'XXX',
private_key: 'XXX',
client_email: 'X#Y.iam.gserviceaccount.com',
client_id: 'XXXX',
auth_uri: 'https://accounts.google.com/o/oauth2/auth',
token_uri: 'https://oauth2.googleapis.com/token',
auth_provider_x509_cert_url: 'https://www.googleapis.com/oauth2/v1/certs',
client_x509_cert_url:
'https://www.googleapis.com/robot/v1/metadata/x509/X%40Y.iam.gserviceaccount.com',
});
// also tested with exact same scopes listed above
const scopes = ['https://www.googleapis.com/auth/drive'];
client.scopes = scopes;
// tested both options for `supportsAllDrives`: true/false
const url = `https://www.googleapis.com/drive/v3/files/XXX?fields=name&supportsAllDrives=true`;
client.request({ url }).then(console.log).catch(console.error);
using ts-google-drive package:
import { TsGoogleDrive } from 'ts-google-drive';
const tsGoogleDrive = new TsGoogleDrive({
credentials: {
client_email: 'X#Y.iam.gserviceaccount.com',
private_key: '',
},
});
async function getSingleFile(fileId: string): Promise<void> {
// returns `undefined`, meaning an error
const file = await tsGoogleDrive.getFile(fileId);
console.log('file', file);
if (file) {
const isFolder = file.isFolder;
console.log('isFolder', isFolder);
}
}
getSingleFile('XXX');
using googleapis
const { google } = require('googleapis');
const auth = new google.auth.GoogleAuth({
keyFile: 'service-account.json', // file properly located
scopes: ..., // same scope
});
const drive = google.drive({ version: 'v3', auth });
const driveResponse = await drive.files.list({
fields: '*',
});
const file = await drive.files.get({
fileId: 'XXX',
fields: 'name',
supportsAllDrives: true,
});
console.log(file); // error!
using googleapis with jwtClient
const google = require('googleapis');
const fs = require('fs');
const key = require('./service-account.json');
const scopes = ... // same
const drive = google.google.drive('v3');
const jwtClient = new google.google.auth.JWT(
key.client_email,
null,
key.private_key,
scopes,
null,
);
jwtClient.authorize(async (authErr) => {
if (authErr) {
console.log(authErr); // NO error here
return;
}
const drive = google.google.drive({ version: 'v3', auth: jwtClient });
console.log('jwtClient.getCredentials()', jwtClient.getCredentials());
console.log('jwtClient.apiKey', jwtClient.apiKey);
console.log('jwtClient.credentials', jwtClient.credentials);
console.log('jwtClient.gtoken', jwtClient.gtoken);
// errors occur down below when actually requesting the api
const file = await drive.files.get({
fileId: 'XXX',
fields: 'name',
supportsAllDrives: true,
});
console.log(file);
});
./service-account file structure:
{
"type": "service_account",
"project_id": "X",
"private_key_id": "XXXX",
"client_email": "X#Y.iam.gserviceaccount.com",
"client_id": "XXX",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/X%Y.iam.gserviceaccount.com"
}
Google Drive resources permissions
Since I'm trying to access my internal company's google drive files, I'm running into issues of giving possibly required file permissions to my service account:
attempt to share a folder/drive with my service account ended up being unsuccessful
It's also important to note that I've failed accessing a file that had been shared with my service account granularly (file was in "Shared with me").
Even though any public file in my company's drive can be accessed for no problem.
ER
To have it being able to access the company drive files using a service account (no real user interaction) - to highliht if it's importatn - those are located in "Shared with me" and in "Shared drives".
I have created a project on Google API's developer console and enabled Google Drive API. As well as I have created and downloaded Service account credentials .json file which I am using on Node.js backend server and connecting and uploading image files on google drive.
npm i googleapis
const { google } = require('googleapis');
let privatekey = require("./privatekey.json");
// configure a JWT auth client
let jwtClient = new google.auth.JWT(
privatekey.client_email,
null,
privatekey.private_key,
['https://www.googleapis.com/auth/drive']);
//authenticate request
jwtClient.authorize(function (err, tokens) {
if (err) {
console.log(err);
return;
} else {
console.log("Successfully connected to gdrive!");
}
});
//Google Drive API
let drive = google.drive('v3');
drive.files.list({
auth: jwtClient,
q: "name contains 'photo'"
}, function (err, response) {
if (err) {
console.log('The API returned an error: ' + err);
return;
}
console.log(response.data);
var files = response.data.files;
if (files.length == 0) {
console.log('No files found.');
} else {
console.log('Files from Google Drive:');
for (var i = 0; i < files.length; i++) {
var file = files[i];
console.log('%s (%s)', file.name, file.id);
}
}
});
let fs = require('fs');
var fileMetadata = {
name: 'photo.png',
};
var media = {
mimeType: 'image/jpeg',
body: fs.createReadStream('files/photo.png'),
};
drive.files.create(
{
auth: jwtClient,
resource: fileMetadata,
media: media,
fields: 'id',
},
function (err, file) {
if (err) {
// Handle error
console.error(err);
} else {
console.log('File Id: ', file.data.id);
}
},
);
When I upload the file I get the unique file ID in the response.
On the android application as well as on the front end react application I want to display this image file using URL.
I tried https://drive.google.com/open?id=PASTE YOUR ID HERE as well as http://drive.google.com/uc?export=view&id=PASTE YOUR ID HERE but it says you need access.
I also tried publishAuto:true while uploading image but it didn't work.
What's the correct way to make the uploaded image file accessible via URL?
I solved it by creating a new folder and setting the permission for this folder as
type: anyone
role: reader
and then uploading images to this folder.
When I want to display uploaded images I can display using below URL:
https://drive.google.com/thumbnail?id=YOUR IMAGE ID
Here is the complete code.
const { google } = require('googleapis');
let privatekey = require("./privatekey.json");
let drive = google.drive('v3');
// configure a JWT auth client - login and get the token
let jwtClient = new google.auth.JWT(
privatekey.client_email,
null,
privatekey.private_key,
['https://www.googleapis.com/auth/drive']);
//authenticate request
jwtClient.authorize(function (err, tokens) {
if (err) {
console.log(err);
return;
} else {
console.log("Successfully connected to gdrive!");
}
});
Run this code only once.
//For creating google drive folder
var fileMetadata = {
'name': 'ProductIcons',
'mimeType': 'application/vnd.google-apps.folder'
};
drive.files.create({
auth: jwtClient,
resource: fileMetadata,
fields: 'id'
}, function (err, file) {
if (err) {
// Handle error
console.error(err);
} else {
console.log('Folder Id: ', file);
}
});
//For changing folder permission
var fileId = 'FOLDER ID HERE';
var permission =
{
'type': 'anyone',
'role': 'reader',
}
;
let drive = google.drive('v3');
drive.permissions.create({
auth: jwtClient,
resource: permission,
fileId: fileId,
fields: 'id',
}, function (err, res) {
if (err) {
// Handle error...
console.error(err);
} else {
console.log('Permission ID: ', res)
}
});
And then upload as many images as you want in that folder using below code.
//For uploading image to folder
var folderId = 'FOLDER ID HERE';
let fs = require('fs')
var fileMetadata = {
'name': 'photo.png',
parents: [folderId]
};
var media = {
mimeType: 'image/jpeg',
body: fs.createReadStream('files/photo.png')
};
drive.files.create({
auth: jwtClient,
resource: fileMetadata,
publishAuto:true,
media: media,
fields: 'id'
}, function (err, file) {
if (err) {
// Handle error
console.error(err);
} else {
console.log('File Id: ', file.data.id);
}
});
I'm doing tests to share files with Google apis I already have how to share the files with someone ams as far as I understand is shared with user permission as reading or writing that is what I need through an app, on the side of the person to whom the file was shared I can not see it as far as I have noticed the api does not consult the files for read permissions in drive in the app just set the permission as "https://www.googleapis.com/auth/drive.file" Because I just want my app to have access to the files that she created.
owner (list in api) --> shared --> user (not list in api)
/////////CODE SHARED FILE OWNER
var request1 = gapi.client.request({
'path': '/drive/v3/files/' + fileId + '/permissions',
'method': 'POST',
'headers': {
'Content-Type': 'application/json',
'Authorization': 'Bearer ' + sTokenDrive
},
'body':{
'role': 'writer',
'type': 'user',
'emailAddress' : 'user#gmail.com'
}
});
request1.execute(function(resp) {
console.log(resp);
});
////////CODE LIST FILE USER
<script src="https://apis.google.com/js/api.js"></script>
<script>
/**
* Sample JavaScript code for drive.files.list
* See instructions for running APIs Explorer code samples locally:
* https://developers.google.com/explorer-help/guides/code_samples#javascript
*/
function authenticate() {
return gapi.auth2.getAuthInstance()
.signIn({scope: "https://www.googleapis.com/auth/drive.file"})
.then(function() { console.log("Sign-in successful"); },
function(err) { console.error("Error signing in", err); });
}
function loadClient() {
return gapi.client.load("https://content.googleapis.com/discovery/v1/apis/drive/v3/rest")
.then(function() { console.log("GAPI client loaded for API"); },
function(err) { console.error("Error loading GAPI client for API", err); });
}
// Make sure the client is loaded and sign-in is complete before calling this method.
function execute() {
return gapi.client.drive.files.list({
"corpus": "user",
"q": "sharedWithMe = true"
})
.then(function(response) {
// Handle the results here (response.result has the parsed body).
console.log("Response", response);
},
function(err) { console.error("Execute error", err); });
}
gapi.load("client:auth2", function() {
gapi.auth2.init({client_id: 'CLIENT_ID'});
});
</script>
<button onclick="authenticate().then(loadClient)">authorize and load</button>
<button onclick="execute()">execute</button>
Following on from the great help I received on my original post
Uploading a file to an s3 bucket, triggering a lambda, which sends an email containing info on the file uploaded to s3 buket
I have tested previously sending the email so I know that works but when I try to include the data of the upload it fires error
Could not fetch object data: { AccessDenied: Access Denied
at Request.extractError (/var/runtime/node_modules/aws-sdk/lib/services/s3.js:577:35)
at Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:105:20)
at Request.emit (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:77:10)
at Request.emit (/var/runtime/node_modules/aws-sdk/lib/request.js:683:14)
I have found many q's related to this online regarding policies around roles etc..So I have added lambda to the s3 event, and added s3 permission to the role eg.
https://stackoverflow.com/questions/35589641/aws-lambda-function-getting-access-denied-when-getobject-from-s3
Unfortunately none of these have helped. I noticed a comment however
Then the best solution is to allow S3FullAccess, see if it works. If it does, then remove one set of access at a time from the policy and find the least privileges required for your Lambda to work. If it does not work even after giving S3FullAccess, then the problem is elsewhere
So how would I go about finding where the problem is?
Thank Y
'use strict';
console.log('Loading function');
var aws = require('aws-sdk');
var ses = new aws.SES({
region: 'us-west-2'
});
//var fileType = require('file-type');
console.log('Loading function2');
var s3 = new aws.S3({ apiVersion: '2006-03-01', accessKeyId: process.env.ACCESS_KEY, secretAccessKey: process.env.SECRET_KEY, region: process.env.LAMBDA_REGION });
console.log('Loading function3');
//var textt = "";
exports.handler = function(event, context) {
console.log("Incoming: ", event);
// textt = event.Records[0].s3.object.key;
// var output = querystring.parse(event);
//var testData = null;
// Get the object from the event and show its content type
// const bucket = event.Records[0].s3.bucket.name;
// const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '));
const params = {
Bucket: 'bucket',
Key: 'key',
};
s3.getObject(params, function(err, objectData) {
if (err) {
console.log('Could not fetch object data: ', err);
} else {
console.log('Data was successfully fetched from object');
var eParams = {
Destination: {
ToAddresses: ["fake#fake.com"]
},
Message: {
Body: {
Text: {
Data: objectData
// Data: textt
}
},
Subject: {
Data: "Email Subject!!!"
}
},
Source: "fake#fake.com"
};
console.log('===SENDING EMAIL===');
var email = ses.sendEmail(eParams, function(err, emailResult) {
if (err) console.log('Error while sending email', err);
else {
console.log("===EMAIL SENT===");
//console.log(objectData);
console.log("EMAIL CODE END");
console.log('EMAIL: ', emailResult);
context.succeed(event);
}
});
}
});
};
UPDATE
I have added comments to the code and checked the logs...it doesnt go past this line
var s3 = new aws.S3({ apiVersion: '2006-03-01', accessKeyId: process.env.ACCESS_KEY, secretAccessKey: process.env.SECRET_KEY, region: process.env.LAMBDA_REGION });
Is this anyway related to access denied?
NOTE: ALL I WANT IN THE FILENAME OF THE UPLOADED FILE
UPDATE 2
iv replaced the line causing issue with var s3 = new aws.S3().getObject({ Bucket: this.awsBucketName, Key: 'keyName' }, function(err, data)
{
if (!err)
console.log(data.Body.toString());
});
but this is firing as TypeError: s3.getObject is not a function
Also tried...var s3 = new aws.S3();
this is back to the original error of Could not fetch object data: { AccessDenied: Access Denied
First of all region should be S3 bucket region and not lambda region. Next you need to verify your credentials and if they have access to S3 bucket you have defined. As you stated in one of the comment try attaching S3 full access Amazon managed policy to your IAM user which is associated with credentials you are using in Lambda. Next step would be use aws cli to see if you can access this bucket. Maybe something like -
aws s3 ls
Having said above you should not use credentials at all. Since Lambda and S3 are amazon services you should use roles. Just give Lambda a role that gives it full access to S3 and do not use aws IAM credentials for this. And
var s3 = new AWS.S3();
is sufficient.
I am trying to store a json object that contains config info in a Google Drive Appdata file. I am currently writing the app in JS that is run on the client side. Using the Google Drive API, I can currently check for the file in the appdata folder. How would I go about generating a new file and storing it in the appdata folder if the config was not found?
var request = gapi.client.drive.files.list({
'q': '\'appdata\' in parents'
});
request.execute(function(resp) {
for (i in resp.items) {
if(resp.items[i].title == FILENAME) {
fileId = resp.items[i].id;
readFile(); //Function to read file
return;
}
}
//Create the new file if not found
});
The gapi client does not provide a method to upload files to google drive (it does for metadata) but they do still expose an API endpoint.
Here's an example I've been using for the V3 api
function saveFile(file, fileName, callback) {
var file = new Blob([JSON.stringify(file)], {type: 'application/json'});
var metadata = {
'name': fileName, // Filename at Google Drive
'mimeType': 'application/json', // mimeType at Google Drive
'parents': ['appDataFolder'], // Folder ID at Google Drive
};
var accessToken = gapi.auth.getToken().access_token; // Here gapi is used for retrieving the access token.
var form = new FormData();
form.append('metadata', new Blob([JSON.stringify(metadata)], {type: 'application/json'}));
form.append('file', file);
var xhr = new XMLHttpRequest();
xhr.open('post', 'https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart&fields=id');
xhr.setRequestHeader('Authorization', 'Bearer ' + accessToken);
xhr.responseType = 'json';
xhr.onload = () => {
console.log(xhr.response.id); // Retrieve uploaded file ID.
callback(xhr.response);
};
xhr.send(form);
}
And since google drive will allow duplicate filenames since they're unique by ID I use something like this to check if it exists already:
function fileExists(file, fileName){
var request = gapi.client.drive.files.list({
spaces: 'appDataFolder',
fields: 'files(id, name, modifiedTime)'
});
request.execute(function(res){
var exists = res.files.filter(function(f){
return f.name.toLowerCase() === fileName.toLowerCase();
}).length > 0;
if(!exists){
saveFile(file, fileName, function(newFileId){
//Do something with the result
})
}
})
}
Check the documentation about Storing Application Data:
The 'Application Data folder' is a special folder that is only accessible by your application. Its content is hidden from the user, and from other apps. Despite being hidden from the user, the Application Data folder is stored on the user's Drive and therefore uses the user's Drive storage quota. The Application Data folder can be used to store configuration files, saved games data, or any other types of files that the user should not directly interact with.
NOTE:
To be able to use your Application Data folder, request access to the following scope:
https://www.googleapis.com/auth/drive.appdata
If you'll check the sample code on how to insert a file into the Application Data folder(PHP code):
$fileMetadata = new Google_Service_Drive_DriveFile(array(
'name' => 'config.json',
'parents' => array('appDataFolder')
));
$content = file_get_contents('files/config.json');
$file = $driveService->files->create($fileMetadata, array(
'data' => $content,
'mimeType' => 'application/json',
'uploadType' => 'multipart',
'fields' => 'id'));
printf("File ID: %s\n", $file->id);
By adding appDataFolder as a parent for the file will make it write to the appFolder. Then implement you own uploading/cody code to insert the file and its content to the appFolder.
Hope this helps