Google Drive API: 404 File not found error [custom company domain] - javascript

I've been trying to GET google drive file by file id using a service account in NodeJS, but requests are failed with the following error (meaning the lack of access):
code: 404,
errors: [
{
message: 'File not found: XXX.',
domain: 'global',
reason: 'notFound',
location: 'fileId',
locationType: 'parameter'
}
]
Scope
I've tried to play around with the scope by adding extra scopes, but essentially https://www.googleapis.com/auth/drive have been always in place.
const scopes = [
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/drive.appdata',
'https://www.googleapis.com/auth/drive.file',
'https://www.googleapis.com/auth/drive.metadata',
'https://www.googleapis.com/auth/drive.metadata.readonly',
'https://www.googleapis.com/auth/drive.photos.readonly',
'https://www.googleapis.com/auth/drive.readonly',
];
Service account
I've created a service account, following the same conventional flow showed in different resources/docs/tutorials (1, 2, 3, etc)
https://console.cloud.google.com/iam-admin/serviceaccounts?project=XXXX
Enabled Google Drive API
https://console.cloud.google.com/marketplace/product/google/drive.googleapis.com
Enabled Domain-wide Delegation in admin google panel with the exact same scope as listed above (also had tested without additionally enabling this)
https://admin.google.com/ac/owl/domainwidedelegation
Source code
There's a google nodejs quickstart out there for accessing drive api that works the way a user granting permissions through oauth2 modal (example), that's not acceptable in my case, since it must be working using service account (akka demon machine-2-machine) without any real user interaction.
I've tried out many ways:
using google-auth-library package:
const { auth } = require('google-auth-library');
const client = auth.fromJSON({
type: 'service_account',
project_id: 'XXX',
private_key_id: 'XXX',
private_key: 'XXX',
client_email: 'X#Y.iam.gserviceaccount.com',
client_id: 'XXXX',
auth_uri: 'https://accounts.google.com/o/oauth2/auth',
token_uri: 'https://oauth2.googleapis.com/token',
auth_provider_x509_cert_url: 'https://www.googleapis.com/oauth2/v1/certs',
client_x509_cert_url:
'https://www.googleapis.com/robot/v1/metadata/x509/X%40Y.iam.gserviceaccount.com',
});
// also tested with exact same scopes listed above
const scopes = ['https://www.googleapis.com/auth/drive'];
client.scopes = scopes;
// tested both options for `supportsAllDrives`: true/false
const url = `https://www.googleapis.com/drive/v3/files/XXX?fields=name&supportsAllDrives=true`;
client.request({ url }).then(console.log).catch(console.error);
using ts-google-drive package:
import { TsGoogleDrive } from 'ts-google-drive';
const tsGoogleDrive = new TsGoogleDrive({
credentials: {
client_email: 'X#Y.iam.gserviceaccount.com',
private_key: '',
},
});
async function getSingleFile(fileId: string): Promise<void> {
// returns `undefined`, meaning an error
const file = await tsGoogleDrive.getFile(fileId);
console.log('file', file);
if (file) {
const isFolder = file.isFolder;
console.log('isFolder', isFolder);
}
}
getSingleFile('XXX');
using googleapis
const { google } = require('googleapis');
const auth = new google.auth.GoogleAuth({
keyFile: 'service-account.json', // file properly located
scopes: ..., // same scope
});
const drive = google.drive({ version: 'v3', auth });
const driveResponse = await drive.files.list({
fields: '*',
});
const file = await drive.files.get({
fileId: 'XXX',
fields: 'name',
supportsAllDrives: true,
});
console.log(file); // error!
using googleapis with jwtClient
const google = require('googleapis');
const fs = require('fs');
const key = require('./service-account.json');
const scopes = ... // same
const drive = google.google.drive('v3');
const jwtClient = new google.google.auth.JWT(
key.client_email,
null,
key.private_key,
scopes,
null,
);
jwtClient.authorize(async (authErr) => {
if (authErr) {
console.log(authErr); // NO error here
return;
}
const drive = google.google.drive({ version: 'v3', auth: jwtClient });
console.log('jwtClient.getCredentials()', jwtClient.getCredentials());
console.log('jwtClient.apiKey', jwtClient.apiKey);
console.log('jwtClient.credentials', jwtClient.credentials);
console.log('jwtClient.gtoken', jwtClient.gtoken);
// errors occur down below when actually requesting the api
const file = await drive.files.get({
fileId: 'XXX',
fields: 'name',
supportsAllDrives: true,
});
console.log(file);
});
./service-account file structure:
{
"type": "service_account",
"project_id": "X",
"private_key_id": "XXXX",
"client_email": "X#Y.iam.gserviceaccount.com",
"client_id": "XXX",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/X%Y.iam.gserviceaccount.com"
}
Google Drive resources permissions
Since I'm trying to access my internal company's google drive files, I'm running into issues of giving possibly required file permissions to my service account:
attempt to share a folder/drive with my service account ended up being unsuccessful
It's also important to note that I've failed accessing a file that had been shared with my service account granularly (file was in "Shared with me").
Even though any public file in my company's drive can be accessed for no problem.
ER
To have it being able to access the company drive files using a service account (no real user interaction) - to highliht if it's importatn - those are located in "Shared with me" and in "Shared drives".

Related

Cannot write using the Twitter API (unsupported authentication)

I'm trying to use twitter-api-v2 to query twitter using their [rate limit example]
import dotenv from 'dotenv'
import { TwitterApi } from 'twitter-api-v2';
import { TwitterApiRateLimitPlugin } from '#twitter-api-v2/plugin-rate-limit'
dotenv.config()
const API_KEY = process.env.TWITTER_API_KEY;
const API_SECRET = process.env.TWITTER_API_SECRET;
const BEARER_TOKEN = process.env.BEARER_TOKEN;
const rateLimitPlugin = new TwitterApiRateLimitPlugin()
// Instantiate with desired auth type (here's Bearer v2 auth)
const twitterClient = new TwitterApi(process.env.BEARER_TOKEN, { plugins: [rateLimitPlugin] });
//const twitterClient = new TwitterApi({ appKey: API_KEY, appSecret: API_SECRET }, { plugins: [rateLimitPlugin] });
await twitterClient.v2.me()
const currentRateLimitForMe = await rateLimitPlugin.v2.getRateLimit('users/me')
console.log(currentRateLimitForMe.limit) // 75
console.log(currentRateLimitForMe.remaining) // 74
I'm getting an error:
'Unsupported Authentication: Authenticating with OAuth 2.0 Application-Only is forbidden for this endpoint. Supported authentication types are [OAuth 1.0a User Context, OAuth 2.0 User Context].',
I'm guessing it has an issue with how I'm logging in, I've tried BEARER and using my API Keys, neither seem to work.
How can I obtain rate limit information?
I'm not sure if this is the case but when I logged in to the developer portal I saw there was another section of keys I could create and everything I was doing before was operating under read only mode.
Here is the full code.
const twitterClient = new TwitterApi({
appKey: API_KEY,
appSecret: API_SECRET,
accessToken: ACCESS_TOKEN_KEY,
accessSecret: ACCESS_TOKEN_SECRET
}, { plugins: [rateLimitPlugin] })
await twitterClient.v2.me()
const currentRateLimitForMe = await rateLimitPlugin.v2.getRateLimit('users/me')
console.log(`rate limit: ${currentRateLimitForMe.limit} remaining: ${currentRateLimitForMe.remaining}`)

aws javascript sdk v3 - signature mismatch error

I can generate the presigned url following the steps as described in this section, so I wanted to test uploading a specific image marble.jpg and I tried to use postman to test the upload. So, I copied the presigned url and hit the endpoint with a PUT request, and I got this error:
<?xml version="1.0" encoding="UTF-8"?>
<Error>
<Code>SignatureDoesNotMatch</Code>
<Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message>
<Key>records/marble_cave.jpg</Key>
<BucketName>bucket</BucketName>
<Resource>/bucket/records/marble.jpg</Resource>
<RequestId>17E3999B521ABB65</RequestId>
<HostId>50abb07a-2ad0-4948-96e0-23403f661cba</HostId>
</Error>
The following resources are setup:
I'm using the min.io server to test this locally.
I'm using aws-sdk version 3 of the nodejs sdk for aws
I've triple checked my credentials, simple minio creds with no special characters also, I'm definitely making a PUT request.
So, The question is:
How to set the signatureVersion using the new javascript aws sdk version 3. (
The getSignedUrl is used to generate presigned url in v3 of the sdk, import { getSignedUrl } from '#aws-sdk/s3-request-presigner';)
what causes might be there such that this error is occuring?
The code I use for presigned url generation is:
import { getSignedUrl } from '#aws-sdk/s3-request-presigner';
import { PutObjectCommand, S3Client } from '#aws-sdk/client-s3';
const s3Client = new S3Client({
region: 'us-east-1',
credentials: {
accessKeyId: 'minioadmin',
secretAccessKey: 'minioadmin',
},
endpoint: http://172.21.0.2:9000,
forcePathStyle: true,
});
const bucketParams = {
Bucket: 'myBucket',
Key: `marbles.jpg`,
};
const command = new PutObjectCommand(bucketParams);
const signedUrl = await getSignedUrl(s3Client, command, {
expiresIn: 10000,
})
I stumbled on this issue myself a year ago, the new V3 SDK has a bug, it doesn't take the port into consideration when signing a URL.
see here https://github.com/aws/aws-sdk-js-v3/issues/2726
the work around I ended up implemented overrides getSignedUrl in my code and add the missing port as follows:
import {BuildMiddleware, MetadataBearer, RequestPresigningArguments} from '#aws-sdk/types';
import {Client, Command} from '#aws-sdk/smithy-client';
import {HttpRequest} from '#aws-sdk/protocol-http';
import {formatUrl} from '#aws-sdk/util-format-url';
import {S3RequestPresigner} from '#aws-sdk/s3-request-presigner';
export const getSignedUrl = async <
InputTypesUnion extends object,
InputType extends InputTypesUnion,
OutputType extends MetadataBearer = MetadataBearer
>(
client: Client<any, InputTypesUnion, MetadataBearer, any>,
command: Command<InputType, OutputType, any, InputTypesUnion, MetadataBearer>,
options: RequestPresigningArguments = {}
): Promise<string> => {
const s3Presigner = new S3RequestPresigner({ ...client.config });
const presignInterceptMiddleware: BuildMiddleware<InputTypesUnion, MetadataBearer> =
(next, context) => async (args) => {
const { request } = args;
if (!HttpRequest.isInstance(request)) {
throw new Error('Request to be presigned is not an valid HTTP request.');
}
// Retry information headers are not meaningful in presigned URLs
delete request.headers['amz-sdk-invocation-id'];
delete request.headers['amz-sdk-request'];
// User agent header would leak sensitive information
delete request.headers['x-amz-user-agent'];
delete request.headers['x-amz-content-sha256'];
delete request.query['x-id'];
if (request.port) {
request.headers['host'] = `${request.hostname}:${request.port}`;
}
const presigned = await s3Presigner.presign(request, {
...options,
signingRegion: options.signingRegion ?? context['signing_region'],
signingService: options.signingService ?? context['signing_service'],
});
return {
// Intercept the middleware stack by returning fake response
response: {},
output: {
$metadata: { httpStatusCode: 200 },
presigned,
},
} as any;
};
const middlewareName = 'presignInterceptMiddleware';
client.middlewareStack.addRelativeTo(presignInterceptMiddleware, {
name: middlewareName,
relation: 'before',
toMiddleware: 'awsAuthMiddleware',
override: true,
});
let presigned: HttpRequest;
try {
const output = await client.send(command);
//#ts-ignore the output is faked, so it's not actually OutputType
presigned = output.presigned;
} finally {
client.middlewareStack.remove(middlewareName);
}
return formatUrl(presigned);
};
The solution is probably the same as in my other question, so simply copying the answer:
I was trying and changing ports, and the put command seems to work when I use only local host for url generation
so, in this above:
new S3Client({
region: 'us-east-1',
credentials: {
accessKeyId: 'minioadmin',
secretAccessKey: 'minioadmin',
},
endpoint: http://172.21.0.2:9000,
forcePathStyle: true,
});
I use:
new S3Client({
region: 'us-east-1',
credentials: {
accessKeyId: 'minioadmin',
secretAccessKey: 'minioadmin',
},
endpoint: http://172.21.0.2, // or 127.0.0.1
forcePathStyle: true,
});
Note, I haven't used any port number, so the default is 80
If you're using docker-compose add this config:
.
.
.
ports:
- 80:9000
and it works fine.

Google Drive API: How to create a file in appDataFolder?

I'm reading this documenation:
https://developers.google.com/drive/api/v3/appdata
This is my code:
var fileMetadata = {
'name': 'config.json',
'parents': ['appDataFolder']
};
var media = {
mimeType: 'application/json',
body: '"sample text"'
};
const request = gapi.client.drive.files.create({
resource: fileMetadata,
media,
fields: 'id'
})
request.execute(function (err, file) {
if (err) {
// Handle error
console.error(err);
} else {
console.log('Folder Id:', file.id);
}
})
I get a 403 error: "The user does not have sufficient permissions for this file."
Does not the user have permission to create a file in his appDataFolder? How to create a file in it?
The scope of gapi client is 'https://www.googleapis.com/auth/drive.appdata' and the user accepted it.
I believe the reason for this error is that you are only using the scope to access the appdata folder, but not the scope to create files. Accessing the app data folder and creating files are two different things. According to your code, you are trying to create a file in the appdata folder.
I suggest you to include both scopes:
https://www.googleapis.com/auth/drive.appdata
https://www.googleapis.com/auth/drive.file
If you are not using incremental authorization, make sure to revoke access and reauthorize again.
Reference: https://developers.google.com/drive/api/v3/about-auth#OAuth2Authorizing
You don't actually need https://www.googleapis.com/auth/drive.file scope to create or delete data inside the appDataFolder. https://www.googleapis.com/auth/drive.appdata scope covers all that.
Try this. Just pass your auth client to the createFile() function.
// Requiring the modular service is much better than requiring the whole GAPI
const GDrive = require('#googleapis/drive');
function createFile(auth) {
const drive = GDrive.drive({version: 'v3', auth});
const fileMetadata = {
'name': 'config.json',
'parents': ['appDataFolder']
};
const media = {
mimeType: 'application/json',
body: '{"TEST": "THIS WORKED"}'
};
drive.files.create({
resource: fileMetadata,
media: media,
fields: 'id'
}).then((resp) => {
console.log('File Id: ', resp.data.id);
}).catch((error) => {
console.error('Unable to create the file: ', error);
});
}

Passport.js / Google OAuth2 strategy - How to use token on login for API access

I am logging users in via their domain Google accounts using passport.js. This works great, but now I need to give this application access to a few Google API's (drive, sheets, etc).
When a user logs in, a message appears in the logs, that makes it seem like passport has all the required info:
info: [06/Jun/2019:21:24:37 +0000] "302 GET /auth/callback?code=** USER ACCESS TOKEN HERE **&scope=email%20profile%20https://www.googleapis.com/auth/drive.file%20https://www.googleapis.com/auth/spreadsheets%20https://www.googleapis.com/auth/userinfo.email%20https://www.googleapis.com/auth/userinfo.profile%20https://www.googleapis.com/auth/drive HTTP/1.1" [46]
This is achieved by passing the appended scopes via passport.authenticate(), which presents the user with the "Grant access to these things on your Google account to this app?" screen :
//Initial auth call to Google
router.get('/',
passport.authenticate('google', {
hd: 'edmonds.wednet.edu',
scope: [
'email',
'profile',
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/drive.file',
'https://www.googleapis.com/auth/spreadsheets'
],
prompt: 'select_account'
})
);
However, when I go and try to call an API with something like:
const {google} = require('googleapis');
const sheets = google.sheets({version: 'v4', auth});
router.post('/gsCreate', function(req,res,next){
sheets.spreadsheets.create({
// Details here.....
});
});
I get nothing but errors (the current one is debug: authClient.request is not a function)
My question is: Is it possible for me to use a setup like this, asking the user to log in and grant permissions once, and then somehow save that to their user session via passport?
I had the same question, but I was able to access Google Gmail API functionalities along with Passport.js user authentication by specifying 'scopes' using the following process.
First, create a file to setup the passport-google-strategy in nodejs as follows.
passport_setup.js
const passport = require('passport')
const GoogleStrategy = require('passport-google-oauth20')
const fs = require("fs");
const path = require('path');
//make OAuth2 Credentials file using Google Developer console and download it(credentials.json)
//replace the 'web' using 'installed' in the file downloaded
var pathToJson = path.resolve(__dirname, './credentials.json');
const config = JSON.parse(fs.readFileSync(pathToJson));
passport.serializeUser((user, done) => {
done(null, user.id)
})
passport.deserializeUser((id, done) => {
const query = { _id: id }
Users.findOne(query, (err, user) => {
if (err) {
res.status(500).json(err);
} else {
done(null, user)
}
})
})
//create a google startergy including following details
passport.use(
new GoogleStrategy({
clientID: config.installed.client_id,
clientSecret: config.installed.client_secret,
callbackURL: config.installed.redirect_uris[0]
}, (accessToken, refreshToken,otherTokenDetails, user, done) => {
//in here you can access all token details to given API scope
//and i have created file from that details
let tokens = {
access_token: accessToken,
refresh_token: refreshToken,
scope: otherTokenDetails.scope,
token_type: otherTokenDetails.token_type,
expiry_date:otherTokenDetails.expires_in
}
let data = JSON.stringify(tokens);
fs.writeFileSync('./tokens.json', data);
//you will get a "user" object which will include the google id, name details,
//email etc, using that details you can do persist user data in your DB or can check
//whether the user already exists
//after persisting user data to a DB call done
//better to use your DB user objects in the done method
done(null, user)
})
)
Then create your index.js file in nodejs for API route management and to call send method of Gmail API.
Also, run the following command to install "google-apis"
npm install googleapis#39 --save
index.js
const express = require("express")
//import passport_setup.js
const passportSetup = require('./passport_setup')
const cookieSeesion = require('cookie-session');
const passport = require("passport");
//import google api
const { google } = require('googleapis');
//read credentials file you obtained from google developer console
const fs = require("fs");
const path = require('path');
var pathToJson_1 = path.resolve(__dirname, './credentials.json');
const credentials = JSON.parse(fs.readFileSync(pathToJson_1));
//get Express functionalities to app
const app = express();
// **Middleware Operations**//
//cookie encryption
app.use(cookieSeesion({
name:'Reserve It',
maxAge: 1*60*60*1000,
keys: ['ranmalc6h12o6dewage']
}))
//initialize passort session handling
app.use(passport.initialize())
app.use(passport.session())
app.use(express.json());
//**API urls**//
//route to authenticate users using google by calling google stratergy in passport_setup.js
//mention access levels of API you want in the scope
app.get("/google", passport.authenticate('google', {
scope: ['profile',
'email',
'https://mail.google.com/'
],
accessType: 'offline',
prompt: 'consent'
}))
//redirected route after obtaining 'code' from user authentication with API scopes
app.get("/google/redirect", passport.authenticate('google'), (req, res) => {
try {
//read token file you saved earlier in passport_setup.js
var pathToJson_2 = path.resolve(__dirname, './tokens.json');
//get tokens to details to object
const tokens = JSON.parse(fs.readFileSync(pathToJson_2));
//extract credential details
const { client_secret, client_id, redirect_uris } = credentials.installed
//make OAuth2 object
const oAuth2Client = new google.auth.OAuth2(client_id,
client_secret,
redirect_uris[0])
// set token details to OAuth2 object
oAuth2Client.setCredentials(tokens)
//create gmail object to call APIs
const gmail = google.gmail({ version: 'v1', auth: oAuth2Client })
//call gmail APIs message send method
gmail.users.messages.send({
userId: 'me',//'me' indicate current logged in user id
resource: {
raw: //<email content>
}
}, (err, res) => {
if (err) {
console.log('The API returned an error: ' + err)
throw err
}
console.log('Email Status : ' + res.status)
console.log('Email Status Text : ' + res.statusText)
})
res.status(200).json({ status:true })
} catch (err) {
res.status(500).json(err)
}
})
app.listen(3000, () => { console.log('Server Satrted at port 3000') })
You can separate the routes in the index.js file to different files for clarity using express.Router()
If you want to call another Google API service just change this code segment and code below that;
const gmail = google.gmail({ version: 'v1', auth: oAuth2Client })
gmail.users.messages.send(....Send Method internal implementation given above....)
For Google Drive:
const drive = google.drive({version: 'v3', auth: oAuth2Client});
drive.files.list(...Refer "Google Drive API" documentation for more details....)
I believe you can't use passport.js for three-legged oauth for APIs like Sheets or Drive.
Have a look at the Using OAuth for web servers documentation instead.
user835611 has the correct answer, as that page explains everything quite nicely. However, if you still need more, the below link really helped me to understand how this works.
https://github.com/googleapis/google-auth-library-nodejs#oauth2

authenticated AWS API Gateway via javascript AND credentials provider

My setup consists of an AWS API Gateway with IAM access control and AWS cognito for log in.
I access the API already from an Android app and would now like to build a web app (angular2) to do the same.
On Android, I'm using the AWSCognitoCredentialsProvider to supply the API SDK with the required credential. (http://docs.aws.amazon.com/apigateway/latest/developerguide/how-to-generate-sdk.html)
Unfortunately I cannot figure how / if I can do that with the javascript SDK?
I have no trouble using cognito to log in and get the session ID, access token etc. However, the API SDK requires me to provide accessKey and secretKey.
Here's the relevant code snippet from the generated API SDK:
var authType = 'NONE';
if (sigV4ClientConfig.accessKey !== undefined && sigV4ClientConfig.accessKey !== '' && sigV4ClientConfig.secretKey !== undefined && sigV4ClientConfig.secretKey !== '') {
authType = 'AWS_IAM';
}
In other words, I have this part working (from some example code):
static authenticate(username:string, password:string, callback:CognitoCallback) {
AWSCognito.config.update({accessKeyId: 'anything', secretAccessKey: 'anything'})
let authenticationData = {
Username: username,
Password: password,
};
let authenticationDetails = new AWSCognito.CognitoIdentityServiceProvider.AuthenticationDetails(authenticationData);
let userData = {
Username: username,
Pool: CognitoUtil.getUserPool()
};
console.log("Authenticating the user");
let cognitoUser = new AWSCognito.CognitoIdentityServiceProvider.CognitoUser(userData);
console.log(AWS.config);
cognitoUser.authenticateUser(authenticationDetails, {
onSuccess: function (result) {
callback.cognitoCallback(null, result);
},
onFailure: function (err) {
callback.cognitoCallback(err.message, null);
},
});
}
and now I'd like to use this:
this.apigClient = apigClientFactory.newClient({
accessKey: "anything",
secretAccessKey: "anything",
sessionToken: "nothing",
region: 'eu-central-1'
How do I get accessKey, secretAccessKey and sessionToken out of my AWSCognito? I was unable to find any API for that so far...
Thank you Bob, for pointing me in the right direction! I've now figured it out and thus for completeness sake, here's the full solution to my problem:
From the service that creates the apigClient:
return CognitoUtil.getCredentials()
.then(() =>
this.apigClient = apigClientFactory.newClient({
accessKey: AWS.config.credentials.accessKeyId,
secretKey: AWS.config.credentials.secretAccessKey,
sessionToken: AWS.config.credentials.sessionToken,
region: 'eu-central-1'}));
The getCredentials() method, which is key to get the required temporary credentials:
public static getCredentials():Promise{
return new Promise((resolve, reject) => {
CognitoUtil.getIdToken({
callback() {
},
callbackWithParam(idTokenJwt:any) {
let url = 'cognito-idp.' + CognitoUtil._REGION.toLowerCase() + '.amazonaws.com/' + CognitoUtil._USER_POOL_ID;
let logins = {};
logins[url] = idTokenJwt;
let params = {
IdentityPoolId: CognitoUtil._IDENTITY_POOL_ID, /* required */
Logins: logins
};
AWS.config.region = CognitoUtil._REGION;
AWS.config.credentials = new AWS.CognitoIdentityCredentials(params);
AWS.config.credentials.refresh(result => {
console.log(AWS.config.credentials);
resolve();
});
}
});
});
}
So the key insight here was, that
I authenticate to the user pool (shown in my question)
I use that with an identity provider to retrieve temporary credentials (getCredentials)
I use the temporary credentials out of AWS.config.credentials to setup the apigClient
I hope this is helpful to someone else as well. Certainly the code that I just posted probably could use some refactoring, so any comments on that are very welcome!
Cognito is actually made of 3 different services:
Cognito Your User Pools - What you've integrated here
Cognito Sync - For syncing user preference data for users
Cognito Federated Identity - For federating identities (FB, Google or User Pools) into your account and generating credentials.
What the API Gateway client is expecting is credentials that come from Cognito Federated Identity.
See the Cognito documentation for integrating your user pool with Cognito Federated Identity.

Categories

Resources