Unable to use CognitoIdentityServiceProvider from AWS SDK - javascript

I'm currently using amazon-cognito-identity-js and CognitoIdentityServiceProvider
and following this article https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/CognitoIdentityServiceProvider.html
When calling listUsersInGroup function I'm initializing the this.cognitoProvider with accessKeyId and secretAccessKey
Is there a way I can use the CognitoIdentityServiceProvider without specifying accessKeyId and secretAccessKey? I don't want to specify these keys since it contains sensitive information
This works
import { Config, CognitoIdentityCredentials, CognitoIdentityServiceProvider } from "aws-sdk";
export default class CognitoAuth {
configure(config) {
if (typeof config !== 'object' || Array.isArray(config)) {
throw new Error('[CognitoAuth error] valid option object required')
}
this.userPool = new CognitoUserPool({
UserPoolId: config.IDENTITY_POOL_ID,
ClientId: config.CLIENT_ID
})
Config.credentials = new CognitoIdentityCredentials({
IdentityPoolId: config.IDENTITY_POOL_ID
})
this.cognitoProvider = new CognitoIdentityServiceProvider({
region: config.REGION,
accessKeyId: config.ACCESS_KEY_ID,
secretAccessKey: config.SECRET_ACCESS_KEY
});
Config.region = config.REGION
this.options = config
}
getUsersInGroup(context, cb) {
var params = {
GroupName: context.group,
UserPoolId: this.options.IDENTITY_POOL_ID
};
this.cognitoProvider.listUsersInGroup(params, (err, data) => {
if (err) console.log(err, err.stack)
else cb(null, data.Users)
})
}
}
This don't work
this.cognitoProvider = new AWS.CognitoIdentityServiceProvider({ apiVersion: '2016-04-18' })
but I'm getting error ConfigError: Missing region in config

As per your linked documentation page, calling the listUsersInGroup requires developer credentials, so these must be provided somehow.
If you look at Setting credentials in Node.js, there are different ways to pass them, e.g., if running this function on a Lambda (or on an EC2 instance), it will use the Lambda (or EC2 instance) role permissions to call the method and credentials never have to be passed. Other options are using environment variables (AWS_ACCESS_KEY_ID/AWS_SECRET_ACCESS_KEY) or shared credentials file.
However, your immediate problem seems to be regarding the region. While in the working block it is passed with region: config.REGION,, it is missing from the non working block. You can fix that by passing the region parameter when instantiating CognitoIdentityServiceProvider:
this.cognitoProvider = new AWS.CognitoIdentityServiceProvider({
apiVersion: '2016-04-18',
region: 'us-east-1' // use your region
});

Related

How to fetch Amazon Cognito Identity ID (user_identity_id) for the user from the lambda function?

In the Amplify documentation, under the Storage/File access levels section there is a paragraph that states:
Files are stored under private/{user_identity_id}/ where the user_identity_id corresponds to the unique Amazon Cognito Identity ID for that user.
How to fetch user_identity_id from the lambda function?
Request to the lambda is authorized, the event.requestContext.authorizer.claims object is available, I can see the user data, but not the user_identity_id.
EDIT: Now I see that there is a field event.requestContext.identity.cognitoIdentityId, but the value is null. Still need to find the way to fetch it.
Ok, so there's no right way to map Cognito identity ID and Cognito user. There is a lengthy discussion here where a couple of workarounds can be found. For now, I'm going to use this solution where, instead of identity_id, you can specify a custom attribute (most likely a sub) as a folder name.
EDIT: There is another solution that might help (found somewhere on the internet, and I verified that it works)
const AWS = require('aws-sdk')
const cognitoIdentity = new AWS.CognitoIdentity();
function getCognitoIdentityId(jwtToken) {
const params = getCognitoIdentityIdParams(jwtToken);
return cognitoIdentity
.getId(params)
.promise()
.then(data => {
if (data.IdentityId) {
return data.IdentityId;
}
throw new Error('Invalid authorization token.');
});
}
function getCognitoIdentityIdParams(jwtToken) {
const loginsKey = `cognito-idp.${process.env.REGION}.amazonaws.com/${process.env.USERPOOLID}`;
return {
IdentityPoolId: `${process.env.IDENTITY_POOL_ID}`,
Logins: {
[loginsKey]: jwtToken,
},
};
}
If the user accesses the lambda through graphql via the AppSync service then the identity is stored event.identity.owner
Here is some typescript code I use to pull the user_identity_id from the event. However, the user doesn't always call the lambda direct sp the user_identity can also be based in if from an authorized IAM role.
export function ownerFromEvent(event: any = {}): string {
if (
event.identity.userArn &&
event.identity.userArn.split(":")[5].startsWith("assumed-role")
) {
// This is a request from a function over IAM.
return event.arguments.input.asData.owner;
} else {
return event.identity.owner;
}
}
For anyone else still struggling with this, I was finally able to use the aws-sdk for JavaScript v3 to obtain a Cognito User's IdentityId & Credentials in a Lambda Function invoked via API-Gateway with a Cognito User Pool Authorizer from the Cognito User's identity jwtToken passed into the Authorization header of the request.
Here is the code used in my JavaScript Lambda Function:
const IDENTITY_POOL_ID = "us-west-2:7y812k8a-1w26-8dk4-84iw-2kdi849sku72"
const USER_POOL_ID = "cognito-idp.us-west-2.amazonaws.com/us-west-2_an976DxVk"
const { CognitoIdentityClient } = require("#aws-sdk/client-cognito-identity");
const { fromCognitoIdentityPool } = require("#aws-sdk/credential-provider-cognito-identity");
exports.handler = async (event,context) => {
const cognitoidentity = new CognitoIdentityClient({
credentials: fromCognitoIdentityPool({
client: new CognitoIdentityClient(),
identityPoolId: IDENTITY_POOL_ID,
logins: {
[USER_POOL_ID]:event.headers.Authorization
}
}),
});
var credentials = await cognitoidentity.config.credentials()
console.log(credentials)
// {
// identityId: 'us-west-2:d393294b-ff23-43t6-d8s5-59876321457d',
// accessKeyId: 'ALALA2RZ7KTS7STD3VXLM',
// secretAccessKey: '/AldkSdt67saAddb6vddRIrs32adQCAo99XM6',
// sessionToken: 'IQoJb3JpZ2luX2VjEJj//////////...', // sessionToken cut for brevity
// expiration: 2022-07-17T08:58:10.000Z
// }
var identity_ID = credentials.identityId
console.log(identity_ID)
// us-west-2:d393294b-ff23-43t6-d8s5-59876321457d
const response = {
statusCode: 200,
headers: {
"Access-Control-Allow-Headers": "*",
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Methods" : "OPTIONS,POST,GET,PUT"
},
body:JSON.stringify(identity_ID)
};
return response;
}
After a Cognito User has signed in to my application, I can use the Auth directive of aws-amplify and fetch() in my React-Native app to invoke the lambda function shown above by sending a request to my API-Gateway trigger (authenticated with a Cognito User Pool Authorizer) by calling the following code:
import { Auth } from 'aws-amplify';
var APIGatewayEndpointURL = 'https://5lstgsolr2.execute-api.us-west-2.amazonaws.com/default/-'
var response = {}
async function getIdentityId () {
var session = await Auth.currentSession()
var IdToken = await session.getIdToken()
var jwtToken = await IdToken.getJwtToken()
var payload = {}
await fetch(APIGatewayEndpointURL, {method:"POST", body:JSON.stringify(payload), headers:{Authorization:jwtToken}})
.then(async(result) => {
response = await result.json()
console.log(response)
})
}
More info on how to Authenticate using aws-amplify can be found here https://docs.amplify.aws/ui/auth/authenticator/q/framework/react-native/#using-withauthenticator-hoc

Unable to get the filecount or list from AWS S3 bucket using javascript

I am using Cypress test and one of the validation is to get the count of files in S3 bucket.
But I am not able to do get the count of files in S3 bucket.
Below is the Cypress code.
describe('Validate api field validation', () => {
it('Verify objects land in correct AWS S3 bucket', () => {
var date = new Date();
var bucketDirectory = date.getUTCFullYear()
var bucketLocation = Cypress.env('aws_bucketLocation')
cy.log(bucketDirectory)
try{
var ss=getCountOfFiles(bucketLocation,bucketDirectory)
}
catch(err){
cy.log(err)
}
cy.log(ss)
})
}
And below is the function I am using to get the count of files in S3 bucket.
const AWS = require("aws-sdk");
const fs = require("fs");
AWS.config.update({
accessKeyId: 'xxxxx',
secretAccessKey: 'yyyyyyy',
region: 'abide',
});
const getCountOfFiles = async (bucketname, prefix) => {
try {
cy.log(bucketname)
cy.log(prefix)
const data = await s3.listObjectsV2({
Bucket: bucketname,
Prefix: prefix, // Limcits response to keys that begin with specified prefix
}).promise().then(mydata => {
return mydata;
})
cy.log('---------')
if (data.$response.error) {
throw new Error('Could not list files in S3: ${data.$response.error}');
}
return data;
} catch {e} {
cy.log(e.message)
//throw new Error('Could not list files in S3: ${e.message}');
}
};
I am trying to print out as whats happening in the log but even I am not able to catch the exception. I am not sure where I am going wrong here.
All data are perfect. It is something I am doing silly.
The directory exists in AWS. Even if it does not exist or something is wrong in input data, why no exception is printed??
Can anyone please help me here. Is it a limitation of Cypress??

Is there a way to get the previous version of a deleted s3 object with aws-sdk?

I have a S3 bucket with versioning enabled, configured to send notification events to Lambda. I need to process deleted objects from that bucket when the s3:ObjectRemoved:* event is received.
The event contains the versionId of the deleted object.
Is there a way to discover the versionId of the immediately previous version of the deleted object and fetch that version using the aws-sdk?
Or, alternatively, is there a way to get the deleted object using aws-sdk?
(I'm using the JavaScript aws-sdk)
It can be done with a 3-step process:
Get the list of versions with listObjectVersions
Get the wanted version from the list
Get the specific object,
passing VersionId as argument in getObject
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
async function getDeletedObject (event, context) {
let params = {
Bucket: 'my-bucket',
Prefix: 'my-file'
};
try {
const previousVersion = await s3.listObjectVersions(params)
.promise()
.then(result => {
const versions = result.Versions;
// get previous versionId
return versions[0].VersionId;
});
params = {
Bucket: 'my-bucket',
Key: 'my-file',
VersionId: previousVersion
};
const deletedObject = await s3.getObject(params)
.promise()
.then(response => response.Body.toString('utf8'));
return deletedObject;
}
catch (error) {
console.log(error);
return;
}
}
Getting the below error with the solution mentioned by #andreswebs
IMG
UnhandledPromiseRejectionWarning: MethodNotAllowed: The specified method is not allowed against this resource.

Lambda function change endpoint

I am somewhat new to Lambda and am trying to pull some data from Support(us-east-1) and then Read/Write to a DynamoDB(I am using a local dynamodb-local instance), however I dont know how to change the region.
const AWS = require('aws-sdk');
AWS.config.update({
region: 'us-east-1',
});
const support = new AWS.Support({
region: 'us-east-1',
apiVersion: '2013-04-15'
});
const supportParams = {
checkId: 'Qch7DwouX1',
language: 'en'
};
let stuff = {};
support.describeTrustedAdvisorCheckResult(supportParams, (err, data) => {
if(err) console.log('Error: ', err.stack);
else {
stuff[test] = [...data]
};
}
// Now I want to pull some data from DynamoDB locally or in another region
//
// AWS.config.update({endpoint: 'http://localhost:8000});
//
How do I change the endpoint to http://localhost:8000 or us-west-2 to get something from DynamoDB? Am I not supposed to change region/endpoint within 1 lambda function?
I was trying something like:
const dynaDB = new AWS.DynamoDB({endpoint: 'http://localhost:8000'})
const dynaClient = new AWS.DynamoDB.DocumentClient();
dynaClient.scan({}, (err, data) => {
..
..
..
}
We had the same problem when we want to copy between two regions.
You can instantiate aws-sdk one for each dynamodb,
const AWSregion = require('aws-sdk');
AWSregion.config.update({
region: 'us-east-1',
});
// Connect to us-east-1 with AWSregion
const AWSlocal = require('aws-sdk'); // Don't set any region here, since it is local
// Connect to local dynamodb with AWSlocal
Hope it helps.

Google cloud dialogflow intent detection nodejs example not working

I am trying to implement a very simple dialogflow agent integration with nodejs.
Here is what I did so far
I followed the code from Intent detection
I added the service account private key file .json to my server.
I added the environment variable GOOGLE_APPLICATION_CREDENTIALS with the path to my .json private key file.
Here is the code I am trying to run right now:
require('dotenv').config()
const projectId = 'gg-chatbot-216808';
const sessionId = 'quickstart-session-id';
const query = 'hello';
const languageCode = 'en-US';
// Instantiate a DialogFlow client.
const dialogflow = require('dialogflow');
const sessionClient = new dialogflow.SessionsClient();
// Define session path
const sessionPath = sessionClient.sessionPath(projectId, sessionId);
// The text query request.
const request = {
session: sessionPath,
queryInput: {
text: {
text: query,
languageCode: languageCode,
},
},
};
// This prints the private key path correctly.
console.log(process.env.GOOGLE_APPLICATION_CREDENTIALS);
// Send request and log result
sessionClient
.detectIntent(request)
.then(responses => {
console.log('Detected intent');
const result = responses[0].queryResult;
console.log(` Query: ${result.queryText}`);
console.log(` Response: ${result.fulfillmentText}`);
if (result.intent) {
console.log(` Intent: ${result.intent.displayName}`);
} else {
console.log(` No intent matched.`);
}
})
.catch(err => {
console.error('ERROR:', err);
});
Then I get this error in the console when I run this file
Auth error:Error: invalid_user: Robot is disabled.
ERROR: { Error: 14 UNAVAILABLE: Getting metadata from plugin failed with error: invalid_user: Robot is disabled.
at Object.exports.createStatusError (/var/www/html/google_auth/node_modules/grpc/src/common.js:87:15)
at Object.onReceiveStatus (/var/www/html/google_auth/node_modules/grpc/src/client_interceptors.js:1188:28)
at InterceptingListener._callNext (/var/www/html/google_auth/node_modules/grpc/src/client_interceptors.js:564:42)
at InterceptingListener.onReceiveStatus (/var/www/html/google_auth/node_modules/grpc/src/client_interceptors.js:614:8)
at callback (/var/www/html/google_auth/node_modules/grpc/src/client_interceptors.js:841:24)
code: 14,
metadata: Metadata { _internal_repr: {} },
details: 'Getting metadata from plugin failed with error: invalid_user: Robot is disabled.' }
i also faced a similar issue for my angular bot.
What i did was, instead of using using the google_credentials from the json file, i created an object with private_key,client_email {these values can be taken from the service account private key file .json}, and passed the object while setting up the session client.
var config = {
credentials: {
private_key: "YOUR_PRIVATE_KEY",
client_email: "YOUR_CLIENT_EMAIL"
}
}
const sessionClient = new dialogflow.SessionsClient(config);
note: do copy the full private_key string from .json. It will start as "-----BEGIN PRIVATE KEY-----\n......" .
Also, in GCP go to the project->IAM then try setting role for the service as DIALOGLOW API ADMIN. Check if this works.
If this has not been resolved yet , the solution is to provide "fileKey" inside sessionClient.
const sessionClient = new dialogflow.SessionsClient({
fileKey:" path of your credentials.json file"
});
or
let filePath = process.env.GOOGLE_APPLICATION_CREDENTIALS ="Location of credentials file".
const sessionClient = new dialogflow.SessionsClient({
fileKey:filePath
});
this will even work if there is no system env variable is set as GOOGLE_APPLICATION_CREDENTIALS.
Hope this is helpful.

Categories

Resources