Lambda function change endpoint - javascript

I am somewhat new to Lambda and am trying to pull some data from Support(us-east-1) and then Read/Write to a DynamoDB(I am using a local dynamodb-local instance), however I dont know how to change the region.
const AWS = require('aws-sdk');
AWS.config.update({
region: 'us-east-1',
});
const support = new AWS.Support({
region: 'us-east-1',
apiVersion: '2013-04-15'
});
const supportParams = {
checkId: 'Qch7DwouX1',
language: 'en'
};
let stuff = {};
support.describeTrustedAdvisorCheckResult(supportParams, (err, data) => {
if(err) console.log('Error: ', err.stack);
else {
stuff[test] = [...data]
};
}
// Now I want to pull some data from DynamoDB locally or in another region
//
// AWS.config.update({endpoint: 'http://localhost:8000});
//
How do I change the endpoint to http://localhost:8000 or us-west-2 to get something from DynamoDB? Am I not supposed to change region/endpoint within 1 lambda function?
I was trying something like:
const dynaDB = new AWS.DynamoDB({endpoint: 'http://localhost:8000'})
const dynaClient = new AWS.DynamoDB.DocumentClient();
dynaClient.scan({}, (err, data) => {
..
..
..
}

We had the same problem when we want to copy between two regions.
You can instantiate aws-sdk one for each dynamodb,
const AWSregion = require('aws-sdk');
AWSregion.config.update({
region: 'us-east-1',
});
// Connect to us-east-1 with AWSregion
const AWSlocal = require('aws-sdk'); // Don't set any region here, since it is local
// Connect to local dynamodb with AWSlocal
Hope it helps.

Related

Unable to use CognitoIdentityServiceProvider from AWS SDK

I'm currently using amazon-cognito-identity-js and CognitoIdentityServiceProvider
and following this article https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/CognitoIdentityServiceProvider.html
When calling listUsersInGroup function I'm initializing the this.cognitoProvider with accessKeyId and secretAccessKey
Is there a way I can use the CognitoIdentityServiceProvider without specifying accessKeyId and secretAccessKey? I don't want to specify these keys since it contains sensitive information
This works
import { Config, CognitoIdentityCredentials, CognitoIdentityServiceProvider } from "aws-sdk";
export default class CognitoAuth {
configure(config) {
if (typeof config !== 'object' || Array.isArray(config)) {
throw new Error('[CognitoAuth error] valid option object required')
}
this.userPool = new CognitoUserPool({
UserPoolId: config.IDENTITY_POOL_ID,
ClientId: config.CLIENT_ID
})
Config.credentials = new CognitoIdentityCredentials({
IdentityPoolId: config.IDENTITY_POOL_ID
})
this.cognitoProvider = new CognitoIdentityServiceProvider({
region: config.REGION,
accessKeyId: config.ACCESS_KEY_ID,
secretAccessKey: config.SECRET_ACCESS_KEY
});
Config.region = config.REGION
this.options = config
}
getUsersInGroup(context, cb) {
var params = {
GroupName: context.group,
UserPoolId: this.options.IDENTITY_POOL_ID
};
this.cognitoProvider.listUsersInGroup(params, (err, data) => {
if (err) console.log(err, err.stack)
else cb(null, data.Users)
})
}
}
This don't work
this.cognitoProvider = new AWS.CognitoIdentityServiceProvider({ apiVersion: '2016-04-18' })
but I'm getting error ConfigError: Missing region in config
As per your linked documentation page, calling the listUsersInGroup requires developer credentials, so these must be provided somehow.
If you look at Setting credentials in Node.js, there are different ways to pass them, e.g., if running this function on a Lambda (or on an EC2 instance), it will use the Lambda (or EC2 instance) role permissions to call the method and credentials never have to be passed. Other options are using environment variables (AWS_ACCESS_KEY_ID/AWS_SECRET_ACCESS_KEY) or shared credentials file.
However, your immediate problem seems to be regarding the region. While in the working block it is passed with region: config.REGION,, it is missing from the non working block. You can fix that by passing the region parameter when instantiating CognitoIdentityServiceProvider:
this.cognitoProvider = new AWS.CognitoIdentityServiceProvider({
apiVersion: '2016-04-18',
region: 'us-east-1' // use your region
});

How do I use two AWS IAM accounts in a single program?

I've been trying transfer some data in bulk between DynamoDB Tables on two different accounts and I haven't been able to do so because I can't use another account in the same program since it just defaults to my main account I use in the AWS CLI.
Here's my code for accessing the two different IAM accounts.
Destination_acc.js
import { DynamoDBClient } from "#aws-sdk/client-dynamodb";
const CONFIG = {
region: "us-east-1",
accessKeyId: "x",
secretAccessKey: "y",
};
const dest = new DynamoDBClient(CONFIG);
export { dest };
Source_acc.js
import { DynamoDBClient } from "#aws-sdk/client-dynamodb";
const CONFIG = {
region: "us-east-1",
accessKeyId: "x",
secretAccessKey: "y",
};
const source = new DynamoDBClient(CONFIG);
export { source };
test.js
export const scanTable = async () => {
const params = {
TableName: "table",
};
var scanResults = [];
var items = [];
do {
items = await dest.send(new ScanCommand(params));
items.Items.forEach((item) => {
console.log(item);
scanResults.push(item);
});
params.ExclusiveStartKey = items.LastEvaluatedKey;
} while (typeof items.LastEvaluatedKey !== "undefined");
return scanResults;
};
scanTable(); //Returns the data in the table of `source` account instead of the data in `dest` account.
DynamoDB picks the account and region based on IAM role. You first need to set up a role in the second account which you are allowed to assume from the first account:
https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_common-scenarios_aws-accounts.html
After that, in your code you need to call sts assumeRole and create a second client based on that role:
https://docs.aws.amazon.com/STS/latest/APIReference/API_AssumeRole.html
Now when you assume the role and create a client with that role you can access DynamoDB tables/ other resources in the second account.

Unable to get the filecount or list from AWS S3 bucket using javascript

I am using Cypress test and one of the validation is to get the count of files in S3 bucket.
But I am not able to do get the count of files in S3 bucket.
Below is the Cypress code.
describe('Validate api field validation', () => {
it('Verify objects land in correct AWS S3 bucket', () => {
var date = new Date();
var bucketDirectory = date.getUTCFullYear()
var bucketLocation = Cypress.env('aws_bucketLocation')
cy.log(bucketDirectory)
try{
var ss=getCountOfFiles(bucketLocation,bucketDirectory)
}
catch(err){
cy.log(err)
}
cy.log(ss)
})
}
And below is the function I am using to get the count of files in S3 bucket.
const AWS = require("aws-sdk");
const fs = require("fs");
AWS.config.update({
accessKeyId: 'xxxxx',
secretAccessKey: 'yyyyyyy',
region: 'abide',
});
const getCountOfFiles = async (bucketname, prefix) => {
try {
cy.log(bucketname)
cy.log(prefix)
const data = await s3.listObjectsV2({
Bucket: bucketname,
Prefix: prefix, // Limcits response to keys that begin with specified prefix
}).promise().then(mydata => {
return mydata;
})
cy.log('---------')
if (data.$response.error) {
throw new Error('Could not list files in S3: ${data.$response.error}');
}
return data;
} catch {e} {
cy.log(e.message)
//throw new Error('Could not list files in S3: ${e.message}');
}
};
I am trying to print out as whats happening in the log but even I am not able to catch the exception. I am not sure where I am going wrong here.
All data are perfect. It is something I am doing silly.
The directory exists in AWS. Even if it does not exist or something is wrong in input data, why no exception is printed??
Can anyone please help me here. Is it a limitation of Cypress??

Key element does not match the schema with DocumentClient

I have a Lambda that I have created using the AWS Toolkit for Visual Studio Code. I have a small lambda.js file that exports the handler for API Gateway events.
const App = require('./app');
const AWS = require('aws-sdk');
exports.handler = async (event, context) => {
let config = {
aws_region: 'us-east-1',
//endpoint: 'http://localhost:8000',
};
let dynamo = new AWS.DynamoDB.DocumentClient();
let app = new App(dynamo, config, event);
try {
let response = await app.run();
return response;
} catch(err) {
return err;
}
};
The app.js file represents the logic of my Lambda. This is broken up to improve testing.
const AWS = require('aws-sdk');
class App {
constructor(datastore, configuration, event) {
this.datastore = datastore;
this.httpEvent = event;
this.configuration = configuration;
}
async run() {
AWS.config.update({
region: this.configuration.aws_region,
endpoint: this.configuration.endpoint,
});
let params = {
TableName: 'test',
Key: {
'year': 2015,
'title': 'The Big New Movie',
},
};
const getRequest = this.datastore.get(params);
// EXCEPTION THROWN HERE
var result = await getRequest.promise();
let body = {
location: this.configuration.aws_region,
url: this.configuration.endpoint,
data: result
};
return {
'statusCode': 200,
'body': JSON.stringify(body)
};
}
}
module.exports = App;
I can write the following mocha test and get it to pass.
'use strict';
const AWS = require('aws-sdk');
const chai = require('chai');
const sinon = require('sinon');
const App = require('../app');
const expect = chai.expect;
var event, context;
const dependencies = {
// sinon.stub() prevents calls to DynamoDB and allows for faking of methods.
dynamo: sinon.stub(new AWS.DynamoDB.DocumentClient()),
configuration: {
aws_region: 'us-west-2',
endpoint: 'http://localhost:5000',
},
};
describe('Tests handler', function () {
// Reset test doubles for isolating individual test cases
this.afterEach(sinon.reset);
it('verifies successful response', async () => {
dependencies.dynamo.get.returns({ promise: sinon.fake.resolves('foo bar')});
let app = new App(dependencies.dynamo, dependencies.configuration, event);
const result = await app.run()
expect(result).to.be.an('object');
expect(result.statusCode).to.equal(200);
expect(result.body).to.be.an('string');
console.log(result.body);
let response = JSON.parse(result.body);
expect(response).to.be.an('object');
expect(response.location).to.be.equal(dependencies.configuration.aws_region);
expect(response.url).to.be.equal(dependencies.configuration.endpoint);
expect(response.data).to.be.equal('foo bar');
});
});
However, when I run the Lambda locally using the Debug Locally via the Code Lens option in VS Code the results from the AWS.DynamoDB.DocumentClient.get call throws an exception.
{
"message":"The provided key element does not match the schema",
"code":"ValidationException",
...
"statusCode":400,
"retryable":false,
"retryDelay":8.354173589804192
}
I have a table created in the us-east-1 region where the non-test code is configured to go to. I've confirmed the http endpoint being hit by the DocumentClient as being dynamodb.us-east-1.amazonaws.com. The table name is correct and I have a hash key called year and a sort key called title. Why would this not find the key correctly? I pulled this example from the AWS documentation, created a table to mirror what the key was and have not had any luck.
The issue is that the Key year was provided a number value while the table was expecting it to be a string.
Wrapping the 2015 in quotes let the Document know that this was a string and it could match to the schema in Dynamo.
let params = {
TableName: 'test',
Key: {
'year': '2015',
'title': 'The Big New Movie',
},
};

How to make a async/await function in AWS Lambda when using aws-sdk

I am using AWS lambda to get some data from cloudwatch metric and below is my code in lambda
var AWS = require('aws-sdk');
AWS.config.update({region: 'ap-northeast-1'});
var cloudwatch = new AWS.CloudWatch({apiVersion: '2010-08-01'});
exports.handler = async function(event, context) {
console.log('==== start ====');
const connection_params = {
// params
};
cloudwatch.getMetricStatistics(connection_params, function(err, data) {
if (err){
console.log(err, err.stack);
} else {
console.log(data);
active_connection = data.Datapoints[0].Average.toFixed(2);
}
console.log(`DDDDDDD ${active_connection}`);
});
console.log('AAAA');
};
I always get 'AAAA' first and then get 'DDDD${active_connection }'.
But what i want is get 'DDDD${active_connection }' first and then 'AAAA'.
I tried to use like
cloudwatch.getMetricStatistics(connection_params).then(() => {})
but show
cloudwatch.getMetricStatistics(...).then is not a function
Try writing your code like this,
With then
const x = cloudwatch.getMetricStatistics(connection_params).promise();
x.then((response) => do something);
With async/await
const x = await cloudwatch.getMetricStatistics(connection_params).promise();
You could use util#promisify Docs
const util = require("util");
const cloudwatch = new AWS.CloudWatch();
const getMetricStatistics = util.promisify(cloudwatch.getMetricStatistics.bind(cloudwatch));
getMetricStatistics(connection_params).then(() => {})

Categories

Resources