Can't Access Data in S3 with Lambda - javascript

I have been using Javascript for a few months and my code runs well in local, but I have always the same problem in a Lambda function.
I cant access any data with s3.getObject.
This is a simple example code that doesn't run in Lambda:
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
exports.myHandler = function(event, context, callback) {
// Retrieve the object
s3.getObject({
Bucket: 'XXXXXX',
Key: 'YYYYY'
}, function(err, data) {
if (err) {
console.log(err);
} else {
console.log("data");
}
});
};

This is because your function is being terminated before your callback is executed, since your s3.getObject() call is asynchronous under the hood.
In order to get data from AWS Lambda, you'll have to return your value like this:
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
exports.myHandler = function(event, context, callback) {
// Retrieve the object
s3.getObject({
Bucket: 'XXXXXX',
Key: 'YYYYY'
}, function(err, data) {
if (err) {
console.log(err);
callback(err)
} else {
callback(null, {statusCode: 200, body: JSON.stringify(data) })
}
});
};
I suggest you use Node 8 though, so you can easily use async/await.
Your code would then look like this:
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
exports.myHandler = async (event) => {
const data = await s3.getObject({
Bucket: 'XXXXXX',
Key: 'YYYYY'
}).promise();
return {
statusCode: 200,
body: JSON.stringify(data)
}
};
Another problem in your code is that you always print "data" instead of data, so a String is printed instead of the data itself.

Related

DyanmoDB getItem not providing any response within Async Lambda

I have been trying to make a getItem request in an async Lambda function to dynamo DB, and I am not receiving any response at all. Any troubleshooting or help would be greatly appreciated.
in general, I am trying to make a request to a dynamodb table using the AWS SDK getItem, however, when I run my code there is no response for the await ddb.getItem function
so I am kinda lost as to what could be causing this.
// Load AWS SDK
const AWS = require("aws-sdk");
// Set the region
AWS.config.update({ region: "us-east-1" });
// Create the DyanmoDB service object
const ddb = new AWS.DynamoDB({ apiVersion: "2012-08-10" });
const handler = async (
event,
context,
callback,
test = false,
testObjFunc = {
test: () => {
Error("Testing enabled");
}
}
) => {
const response = {
isBase64Encoded: false,
statusCode: 200,
headers: { "Content-Type": "application/json", "Access-Control-Allow-Origin": "*" },
multiValueHeaders: {},
body: JSON.stringify({ responseBody })
};
try {
// Parameters for DynamodDB getItem call
const dbParams = {
TableName: "Table_Name",
Key: {
personID: { S: "value" }
},
ProjectionExpression: "value_to_return"
};
// DynamoDB call to check for item
const results = await ddb.getItem(dbParams).promise();
console.log("success");
console.log(results);
} catch (error) {
response.statusCode = 500;
}
return response;
};
module.exports.handler = handler;
You have put the getitem call in try block, as you are not receiving any response means something is gone wrong in the try block.

Write a csv file to AWS S3 Fails

I have this code in TypeScript that I used to write a csv file to AWS S3, which it works fine locally, and recently I started getting and error saying:
s3 upload error unsupported body payload object
NOTES:
I'm not passing the credentials because the code is running in the
same container with AWS S3 (EC2) that's why I don't need to pass the
credentials.
I'm printing all the params I'm reading/passing and I have them
read properly.
Here is the code:
public async writeFileToS3(datasetFile: any): Promise<boolean> {
try {
const readFile = util.promisify(this.fileWriter.readFile);
const unlinkFile = util.promisify(this.fileWriter.unlink);
const s3BucketName = this.appConfig.get<string>(
'infra.fileWriter.bucket'
);
const s3Region = this.appConfig.get<string>(
'infra.fileWriter.region'
);
this.s3Bucket.config.region = s3Region;
console.log(
`datasetFile ${datasetFile.path} ${datasetFile.originalname}`
);
const data = readFile(datasetFile.path);
const params = {
Bucket: s3BucketName,
Key: datasetFile.originalname,
Body: data,
ACL: 'public-read'
};
console.log(
`params ${params.Bucket} ${params.Key} ${params.Body} ${params.ACL}`
);
return await new Promise<boolean>((resolve, reject) => {
this.s3Bucket.upload(params, function(err: any) {
unlinkFile(datasetFile.path);
if (err) {
console.log(err);
throw new OperationError(
'Error wirting file to S3',
err
);
} else {
resolve(true);
}
});
});
} catch (err) {
throw new OperationError('Error wirting file to S3');
}
}
readFile returns a Promise (you created it with util.promisify), thus data is a Promise here:
const data = readFile(datasetFile.path);
const params = {
Bucket: s3BucketName,
Key: datasetFile.originalname,
Body: data,
ACL: 'public-read'
};
You should await the Promise:
const data = await readFile(datasetFile.path);

AWS Lambda function not writing to DynamoDB

I have a lambda function that's suppose to be writing to a database. When I run it on my local machine it works but then when I upload it to lambda and test it It doesn't put anything in the database. The role I have the function using has full access to DynamoDB and its the exact same code that works fine when I run it from my laptop. Any idea why that would be the case?
Here's my lambda. The dao class contains the code that actually accesses dynamo. I'm just trying to upload some constant strings right now.
const DAO = require('./PostStatusDAO.js');
exports.handler = async (event, context, callback) => {
var dao = new DAO();
dao.post("this is a test", "#jordan", "#matt", "none");
const response = {
statusCode: 200,
body: {
result: "good"
}
};
return response;
};
const AWS = require('aws-sdk');
const ddb = new AWS.DynamoDB.DocumentClient({region: 'us-west-2'});
class PostStatusDAO {
post(in_text, in_user, in_author, in_attachment) {
var params = {
Item: {
user: String(in_user),
timestamp: Date.now(),
author: String(in_author),
text: String(in_text),
attachment: String(in_attachment),
},
TableName: 'Feed',
};
console.log(params);
var result = ddb.put(params, (err, data) => {
console.log("callback");
if(err) {
console.log("Error: ", err);
} else {
console.log("Data: ", data);
}
});
// console.log(result);
}
}
module.exports = PostStatusDAO;
To see the reason why your function is failing you have to either run it synchronously or return the promise back to the caller/runtime like this:
const DAO = require('./PostStatusDAO.js');
exports.handler = async(event, context, callback) => {
var dao = new DAO();
// Return new promise
return new Promise(function(resolve, reject) {
// Do async job
dao.post("this is a test", "#jordan", "#matt", "none", function(err, data) {
if (err) {
console.log("Error: ", err);
reject(err);
}
else {
console.log("Data: ", data);
resolve(data);
}
})
})
};
const AWS = require('aws-sdk');
const ddb = new AWS.DynamoDB.DocumentClient({region: 'us-west-2'});
class PostStatusDAO {
async post(in_text, in_user, in_author, in_attachment, callback) {
var params = {
Item: {
user: String(in_user),
timestamp: Date.now(),
author: String(in_author),
text: String(in_text),
attachment: String(in_attachment),
},
TableName: 'Feed',
};
console.log(params);
return ddb.put(params, callback).promise();
}
}
module.exports = PostStatusDAO;

Amazon S3: getSignedUrl: "Missing required key 'Bucket' in params"

Struggled here for two days. I am kind of new to javascript and AWS so any hint will be appreciate.
I have 11 buckets. Others work fine except this one.
When I pass in another bucket name and key value, it works, but when I pass in the one I needed I get error: "Missing required key 'Bucket' in params".
For example:
If I pass in bucket: 'businesspicture', it successfully load the picture I need.
$scope.$watch("userInfo.picture", function (imageValue) {
var defaultIcon = '/images/defaultuser.jpg';
if (imageValue !== defaultIcon && !imageValue.startsWith("https://")) {
pictureServices.picture.getPictureFromS3({
fileName: imageValue,
bucket: "userpicture"
}, {}, function (pic) {
$scope.iconPreviewImage = pic.url;
$scope.userInfo.picture = imageValue;
}, function (error) {
dialogService.showNgResourceError(error);
});
}
});
pictureService.js.
var AWS = require('aws-sdk');
AWS.config.loadFromPath('../s3_config.json');
var photoBuckets = new AWS.S3();
exports.getPictureFromS3 = function(fileName, bucketName) {
return new Promise(function(resolve, reject){
var params = {
Bucket: bucketName,
Key: fileName
};
photoBuckets.getSignedUrl('getObject', params, function(err, url){
if(err){
reject(err);
} else{
awsurl = {url:url};
resolve(awsurl);
}
});
});
};

aws upload object to S3 bucket and pass details of data to lambda

Working my way through tutorials for AWS...So ive created an S3 bucket which when a file is dropped into it calls my lambda 'testHelloWorld' which sends an email...this all works fine (see below)
'use strict';
console.log('Loading function');
var aws = require('aws-sdk');
var ses = new aws.SES({
region: 'us-west-2'
});
exports.handler = function(event, context) {
console.log("Incoming: ", event);
// var output = querystring.parse(event);
var eParams = {
Destination: {
ToAddresses: ["johnb#hotmail.com"]
},
Message: {
Body: {
Text: {
Data: "Hey! What is up?"
}
},
Subject: {
Data: "Email Subject!!!"
}
},
Source: "johnb#hotmail.com"
};
console.log('===SENDING EMAIL===');
var email = ses.sendEmail(eParams, function(err, data){
if(err) console.log(err);
else {
console.log("===EMAIL SENT===");
console.log(data);
console.log("EMAIL CODE END");
console.log('EMAIL: ', email);
context.succeed(event);
}
});
};
but I want to extend the email to include data on the file that was uploaded to the bucket. I have found How to trigger my Lambda Function once the file is uploaded to s3 bucket which gives a node.js code snippet which should capture the data. I have tried to import this into my existing lambda
'use strict';
console.log('Loading function');
var aws = require('aws-sdk');
var ses = new aws.SES({
region: 'us-west-2'
});
var s3 = new aws.S3({ apiVersion: '2006-03-01', accessKeyId: process.env.ACCESS_KEY, secretAccessKey: process.env.SECRET_KEY, region: process.env.LAMBDA_REGION });
exports.handler = function(event, context, exit){
console.log("Incoming: ", event);
// var output = querystring.parse(event);
// Get the object from the event and show its content type
// const bucket = event.Records[0].s3.bucket.name;
// const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '));
const params = {
Bucket: 'bucketName',
Key: 'keyName',
Source : 'SourceName',
Destination : 'DestinationName',
Message : 'MessageName'
};
s3.getObject(function(err, data){
if (err) {
console.log('ERROR ' + err);
// exit(err);
} else {
// the data has the content of the uploaded file
var eParams = {
Destination: {
ToAddresses: ["johnboy#hotmail.com"]
},
Message: {
Body: {
Text: {
Data: data
}
},
Subject: {
Data: "Email Subject!!!"
}
},
Source: "johnboy#hotmail.com"
};
}
});
console.log('===SENDING EMAIL===');
var email = ses.sendEmail(eParams, function(err, data){
if(err) console.log(err);
else {
console.log("===EMAIL SENT===");
console.log(data);
console.log("EMAIL CODE END");
console.log('EMAIL: ', email);
context.succeed(event);
}
});
};
but this is failing on the params
message: 'There were 3 validation errors:
* MissingRequiredParameter: Missing required key \'Source\' in params
* MissingRequiredParameter: Missing required key \'Destination\' in params
* MissingRequiredParameter: Missing required key \'Message\' in params',
code: 'MultipleValidationErrors',
errors:
These source, destination and message are listed in the params, are they not correctly formatted and it isnt picking them up?
I cant find much online....any help appreciated
UPDATE
Ok iv got it working without failing...if i use the test function in the lambda with the following code...
'use strict';
console.log('Loading function');
var aws = require('aws-sdk');
var ses = new aws.SES({
region: 'us-west-2'
});
var s3 = new aws.S3({ apiVersion: '2006-03-01', accessKeyId: process.env.ACCESS_KEY, secretAccessKey: process.env.SECRET_KEY, region: process.env.LAMBDA_REGION });
exports.handler = function(event, context) {
console.log("Incoming: ", event);
// var output = querystring.parse(event);
var testData = null;
// Get the object from the event and show its content type
// const bucket = event.Records[0].s3.bucket.name;
// const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '));
const params = {
Bucket: 'bucket',
Key: 'key',
};
s3.getObject(params, function(err, data){
if (err) {
console.log('ERROR ' + err);
exit(err);
} else {
testData = data;
}
});
var eParams = {
Destination: {
ToAddresses: ["jim#him.com"]
},
Message: {
Body: {
Text: { Data: 'testData2' + testData}
},
Subject: {
Data: "Email Subject!!!"
}
},
Source: "jim#him.com"
};
console.log('===SENDING EMAIL===');
var email = ses.sendEmail(eParams, function(err, data){
if(err) console.log(err);
else {
console.log("===EMAIL SENT===");
console.log(data);
console.log("EMAIL CODE END");
console.log('EMAIL: ', email);
context.succeed(event);
}
});
};
I get the email with the body- testData2null
So I tried uploading an image through the s3 bucket and I still get the email with the body testData2null
is there anyway to debug this further or does anyone kno who it is saying null. I never actually tested the code from the other post which passes the data over to the email I just assumed it would work. Does anyone else know who to obtain the data from the upload please? thanks
You are declaring the var eParams within the callback of s3.getObject, but then you run the ses.sendMail outside of the callback. I think that's why!
You also need to move the ses.sendEmail to inside the callback of s3.getObject if you want to send the data from your object inside the email.
Try this:
s3.getObject(function(err, objectData) {
if (err) {
console.log('Could not fetch object data: ', err);
} else {
console.log('Data was successfully fetched from object');
var eParams = {
Destination: {
ToAddresses: ["johnboy#hotmail.com"]
},
Message: {
Body: {
Text: {
Data: objectData
}
},
Subject: {
Data: "Email Subject!!!"
}
},
Source: "johnboy#hotmail.com"
};
console.log('===SENDING EMAIL===');
var email = ses.sendEmail(eParams, function(err, emailResult) {
if (err) console.log('Error while sending email', err);
else {
console.log("===EMAIL SENT===");
console.log(objectData);
console.log("EMAIL CODE END");
console.log('EMAIL: ', emailResult);
context.succeed(event);
}
});
}
});
You need to read on how Nodejs works. It is event based and depends on callbacks and promises. You should do -
s3.getObject(params, function(err, data){
//This is your callback for s3 API call. DO stuff here
if (err) {
console.log('ERROR ' + err);
exit(err);
} else {
testData = data;
// Got your data. Send the mail here
}
});
I have added my comments in code above. Since Nodejs is single threaded it will make S3 api call and go ahead. When it is sending mail s3 api call is not complete so data is null. It is better to use promises here.
Anyway read up on callback and promises in nodejs and how it works. But hope this answers your logical error.

Categories

Resources