So I'm trying to put records into a DynamoDB table from an AWS lambda function. I've set up the role for the lambda function to have access for the lambda function. However, in all the example code from the dynamo documentation below has no reference to an identifier/arn. Is one not needed, and instead the specific dynamoDB instance to use is inferred from the IAM role that the lambda function uses?
Essentially, my question is how can all the code do is call new AWS.DynamoDB(...) and 'automatically' know the right database to access/modify?
Example code:
var table = new AWS.DynamoDB({apiVersion: '2012-08-10', params: {TableName: 'MY_TABLE'}});
var key = 'UNIQUE_KEY_ID';
// Write the item to the table
var itemParams = {
Item: {
id: {S: key},
data: {S: 'data'}
}
};
table.putItem(itemParams, function() {
// Read the item from the table
table.getItem({Key: {id: {S: key}}}, function(err, data) {
console.log(data.Item); // print the item data
});
});
Table name and region is all you should need to specify.
Is one not needed, and instead the specific dynamoDB instance to use
is inferred from the IAM role that the lambda function uses?
The AWS SDK configuration provides the account and region. Those pieces of information, coupled with the table name, are everything it needs to connect to your table. In this case I believe the IAM role provides the account information, and you have specified the table name. You might also need to specify the region depending on if the table is in the default region or not.
Related
I am trying to access a DynamoDb table, but I keep getting a "Resource not found" error.
The table is defined as follows, note that the table is Active and the Region is Paris (eu-west-3)
The code I am using:
export class EncuestaComponent implements OnInit {
[...]
client: DynamoDBClient = new DynamoDBClient({
region : 'eu-west-3',
credentials: {
accessKeyId: '[REDACTED]',
secretAccessKey: '[REDACTED]'
}
});
[...]
onDbClick() {
const commandParams = {};
const input: BatchExecuteStatementInput = {
Statements: [
{Statement: "SELECT opciones FROM encuesta.encuesta WHERE id = 'user.1'"}
],
}
const command = new BatchExecuteStatementCommand(input);
this.client.send(command).
then(data => console.log(data.Responses![0].Error)).
catch(error => {console.log("Error"); console.log(error)});
}
And, in the console, it shows that the then method has been executed, but the message printed is {Code: 'ResourceNotFound', Message: 'Requested resource not found'}
What am I doing wrong?
In PartiQL for DynamoDB, when you do select * from something.else means that you want it to query an index named else on table named something. Either you need to do one of the following:
escape the .
surround the table name with quotes
create a new table with a different name
I am not in front of my computer or i would figure out which it is for you, but this is where I'd start.
Here is something that might be causing the problem.
Can you confirm that
the accessKey that you are using have permission to read from the DynamoDB table?
that the accessKey has not expired and can be used?
Here are a few things that can help you test out the functionality try adding the key in ~/.aws/credentials and running this command.
aws dynamodb scan --table-name encuesta.encuesta
and confirm that it is indeed showing up the table content and not getting an access denied error.
I'm working on an IoT project where I need to read some data from a device.
I use AWS, and I'm currently working on some lambda function code. But I can't figure out how to get the last (newest) item from my database.
My database has two keys:
Partition key:
device_id (Number)
Sort key
sample_time (Number)
This is a part of the code I wrote to retrieve the newest reading from my IoT device
case "GET /data/newest":
body = await dynamo
.query({
TableName: "bikelock_db",
KeyConditionExpression: 'device_id = :id',
ExpressionAttributeValues: {
":id": 1,
},
Limit: 1,
ScanForwardIndex: false,
})
.promise();
break;
This code however, only returns the first added item from the database.
Changing the ScanForwardIndex: false to true doesn't change a thing.
I thought the Sort Key would sort it automatically, but it does not.
Any idea what I'm missing, or why it isn't working?
Try ScanIndexForward and I bet it'll work. You transposed the two words.
I want to send some value for a field to Cloud Firestore, but I dont want to be persist(saved) in Cloud Firestore.
Code:
const message = {
persistentData: {
id: 'dSXYdieiwoDUEUWOssd',
text: 'Hi dear how are you',
date: new Date();
},
nonPersistentData: {
securityCode: 393929949
}
};
db.collection('messages').doc(message.persistentData.id).set(message).catch(e => {});
In above code I want to persit (save) persistentData, but I dont want to save nonPersistentData online nor offline, because I only need them to check real data in Firestore rule. So I dont want they should be accessible in cache(offline) or server(online)...
This is simply not possible with firestore. There is a similar question here. You need to separate the data into public (persistent) and private data (non-persistent). One possible solution will be-
From the client, push the private data which contains the securityCode to a new collection called securityCodes and store the id of the new entry.
Because you don't want this info to be available to anyone, you can add a security rule
match /securityCodes/{securityCode} {
// No one can read the value from this collection, but only create
allow create: true;
}
In your public data, add the id of the previously added document
data = {
id: 'dSXYdieiwoDUEUWOssd',
text: 'Hi dear how are you',
date: new Date(),
securityId: <id of the secretCode entry>
}
In your security rules, get the secret code using the securityId you are sending with the public data. Example-
match /collectionId/documentId {
allow create: if get(/secretCodes/$(request.resource.data.secretId)) == 'someknowncode'
}
I want to know the difference between the AWS SDK DynamoDB client and the DynamoDB DocumentClient? In which use case should we use the DynamoDB client over the DocumentClient?
const dynamoClient = new AWS.DynamoDB.DocumentClient();
vs
const dynamo = new AWS.DynamoDB();
I think this can be best answered by comparing two code samples which do the same thing.
Here's how you put an item using the dynamoDB client:
var params = {
Item: {
"AlbumTitle": {
S: "Somewhat Famous"
},
"Artist": {
S: "No One You Know"
},
"SongTitle": {
S: "Call Me Today"
}
},
TableName: "Music"
};
dynamodb.putItem(params, function (err, data) {
if (err) console.log(err)
else console.log(data);
});
Here's how you put the same item using the DocumentClient API:
var params = {
Item: {
"AlbumTitle": "Somewhat Famous",
"Artist": "No One You Know",
"SongTitle": "Call Me Today"
},
TableName: "Music"
};
var documentClient = new AWS.DynamoDB.DocumentClient();
documentClient.put(params, function (err, data) {
if (err) console.log(err);
else console.log(data);
});
As you can see in the DocumentClient the Item is specified in a more natural way. Similar differences exist in all other operations that update DDB (update(), delete()) and in the items returned from read operations (get(), query(), scan()).
As per the announcement of the DocumentClient:
The document client abstraction makes it easier to read and write data to Amazon DynamoDB with the AWS SDK for JavaScript. Now you can use native JavaScript objects without annotating them as AttributeValue types.
It's basically a simpler way of calling dynamoDB in the SDK and it also converts annotated response data to native JS types. Generally you should only use the regular DynamoDB client when doing more "special" operations on your database like creating tables etc. That's stuff usually outside of the CRUD-scope.
In simpler words, DocumentClient is nothing, but wrapper around DynamoDB client. As per mentioned in other comments and aws documentation below, it features convenience of use, converting annotated response data to native JS types and abstracting away the notion of attribute values.
Another noticeable difference is that the scope of documentclient is limited to item level operations, but dynamodb client provides broader range of operations in addition to the item level operations.
From AWS document client documentation
The document client simplifies working with items in Amazon DynamoDB by abstracting away the notion of attribute values. This abstraction annotates native JavaScript types supplied as input parameters, as well as converts annotated response data to native JavaScript types.
I stored some data using db.collection("articles").doc(); and update it using db.collection("articles").doc(postData.articleId).update(postData);. But creating new article next time, the postData.articleId is accumulated.
First, I create a doc, then update data in real time using socket.io. but I create another doc later, postData.id was accumulated.
var doc = db.collection("articles").doc();
var postData = {
author: user.email,
articleId: doc.id,
currentTime: new Date()
}
doc.set(postData);
--- get data from client code here (using socket) ---
db.collection("articles").doc(postData.articleId).update(postData); /*Update Data to postData.id.*/
The expected result is not accumulation postData.id when new data added.
(For more info: https://github.com/officialmansu/opinion-express/tree/develop)
The update() method of Cloud Firestore is specifically meant to provide partial updates (sometimes also known as patches) to an existing document.
If you want to instead replace all existing data of the document, use set(). So:
db.collection("articles").doc(postData.articleId).set(postData);