Setting dataType JSON in aws-sdk for sqs - javascript

Im just trying to send a json object to sqs queue by using aws-sdk npm package.
const sqsMessage = {
MessageAttributes: {
"data": {
DataType: "String",
StringValue: payload.data
}
},
MessageBody: JSON.stringify(payload),
QueueUrl: queueUrl
If i pass json object in data attribute it tells it expected String, if i add Json then it also throws an error about type, does anyone know a workaround or what to use to get the json?
Some useful links i've found,
1)https://blog.chrismitchellonline.com/posts/aws-sqs-message-with-attributes/
https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-send-message.html
Would really appreciate some guidance on this

Can you share the error message that you're receiving. And can you show us an example of payload that you're constructing? Basically, the allowable values for dataType are found here: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/APIReference/API_MessageAttributeValue.html
Specifically, this part: "Amazon SQS supports the following logical data types: String, Number, and Binary. For the Number data type, you must use StringValue."
So when it comes to constructing the value for your key MessageBody, you want to use JSON. So the type must be string and your value, it has to be valid JSON that's really a string and the inner quotes escaped. So for example, something like this:
"{\"foo\": \"bar\"}"
I ran into this same issue while writing unit tests in goLang and it works. I know this is 11 months later, but hopefully this helps you or someone else in the future :)

Related

How to decode transaction input data using `ethers.utils.defaultAbiCoder`

I'm fetching transaction data using Etherscan API. This is the example result I'm getting:
{
blockNumber: '7409930',
timeStamp: '1639151980',
hash: '...',
nonce: '4124',
...
input: '0x9d90e4c8000000000000000000000000000000000000000000000000000000000000002000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000040000000000000000000000000000000000000000000000000000000000000002000000000000000000000000093238bb66b5d15b4152c5db574e3397ff1b1a450',
contractAddress: '',
cumulativeGasUsed: '403775',
gasUsed: '1162315',
confirmations: '191308'
}
I now need to figure out the event type (contract method, e.g. TransferOwnership, stakeTokens,...) for this transaction. This data is stored in input property of this object.
I managed to accomplish this using abi-decoder library, but I want to accomplish the same thing using ethers's utility method (whichever).
My current implementation:
const abiDecoder = require("abi-decoder");
abiDecoder.addABI(contractAbi);
// "item" is transaction data, input property is encoded stuff from which I want to get the contract method used by this transaction
const decodedInput = abiDecoder.decodeMethod(item.input);
// contract method
console.log(decodedInput.name);
I was reading through ether's documentation (https://docs.ethers.io/v5/api/utils/abi/coder/), but I can't figure it out.
you can try what is recommended in this: https://github.com/ethers-io/ethers.js/issues/423 . but if you are interacting with BSC, this is not possible due to the error input data too big causing Error in Big Number Number can only safely store up to 53 bits

How to send an object in postman

I was trying to make a post request to the api route I just created.
In the backend I have something like this
console.log(typeof req.body)
console.log(req.body)
const { firstName, lastName, email, phoneNumber } = req.body
console.log(`Variable Values`, firstName, lastName, email, phoneNumber)
Here I am getting typeof as String and body as this
{
firstName: "Varun",
lastName: "Bindal",
email: "iva#gmail.com",
phoneNumber: "+91-8888"
}
What I want is that the typeof to be object so I can de-structure it, How can I make a request from postman in this case (I don't want use JSON.parse)
Click the "Text" beside it will show you a dropdown. Just choose "JSON" instead of "Text"
Choose the JSON option as shown in the picture.
You should change the type of body from raw text to JSON (application/json) by clicking on the text button right next to your GraphQL option.
Your object body is of type text. Change it to JSON using the little dropdown and the POST request will work.
Cheers!
Why don't you want to use JSON.parse?
It's important to know that JSON and a javascript object are two different things.
JSON is a data format format that can be used in various environments while a javascript object is a data structure/concept in javascript.
When making a HTTP request you can send data via a few different methods. A few prominent ones being XML, Binary and JSON (They all will be represented as text, even binary).
Since you're building a API with javascript I would recommended that you use JSON in your requests and responses. JSON has also somewhat become the "standard" for APIs these days. It's also very easy to parse JSON to javascript objects and the other way around.
Please note that you maybe also need to tell postman to set the Content Type Header to application/json. You also would need to change your body to be actual valid JSON:
{
"firstName": "Varun",
"lastName": "Bindal",
"email": "iva#gmail.com",
"phoneNumber": "+91-8888"
}
I can recommend that you read the following article explaining what JSON is and how you use it: https://www.w3schools.com/js/js_json_intro.asp

Need Help to implement Tincan Javascript API

I'm working on tincan JavaScript API. The issue my data format is total change and TinCan have specified a why to pass data along with call. Help me to adjust my data in TinCan Api format. Here is sample data one of my call.
var data = {
"groupId": "groupId",
"groupName": "gNameEncrypt",
"tutorNames": "tutorNames",
"actorNames": "actorNames",
"otherNames": "otherNames"
};
Current what i do i simply decode this data and send it like this.
var actionList = new TinCan(
{
recordStores: [{
endpoint: "http://example.com",
username: username,
password: password,
allowFail: false
}]
});
var action = new TinCan.Agent({
"name": "insert"
});
actionList.getStatements({
'params': {
'agent': action,
'verb': {
'id': $.base64.encode(data)
}
},
'callback': function (err, data) {
console.info(data.more);
var urlref = "http://<?php echo $_SERVER['SERVER_NAME'] . ":" . $_SERVER['SERVER_PORT'] . $uriParts[0] . "?" ?>t=" + data.more.TutorToken;
window.location.href = urlref;
}
});
crypt.finish();
});
There are really two parts here:
need to get data into an xAPI (formerly Tin Can) format, and
the code itself.
In depth,
I think you need to take another look at how xAPI is used in general. Data is stored a JSON "Statement" object that has 3 required properties and various other optional ones. These properties often contain complex objects that are very extensible. It is hard to tell from what you've shown what you are really trying to capture and what the best approach would be. I suggest reading some material about the xAPI statement format. http://experienceapi.com/statements-101/ is a good starting point, and to get at least some coverage of all the possibilities continue with http://experienceapi.com/statements/ .
The code you've listed is attempting to get already stored statements based on two parameters rather than trying to store a statement. The two parameters being "agent" and "verb". In this case We can't tell what the verb is supposed to be since we don't know what data contains, I suspect this isn't going to make sense as a verb which is intended to be the action of a statement. Having said that the fact that the "actor" has a value of action is questionable, as that really sounds more like what a "verb" should contain. Getting the statements right as part of #1 should make obvious how you would retrieve those statements. As far as storing those statements, if you're using the TinCan interface object you would need to use the sendStatement method of that object. But this interface is no longer recommended, the recommended practice is to construct a TinCan.LRS object and interact directly with it, in which case you'd be using the saveStatement method.
I would recommend looking at the "Basic Usage" section of the project home page here: http://rusticisoftware.github.io/TinCanJS/ for more specifics look at the API doc: http://rusticisoftware.github.io/TinCanJS/doc/api/latest/

req.body of post router contains new line and colon?

I would like to know how to extract the post data from req.body.
My post data is
{
name:'asdf',
completed: false,
note: 'asdf'
}
When I am trying to console it using JSON.stringify , I am getting req.body as
{"{\n name:'asdf',completed:false,note:'asdf'}":""}
I noticed that new line and colon are getting added to the req.body object. So when I am trying to filter req.body.name its returning me undefined.
I have used app.use(bodyParser.json());but still I am not getting the actual result
Hence I would like to know the following:
1. How to filter the post object?
2. Why new lines and colon are getting added to req.body object?
I found solution myself by following
req.body empty on posts
As I am testing in rest-client, The mistakes I did are,
I initially made content-type:www-form-urlencoded. Then later I made it as application/json
I didnt use quotes for key in key-value pair. so you need to pass data in raw section as
{"name":"asdf",
"completed":false,
"note":"asdf"}
Note: Though www-form-urlencoded also work, when you are passing data in form section as
name asdf
completed false
note asdf
All the parsers accept a type option which allows you to change the Content-Type that the middleware will parse.
// parse various different custom JSON types as JSON
app.use(bodyParser.json({ type: 'application/*+json' }))
// parse some custom thing into a Buffer
app.use(bodyParser.raw({ type: 'application/vnd.custom-type' }))
// parse an HTML body into a string
app.use(bodyParser.text({ type: 'text/html' }))

SAPUI5 Create OData entity with dates - generates incorrect request payload that ends in CX_SXML_PARSE_ERROR

We are trying to create an entity that has date attributes via an odata service. Backend is an sap system. This entity has only 3 key attributes plus a bunch of other attributes. We have identified that dates in the keys are the root cause of the problem.
Keys:
Pernr type string,
begda type datetime
endda type datetime.
The code below, (which does not work), has been severely simplified when trying to troubleshoot the issue. At the moment, it reads an entity from an entity set and immediately tries to create one with exactly the same data.
Code:
var oODataModel = new sap.ui.model.odata.ODataModel("/sap/opu/odata/sap/Z_PERSONAL_DATA_SRV/");
//Test entity to be saved
var entity = null;
//Handler for read error
var handleReadE = function (oEvent){
alert("error");
};
//Handler for read success
var handleRead = function (oEvent){
//Get the data read from backend
entity = oEvent.results[0];
//Try to create a new entity with same data
oODataModel.create('/PersDataSet', entity, null, function(){
alert("Create successful");
},function(oError){
alert("Create failed", oError);
});
};
oODataModel.read("/PersDataSet", null, [], true, handleRead, handleReadE);
In the gateway error log, an xml parsing error appears. In this log, we can see the request data and it can be seen that the dates are transported with String types. These dates are defined in the service as DateTimes so the request is rejected.
Example:
<m:properties>
<d:Pernr m:type="Edm.String">00000001</d:Pernr>
<d:Endda m:type="Edm.String">9999-12-31T00:00:00</d:Endda>
<d:Begda m:type="Edm.String">1979-05-23T00:00:00</d:Begda>
When the entity is read, the backend does not send any type information. It sends like the following example:
<m:properties>
<d:Pernr>72010459</d:Pernr>
<d:Endda>9999-12-31T00:00:00</d:Endda>
<d:Begda>1876-07-21T00:00:00</d:Begda>
And, indeed, if we try to save the same info without the type=".." it works. So the problem are the incorrect types ODataModel.create adds to the xml.
My question is:
Can I tell ODataModel.create to not add this type info? It is not doing a good job inferring the types.
Can anyone share an example reading and writing dates through odata?
Thank you very much in advance.
the data returned from oODataModel.read is raw, before you post you need to parse it
var handleRead = function (oEvent){
//Get the data read from backend
entity = oEvent.results[0];
var newEntity = jQuery.extend({},entity);
delete newEntity.__metadata;
newEntity.Begda = new Date(entity.Begda);
newEntity.Endda = new Date(entity.Endda);
//Try to create a new entity with same data
oODataModel.create('/PersDataSet', newEntity, null, function(){
why not use json instead of xml?
Thanks all for the help.
We got this working accounting for the following:
The problem of the wrong types appended to the attributes comes from the read itself. The object returned by read has a __metadata attribute which describes the values. In this object the dates are set with type=edm.string, even when the service says they are DateTime. To me this is a bug of the .read function.
When trying to use the same object to save, create sees the __metatada on the entry and uses those values, producing type edm.string type for the dates. This caused the request to be rejected. Manually changing these __metadata.properties...type to Edm.DateTime makes it work.
In the end, we did the following:
Dates are parsed manually from the Odata response, creating a js Date
object from the strings in format "yyyy-mm-ddT00:00:00", to make it work with control bindings. When we want to save, the reverse is done.
The object to be created is a new object with
only the attributes we care (no __metadata)

Categories

Resources