This code is written in python:
from asn1crypto import tsp, cms, util
response_file = open('timestamp-response.tsr','rb')
response = tsp.TimeStampResp.load(response_file.read())
token = response['time_stamp_token']
signed_data = token['content']
encap_content_info = signed_data['encap_content_info']
tst_info = encap_content_info['content'].parsed
signer_infos = signed_data['signer_infos']
signer_info = signer_infos[0]
signed_attrs = signer_info['signed_attrs']
signature = signer_info['signature']
I can't find way to perform the same action using javascript even the api of the libraries looks the same.
Helpful links:
https://kjur.github.io/jsrsasign/api/symbols/KJUR.asn1.tsp.TimeStampResp.html
https://github.com/wbond/asn1crypto/blob/master/asn1crypto/tsp.py
I am not aware of any ready-to-use library but I believe it should be possible to use ASN1.js to parse TimeStampResp structure with definitions from RFC3161 and extract the data you need.
Parsing DER encoded structure when you have its ASN.1 definition is the same thing as parsing XML structure when you have its XSD definition but it will probably take more time until you get familiar with ASN.1 stuff.
You could try pkijs. I did not try it on timestamps (only x509 certificates) but it seems this library does support it. It uses asn1js under the cover.
Time-stamping request:
Parsing internal values
Getting/setting any internal values
Creation of a new Time-stamping request "from scratch"
Validation of Time-stamping request signature
Time-stamping response:
Parsing internal values
Getting/setting any internal values
Creation of a new Time-stamping response "from scratch"
Validation of Time-stamping response signature
Related
I am writing DNS-over-HTTPS server which should resolve custom names, not just proxy them to some other DoH server, like Google's. I am having trouble properly decoding the body of the request.
For example, I get body of request, that is in binary format, specifically in javascript in Uint8 ArrayBuffer type. I am using the following code to get base64 format of the array:
function _arrayBufferToBase64(buffer) {
var binary = '';
var bytes = new Uint8Array(buffer);
var len = bytes.byteLength;
for (var i = 0; i < len; i++) {
binary += String.fromCharCode(bytes[i]);
}
return btoa(binary);
}
And I get something like this as a result:
AAABAAABAAAAAAABCmFwbngtbWF0Y2gGZG90b21pA2NvbQAAAQABAAApEAAAAAAAAE4ADABKAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=
Now, per RCF8484 standard this should be decoded as base64url, but when I decode it as such, I get the following:
apnx-matchdotomicom)NJ
I also used this "tutorial" as the reference, but they also decode similarly formatted blob and I get similar nonsense like previously.
There is very little to no information about something like this on the internet and if it is of any help DoH standard uses application/dns-message media type for the body.
If anyone has some insight on what I am doing wrong or how I could edit the question to make it more clear, please help me, cheers :)
As stated in the RFC:
Definition of the "application/dns-message" Media Type
The data payload for the "application/dns-message" media type is a
single message of the DNS on-the-wire format defined in Section 4.2.1
of [RFC1035], which in turn refers to the full wire format defined in
Section 4.1 of that RFC.
So what you get is exactly what is sent on the wire in the normal DNS over 53 case.
I would recommend you use a DNS library that should have a from_wire or similar method to which you can feed this content and get back some structured data.
Showing an example in Python with the content you gave:
In [1]: import base64
In [3]: import dns.message
In [5]: payload = 'AAABAAABAAAAAAABCmFwbngtbWF0Y2gGZG90b21pA2NvbQAAAQABAAApEAAAAAAAAE4ADABKAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA='
In [7]: raw = base64.b64decode(payload)
In [9]: msg = dns.message.from_wire(raw)
In [10]: print msg
id 0
opcode QUERY
rcode NOERROR
flags RD
edns 0
payload 4096
option Generic 12
;QUESTION
apnx-match.dotomi.com. IN A
;ANSWER
;AUTHORITY
;ADDITIONAL
So your message is a DNS query for the A record type on name apnx-match.dotomi.com.
Also about:
I am writing DNS-over-HTTPS server which should resolve custom names,
If you don't do that to learn (which is a fine goal), note that there are already various open source nameservers software that do DOH so you don't need to reinvent it. For example: https://blog.nlnetlabs.nl/dns-over-https-in-unbound/
I'm trying to read an updating JSON file from syslog-ng. Currently, syslog, a logging software, is set to continually append a JSON file with logs of the data I want. I'm displaying the data on my cyber attack map only for only 30 seconds until it's not needed anymore. I can read the file and parse what I need, but is there a way to, over time, read & parse only the most recent additions to the file?
Sample code:
//Assume JSON output = {attack source, attack destination, attack type}
//Required modules
var JSONStream = require('JSONStream')
var fs = require('fs');
//Creates readable stream for JSON file parsing
var stream = fs.createReadStream( 'output.json', 'utf8'),
parser = JSONStream.parse(['source', 'dest', 'type']);
//Send read data to parser function
stream.pipe(parser);
//Intake data from parser function
parser.on('data', function (obj) {
//Do something with the object
console.log(obj);
});
I'm using JSONStream to avoid having to read the whole log file into memory, JSONstream should still be able to parse the bits I want, but is there a method to only read changes after the original reading is complete?
Use this code example provided in the library
JSONStream Test code
You don't have to wait for the end, you can use the callback to do your work object by object
But the file structure should suite the library expectation as the files given in the folder
Example file all_npm.json
How can I send JSON.stringify(array) data within form-data and decode the JSON with my Django api?
I'm trying to add the functionality to upload an array of date-strings within a form
originally we sent the post data using JSON and sending arrays of data worked, however, when we switched to using form-data in order to make uploading an image easier we started having problems with array types.
since form data has to be sent using a string type I converted the date-string array using JSON.stringify()
const myForm = new FormData();
myForm.set("date_strings", JSON.stringify(dateStrings));
when I post myForm to my Django + DRF API it responds with
{
"date_strings": [
"Datetime has wrong format. Use one of these formats instead: YYYY-MM-DDThh:mm[:ss[.uuuuuu]][+HH:MM|-HH:MM|Z]."
],
"status_code": 400
}
In Postman I verified that sending a single date-string works, but when I send a string-array I get the same error.
I believe my Django API tests if the request.data is valid, sees that date_strings are a JSON string, then responds with a 400 Error.
def create(self, request, *args, **kwargs):
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
Attempted Solutions:
converting the JSON string to an array in the PostViewset create method
I can't change request.data['publish_dates'] because it is not mutable and I've seen some advice that you should not attempt to copy or change request data in the viewset because it isn't safe.
convert the JSON string to an array in the serializer
neither MySerializer's create nor validate methods run (I logged to test).
date_strings are formated and created as a separate PublishDate model in the create method,
class MySerializer(MyChildSerializer):
date_strings = serializers.ListField(child=serializers.DateTimeField(), min_length=1, max_length=100, write_only=True)
How can I send/accept form-data to my Django + DRF API when one attribute is an array of date-time strings?
I realized the problem and found a solution. Because MySerializer had set date_strings to be a ListField it was coming in as a string and being rejected as a bad request.
by changing the model of date_strings to a CharField, MySerializer runs the validate and create methods. I now have to find a way to convert the JSON date_strings back into an array in the serializer create method, but I solved the API's 400 issue.
MySerializer now looks like:
class CreatePostSerializer(SocialVeilModelSerializer):
publish_dates = serializers.CharField(max_length=2000)
I'm trying some basic API Connect tutorials on IBM's platform (running locally using loopback) and have got completely stuck at an early point.
I've built a basic API service with some in-memory data and setter / getter functions. I've then built a separate API which takes two GET parameters and uses one of my getter functions to perform a search based on two criteria. When I run it, I successfully get a response with the following JSON object:
[{"itemId":1,"charge":9,"itemSize":2,"id":2}]
I've then tried to add a piece of server logic that modifies the response data - at this point, I'm just trying to add an extra field. I've added a Javascript component in the Assemble view and included the following code (taken from a tutorial), which I thought should modify the message body returned by the API while still passing it through:
//APIC: get the payload
var json = apim.getvariable('message.body');
//console.error("json %s", JSON.stringify(json));
//same: code to inject new attribute
json.platform = 'Powered by IBM API Connect';
//APIC: set the payload
//message.body = json;
apim.setvariable('message.body', json);
Instead of getting an extra JSON parameter ("platform"), all I get is a 500 error when I call the service. I'm guessing that I'm doing something fundamentally wrong, but all the docs suggest these are the right variable names to use.
You can't access json.platform but at that point json variable is json type. Are you sure that you can add a property to a json type variable if your json object lacks of that property? I mean: What if you first parse the json variable of json type to a normal object, then add new property, and finally stringify to json type again for body assigning purposes?
var json = JSON.parse(apim.getvariable('message.body')); //convert to normal object
json.platform = 'Powered by IBM API Connect'; //add new property
apim.setvariable('message.body', JSON.stringify(json)); //convert to json again before setting as body value
You need to get the context in some determined format, and in this function do your logic. For example if your message is in json you need to do:
apim.readInputAsJSON(function (error, json) {
if (error)
{
// handle error
apim.error('MyError', 500, 'Internal Error', 'Some error message');
}
else
{
//APIC: get the payload
var json = apim.getvariable('message.body');
//console.error("json %s", JSON.stringify(json));
if(json){
//same: code to inject new attribute
json.platform = 'Powered by IBM API Connect';
//APIC: set the payload
//message.body = json;
apim.setvariable('message.body', json);
}
}
});
Reference:
IBM Reference
You have the message.body empty, put a invoke/proxy policy before your gateway/javascript policy for example.
We are trying to create an entity that has date attributes via an odata service. Backend is an sap system. This entity has only 3 key attributes plus a bunch of other attributes. We have identified that dates in the keys are the root cause of the problem.
Keys:
Pernr type string,
begda type datetime
endda type datetime.
The code below, (which does not work), has been severely simplified when trying to troubleshoot the issue. At the moment, it reads an entity from an entity set and immediately tries to create one with exactly the same data.
Code:
var oODataModel = new sap.ui.model.odata.ODataModel("/sap/opu/odata/sap/Z_PERSONAL_DATA_SRV/");
//Test entity to be saved
var entity = null;
//Handler for read error
var handleReadE = function (oEvent){
alert("error");
};
//Handler for read success
var handleRead = function (oEvent){
//Get the data read from backend
entity = oEvent.results[0];
//Try to create a new entity with same data
oODataModel.create('/PersDataSet', entity, null, function(){
alert("Create successful");
},function(oError){
alert("Create failed", oError);
});
};
oODataModel.read("/PersDataSet", null, [], true, handleRead, handleReadE);
In the gateway error log, an xml parsing error appears. In this log, we can see the request data and it can be seen that the dates are transported with String types. These dates are defined in the service as DateTimes so the request is rejected.
Example:
<m:properties>
<d:Pernr m:type="Edm.String">00000001</d:Pernr>
<d:Endda m:type="Edm.String">9999-12-31T00:00:00</d:Endda>
<d:Begda m:type="Edm.String">1979-05-23T00:00:00</d:Begda>
When the entity is read, the backend does not send any type information. It sends like the following example:
<m:properties>
<d:Pernr>72010459</d:Pernr>
<d:Endda>9999-12-31T00:00:00</d:Endda>
<d:Begda>1876-07-21T00:00:00</d:Begda>
And, indeed, if we try to save the same info without the type=".." it works. So the problem are the incorrect types ODataModel.create adds to the xml.
My question is:
Can I tell ODataModel.create to not add this type info? It is not doing a good job inferring the types.
Can anyone share an example reading and writing dates through odata?
Thank you very much in advance.
the data returned from oODataModel.read is raw, before you post you need to parse it
var handleRead = function (oEvent){
//Get the data read from backend
entity = oEvent.results[0];
var newEntity = jQuery.extend({},entity);
delete newEntity.__metadata;
newEntity.Begda = new Date(entity.Begda);
newEntity.Endda = new Date(entity.Endda);
//Try to create a new entity with same data
oODataModel.create('/PersDataSet', newEntity, null, function(){
why not use json instead of xml?
Thanks all for the help.
We got this working accounting for the following:
The problem of the wrong types appended to the attributes comes from the read itself. The object returned by read has a __metadata attribute which describes the values. In this object the dates are set with type=edm.string, even when the service says they are DateTime. To me this is a bug of the .read function.
When trying to use the same object to save, create sees the __metatada on the entry and uses those values, producing type edm.string type for the dates. This caused the request to be rejected. Manually changing these __metadata.properties...type to Edm.DateTime makes it work.
In the end, we did the following:
Dates are parsed manually from the Odata response, creating a js Date
object from the strings in format "yyyy-mm-ddT00:00:00", to make it work with control bindings. When we want to save, the reverse is done.
The object to be created is a new object with
only the attributes we care (no __metadata)