I am trying to implement hyperledger sawtooth transaction through javascript SDK following this https://sawtooth.hyperledger.org/docs/core/releases/1.0/_autogen/sdk_submit_tutorial_js.html#encoding-your-payload.
/*
*Create the transaction header
*/
const createTransactionHeader = function createTransactionHeader(payloadBytes) {
return protobuf.TransactionHeader.encode({
familyName: 'intkey',
familyVersion: '1.0',
inputs: [],
outputs: [],
signerPublicKey: '02cb65a26f7af4286d5f8118400262f7790e20018f2d01e1a9ffc25de1aafabdda',
batcherPublicKey: '02cb65a26f7af4286d5f8118400262f7790e20018f2d01e1a9ffc25de1aafabdda',
dependencies: [],
payloadSha512: createHash('sha512').update(payloadBytes).digest('hex')
}).finish();
}
/*
* Create the transactions
*/
const createTransaction = function createTransaction(transactionHeaderBytes, payloadBytes) {
const signature = signer.sign(transactionHeaderBytes)
return transaction = protobuf.Transaction.create({
header: transactionHeaderBytes,
headerSignature: Buffer.from(signature, 'utf8').toString('hex'),
payload: payloadBytes
});
}
While submitting the transaction I am getting the following error from REST API
{
"error": {
"code": 30,
"message": "The submitted BatchList was rejected by the validator. It was poorly formed, or has an invalid signature.",
"title": "Submitted Batches Invalid"
}
}
Found the following issue similar to my problem
Sawtooth Invalid Batch or Signature
But its implemented in java the solution not work for my case
This should work, try this:
const cbor = require('cbor');
const {createContext, CryptoFactory} = require('sawtooth-sdk/signing');
const {createHash} = require('crypto');
const {protobuf} = require('sawtooth-sdk');
const request = require('request');
const crypto = require('crypto');
const context = createContext('secp256k1');
const privateKey = context.newRandomPrivateKey();
const signer = CryptoFactory(context).newSigner(privateKey);
// Here's how you can generate the input output address
const FAMILY_NAMESPACE = crypto.createHash('sha512').update('intkey').digest('hex').toLowerCase().substr(0, 6);
const address = FAMILY_NAMESPACE + crypto.createHash('sha512').update('foo').digest('hex').toLowerCase().substr(0, 64);
const payload = {
Verb: 'set',
Name: 'foo',
Value: 42
};
const payloadBytes = cbor.encode(payload);
const transactionHeaderBytes = protobuf.TransactionHeader.encode({
familyName: 'intkey',
familyVersion: '1.0',
inputs: [address],
outputs: [address],
signerPublicKey: signer.getPublicKey().asHex(),
batcherPublicKey: signer.getPublicKey().asHex(),
dependencies: [],
payloadSha512: createHash('sha512').update(payloadBytes).digest('hex')
}).finish();
const transactionHeaderSignature = signer.sign(transactionHeaderBytes);
const transaction = protobuf.Transaction.create({
header: transactionHeaderBytes,
headerSignature: transactionHeaderSignature,
payload: payloadBytes
});
const transactions = [transaction]
const batchHeaderBytes = protobuf.BatchHeader.encode({
signerPublicKey: signer.getPublicKey().asHex(),
transactionIds: transactions.map((txn) => txn.headerSignature),
}).finish();
const batchHeaderSignature = signer.sign(batchHeaderBytes)
const batch = protobuf.Batch.create({
header: batchHeaderBytes,
headerSignature: batchHeaderSignature,
transactions: transactions
};
const batchListBytes = protobuf.BatchList.encode({
batches: [batch]
}).finish();
request.post({
url: 'http://rest.api.domain/batches',
body: batchListBytes,
headers: {'Content-Type': 'application/octet-stream'}
}, (err, response) => {
if(err) {
return console.log(err);
}
console.log(response.body);
});
Related
I need to integrate KNET payment gateway in a Next.js application. But I wasn't able to find any documentation or example for that.
Can anyone please help me and provide any idea how to integrate the KNET in Javascript.
After some more research, here is how I was able to integrate KNET in Next.js:
import * as crypto from 'crypto'
const pkcs5Pad = (text: string) => {
const blocksize = 16
const pad = blocksize - (text.length % blocksize)
return text + pad.toString().repeat(pad)
}
const aesEncrypt = (text: string, key: string) => {
const AES_METHOD = 'aes-128-cbc'
const content = pkcs5Pad(text)
try {
const cipher = crypto.createCipheriv(AES_METHOD, new Buffer(key), key)
let encrypted = cipher.update(content)
encrypted = Buffer.concat([encrypted, cipher.final()])
return `${encrypted.toString('hex')}`
} catch (err) {
/* empty */
}
}
const aesDecrypt = (text: string) => {
const AES_METHOD = 'aes-128-cbc'
const key = process.env.termResourceKey
const decipher = crypto.createDecipheriv(
AES_METHOD,
new Buffer(key as string),
key as string
)
const encryptedText = new Buffer(text, 'hex')
let decrypted = decipher.update(encryptedText)
decrypted = Buffer.concat([decrypted, decipher.final()])
return decrypted.toString()
}
const initiateKnetPayment = () => {
const kpayUrl = process.env.kpayUrl // https://www.kpay.com.kw/kpg/PaymentHTTP.htm for production or https://www.kpaytest.com.kw/kpg/PaymentHTTP.htm for test
const tranportalId = process.env.tranportalId
const tranportalPassword = process.env.tranportalPassword
const termResourceKey = process.env.termResourceKey
const responseUrl = process.env.kpayResponseUrl
const errorUrl = process.env.kpayErrorUrl
const paramData = {
currencycode: '414',
id: tranportalId,
password: tranportalPassword,
action: '1',
langid: 'AR',
amt: 20, // amount
responseURL: responseUrl,
errorURL: errorUrl,
trackid: Math.random(),
udf3: 12345678 // 8 digit numeric value as customer identifier
}
let params = ''
Object.keys(paramData).forEach((key) => {
params += `${key}=${paramData[key as keyof typeof paramData]}&`
})
const encryptedParams = aesEncrypt(params, termResourceKey)
params = `${encryptedParams}&tranportalId=${tranportalId}&responseURL=${responseUrl}&errorURL=${errorUrl}`
const url = `${kpayUrl}?param=paymentInit&trandata=${params}`
Router.push(url)
}
I am new to GTM+GA.I am trying to display google Analytics(GA4) reports on my webpage. I created Oauth Client Id in google cloud console and also done other settings in Google cloud console. Through javascript code i am trying to get access token from google Api and I am getting below exception.
After successful authentication I will integrate GA repots with my web page. Below is my javascript code for getting access token.
function main(propertyId = 'YOUR-GA4-PROPERTY-ID') {
propertyId = '347415282';
const {
OAuth2Client
} = require('google-auth-library');
const {
grpc
} = require('google-gax');
const http = require('http');
const url = require('url');
const open = require('open');
const destroyer = require('server-destroy');
const keys = require('./oauth2.keys.json');
const SCOPES = ['https://www.googleapis.com/auth/analytics.readonly'];
function getAnalyticsDataClient(authClient) {
const sslCreds = grpc.credentials.createSsl();
const credentials = grpc.credentials.combineChannelCredentials(
sslCreds,
grpc.credentials.createFromGoogleCredential(authClient));
return new BetaAnalyticsDataClient({
sslCreds: credentials,
});
}
function getOAuth2Client() {
return new Promise((resolve, reject) => {
const oAuth2Client = new OAuth2Client(
keys.web.client_id,
keys.web.client_secret,
'http://localhost:3000/oauth2callback');
const authorizeUrl = oAuth2Client.generateAuthUrl({
access_type: 'offline',
scope: SCOPES.join(' '),
});
const server = http
.createServer(async(req, res) => {
try {
if (req.url.indexOf('/oauth2callback') > -1) {
const qs = new url.URL(req.url, 'http://localhost:3000')
.searchParams;
const code = qs.get('code');
console.log(`Code is ${code}`);
res.end(
'Authentication successful! Please return to the console.');
server.destroy();
const r = await oAuth2Client.getToken(code);
oAuth2Client.setCredentials(r.tokens);
console.info('Tokens acquired.');
resolve(oAuth2Client);
}
} catch (e) {
reject(e);
}
})
.listen(3000, () => {
console.info(`Opening the browser with URL: ${authorizeUrl}`);
open(authorizeUrl, {
wait: false
}).then(cp => cp.unref());
});
destroyer(server);
});
}
async function runReport() {
const oAuth2Client = await getOAuth2Client();
const analyticsDataClient = getAnalyticsDataClient(oAuth2Client);
const[response] = await analyticsDataClient.runReport({
property: `properties/${propertyId}`,
dateRanges: [{
startDate: '2020-03-31',
endDate: 'today',
},
],
dimensions: [{
name: 'city',
},
],
metrics: [{
name: 'activeUsers',
},
],
});
console.log('Report result:');
response.rows.forEach(row => {
console.log(row.dimensionValues[0], row.metricValues[0]);
});
}
runReport();
process.on('unhandledRejection', err => {
console.error(err.message);
process.exitCode = 1;
});
main(...process.argv.slice(2));
Please let me know how to get rid off this issue.
Regards,
Prabhash
Does it matter from where I am calling the API, for example, the region of my KMS is the US but currently I'm calling it from the EU. Does it somehow affect decryption, coz I'm encrypting data fine but decryption gives random output? Other than region I'm not sure if something else is causing the issue, please have a look and any answer is appreciated.
Here is my code for reference:
import { DecryptCommandInput, KMS } from "#aws-sdk/client-kms";
import util from 'util'
import { kmsConfig } from "./constants";
export const region = kmsConfig.region;
export const kms = new KMS({
region: region,
apiVersion: "2014-11-01",
credentials: {
accessKeyId: kmsConfig.accesKeyId,
secretAccessKey: kmsConfig.secretAccessKey,
},
// important for react-native
endpoint: {
hostname: "kms." + region +".amazonaws.com",
path: "",
protocol: "https",
}
});
export async function kmsEncryption(data) {
// a client can be shared by different commands.
try {
let encryptionParams = {
KeyId: kmsConfig.arn,
Plaintext: data,
};
let kmsEncrypt = util.promisify(kms.encrypt).bind(kms);
let encryptedData = await kmsEncrypt(encryptionParams);
//encryptedData contained 2 parts, CiphertextBlob and KeyId
console.log("Encrypted");
return encryptedData;
} catch (error) {
console.log("\nerror => \n", error);
}
}
export const kmsDecryption = async (encryptedData: any) => {
try {
let buff = Buffer.from(encryptedData.CiphertextBlob);
let encryptedBase64data = buff.toString("base64");
console.log("\nencryptedBase64data => \n", encryptedBase64data);
let decryptionParams:DecryptCommandInput = {
CiphertextBlob: encryptedData.CiphertextBlob,
};
let kmsDecrypt = util.promisify(kms.decrypt).bind(kms);
let decryptedData = await kmsDecrypt(decryptionParams);
// decryptedData contained 2 parts, Plaintext and KeyId
console.log("\ndecryptedData => \n", decryptedData);
console.log("\ndecryptedData.Plaintext => \n", decryptedData.Plaintext);
console.log("\ndecryptedData.KeyId => \n", decryptedData.KeyId);
let buff2 = Buffer.from(decryptedData.Plaintext, "base64");
let originalText = buff2.toString();
console.log("\noriginalText => \n", originalText);
return originalText;
} catch (error) {
console.log("\ndecrypt error => \n", error);
}
}
let encode = await kmsEncryption("helloword1234");
let decode = await kmsDecryption(encode);
I am having a hard time understanding serverTimestamp in firestore.
When I save a document in database in a firebase function using Fieldvalue.serverTimestamp() or in a javascript client code using serverTimestamp() it sometimes doesn't save the same thing in the database.
See screenshots below :
Sometime I get an object with {nanoseconds: xxx, seconds: xxx} and sometimes I get a timestamp formatted date...
The problem is when I try to query my orders using query(collectionRefOrders, orderBy('createdAt', 'desc'), limit(10)).
The orders with the object appears before the others ones even if they are created after...
Any clue why this happens ? What am I doing wrong ?
Thanks a lot.
EDIT :
Here is the code I use to add documents in the my firebase function (it is a request function I call in a website) :
const { getFirestore, FieldValue } = require('firebase-admin/firestore');
const firebaseDB = getFirestore();
exports.createOrderFromTunnel = functions.region('europe-west3')
.runWith({
timeoutSeconds: 10,
memory: "4GB",
})
.https
.onRequest(async (req, res) => {
cors(req, res, async () => {
try {
const { apiKey } = req.body;
const project = await getProjectFromApiKey(apiKey);
if (!project) {
return res.json({
success: false,
error: 'Unauthorized: invalid or missing api key'
});
}
const contactData = {
address: {},
createdAt: FieldValue.serverTimestamp()
};
const orderData = {
accounting: {
totalHT: 0,
totalTTC: 0,
totalTVA: 0,
},
createdAt: FieldValue.serverTimestamp(),
status: 'NEW',
};
const refProject = firebaseDB
.collection('projects')
.doc(project.id);
const colOrder = firebaseDB.collection(`projects/${project.id}/orders`)
const refOrder = colOrder.doc();
const colContact = firebaseDB.collection(`projects/${project.id}/contacts`)
const refContact = colContact.doc();
await firebaseDB.runTransaction(async transaction => {
const snapProject = await transaction.get(refProject);
const dataProject = snapProject.data();
const sequenceContact = dataProject.sequenceContact;
const sequenceOrder = dataProject.sequenceOrder;
contactData.sequence = sequenceContact;
orderData.sequenceNumber = sequenceOrder;
await transaction.set(refContact, contactData);
orderData.customer.id = refContact.id;
orderData.customer.sequence = sequenceContact;
await transaction.set(refOrder, orderData);
await transaction.update(refProject, {
sequenceContact: sequenceContact + 1,
sequenceOrder: sequenceOrder + 1,
totalContacts: dataProject.totalContacts + 1,
totalOrders: dataProject.totalOrders + 1,
});
return refOrder.id;
});
return res.json({
success: true
});
} catch (err) {
functions.logger.error(err);
return res.json({
success: false,
err
});
}
});
});
Here is the code I use to add documents in my client code (it is a web app in javascript) :
const createOrder = async (projectId) => {
try {
const orderData = {
accounting: {
totalHT: 0,
totalTTC: 0,
totalTVA: 0,
},
createdAt: serverTimestamp(),
status: 'NEW',
surface: 0,
};
const refProject = doc(firebaseDB, 'projects', projectId);
const colOrder = collection(firebaseDB, `projects/${projectId}/orders`)
const refOrder = doc(colOrder);
return await runTransaction(firebaseDB, async (transaction) => {
const snapProject = await transaction.get(refProject);
if (!snapProject.exists()) {
throw "Document does not exist!";
}
const dataProject = snapProject.data();
const sequence = dataProject.sequenceOrder;
orderData.sequenceNumber = sequence;
transaction.set(refOrder, orderData);
transaction.update(refProject, { sequenceOrder: sequence + 1, totalOrders: dataProject.totalOrders + 1 });
return refOrder.id;
});
} catch (e) {
console.error(e);
return null;
}
};
CONTEXT:
In my app, I have a feature that allows the user to upload a video. I noticed that when the users try to upload large videos, sometimes the upload fails.
After, I did a bit of research, I found-out for files larger than 100 Mb, I should use multipart upload.
So I have been following this tutorial to implement multipart upload in my app. And I reached Stage Three.
PART 1: Previous single part upload works fine
This is the implementation of a single part upload using pre-signed urls:
BACKEND
var AWS = require("aws-sdk");
const REGION = "*************************"; //e.g. "us-east-1"
const BUCKET_NAME = "l****************";
AWS.config.update({ region: REGION });
const s3 = new AWS.S3({
signatureVersion: "v4",
apiVersion: "2006-03-01",
});
var getVideoSignedUrl = async function (key) {
return new Promise((resolve, reject) => {
s3.getSignedUrl(
"putObject",
{
Bucket: BUCKET_NAME,
Key: key,
ContentType: "video/*",
ACL: "public-read",
Expires: 300,
},
(err, url) => {
if (err) {
reject(err);
} else {
resolve(url);
}
}
);
});
};
exports.getVideoSignedUrl = getVideoSignedUrl;
FRONTEND
export const getVideoPreSignedUrl = async () =>
await axios.get("/api/profile/getVideoPreSignedURL");
export const uploadVideoFileToCloud = async (file) => {
const { data: uploadConfig } = await getVideoPreSignedUrl();
await axios.put(uploadConfig.url, file, {
headers: {
"Content-Type": file.type,
"x-amz-acl": "public-read",
},
transformRequest: (data, headers) => {
delete headers.common["Authorization"];
return data;
},
});
};
PART 2: Multipart upload which throws 403 forbidden error
BACKEND
var AWS = require("aws-sdk");
const REGION = "***********************"; //e.g. "us-east-1"
const BUCKET_NAME = "************************";
AWS.config.update({ region: REGION });
const s3 = new AWS.S3({
signatureVersion: "v4",
apiVersion: "2006-03-01",
});
// ==========================================================
// Replacing getVideoSignedUrl with initiateMultipartUpload
// That would generate a presigned url for every part
const initiateMultipartUpload = async (object_name) => {
const params = {
Bucket: BUCKET_NAME,
Key: object_name,
ContentType: "video/*",
ACL: "public-read",
Expires: 300,
};
const res = await s3.createMultipartUpload(params).promise();
return res.UploadId;
};
const generatePresignedUrlsParts = async (object_name, number_of_parts) => {
const upload_id = await initiateMultipartUpload(object_name);
const baseParams = {
Bucket: BUCKET_NAME,
Key: object_name,
UploadId: upload_id,
};
const promises = [];
for (let index = 0; index < number_of_parts; index++) {
promises.push(
s3.getSignedUrlPromise("uploadPart", {
...baseParams,
PartNumber: index + 1,
})
);
}
const res = await Promise.all(promises);
const signed_urls = {};
res.map((signed_url, i) => {
signed_urls[i] = signed_url;
});
return signed_urls;
};
exports.initiateMultipartUpload = initiateMultipartUpload;
exports.generatePresignedUrlsParts = generatePresignedUrlsParts;
FRONTEND
This is where the error occurs. See const resParts = await Promise.all(promises)
export const getMultiPartVideoUploadPresignedUrls = async (number_of_parts) => {
const request_params = {
params: {
number_of_parts,
},
};
return await axios.get(
"/api/profile/get_multi_part_video_upload_presigned_urls",
request_params
);
};
// Using multipart upload
export const uploadVideoFileToCloud = async (video_file, dispatch) => {
// Each chunk is 100Mb
const FILE_CHUNK_SIZE = 100_000_000;
let video_size = video_file.size;
let video_size_in_mb = Math.floor(video_size / 1000000);
const number_of_parts = Math.floor(video_size_in_mb / 100) + 1;
const response = await getMultiPartVideoUploadPresignedUrls(number_of_parts);
const urls = response.data;
console.log(
"🚀 ~ file: profileActions.js ~ line 654 ~ uploadParts ~ urls",
urls
);
// async function uploadParts(file: Buffer, urls: Record<number, string>) {
// const axios = Axios.create()
// delete axios.defaults.headers.put["Content-Type"];
const keys = Object.keys(urls);
const promises = [];
for (const indexStr of keys) {
const index = parseInt(indexStr);
const start = index * FILE_CHUNK_SIZE;
const end = (index + 1) * FILE_CHUNK_SIZE;
const blob =
index < keys.length
? video_file.slice(start, end)
: video_file.slice(start);
console.log(
"🚀 ~ file: profileActions.js ~ line 691 ~ uploadParts ~ urls[index]",
urls[index]
);
console.log(
"🚀 ~ file: profileActions.js ~ line 682 ~ uploadParts ~ blob",
blob
);
const upload_params = {
headers: {
"Content-Type": video_file.type,
"x-amz-acl": "public-read",
},
transformRequest: (data, headers) => {
delete headers.common["Authorization"];
return data;
},
};
const axios_request = axios.put(urls[index], blob, upload_params);
promises.push(axios_request);
console.log(
"🚀 ~ file: profileAction.helper.js ~ line 117 ~ uploadParts ~ promises",
promises
);
}
// Uploading video parts
// This throws the 403 forbidden error
const resParts = await Promise.all(promises);
// This never gets logged
console.log(
"🚀 ~ file: profileAction.helper.js ~ line 124 ~ uploadParts ~ resParts",
resParts
);
// return resParts.map((part, index) => ({
// ETag: (part as any).headers.etag,
// PartNumber: index + 1
// }))
};
This is the error that's logged:
PART 3: AWS Bucket & CORS policy:
CORS Policy:
[
{
"AllowedHeaders": [
""
],
"AllowedMethods": [
"PUT",
"POST",
"GET"
],
"AllowedOrigins": [
""
],
"ExposeHeaders": [],
"MaxAgeSeconds": 3000
}
]
Bucket policy hasn't been changed since I created the bucket and it's still empty by default:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Statement1",
"Principal": {},
"Effect": "Allow",
"Action": [],
"Resource": []
}
]
}
So maybe I should add something here?
I also have all of these unchecked:
NOTES:
I tested multipart upload with files smaller and larger than 100 Mb. And it always throws the 403 forbidden error.
I don't understand why I would get forbidden error if the single part upload works just fine. In other words, the upload is allowed and if both single part and multipart upload are using the same credentials, then that forbidden error should not occur.
I have a piece of code that shows me the progress of the upload. And I see the upload progressing. And the error seems to occur AFTER the upload of EACH PART is done: