I found this informative thread:
C# solution to de-/encode a unicode string:
How do you convert Byte Array to Hexadecimal String, and vice versa?
Javascript solution for de-/encode a unicode string:
Javascript: Unicode string to hex
But the solutions mix the chars.
Example Javascript (code 1:1 from link above):
var str = "그러하지";
hex = str.hexEncode(); // returns "adf8b7ecd558c9c0"
Example C# (tried 2 solutions, same results):
/// <summary>
/// Convert a string to hex value
/// </summary>
/// <param name="stringValue"></param>
/// <returns></returns>
public string HexEncode(string stringValue)
{
var ba = Encoding.Unicode.GetBytes(stringValue);
// SOLUTION 1
//var c = new char[ba.Length * 2];
//for (var i = 0; i < ba.Length; i++)
//{
// var b = ba[i] >> 4;
// c[i * 2] = (char)(55 + b + (((b - 10) >> 31) & -7));
// b = ba[i] & 0xF;
// c[i * 2 + 1] = (char)(55 + b + (((b - 10) >> 31) & -7));
//}
//return new string(c);
// SOLUTION 2
var hex = new StringBuilder(ba.Length * 2);
foreach (var b in ba)
hex.AppendFormat("{0:x2}", b);
return hex.ToString();
}
/// <summary>
/// Converts a hex value to a string
/// </summary>
/// <param name="hexString"></param>
/// <returns></returns>
public string HexDecode(string hexString)
{
if (hexString == null || (hexString.Length & 1) == 1) return "";
// SOLUTION 1
//hexString = hexString.ToUpper();
//var hexStringLength = hexString.Length;
//var b = new byte[hexStringLength / 2];
//for (var i = 0; i < hexStringLength; i += 2)
//{
// var topChar = (hexString[i] > 0x40 ? hexString[i] - 0x37 : hexString[i] - 0x30) << 4;
// var bottomChar = hexString[i + 1] > 0x40 ? hexString[i + 1] - 0x37 : hexString[i + 1] - 0x30;
// b[i / 2] = Convert.ToByte(topChar + bottomChar);
//}
// SOLUTION 2
var numberChars = hexString.Length;
var bytes = new byte[numberChars / 2];
for (var i = 0; i < numberChars; i += 2)
bytes[i / 2] = Convert.ToByte(hexString.Substring(i, 2), 16);
return Encoding.Unicode.GetString(bytes);
}
var hex = tools.HexEncode("그러하지");
var str = tools.HexDecode(hex); // f8adecb758d5c0c9
JS: adf8 b7ec d558 c9c0
C#: f8ad ecb7 58d5 c0c9
So the sequence is exchanged.
Both encode and decode works as long I am in the same environment. But I need to encode in JS and decode in C# and vice versa.
I do not know which one is the correct one, if correct can be defined here.
And how do I fix this?
Both values are correct. It's just that your javascript solution gives you unicode array in Big Endian notation, and C# - in Little Endian (MSDN article, see Remarks section).
To make C# byte array same like your javascript, define your encoding like this:
UnicodeEncoding bigEndianUnicode = new UnicodeEncoding(true, true);
And later use it like this:
var ba = bigEndianUnicode.GetBytes(stringValue);
Demo: .Net Fiddle
Related
I'm not able to get the same djbhash in JavaScript that I was getting in Swift.
extension String {
public func djbHash() -> Int {
return self.utf8
.map {return $0}
.reduce(5381) {
let h = ($0 << 5) &+ $0 &+ Int($1)
print("h", h)
return h
}
}
}
var djbHash = function (string) {
var h = 5381; // our hash
var i = 0; // our iterator
for (i = 0; i < string.length; i++) {
var ascii = string.charCodeAt(i); // grab ASCII integer
h = (h << 5) + h + ascii; // bitwise operations
}
return h;
}
I tried using BigInt, but the value for the string "QHChLUHDMNh5UTBUcgtLmlPziN42" I'm getting is 17760568308754997342052348842020823769412069976n, compared to 357350748206983768 in Swift.
The Swift &+ operator is an “overflow operator”: It truncates the result of the addition to the available number of bits for the used integer type.
A Swift Int is a 64-bit (signed) integer on all 64-bit platforms, and adding two integers would crash with a runtime exception if the result does not fit into an Int:
let a: Int = 0x7ffffffffffffff0
let b: Int = 0x7ffffffffffffff0
print(a + b) // 💣 Swift runtime failure: arithmetic overflow
With &+ the result is truncated to 64-bit:
let a: Int = 0x7ffffffffffffff0
let b: Int = 0x7ffffffffffffff0
print(a &+ b) // -32
In order to get the same result with JavaScript and BigInt one can use the BigInt.asIntN() function:
var a = 0x7ffffffffffffff0n
var b = 0x7ffffffffffffff0n
console.log(a + b) // 18446744073709551584n
console.log(BigInt.asIntN(64, a+b)) // -32n
With that change, the JavaScript function gives the same result as your Swift code:
var djbHash = function (string) {
var h = 5381n; // our hash
var i = 0; // our iterator
for (i = 0; i < string.length; i++) {
var code = string.charCodeAt(i); // grab UTF-16 code point
h = BigInt.asIntN(64, (h << 5n) + h + BigInt(code)); // bitwise operations
}
return h;
}
console.log(djbHash("QHChLUHDMNh5UTBUcgtLmlPziN42")) // 357350748206983768n
As mentioned in the comments to the other answer, charCodeAt() returns UTF-16 code points, whereas your Swift function works with the UTF-8 representation of a string. So this will still give different results for strings containing any non-ASCII characters.
For identical results for arbitrary strings (umlauts, Emojis, flags, ...) its best to work with the Unicode code points. In Swift that would be
extension String {
public func djbHash() -> Int {
return self.unicodeScalars
.reduce(5381) { ($0 << 5) &+ $0 &+ Int($1.value) }
}
}
print("äöü€😀🚩".djbHash()) // 6958626281456
(You may also consider to use Int64 instead of Int for platform-independent code, or Int32 if a 32-bit hash is sufficient.)
The corresponding JavaScript code is
var djbHash = function (string) {
var h = 5381n; // our hash
for (const codePoint of string) {
h = BigInt.asIntN(64, (h << 5n) + h + BigInt(codePoint.codePointAt(0))); // bitwise operations
}
return h;
}
console.log(djbHash("äöü€😀🚩")) // 6958626281456n
I've had a similar issue in which I've used the & in combination with the used operator. I think the code below should work. It's still under review but you can checkout my post
var djbHash = function (string) {
var h = 5381; // our hash
var i = 0; // our iterator
for (i = 0; i < string.length; i++) {
var ascii = string.charCodeAt(i); // grab ASCII integer
h = (h << 5) + h &+ ascii; // bitwise operations
}
return h;
}
So I've already found posts on this topic, but that didn't really help me. On the one hand, I have tried something like this, for example, but it is not quite right in my case**(C#)**.
string temp;
foreach (var a in chaine)
temp = ( Convert.ToUInt16(a).ToString("X4"));
for (j = 0; j < intlenght; j+= 1)
{
arrayData[j + 1] = temp;
}
Why I think it doesn't really work is that my starting form looked a little different than the examples and I'm not really familiar with JavaScript. My starting shape looks like this**(javaScript)**:
for (j = 0; j < intlenght; j+= 1)
{
arrayData[j + 1] = x.charCodeAt(j) - 32;
}
the x in this case it is actually
var x = document.getElementById ("textIn"). value;
but in my method I have a string return value instead of the X
so how can i correctly get the
arrayData [j + 1] = x.charCodeAt (j) - 32;
translate in c #. In the end I need this in my method for Code128 encoder
EDIT for better Understanding:
So I have a TextBlock in my window, but there is a text in it in a barcode 128 font. However, this barcode cannot yet be read. So what I want to do is add the additional characters of the barcode so that at the end you can scan this barcode with a scanning program. To do that I come across this Stack Overflow answer: https://stackoverflow.com/a/60363928/17667316
However, the problem was with the question that the code is in JavaScript and not in C #. Since I've only found solutions where it works with libaries and nuggets (which I want to work around) I tried to convert this javaScript code into C #. I come across lines like this(javaScript):
arrayData[j + 1] = x.charCodeAt(j) - 32;
I didn't find a solution to this as I did this Javascript code:
var buttonGen = document.getElementById("btnGen");
buttonGen.onclick = function () {
var x = document.getElementById("textIn").value;
var i, j, intWeight, intLength, intWtProd = 0, arrayData = [], fs;
var arraySubst = [ "Ã", "Ä", "Å", "Æ", "Ç", "È", "É", "Ê" ];
/*
* Checksum Calculation for Code 128 B
*/
intLength = x.length;
arrayData[0] = 104; // Assume Code 128B, Will revise to support A, C and switching.
intWtProd = 104;
for (j = 0; j < intLength; j += 1) {
arrayData[j + 1] = x.charCodeAt(j) - 32; // Have to convert to Code 128 encoding
intWeight = j + 1; // to generate the checksum
intWtProd += intWeight * arrayData[j + 1]; // Just a weighted sum
}
arrayData[j + 1] = intWtProd % 103; // Modulo 103 on weighted sum
arrayData[j + 2] = 106; // Code 128 Stop character
chr = parseInt(arrayData[j + 1], 10); // Gotta convert from character to a number
if (chr > 94) {
chrString = arraySubst[chr - 95];
} else {
chrString = String.fromCharCode(chr + 32);
}
// Change the font-size style to match the drop down
fs = document.getElementsByTagName("option")[document.getElementById("selList").selectedIndex].value;
document.getElementById("test").style.fontSize = fs + 'px';
document.getElementById("check").innerHTML =
'Checksum = ' + chr + ' or character ' + // Make It Visual
chrString + ', for text = "' + x + '"';
document.getElementById("test").innerHTML =
'Ì' + // Start Code B
x + // The originally typed string
chrString + // The generated checksum
'Î'; // Stop Code
}
Can convert into a working C # code.
During my research I've found these information but it seems like they are not really matching to my problem.
http://www.cplusplus.com/forum/beginner/31776/
Base 10 to base n conversions
https://cboard.cprogramming.com/cplusplus-programming/83808-base10-base-n-converter.html
So I'd like to implement a custom Base64 to BaseN encoding and decoding using C++.
I should be able to convert a (Base64)-string like "IloveC0mpil3rs" to a custom Base (e.g Base4) string like e.g "10230102010301" and back again.
Additional I should be able to use a custom charset (alphabet) for the base values like the default one probably is "0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ".
So I should be able to use a shuffled one like e.g this (kind of encoding :) ): "J87opBEyWwDQdNAYujzshP3LOx1T0XK2e+ZrvFnticbCS64a9/Il5GmgVkqUfRMH".
I thought about translating the convertBase-function below from javascript into C++ but I'm obviously a beginner and got big problems, so I got stuck right there because my code is not working as expected and I can not find the error:
string encoded = convertBase("Test", 64, 4); // gets 313032130131000
cout << encoded << endl;
string decoded = convertBase(encoded, 4, 64); // error
cout << decoded << endl;
C++ code: (not working)
std::string convertBase(string value, int from_base, int to_base) {
string range = "0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ+/";
string from_range = range.substr(0, from_base),
to_range = range.substr(0, to_base);
int dec_value = 0;
int index = 0;
string reversed(value.rbegin(), value.rend());
for(std::string::iterator it = reversed.begin(); it != reversed.end(); ++it) {
index++;
char digit = *it;
if (!range.find(digit)) return "error";
dec_value += from_range.find(digit) * pow(from_base, index);
}
string new_value = "";
while (dec_value > 0) {
new_value = to_range[dec_value % to_base] + new_value;
dec_value = (dec_value - (dec_value % to_base)) / to_base;
}
return new_value;
}
javascript code: (working)
function convertBase(value, from_base, to_base) {
var range = '0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ+/'.split('');
var from_range = range.slice(0, from_base);
var to_range = range.slice(0, to_base);
var dec_value = value.split('').reverse().reduce(function (carry, digit, index) {
if (from_range.indexOf(digit) === -1) throw new Error('Invalid digit `'+digit+'` for base '+from_base+'.');
return carry += from_range.indexOf(digit) * (Math.pow(from_base, index));
}, 0);
var new_value = '';
while (dec_value > 0) {
new_value = to_range[dec_value % to_base] + new_value;
dec_value = (dec_value - (dec_value % to_base)) / to_base;
}
return new_value || '0';
}
let encoded = convertBase("Test", 64, 4)
console.log(encoded);
let decoded = convertBase(encoded, 4, 64)
console.log(decoded);
Any help how to fix my code would be very appreciated!
I try to sign messages in javascript before sending to a PHP application.
The PHP application must check the signature to be sure it's not false.
In javascript I use cryptico.js.
This is the js function for signing messages
var sign = function(passphrase, text) {
signingkey = cryptico.generateRSAKey(passphrase, 2048);
signString = cryptico.b16to64(signingkey.signString(text, "sha256"));
return signString;
}
This is the function for getting the public key:
var getPublicKey = function(passphrase) {
var rsaKey = cryptico.generateRSAKey(passphrase, 2048);
return = cryptico.publicKeyString(rsaKey);
}
For example, for the message "message" and the passphrase "test2" the public key and signature are
qH/J3/gvF/h5U02uPyC9Qzn/hHEV5DzB9nFfqk5zbQqHdInVe4sfL+npa+4fjLGrBU30Iuvcr+o9paEjzpH5dY48cq6JHqz1RyJ0CQIc2Jr5+sS4eL1ZIjxWlyN1pKMR+4aE2rlDAad56Ad1cytiaHuVvyK/gdtbKiuGroSQhJ1EVfZ60m3NIqnqmpi5Zdsnmzny4VH/d66BcGXxGaGaUaqFn0WTypuwIMZMMtzZEK7peKoaW4H4rfkfdrKcD8AaT+z9v5lLGkTl0NcZZ4LN9sSUzsHNfyAFK6cSXo/73z0tDAlGb5K+yWV6UHoYW1rcoIsxlNRZM6/6FYgMXbbfow==
XbF4O6v6oadEQGtdpQ7d54Q2JB9/ZEXEUH3S1FMn4E/PSqk7HLXjG4tNfuiUBa5eS8kYV49gwC8Yr+mn6YUAHt+K9lHPSsmltWoiHNOaas4aqai9nlyeft4TYYhP+GYbQfw+3n2TcO39s6M0vw0m0a8AX9JfF02JwCUhP4bu4dzG6Bl5dj000TbUkric14Jyurp8OHmmMvKW62TvXPhNOW39+wS1Qkfn9Bxmzi8UEVSVe3wP45JWZNgmgeGnpubDhD05FJEDErfVtZ/DRKD81q5YRd4X4cCkeDPDcJLgKW1jkCsA7yBqESXPDSkkrVUM06A9qMFUwk4mRI88fZ8ryQ==
I'm asking me how to verify it in php?
I tryed something like:
$rsa = new Crypt_RSA();
$rsa->loadKey('qH/J3/gvF/h5U02uPyC9Qzn/hHEV5DzB9nFfqk5zbQqHdInVe4sfL+npa+4fjLGrBU30Iuvcr+o9paEjzpH5dY48cq6JHqz1RyJ0CQIc2Jr5+sS4eL1ZIjxWlyN1pKMR+4aE2rlDAad56Ad1cytiaHuVvyK/gdtbKiuGroSQhJ1EVfZ60m3NIqnqmpi5Zdsnmzny4VH/d66BcGXxGaGaUaqFn0WTypuwIMZMMtzZEK7peKoaW4H4rfkfdrKcD8AaT+z9v5lLGkTl0NcZZ4LN9sSUzsHNfyAFK6cSXo/73z0tDAlGb5K+yWV6UHoYW1rcoIsxlNRZM6/6FYgMXbbfow=='); // public key
echo $rsa->verify('message', 'XbF4O6v6oadEQGtdpQ7d54Q2JB9/ZEXEUH3S1FMn4E/PSqk7HLXjG4tNfuiUBa5eS8kYV49gwC8Yr+mn6YUAHt+K9lHPSsmltWoiHNOaas4aqai9nlyeft4TYYhP+GYbQfw+3n2TcO39s6M0vw0m0a8AX9JfF02JwCUhP4bu4dzG6Bl5dj000TbUkric14Jyurp8OHmmMvKW62TvXPhNOW39+wS1Qkfn9Bxmzi8UEVSVe3wP45JWZNgmgeGnpubDhD05FJEDErfVtZ/DRKD81q5YRd4X4cCkeDPDcJLgKW1jkCsA7yBqESXPDSkkrVUM06A9qMFUwk4mRI88fZ8ryQ==') ? 'verified' : 'unverified';
I think the signature and/or public key are not formated correctly for php. Any idea?
Thank you in advance,
[EDIT]
I'm not sure the signature is correct. If I use the js function cryptico.b64to16(signature), the signature will be somethink like :
5db1783babfaa1a744406b5da50edde78436241f7f6445c4507dd2d45327e04fcf4aa93b1cb5e31b8b4d7ee89405ae5e4bc918578f60c02f18afe9a7e985001edf8af651cf4ac9a5b56a221cd39a6ace1aa9a8bd9e5c9e7ede1361884ff8661b41fc3ede7d9370edfdb3a334bf0d26d1af005fd25f174d89c025213f86eee1dcc6e81979763d34d136d492b89cd78272baba7c3879a632f296eb64ef5cf84d396dfdfb04b54247e7f41c66ce2f141154957b7c0fe3925664d82681e1a7a6e6c3843d3914910312b7d5b59fc344a0fcd6ae5845de17e1c0a47833c37092e0296d63902b00ef206a1125cf0d2924ad550cd3a03da8c154c24e26448f3c7d9f2bc9
I am not sure about the format of the param key of $rsa->verify. I tryed to add the prefix ssh-rsa. But it do not works better.
So I tryed the to signature format and the to key. The message is each time "unverified"
Thanks #neubert, it's du to PSS signature.
So here is the solution.
I use:
phpseclib : PHP lib used to validate message
jsrsasign : JS lib used to sign message
jsencrypt : JS lib to create private and public key
First, generate the keys in JS:
crypt = new JSEncrypt({default_key_size: 512});
var key = crypt.getKey();
var publicKey = key.getPublicKey();
var privateKey = key.getPrivateKey();
Secondly, create the signature:
var rsa = new RSAKey();
rsa.readPrivateKeyFromPEMString(privateKey);
var hSig = rsa.signStringPSS('message', 'sha1');
var signature = linebrk(hSig, 64);
console.log(signature);
By default the signature is not in the good format. We have to encode hSig in base 64 with the function base64Chars
var base64Chars = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/';
b16to64 = function(h) {
var i;
var c;
var ret = "";
if(h.length % 2 == 1)
{
h = "0" + h;
}
for (i = 0; i + 3 <= h.length; i += 3)
{
c = parseInt(h.substring(i, i + 3), 16);
ret += base64Chars.charAt(c >> 6) + base64Chars.charAt(c & 63);
}
if (i + 1 == h.length)
{
c = parseInt(h.substring(i, i + 1), 16);
ret += base64Chars.charAt(c << 2);
}
else if (i + 2 == h.length)
{
c = parseInt(h.substring(i, i + 2), 16);
ret += base64Chars.charAt(c >> 2) + base64Chars.charAt((c & 3) << 4);
}
while ((ret.length & 3) > 0) ret += "=";
return ret;
}
To finish we validate in PHP. We assume the signature and the public key are stored in the vars with the same name:
$rsa = new Crypt_RSA();
$rsa->loadKey($publickey); // public key;
echo $rsa->verify('message', base64_decode($signature)) ? 'verified' : 'unverified';
I am trying without much success to convert a very large hex number to decimal.
My problem is that using deciaml = parseInt(hex, 16)
gives me errors in the number when I try to convert a hex number above 14 digits.
I have no problem with this in Java, but Javascript does not seem to be accurate above 14 digits of hex.
I have tried "BigNumber" but tis gives me the same erroneous result.
I have trawled the web to the best of my ability and found web sites that will do the conversion but cannot figure out how to do the conversion longhand.
I have tried getting each character in turn and multiplying it by its factor i.e. 123456789abcdef
15 * Math.pow(16, 0) + 14 * Math.pow(16, 1).... etc but I think (being a noob) that my subroutines may not hev been all they should be because I got a completely (and I mean really different!) answer.
If it helps you guys I can post what I have written so far for you to look at but I am hoping someone has simple answer for me.
<script>
function Hex2decimal(hex){
var stringLength = hex.length;
var characterPosition = stringLength;
var character;
var hexChars = new Array();
hexChars[0] = "0";
hexChars[1] = "1";
hexChars[2] = "2";
hexChars[3] = "3";
hexChars[4] = "4";
hexChars[5] = "5";
hexChars[6] = "6";
hexChars[7] = "7";
hexChars[8] = "8";
hexChars[9] = "9";
hexChars[10] = "a";
hexChars[11] = "b";
hexChars[12] = "c";
hexChars[13] = "d";
hexChars[14] = "e";
hexChars[15] = "f";
var index = 0;
var hexChar;
var result;
// document.writeln(hex);
while (characterPosition >= 0)
{
// document.writeln(characterPosition);
character = hex.charAt(characterPosition);
while (index < hexChars.length)
{
// document.writeln(index);
document.writeln("String Character = " + character);
hexChar = hexChars[index];
document.writeln("Hex Character = " + hexChar);
if (hexChar == character)
{
result = hexChar;
document.writeln(result);
}
index++
}
// document.write(character);
characterPosition--;
}
return result;
}
</script>
Thank you.
Paul
The New 'n' Easy Way
var hex = "7FDDDDDDDDDDDDDDDDDDDDDD";
if (hex.length % 2) { hex = '0' + hex; }
var bn = BigInt('0x' + hex);
var d = bn.toString(10);
BigInts are now available in most browsers (except IE).
Earlier in this answer:
BigInts are now available in both node.js and Chrome. Firefox shouldn't be far behind.
If you need to deal with negative numbers, that requires a bit of work:
How to handle Signed JS BigInts
Essentially:
function hexToBn(hex) {
if (hex.length % 2) {
hex = '0' + hex;
}
var highbyte = parseInt(hex.slice(0, 2), 16)
var bn = BigInt('0x' + hex);
if (0x80 & highbyte) {
// You'd think `bn = ~bn;` would work... but it doesn't
// manually perform two's compliment (flip bits, add one)
// (because JS binary operators are incorrect for negatives)
bn = BigInt('0b' + bn.toString(2).split('').map(function (i) {
return '0' === i ? 1 : 0
}).join('')) + BigInt(1);
bn = -bn;
}
return bn;
}
Ok, let's try this:
function h2d(s) {
function add(x, y) {
var c = 0, r = [];
var x = x.split('').map(Number);
var y = y.split('').map(Number);
while(x.length || y.length) {
var s = (x.pop() || 0) + (y.pop() || 0) + c;
r.unshift(s < 10 ? s : s - 10);
c = s < 10 ? 0 : 1;
}
if(c) r.unshift(c);
return r.join('');
}
var dec = '0';
s.split('').forEach(function(chr) {
var n = parseInt(chr, 16);
for(var t = 8; t; t >>= 1) {
dec = add(dec, dec);
if(n & t) dec = add(dec, '1');
}
});
return dec;
}
Test:
t = 'dfae267ab6e87c62b10b476e0d70b06f8378802d21f34e7'
console.log(h2d(t))
prints
342789023478234789127089427304981273408912349586345899239
which is correct (feel free to verify).
Notice that "0x" + "ff" will be considered as 255, so convert your hex value to a string and add "0x" ahead.
function Hex2decimal(hex)
{
return ("0x" + hex) / 1;
}
If you are using the '0x' notation for your Hex String, don't forget to add s = s.slice(2) to remove the '0x' prefix.
Keep in mind that JavaScript only has a single numeric type (double), and does not provide any separate integer types. So it may not be possible for it to store exact representations of your numbers.
In order to get exact results you need to use a library for arbitrary-precision integers, such as BigInt.js. For example, the code:
var x = str2bigInt("5061756c205768697465",16,1,1);
var s = bigInt2str(x, 10);
$('#output').text(s);
Correctly converts 0x5061756c205768697465 to the expected result of 379587113978081151906917.
Here is a jsfiddle if you would like to experiment with the code listed above.
The BigInt constructor can take a hex string as argument:
/** #param hex = "a83b01cd..." */
function Hex2decimal(hex) {
return BigInt("0x" + hex).toString(10);
}
Usage:
Hex2decimal("100");
Output:
256
A rip-off from the other answer, but without the meaningless 0 padding =P