Equivalent of Swift &+ in JavaScript - javascript

I'm not able to get the same djbhash in JavaScript that I was getting in Swift.
extension String {
public func djbHash() -> Int {
return self.utf8
.map {return $0}
.reduce(5381) {
let h = ($0 << 5) &+ $0 &+ Int($1)
print("h", h)
return h
}
}
}
var djbHash = function (string) {
var h = 5381; // our hash
var i = 0; // our iterator
for (i = 0; i < string.length; i++) {
var ascii = string.charCodeAt(i); // grab ASCII integer
h = (h << 5) + h + ascii; // bitwise operations
}
return h;
}
I tried using BigInt, but the value for the string "QHChLUHDMNh5UTBUcgtLmlPziN42" I'm getting is 17760568308754997342052348842020823769412069976n, compared to 357350748206983768 in Swift.

The Swift &+ operator is an “overflow operator”: It truncates the result of the addition to the available number of bits for the used integer type.
A Swift Int is a 64-bit (signed) integer on all 64-bit platforms, and adding two integers would crash with a runtime exception if the result does not fit into an Int:
let a: Int = 0x7ffffffffffffff0
let b: Int = 0x7ffffffffffffff0
print(a + b) // 💣 Swift runtime failure: arithmetic overflow
With &+ the result is truncated to 64-bit:
let a: Int = 0x7ffffffffffffff0
let b: Int = 0x7ffffffffffffff0
print(a &+ b) // -32
In order to get the same result with JavaScript and BigInt one can use the BigInt.asIntN() function:
var a = 0x7ffffffffffffff0n
var b = 0x7ffffffffffffff0n
console.log(a + b) // 18446744073709551584n
console.log(BigInt.asIntN(64, a+b)) // -32n
With that change, the JavaScript function gives the same result as your Swift code:
var djbHash = function (string) {
var h = 5381n; // our hash
var i = 0; // our iterator
for (i = 0; i < string.length; i++) {
var code = string.charCodeAt(i); // grab UTF-16 code point
h = BigInt.asIntN(64, (h << 5n) + h + BigInt(code)); // bitwise operations
}
return h;
}
console.log(djbHash("QHChLUHDMNh5UTBUcgtLmlPziN42")) // 357350748206983768n
As mentioned in the comments to the other answer, charCodeAt() returns UTF-16 code points, whereas your Swift function works with the UTF-8 representation of a string. So this will still give different results for strings containing any non-ASCII characters.
For identical results for arbitrary strings (umlauts, Emojis, flags, ...) its best to work with the Unicode code points. In Swift that would be
extension String {
public func djbHash() -> Int {
return self.unicodeScalars
.reduce(5381) { ($0 << 5) &+ $0 &+ Int($1.value) }
}
}
print("äöü€😀🚩".djbHash()) // 6958626281456
(You may also consider to use Int64 instead of Int for platform-independent code, or Int32 if a 32-bit hash is sufficient.)
The corresponding JavaScript code is
var djbHash = function (string) {
var h = 5381n; // our hash
for (const codePoint of string) {
h = BigInt.asIntN(64, (h << 5n) + h + BigInt(codePoint.codePointAt(0))); // bitwise operations
}
return h;
}
console.log(djbHash("äöü€😀🚩")) // 6958626281456n

I've had a similar issue in which I've used the & in combination with the used operator. I think the code below should work. It's still under review but you can checkout my post
var djbHash = function (string) {
var h = 5381; // our hash
var i = 0; // our iterator
for (i = 0; i < string.length; i++) {
var ascii = string.charCodeAt(i); // grab ASCII integer
h = (h << 5) + h &+ ascii; // bitwise operations
}
return h;
}

Related

How to reverse Javascript XOR and AND bitwise Operators

I have this code in JS:
function test(e) {
for (var t = "", n = e.charCodeAt(0), i = 1; i < e.length; ++i) {
t += String.fromCharCode(e.charCodeAt(i) ^ i + n & 127);
}
return t;
}
console.log(test('#\f+ 6*5(.=j\x02"9+=>4&s\x11-&;7+?)'));
Console Output is: Microsoft Internet Explorer
Is it possible to reverse a function to do the opposite?
When I write:
console.log(test('Microsoft Internet Explorer'));
I need: #\f+ 6*5(.=j\x02"9+=>4&s\x11-&;7+?)
Your code uses the first character code to XOR the other characters codes. So you can't simply reverse it since it expects 2 inputs. Not just the content string, but also the character that is used for the XOR operation. You can't guess this to be the # since all characters are valid, but produce different encrypted stings.
XOR is the reverse of itself, similar to how multiplying with -1 is the reverse of itself. This means you can re-use a single function for encryption and decryption. The only thing left to do is add the key character at the front for encryption, and remove it for decryption.
This is not code golf, so I've chose some more sensible names (mainly e, t and n are confusing). In my opinion good variable names help readers understand the code better.
function toggleEncryption(keyChar, string) {
const keyCode = keyChar.charCodeAt(0);
let result = "";
for (let index = 0; index < string.length; ++index) {
const code = string.charCodeAt(index);
result += String.fromCharCode(code ^ index + 1 + keyCode & 127);
}
return result;
}
function decrypt(encryptedString) {
return toggleEncryption(encryptedString[0], encryptedString.slice(1));
}
function encrypt(keyChar, string) {
return keyChar[0] + toggleEncryption(keyChar, string);
}
const string = "Microsoft Internet Explorer";
console.log(string);
const encrypted = encrypt("#", string);
console.log(encrypted);
const decrypted = decrypt(encrypted);
console.log(decrypted);
console.log(string == decrypted);
console.log(encrypted == '#\f+ 6*5(.=j\x02"9+=>4&s\x11-&;7+?)');
// Like I said in the introduction you could replace the # with
// any character, but this will produce a different encrypted
// string.
const encrypted2 = encrypt("!", string);
console.log(encrypted2);
const decrypted2 = decrypt(encrypted2);
console.log(decrypted2);
Non-printable characters are not displayed inside the Stack Overflow snippet, but most browsers do show them in the browser console. For most browsers press Ctrl + Shift + I or F12 to open up developer tools and select the console.
It's important to note the operator precedence of:
code ^ index + 1 + keyCode & 127
// is executed as:
code ^ ((index + 1 + keyCode) & 127)
This means that only the XOR operator is called upon code and that is the only thing that has to be reversed.
function test(e) {
for (var t = "", n = e.charCodeAt(0), i = 1; i < e.length; ++i) {
t += String.fromCharCode(e.charCodeAt(i) ^ i - n & 127);
}
return t;
}
Look at - here:
t += String.fromCharCode(e.charCodeAt(i) ^ i - n & 127);
^
Bitwise & operation cannot be reversed :
0 & 1 = 0;
0 & 0 = 0;

How to encode ??

During my research I've found these information but it seems like they are not really matching to my problem.
http://www.cplusplus.com/forum/beginner/31776/
Base 10 to base n conversions
https://cboard.cprogramming.com/cplusplus-programming/83808-base10-base-n-converter.html
So I'd like to implement a custom Base64 to BaseN encoding and decoding using C++.
I should be able to convert a (Base64)-string like "IloveC0mpil3rs" to a custom Base (e.g Base4) string like e.g "10230102010301" and back again.
Additional I should be able to use a custom charset (alphabet) for the base values like the default one probably is "0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ".
So I should be able to use a shuffled one like e.g this (kind of encoding :) ): "J87opBEyWwDQdNAYujzshP3LOx1T0XK2e+ZrvFnticbCS64a9/Il5GmgVkqUfRMH".
I thought about translating the convertBase-function below from javascript into C++ but I'm obviously a beginner and got big problems, so I got stuck right there because my code is not working as expected and I can not find the error:
string encoded = convertBase("Test", 64, 4); // gets 313032130131000
cout << encoded << endl;
string decoded = convertBase(encoded, 4, 64); // error
cout << decoded << endl;
C++ code: (not working)
std::string convertBase(string value, int from_base, int to_base) {
string range = "0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ+/";
string from_range = range.substr(0, from_base),
to_range = range.substr(0, to_base);
int dec_value = 0;
int index = 0;
string reversed(value.rbegin(), value.rend());
for(std::string::iterator it = reversed.begin(); it != reversed.end(); ++it) {
index++;
char digit = *it;
if (!range.find(digit)) return "error";
dec_value += from_range.find(digit) * pow(from_base, index);
}
string new_value = "";
while (dec_value > 0) {
new_value = to_range[dec_value % to_base] + new_value;
dec_value = (dec_value - (dec_value % to_base)) / to_base;
}
return new_value;
}
javascript code: (working)
function convertBase(value, from_base, to_base) {
var range = '0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ+/'.split('');
var from_range = range.slice(0, from_base);
var to_range = range.slice(0, to_base);
var dec_value = value.split('').reverse().reduce(function (carry, digit, index) {
if (from_range.indexOf(digit) === -1) throw new Error('Invalid digit `'+digit+'` for base '+from_base+'.');
return carry += from_range.indexOf(digit) * (Math.pow(from_base, index));
}, 0);
var new_value = '';
while (dec_value > 0) {
new_value = to_range[dec_value % to_base] + new_value;
dec_value = (dec_value - (dec_value % to_base)) / to_base;
}
return new_value || '0';
}
let encoded = convertBase("Test", 64, 4)
console.log(encoded);
let decoded = convertBase(encoded, 4, 64)
console.log(decoded);
Any help how to fix my code would be very appreciated!

Javascript AND C# Unicode De-/Encode with same results

I found this informative thread:
C# solution to de-/encode a unicode string:
How do you convert Byte Array to Hexadecimal String, and vice versa?
Javascript solution for de-/encode a unicode string:
Javascript: Unicode string to hex
But the solutions mix the chars.
Example Javascript (code 1:1 from link above):
var str = "그러하지";
hex = str.hexEncode(); // returns "adf8b7ecd558c9c0"
Example C# (tried 2 solutions, same results):
/// <summary>
/// Convert a string to hex value
/// </summary>
/// <param name="stringValue"></param>
/// <returns></returns>
public string HexEncode(string stringValue)
{
var ba = Encoding.Unicode.GetBytes(stringValue);
// SOLUTION 1
//var c = new char[ba.Length * 2];
//for (var i = 0; i < ba.Length; i++)
//{
// var b = ba[i] >> 4;
// c[i * 2] = (char)(55 + b + (((b - 10) >> 31) & -7));
// b = ba[i] & 0xF;
// c[i * 2 + 1] = (char)(55 + b + (((b - 10) >> 31) & -7));
//}
//return new string(c);
// SOLUTION 2
var hex = new StringBuilder(ba.Length * 2);
foreach (var b in ba)
hex.AppendFormat("{0:x2}", b);
return hex.ToString();
}
/// <summary>
/// Converts a hex value to a string
/// </summary>
/// <param name="hexString"></param>
/// <returns></returns>
public string HexDecode(string hexString)
{
if (hexString == null || (hexString.Length & 1) == 1) return "";
// SOLUTION 1
//hexString = hexString.ToUpper();
//var hexStringLength = hexString.Length;
//var b = new byte[hexStringLength / 2];
//for (var i = 0; i < hexStringLength; i += 2)
//{
// var topChar = (hexString[i] > 0x40 ? hexString[i] - 0x37 : hexString[i] - 0x30) << 4;
// var bottomChar = hexString[i + 1] > 0x40 ? hexString[i + 1] - 0x37 : hexString[i + 1] - 0x30;
// b[i / 2] = Convert.ToByte(topChar + bottomChar);
//}
// SOLUTION 2
var numberChars = hexString.Length;
var bytes = new byte[numberChars / 2];
for (var i = 0; i < numberChars; i += 2)
bytes[i / 2] = Convert.ToByte(hexString.Substring(i, 2), 16);
return Encoding.Unicode.GetString(bytes);
}
var hex = tools.HexEncode("그러하지");
var str = tools.HexDecode(hex); // f8adecb758d5c0c9
JS: adf8 b7ec d558 c9c0
C#: f8ad ecb7 58d5 c0c9
So the sequence is exchanged.
Both encode and decode works as long I am in the same environment. But I need to encode in JS and decode in C# and vice versa.
I do not know which one is the correct one, if correct can be defined here.
And how do I fix this?
Both values are correct. It's just that your javascript solution gives you unicode array in Big Endian notation, and C# - in Little Endian (MSDN article, see Remarks section).
To make C# byte array same like your javascript, define your encoding like this:
UnicodeEncoding bigEndianUnicode = new UnicodeEncoding(true, true);
And later use it like this:
var ba = bigEndianUnicode.GetBytes(stringValue);
Demo: .Net Fiddle

XOR of two hex strings in JavaScript

var hex1 = "B1C85C061C98E713DEF0E2EDDDDB432738674C9F8962F09B75E943D55F9FB39F";
var hex2 = "121B0D3327A21B8048FC7CA6FD07AACC0D8DF59B99DB098686696573E3686E6C";
var result = hex1 ^ hex2; //XOR the values
console.log(result); // outputs: 0 which does not sound good.
Any ideas how to perform XOR operations on hex values?
Bitwise operations in JavaScript only work on numeric values.
You should parseInt(hexString, 16) your hex string before. Specifically in your case this wouldn't work because your hex is too big for a number. You would have to create your own customized XOR function.
Take a look at this link: How to convert hex string into a bytes array, and a bytes array in the hex string?
The resulting bytearray will be ellegible for a manual XOR. Byte by byte. Maybe this will help: Java XOR over two arrays.
If you are on Nodejs, you could transform the hex strings to Buffers then use map to build the resulting string.
function xor(hex1, hex2) {
const buf1 = Buffer.from(hex1, 'hex');
const buf2 = Buffer.from(hex2, 'hex');
const bufResult = buf1.map((b, i) => b ^ buf2[i]);
return bufResult.toString('hex');
}
str = 'abc';
c = '';
key = 'K';
for(i=0; i<str.length; i++) {
c += String.fromCharCode(str[i].charCodeAt(0).toString(10) ^ key.charCodeAt(0).toString(10)); // XORing with letter 'K'
}
return c;
Output of string 'abc':
"*)("
You can use a function like this.
function xor(a, b) {
if (!Buffer.isBuffer(a)) a = new Buffer(a)
if (!Buffer.isBuffer(b)) b = new Buffer(b)
var res = []
if (a.length > b.length) {
for (var i = 0; i < b.length; i++) {
res.push(a[i] ^ b[i])
}
} else {
for (var i = 0; i < a.length; i++) {
res.push(a[i] ^ b[i])
}
}
return new Buffer(res);
}
Source: https://github.com/czzarr/node-bitwise-xor
Below is function that takes in 2 strings like "041234FFFFFFFFFF" and "0000000709000003" (a classic example of pin block and card block)
Expected result from the above 2 strings is "041234F8F6FFFFFC"
function bitwiseXorHexString(pinBlock1, pinBlock2) {
var result = ''
for (let index = 0; index < 16; index++) {
const temp = (parseInt(pinBlock1.charAt(index), 16) ^ parseInt(pinBlock2.charAt(index), 16)).toString(16).toUpperCase()
result += temp
}
return result
}
Note: This was made to xor 2 strings of fixed length 16. You may modify it as per your needs.

How to convert a very large hex number to decimal in javascript

I am trying without much success to convert a very large hex number to decimal.
My problem is that using deciaml = parseInt(hex, 16)
gives me errors in the number when I try to convert a hex number above 14 digits.
I have no problem with this in Java, but Javascript does not seem to be accurate above 14 digits of hex.
I have tried "BigNumber" but tis gives me the same erroneous result.
I have trawled the web to the best of my ability and found web sites that will do the conversion but cannot figure out how to do the conversion longhand.
I have tried getting each character in turn and multiplying it by its factor i.e. 123456789abcdef
15 * Math.pow(16, 0) + 14 * Math.pow(16, 1).... etc but I think (being a noob) that my subroutines may not hev been all they should be because I got a completely (and I mean really different!) answer.
If it helps you guys I can post what I have written so far for you to look at but I am hoping someone has simple answer for me.
<script>
function Hex2decimal(hex){
var stringLength = hex.length;
var characterPosition = stringLength;
var character;
var hexChars = new Array();
hexChars[0] = "0";
hexChars[1] = "1";
hexChars[2] = "2";
hexChars[3] = "3";
hexChars[4] = "4";
hexChars[5] = "5";
hexChars[6] = "6";
hexChars[7] = "7";
hexChars[8] = "8";
hexChars[9] = "9";
hexChars[10] = "a";
hexChars[11] = "b";
hexChars[12] = "c";
hexChars[13] = "d";
hexChars[14] = "e";
hexChars[15] = "f";
var index = 0;
var hexChar;
var result;
// document.writeln(hex);
while (characterPosition >= 0)
{
// document.writeln(characterPosition);
character = hex.charAt(characterPosition);
while (index < hexChars.length)
{
// document.writeln(index);
document.writeln("String Character = " + character);
hexChar = hexChars[index];
document.writeln("Hex Character = " + hexChar);
if (hexChar == character)
{
result = hexChar;
document.writeln(result);
}
index++
}
// document.write(character);
characterPosition--;
}
return result;
}
</script>
Thank you.
Paul
The New 'n' Easy Way
var hex = "7FDDDDDDDDDDDDDDDDDDDDDD";
if (hex.length % 2) { hex = '0' + hex; }
var bn = BigInt('0x' + hex);
var d = bn.toString(10);
BigInts are now available in most browsers (except IE).
Earlier in this answer:
BigInts are now available in both node.js and Chrome. Firefox shouldn't be far behind.
If you need to deal with negative numbers, that requires a bit of work:
How to handle Signed JS BigInts
Essentially:
function hexToBn(hex) {
if (hex.length % 2) {
hex = '0' + hex;
}
var highbyte = parseInt(hex.slice(0, 2), 16)
var bn = BigInt('0x' + hex);
if (0x80 & highbyte) {
// You'd think `bn = ~bn;` would work... but it doesn't
// manually perform two's compliment (flip bits, add one)
// (because JS binary operators are incorrect for negatives)
bn = BigInt('0b' + bn.toString(2).split('').map(function (i) {
return '0' === i ? 1 : 0
}).join('')) + BigInt(1);
bn = -bn;
}
return bn;
}
Ok, let's try this:
function h2d(s) {
function add(x, y) {
var c = 0, r = [];
var x = x.split('').map(Number);
var y = y.split('').map(Number);
while(x.length || y.length) {
var s = (x.pop() || 0) + (y.pop() || 0) + c;
r.unshift(s < 10 ? s : s - 10);
c = s < 10 ? 0 : 1;
}
if(c) r.unshift(c);
return r.join('');
}
var dec = '0';
s.split('').forEach(function(chr) {
var n = parseInt(chr, 16);
for(var t = 8; t; t >>= 1) {
dec = add(dec, dec);
if(n & t) dec = add(dec, '1');
}
});
return dec;
}
Test:
t = 'dfae267ab6e87c62b10b476e0d70b06f8378802d21f34e7'
console.log(h2d(t))
prints
342789023478234789127089427304981273408912349586345899239
which is correct (feel free to verify).
Notice that "0x" + "ff" will be considered as 255, so convert your hex value to a string and add "0x" ahead.
function Hex2decimal(hex)
{
return ("0x" + hex) / 1;
}
If you are using the '0x' notation for your Hex String, don't forget to add s = s.slice(2) to remove the '0x' prefix.
Keep in mind that JavaScript only has a single numeric type (double), and does not provide any separate integer types. So it may not be possible for it to store exact representations of your numbers.
In order to get exact results you need to use a library for arbitrary-precision integers, such as BigInt.js. For example, the code:
var x = str2bigInt("5061756c205768697465",16,1,1);
var s = bigInt2str(x, 10);
$('#output').text(s);
Correctly converts 0x5061756c205768697465 to the expected result of 379587113978081151906917.
Here is a jsfiddle if you would like to experiment with the code listed above.
The BigInt constructor can take a hex string as argument:
/** #param hex = "a83b01cd..." */
function Hex2decimal(hex) {
return BigInt("0x" + hex).toString(10);
}
Usage:
Hex2decimal("100");
Output:
256
A rip-off from the other answer, but without the meaningless 0 padding =P

Categories

Resources