I'm working on a function that converts a number to string and string to a number.
the original c# code
public static string unhash(Int64 hash)
{
string originalString = "";
Int64 mod = 37;
string letters = "acdegilmnoprstuw";
while (hash != 7)
{
Int64 index = hash % mod;
originalString = letters[(Int32) index] + originalString; // need help converting this line to javascript
hash = (hash - index) / mod;
}
return originalString;
}
the javascript code
this is working correctly as it convert string into hash that I want
function hash (s) {
var h = 7;
var letters = "acdegilmnoprstuw";
for (i = 0; i < s.length; i++) {
h = (h * 37 + letters.indexOf(s[i]))
}
return h;
}
the code to reverse the process of hash to string, it not working correctly
function unhash (hash) {
var originalString = "";
var mod = 37;
var letters = "acdegilmnoprstuw";
while( hash != 7) {
var index = hash % mod;
originalString = letters[(Int32Array)index] + originalString; // I'm not sure what the javascript version of int32
hash = (hash - index) / mod;
}
}
alert(hash("leepadg")); // this is the correct output 680131659347
alert(unhash( 680131659347)); //output supposed to be leepadg but returning undefined
Related
filterString('str$$$1232text%<>');
The answer should be like
a = 'strtext'
b = '$$$%<>'enter code here
c = '1231'
Going by your question, assuming it to be in string, two possible ways are checking by regex or Unicode.
word = 'str$$$1232text%<>'
console.log(filterStringByUnicode(word))
console.log(filterStringByRegex(word))
function filterStringByRegex(word){
let str = num = spl = '';
[...word].forEach(el => {
if(el.match(/[a-z]/))
str += el;
else if(el.match(/[0-9]/))
num += el;
else
spl += el;
})
return {a:str,b:spl,c:num}
}
function filterStringByUnicode(word){
let str = num = spl = '';
[...word].forEach(el => {
let unicode = el.charCodeAt(0)
if(unicode >= 91 && unicode <= 122) //Unicode for a-z
str += el;
else if(unicode >= 48 && unicode <= 57) //Unicode for numbers
num += el;
else //rest
spl += el;
})
return {a:str,b:spl,c:num}
}
Your question seems not to be about filters but about how to split a string into substrings following some rules. I suggest you to look around RegExp(theRule) in JS.
A solution could be similar to :
var aString = 'str$$$1232text%<>';
var a=''; var b=''; var c='';
var regexA = new RegExp('[a-z]'); // lowercase a to z
var regexB = new RegExp('$%<>'); // only this special chars but you can add more
var regexC = new RegExp('[0-9]'); // 0 to 9
for(const aCharacter of aString.split('')){ // split will make an Array of chars
if (regexA.test(aCharacter) // the 'test' method return true if the char respect the regex rule
a = a.concat(aCharacter);
if (regexB.test(aCharacter)
b = b.concat(aCharacter);
if (regexC.test(aCharacter)
c = c.concat(aCharacter);
}
I'm new to Selenium, and I can't quite figure out how to insert random values to a Send.Key in a findElement. I'm using Selenium web driver with Java.
Here's my code:
driver.findElement(By.id("id1")).click();
{
int T;
double M=0;
boolean S = true;
boolean x = false;
double p = 1;
for (T = 0; T == (int) Math.floor(T / 10);)
p = (p + T % 10 * (9 - M++ % 6)) % 11;
//return S?S-1:'k';
alert(x ? p - 1 : 'k');
double alert;
driver.findElement(By.id("id1")).sendKeys(alert);
}
Could someone tell me how to do this?
driver.findElement(By.id("id1")).click();
String Capital_chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
String Small_chars = "abcdefghijklmnopqrstuvwxyz";
String numbers = "0123456789";
int length = 8; // You can change the lenght of the random string as per your requirement
String values = Capital_chars + Small_chars + numbers ;
Random randomGenerator = new Random();
StringBuilder sb = new StringBuilder();
for(int i = 0; i < length; i++) {
// generate random index number
int index = randomGenerator.nextInt(values.length());
// get character specified by index
// from the string
char randomChar = values.charAt(index);
// append the character to string builder
sb.append(randomChar);
}
System.out.println("sb===="+sb.toString());
driver.findElement(By.id("id1")).sendKeys(sb.toString());
I try to encrypt an ArrayBuffer with AES so convert is to an wordArray and then to a string:
private encrypt(file: ArrayBuffer, key: string): string {
const wordArray = CryptoJS.lib.WordArray.create(file);
const str = CryptoJS.enc.Hex.stringify(wordArray);
console.log(str); //6920616d206120737472696e67
return CryptoJS.AES.encrypt(str, key).toString();
}
Now I want to decrypt back to an ArrayButter, but the printed strings do not even match:
private decrypt(file: string, key: string) {
const decrypted = CryptoJS.AES.decrypt(file, key);
console.log(decrypted.toString()); //3639323036313664323036313230373337343732363936653637
}
I think I messed up some step but I don't know where.
Update: I need to convert the string to utf to generate a wordarray:
private decrypt(file: string, key: string) {
const decrypted = CryptoJS.AES.decrypt(file, key);
const str = decrypted.toString(CryptoJS.enc.Utf8);
const wordArray = CryptoJS.enc.Hex.parse(str);
}
Now I am only one step away from converting it to an ArrayBuffer again
function CryptJsWordArrayToUint8Array(wordArray) {
const l = wordArray.sigBytes;
const words = wordArray.words;
const result = new Uint8Array(l);
var i = 0 /*dst*/, j = 0 /*src*/;
while (true) {
// here i is a multiple of 4
if (i == l) {
break;
}
var w = words[j++];
result[i++] = (w & 0xff000000) >>> 24;
if (i == l) {
break;
}
result[i++] = (w & 0x00ff0000) >>> 16;
if (i == l) {
break;
}
result[i++] = (w & 0x0000ff00) >>> 8;
if (i == l) {
break;
}
result[i++] = (w & 0x000000ff);
}
return result;
}
I'm decyphering an AES CBC encrypted string (json) in a controller, however the decrypted string is partially wrong: why / how ?
First I'm encrypting a padded json in javascript (using aes-js):
AddPaddingText(text, size) {
text += " ".repeat(size - text.length % size);
return text;
}
Encrypt(decyphered, size) {
var padded = this.AddPaddingText(decyphered, size);
var textBytes = aesjs.utils.utf8.toBytes(padded);
var encryptedBytes = this.aesCyCtr.encrypt(textBytes);
return aesjs.utils.hex.fromBytes(encryptedBytes);
}
var json = JSON.stringify([this.emailFieldValue, this.textFieldValue]);
this.cyphjson = this.Encrypt(json, 128);
this.cyphjson is sent to the api, and then:
[Route("/contact/decrypt")]
public IActionResult DecryptMessage(string message)
{
if (message is null)
return Content($"null message");
_aes.Padding = PaddingMode.None;
var messageBytes = HexToBytes(message);
using (var target = new MemoryStream())
{
using (var cs = new CryptoStream(target, _aes.CreateDecryptor(), CryptoStreamMode.Write))
{
cs.Write(messageBytes, 0, messageBytes.Length);
cs.FlushFinalBlock();
}
var dbMessage = MemoryStreamToMessage(target);
var jsonOut = JsonConvert.SerializeObject(dbMessage);
return Content(jsonOut, "application/json");
}
}
private static Message MemoryStreamToMessage(MemoryStream stream)
{
var text = Encoding.Default.GetString(stream.ToArray()).TrimEnd();
var jarray = (JArray)JsonConvert.DeserializeObject(text); //EXCEPTION HERE
var list = jarray.ToObject<List<string>>();
var storableMessage = new Message { Body = list[1], Email = list[0], Date = DateTime.Now };
return storableMessage;
}
public static byte[] HexToBytes(string hex)
{
if (hex.Length % 2 == 1)
throw new Exception("Hex string must have even number of digits");
byte[] arr = new byte[hex.Length >> 1];
for (int i = 0; i < hex.Length >> 1; ++i)
arr[i] = (byte)((GetHexVal(hex[i << 1]) << 4) + (GetHexVal(hex[(i << 1) + 1])));
return arr;
}
public static int GetHexVal(char hex)
{
int val = hex;
return val - (val < 58 ? 48 : 87);
}
In MemoryStreamToMessage, text is having the symptom; so I get this kind of string:
"�0�T.�z�\u000f��8��ߥcom\",\"My message is full\"]"
instead of:
["sender#server.com","My message is full"]
To me it looks like there is a shift in bytes, in the encoding, but I'm using a byte[]-UTF8 conversion on both side. Any thougths ?
im doing some stuff in javaScript and therefore need to convert a Text String to Decimal .
The php code that i used was:
function to_number($data)
{
$base = "256";
$radix = "1";
$result = "0";
for($i = strlen($data) - 1; $i >= 0; $i--)
{
$digit = ord($data{$i});
$part_res = bcmul($digit, $radix);
$result = bcadd($result, $part_res);
$radix = bcmul($radix, $base);
}
return $result;
}
since i need to do it in Javascript i tried to convert The String into Hex and then into a Decimal String :
/// String into Hex
function toHex(str) {
var hex = '';
var i = 0;
while(str.length > i) {
hex += ''+str.charCodeAt(i).toString(16);
i++;
}
return hex;
}
//// Hex into Dec
function h2d(s) {
function add(x, y) {
var c = 0, r = [];
var x = x.split('').map(Number);
var y = y.split('').map(Number);
while(x.length || y.length) {
var s = (x.pop() || 0) + (y.pop() || 0) + c;
r.unshift(s < 10 ? s : s - 10);
c = s < 10 ? 0 : 1;
}
if(c) r.unshift(c);
return r.join('');
}
var dec = '0';
s.split('').forEach(function(chr) {
var n = parseInt(chr, 16);
for(var t = 8; t; t >>= 1) {
dec = add(dec, dec);
if(n & t) dec = add(dec, '1');
}
});
return dec;
}
Everything works fine and both php and javascript code returns exact same result when i feed them with a regular String like :
thisisasimplestring
but when feeding an input like this:
�ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ�!µ?ŲÀ]).-PSH9Ó·ÂÞ
or even this:
�ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ�yohoo
the result of javascript code will be diffrent from php.
i need a result same as php.
Any idea? im looking for help
although i solved the problem with another solution, but for the refrence its because of the way that javascript handle the characters (characters like those i need) , so
str.charCodeAt(i)
won't return a correct code point for that character.
i suggest reading this article :
http://mathiasbynens.be/notes/javascript-unicode