converting numbers into alphabets - javascript

I want to convert numbers into alpha characters using JavaScript. For example, 01=n, 02=i 03=n, 04=a, etc.
When someone enters the numbers:01020304 in the form he will get the response: nina. Whatever the user enters gets replaced with the equivalent characters including spaces.
Update
Thank you all for quick response. I have found this code in one site. It converts alpha characters into numbers, but code for converting numbers into alpha characters isn't working. Here is the code for converting alpha characters into numbers:
var i,j;
var getc;
var len;
var num, alpha;
num=new Array("01","02","03","04","05","06","07","08","09","10","11","12","13","14","15","16","17",
"18","19","20","21","22","23","24","25","26","00","##","$$");
alpha=new Array("a","b","c","d","e","f","g","h","i","j","k","l","m","n","o","p","q","r","s","t","u","
v","w","x","y","z"," ",".",",");
function encode() {
len=document.f1.ta1.value.length;
document.f1.ta2.value="";
for(i=0;i<len;i++) {
getc=document.f1.ta1.value.charAt(i);
getc=getc.toLowerCase();
for(j=0;j<alpha.length;j++) {
if(alpha[j]==getc) {
document.f1.ta2.value+=num[j];
}
}
}
}
Can anyone show me how to convert this to do the opposite character conversion?

I agree with Skrilldrick, you should learn how to do this yourself, but I couldn't help myself: http://jsfiddle.net/dQkxw/
HTML
<html>
<body>
<input type="text" id="code">
<button onclick="decode($('#code').val())">
Decode
</button>
</body>
</html>
JavaScript
window.decode = function(numbers) {
if (numbers.length % 2 != 0)
{
alert("invalid code!");
return;
}
var result = "";
for (var i = 0; i < numbers.length; i+=2) {
var number = Number(numbers.substring(i, i+2));
if (number < 1 || number > 26)
{
alert("invalid number: "+number);
return;
}
result += String.fromCharCode(96+number);
}
alert(result);
}

A good way to do this easily, and so it is a scalable solution would be to have a multi dimensional array that maps each char to it's corresponding char. You can have multiple dimensions for each conversion and pick between them.
var myCharArray=new Array(4)
for (i=0; i < 4; i++)
myCharArray[i]=new Array(2)
myCharArray[0][0]="a"
myCharArray[0][1]="1"
myCharArray[1][0]="b"
myCharArray[1][1]="2"
myCharArray[2][0]="c"
myCharArray[2][1]="3"
myCharArray[3][0]="d"
myCharArray[3][1]="4"
Then, upon conversion, loop every single character in your string to be encoded, and search for it in the array. If it is found, switch it with the encoded value. This should be reasonably easy to do.
The method you described seems to be a simple derivative off a Caesar cipher. Also remember because the script is client side, it will be incredible easy to decode, so make sure it's not for anything important!

Related

Emojis to/from codepoints in Javascript

In a hybrid Android/Cordova game that I am creating I let users provide an identifier in the form of an Emoji + an alphanumeric - i.e. 0..9,A..Z,a..z - name. For example
🙋‍️Stackoverflow
Server-side the user identifiers are stored with the Emoji and Name parts separated with only the Name part requiried to be unique. From time-to-time the game displays a "league table" so the user can see how well they are performing compared to other players. For this purpose the server sends back a sequence of ten "high score" values consisting of Emoji, Name and Score.
This is then presented to the user in a table with three columns - one each for Emoji, Name and Score. And this is where I have hit a slight problem. Initially I had quite naively assumed that I could figure out the Emoji by simply looking at handle.codePointAt(0). When it dawned on me that an Emoji could in fact be a sequence of one or more 16 bit Unicode values I changed my code as follows
Part 1:Dissecting the user supplied "handle"
var i,username,
codepoints = [],
handle = "🙋‍️StackOverflow",
len = handle,length;
while ((i < len) && (255 < handle.codePointAt(i)))
{codepoints.push(handle.codePointAt(i));i += 2;}
username = handle.substring(codepoints.length + 1);
At this point I have the "disssected" handle with
codepoints =  [128587, 8205, 65039];
username = 'Stackoverflow;
A note of explanation for the i += 2 and the use of handle.length above. This article suggests that
handle.codePointAt(n) will return the code point for the full surrogate pair if you hit the leading surrogate. In my case since the Emoji has to be first character the leading surrogates for the sequence of 16 bit Unicodes for the emoji are at 0,2,4....
From the same article I learnt that String.length in Javascript will return the number of 16 bit code units.
Part II - Re generating the Emojis for the "league table"
Suppose the league table data squirted back to the app by my servers has the entry {emoji: [128583, 8205, 65039],username:"Stackexchange",points:100} for the emoji character 🙇‍️. Now here is the bothersome thing. If I do
var origCP = [],
i = 0,
origEmoji = '🙇‍️',
origLen = origEmoji.length;
while ((i < origLen) && (255 < origEmoji.codePointAt(i))
{origCP.push(origEmoji.codePointAt(i);i += 2;}
I get
origLen = 5, origCP = [128583, 8205, 65039]
However, if I regenerate the emoji from the provided data
var reEmoji = String.fromCodePoint.apply(String,[128583, 8205, 65039]),
reEmojiLen = reEmoji.length;
I get
reEmoji = '🙇‍️'
reEmojiLen = 4;
So while reEmoji has the correct emoji its reported length has mysteriously shrunk down to 4 code units in place of the original 5.
If I then extract code points from the regenerated emoji
var reCP = [],
i = 0;
while ((i < reEmojiLen) && (255 < reEmoji.codePointAt(i))
{reCP.push(reEmoji.codePointAt(i);i += 2;}
which gives me
reCP = [128583, 8205];
Even curioser, origEmoji.codePointAt(3) gives the trailing surrogate pair value of 9794 while reEmoji.codePointAt(3) gives the value of the next full surrogate pair 65039.
I could at this point just say
Do I really care?
After all, I just want to show the league table emojis in a separate column so as long as I am getting the right emoji the niceties of what is happening under the hood do not matter. However, this might well be stocking up problems for the future.
Can anyone here shed any light on what is happening?
emojis are more complicated than just single chars, they come in "sequences", e.g. a zwj-sequence (combine multiple emojis into one image) or a presentation sequence (provide different variations of the same symbol) and some more, see tr51 for all the nasty details.
If you "dump" your string like this
str = "🙋‍️StackOverflow"
console.log(...[...str].map(x => x.codePointAt(0).toString(16)))
you'll see that it's actually an (incorrectly formed) zwj-sequence wrapped in a presentation sequence.
So, to slice emojis accurately, you need to iterate the string as an array of codepoints (not units!) and extract plane 1 CPs (>0xffff) + ZWJ's + variation selectors. Example:
function sliceEmoji(str) {
let res = ['', ''];
for (let c of str) {
let n = c.codePointAt(0);
let isEmoji = n > 0xfff || n === 0x200d || (0xfe00 <= n && n <= 0xfeff);
res[1 - isEmoji] += c;
}
return res;
}
function hex(str) {
return [...str].map(x => x.codePointAt(0).toString(16))
}
myStr = "🙋‍️StackOverflow"
console.log(sliceEmoji(myStr))
console.log(sliceEmoji(myStr).map(hex))

Convert Google Contact ID to Hex to use in URL

Google Contacts now (Jan 2019) issues a long (19 digit) decimal number id for each contact that you create.
Unfortunately, as discussed in this question the ID cannot be put into a URL to view the contact easily, however if you convert this decimal number to Hex it can be put into the URL.
So the question is, how to convert
c2913347583522826972
to
286E4A310F1EEADC
When I use the Decimal to Hex converter here it gives me
286E4A310F1EEADC if I drop the c (2nd function below is a version of the sites code, but it does use PHP too maybe)
However trying the following functions in Javascript give me mixed results
The first one is from this stack question which is the closest, just 2 digits off
function decimalToHexString(number)
{
number = parseFloat(number);
if (number < 0)
{
number = 0xFFFFFFFF + number + 1;
}
return number.toString(16);
}
console.log(decimalToHexString('2913347583522826972'));
//output 286e4a310f1eea00
function convertDec(inp,outp) {
var pd = '';
var output ;
var input = inp;
for (i=0; i < input.length; i++) {
var e=input[i].charCodeAt(0);var s = "";
output+= e + pd;
}
return output;
}
//return 50574951515255535651535050565054575550
Love to know your thoughts on improving this process
It seems like the limit of digit size. You have to use arrays if you need to convert bigger digits.
You can use hex2dec npm package to convert between hex and dec.
>> converter.decToHex("2913347583522826972", { prefix: false }
//286e4a310f1eeadc
Js example
On python side, you can simply do
dec = 2913347583522826972
// Python implicitly handles prefix
hexa = hex(dec)
print dec == int(hexa, 16)
// True
Python example
For more take a look at the following gist
https://gist.github.com/agirorn/0e740d012b620968225de58859ccef5c

Compresing / decompresing a binary string into/from hex in javascript not working

Introduction
I'm currently working on John Conway's Game of Life in js. I have the game working (view here) and i'm working on extra functionalities such as sharing your "grid / game" to your friends. To do this i'm extracting the value's of the grid (if the cell is alive or dead) into a long string of 0's and 1's.
This long string can be seen as binary code and im trying to "compress" it into a hexadecimal string by chopping the binary up into substrings with a lenght of 8 and then determining its hexadecimal value. decompressing works the other way around. Deviding the hex string into bits of two and determining its binary value.
parseInt('00011110', 2).toString(16); // returns '1e'
parseInt('1e', 16).toString(2); // returns '11110'
// Technically both representations still have the same decimal value
As shown above js will cut off the leading 0s since they're 'not needed'.
I've fixed this problem by looking if the lenght of the binary string returned by the function is 8, ifnot it adds enough 0s in front untill its length is exactly 8.
It could be that this function is not working correctly but i'm not sure.
It seems to work with small binary values.
please note you can only put in strings with a length devidable by 8
The problem
Longer binary strings don't seem to work (shown below) and this is probably not caused by overflow (that would probably result in a long row of 0s at the end).
EDIT:
var a = "1000011101110101100011000000001011111100111011010011110000000100101000000111111010111111110101100001100101110001100110110101000111110001001010110111001010100011010010111001110010111001101100000100001001101000001010101110001001001110101001110001001111010110011000010100001111000111000011000101010110010011101100000100011101101110110000100101000110011101101011011111010111001001000101000001001111010010010010100000110101101101110101110101010101111101100110101110100100110000010000000110000100000001110001011001011011000101111110101000100011010100011001000101111001000010001011001011100100110001101100001111110110000000111010100101110110101110110111001100000001001100111110000111001010111110110100010111001011101110011011100100111010001100010111100111011010111110111101010000111101010100011000000111000010101011101101011110010011001110000111100000111011111011000000100000010100001111110101001110001100011001"
a.length
904
var c = compress(a)
c
"87758c2fced3c4a07ebfd619719b51f12b72a34b9cb9b042682ae24ea713d66143c7c5593b0476ec2519dadf5c91413d24ad6dd7557d9ae93040611c596c5fa88d4645e422cb931b0fd80ea5daedcc04cf872bed172ee6e4e8c5e76bef5f546070abb5e4ce1eefb25fd4e319"
var d = decompress(c)
d
"100001110111010110001100001011111100111011010011110001001010000001111110101111111101011000011001011100011001101101010001111100010010101101110010101000110100101110011100101110011011000001000010011010000010101011100010010011101010011100010011110101100110000101000011110001111100010101011001001110110000010001110110111011000010010100011001110110101101111101011100100100010100000100111101001001001010110101101101110101110101010101111101100110101110100100110000010000000110000100011100010110010110110001011111101010001000110101000110010001011110010000100010110010111001001100011011000011111101100000001110101001011101101011101101110011000000010011001111100001110010101111101101000101110010111011100110111001001110100011000101111001110110101111101111010111110101010001100000011100001010101110110101111001001100111000011110111011111011001001011111110101001110001100011001"
d == a
false
end of edit
My code
The function I use to compress:
function compress(bin) {
bin = bin.toString(); // To make sure the binary is a string;
var returnValue = ''; // Empty string to add our data to later on.
for (var i = 0; i < parseInt(bin.length / 8); i++) {
// Determining the substring.
var substring = bin.substr(i*8, 8)
// Determining the hexValue of this binary substring.
var hexValue = parseInt(substring, 2).toString(16);
// Adding this hexValue to the end string which we will return.
returnValue += hexValue;
}
// Returning the to hex compressed string.
return returnValue;
}
The function I use to decompress:
function decompress(compressed) {
var returnValue = ''; // Empty string to add our data to later on.
for (var i = 0; i < parseInt(compressed.length / 2); i++) {
// Determining the substring.
var substring = compressed.substr(i*2, 2);
// Determining the binValue of this hex substring.
var binValue = parseInt(substring, 16).toString(2);
// If the length of the binary value is not equal to 8 we add leading 0s (js deletes the leading 0s)
// For instance the binary number 00011110 is equal to the hex number 1e,
// but simply running the code above will return 11110. So we have to add the leading 0s back.
if (binValue.length != 8) {
// Determining how many 0s to add:
var diffrence = 8 - binValue.length;
// Adding the 0s:
for (var j = 0; j < diffrence; j++) {
binValue = '0'+binValue;
}
}
// Adding the binValue to the end string which we will return.
returnValue += binValue
}
// Returning the decompressed string.
return returnValue;
}
Does anyone know what's going wrong? Or how to do this properly?
Problem is you are expecting your compress function to always add pairs of 2 hexa letters, but that is not always the case. For example '00000011' gives just a '3', but you actually want '03'. So you need to cover those cases in your compress function:
var hexValue = parseInt(substring, 2).toString(16);
if(hexValue.length == 1) hexValue = '0'+hexValue

Retrieving binary data in Javascript (Ajax)

Im trying to get this remote binary file to read the bytes, which (of course) are supossed to come in the range 0..255. Since the response is given as a string, I need to use charCodeAt to get the numeric values for every character. I have come across the problem that charCodeAt returns the value in UTF8 (if im not mistaken), so for example the ASCII value 139 gets converted to 8249. This messes up my whole application cause I need to get those value as they are sent from the server.
The immediate solution is to create a big switch that, for every given UTF8 code will return the corresponding ASCII. But i was wondering if there is a more elegant and simpler solution. Thanks in advance.
The following code has been extracted from an answer to this StackOverflow question and should help you work around your issue.
function stringToBytesFaster ( str ) {
var ch, st, re = [], j=0;
for (var i = 0; i < str.length; i++ ) {
ch = str.charCodeAt(i);
if(ch < 127)
{
re[j++] = ch & 0xFF;
}
else
{
st = []; // clear stack
do {
st.push( ch & 0xFF ); // push byte to stack
ch = ch >> 8; // shift value down by 1 byte
}
while ( ch );
// add stack contents to result
// done because chars have "wrong" endianness
st = st.reverse();
for(var k=0;k<st.length; ++k)
re[j++] = st[k];
}
}
// return an array of bytes
return re;
}
var str = "\x8b\x00\x01\x41A\u1242B\u4123C";
alert(stringToBytesFaster(str)); // 139,0,1,65,65,18,66,66,65,35,67
I would recommend encoding the binary data is some character-encoding independent format like base64

How do I get the unicode/hex representation of a symbol out of the HTML using JavaScript/jQuery?

Say I have an element like this...
<math xmlns="http://www.w3.org/1998/Math/MathML">
<mo class="symbol">α</mo>
</math>
Is there a way to get the unicode/hex value of alpha α, &#x03B1, using JavaScript/jQuery? Something like...
$('.symbol').text().unicode(); // I know unicode() doesn't exist
$('.symbol').text().hex(); // I know hex() doesn't exist
I need &#x03B1 instead of α and it seems like anytime I insert &#x03B1 into the DOM and try to retrieve it right away, it gets rendered and I can't get &#x03B1 back; I just get α.
Using mostly plain JavaScript, you should be able to do:
function entityForSymbolInContainer(selector) {
var code = $(selector).text().charCodeAt(0);
var codeHex = code.toString(16).toUpperCase();
while (codeHex.length < 4) {
codeHex = "0" + codeHex;
}
return "&#x" + codeHex + ";";
}
Here's an example: http://jsfiddle.net/btWur/
charCodeAt will get you the decimal value of the string:
"α".charCodeAt(0); //returns 945
0x03b1 === 945; //returns true
toString will then get the hex string
(945).toString(16); // returns "3b1"
(Confirmed to work in IE9 and Chrome)
If you would try to convert Unicode character out of BMP (basic multilingual plane) in ways above - you are up for a nasty surprise. Characters out of BMP are encoded as multiple UTF16 values for example:
"🔒".length = 2 (one part for shackle one part for lock base :) )
so "🔒".charCodeAt(0) will give you 55357 which is only 'half' of number while "🔒".charCodeAt(1) will give you 56594 which is the other half.
To get char codes for those values you might wanna use use following string extension function
String.prototype.charCodeUTF32 = function(){
return ((((this.charCodeAt(0)-0xD800)*0x400) + (this.charCodeAt(1)-0xDC00) + 0x10000));
};
you can also use it like this
"&#x"+("🔒".charCodeUTF32()).toString(16)+";"
to get html hex codes.
Hope this saves you some time.
for example in case you need to convert this hex code to unicode
e68891e4bda0e4bb96
pick two character time by time ,
if the dec ascii code is over 127 , add a % before
return url decode string
function hex2a(hex) {
var str = '';
for (var i = 0; i < hex.length; i += 2){
var dec = parseInt(hex.substr(i, 2), 16);
character = String.fromCharCode(dec);
if (dec > 127)
character = "%"+hex.substr(i,2);
str += character;
}
return decodeURI(str);
}

Categories

Resources