Javascript has only one number type: a 64-bit floating point.
Using Javascript, I need to implement a hashing algorithm which is designed to be written in C
with 16 bit unsigned integers.
The main operation is something like this (pseudocode):
uint16 n = 0;
string s = "abcd1234";
for (int i = 0; i < s.length; i += 1) {
n ^= (n << 2) + (n >> 3) + s[i];
}
return n;
Of course, this produces one result when using uint16
values and a different result if n
is a 64-bit floating point.
My best solution to the problem so far is to convert the results of each bitwise operation to <= 16 bits, using a function like this (javascript):
function uint16 (n) {
return parseInt(n.toString(2).slice(-16), 2);
}
and performing the operation something like this (javascript):
for (var i = 0; i < s.length; i +=1 ) {
n ^= uint16(uint16(n << 2) + uint16(n >>> 3) + s.charCodeAt(i));
}
but I'm not 100% confident this will always produce the correct results.
Is there any standard way to emulate 16-bit unsigned bitwise operations on a number value in Javascript?
You can use bitwise AND.
It operates on 32 bit integers, but you can just "and" with 0xffff.
function uint16 (n) {
return n & 0xFFFF;
}
Additionally, bit shift operations (<<
, >>>
) also operate on 32 bit integers, so you only really need to call the uint16
function before assignment:
for (var i = 0; i < s.length; i +=1 ) {
n ^= uint16((n << 2) + (n >>> 3) + s.charCodeAt(i));
}
I suspect @Amit's version will execute faster but this is a workaround I have had success with:
//create a typed container, we'll work only in the first index.
const idx = 0;
const n = new Uint16Array(1);
for (int i = 0; i < s.length; i += 1) {
n[idx] ^= (n[idx] << 2) + (n[idx] >> 3) + s[i];
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With