Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Int32 computation difference between Java and JavaScript [duplicate]

I need to rewrite some legacy Java code performing arithmetic transformations from Java to TypeScript/JavaScript. The problem is the legacy code uses the int Java type (signed 32-bits) and relies on overflows. I almost got what I want using Int32Array in JavaScript, but I still have a difference I can't explain. Look below.

Java:

int current = -1599751945;
int next = current * 0x08088405 + 1;
System.out.println("next = " + next);

Output: next = 374601940

Javascript:

const a = new Int32Array(4)

a[0] = -1599751945
a[1] = 0x08088405
a[2] = 1
a[3] = a[0]*a[1] + a[2]

console.log('a[3] = ' + a[3])

Output: a[3] = 374601952

Can someone explain the difference? And how can I get same result in JavaScript? I tried shift operations, coerce with |0, methods to convert etc., but best result is the one above.

like image 421
airone Avatar asked Oct 17 '25 00:10

airone


1 Answers

Use Math.imul() in JavaScript. That should produce the correct result.

const a = new Int32Array(4)

a[0] = -1599751945
a[1] = 0x08088405
a[2] = 1
a[3] = Math.imul(a[0], a[1]) + a[2]

console.log('a[3] = ' + a[3])

Additional details as to why can be found here.

like image 90
antonio_s87 Avatar answered Oct 19 '25 13:10

antonio_s87