I'm doing some code conversion from javascript to java. When starting process some bitwise operation, I found the behavior of the bitwise operation between javascript and java are so different. I'm relatively experienced in the JVM side, the operation seems normal for me. But in the javascript side, I'm kind of confusing. I have checked MDN on mozilla the demo listed seem normal for me, but I found some case, not as my expected.
Question:
Would you help to explain why the bitwise operation & in the code that I attached are so different in javascript and java?
Also, I know that my knowledge in javascript is not enough for me to perform some kind of code conversion between these two languages, is there any useful site that helps me have a better understanding on the bitwise operation or the data structure of numeric data in javascript?
Java code
Long x = 4023233417L;
System.out.println(String.format("%d => %s",x, Long.toBinaryString(x)));
Long y = 2562383102L;
System.out.println(String.format("%d => %s",y, Long.toBinaryString(y)));
Long result = x & y;
System.out.println(String.format("%d => %s",result, Long.toBinaryString(result)));
//Output
//4023233417 => 11101111110011011010101110001001
//2562383102 => 10011000101110101101110011111110
//2290649224 => 10001000100010001000100010001000
Javascript code
x= 4023233417
console.log(x.toString(2))
y = 2562383102
console.log(y.toString(2))
result = x & y
console.log(result.toString(2))
//Output
//11101111110011011010101110001001
//10011000101110101101110011111110
//-1110111011101110111011101111000
I expecting x & y should "10001000100010001000100010001000".
But in the javascript code, the result is: "-1110111011101110111011101111000"