~x inverts all the bits of x including its sign bit. x ^ 65535 inverts just the lower 16-bits of x.
The ^ means bitwise XOR operation. The truth table for single bit a XOR b is:
a b | a^b
---------
0 0 | 0
0 1 | 1 <-
1 0 | 1
1 1 | 0 <-
XOR has an interesting property that a ^ 0 = a (identity) and a ^ 1 = not a (invert). You can see this in the <- lines in the above table.
So what x ^ 65535 (or x ^ 0xffff which is clearer) does is bitwise XOR the lower 16 bits with 16 ones to invert just the lower 16 bits (0xffff == 65535 is 16 ones). So for a 32 bit example:
xxxx xxxx xxxx xxxx aaaa aaaa aaaa aaaa
xor 0000 0000 0000 0000 1111 1111 1111 1111
----------------------------------------------
xxxx xxxx xxxx xxxx AAAA AAAA AAAA AAAA (where A is ~a)
The x's represent bits that remain the same in the input and result. The A's represent bits that have been inverted.
BTW: another way to do the same thing would have been:
~x & 0xffff
~xinverts all the bits ofxincluding its sign bit.x ^ 65536(orx ^ 0xffff) inverts just the lower 16-bits ofx.x ^ 65536just adds 0x10000 to the number.x ^ 65536inverts the 17th bit ofx, so if its value was0x1, the result would be0x10001(or 35537) — which just happens to be the same as adding them together.