I'm looking for an explanation for Java's behavior when handling the following scenarios. I understand that the ASCII table is arranged so that the value of the character 5 is five positions greater than 0. This allows for calculations to be done on the char without converting to an int as seen in the first example.
What I don't understand is why Java seems to inconsistently handle when to provide a value from an ASCII table and when to do a calculation on the chars as though they were integers.
int x = '5' - '0';
output x = 5;
int x = '5'
output x = 53;
Now for some examples, that introduce confusion.
int x = '0' + 1 - '5'
output x = -4
int y = '5' - '0' + '1'
output 54
int y = '5' - 0 + '1'
output 102
Java seems to be doing an implicit type conversion, but how is Java inferring which representation of the int/char should it be using?
chars are converted tointbefore performing the operations.charis a UTF-16 code unit, not ASCII. Java doesn't use ASCII (similarly for JavaScript, .NET, VB4/5/6/A/Script, SQL NCHAR NVARCHAR,…). UTF-16 is one of several character encoding for the Unicode character set.