Let's consider the following simple expressions in Java.
char c='A';
int i=c+1;
System.out.println("i = "+i);
This is perfectly valid in Java and returns 66, the corresponding value of the character (Unicode) of c+1.
String temp="";
temp+=c;
System.out.println("temp = "+temp);
This is too valid in Java and the String type variable temp automatically accepts c of type char and produces temp=A on the console.
All the following statements are also surprisingly valid in Java!
Integer intType=new Integer(c);
System.out.println("temp = "+intType);
Double doubleType=new Double(c);
System.out.println("temp = "+doubleType);
Float floatType=new Float(c);
System.out.println("temp = "+floatType);
BigDecimal decimalType=new BigDecimal(c);
System.out.println("temp = "+decimalType);
Although c is a type of char, it can be supplied with no error in the respective constructors and all of the above statements are treated as valid statements. They produce the following outputs respectively.
temp = 65
temp = 65.0
temp = 65.0
temp = 65
In such a scenario, what is the internal behavior of the char type available in Java?
Char is a primitive numeric integral type and as such is subject to all the rules of these beasts including conversions and promotions. You'll want to read up on this, and the JLS is one of the best sources for this: Conversions and Promotions. In particular, read the short bit on "5.1.2 Widening Primitive Conversion".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With