Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does ASCII value need to be an integer when converting to Char

When we are converting an ASCII value to a Char value, we use Chr(Integer). When we are converting a Char value to an ASCII value, we use Asc(Char) (which returns an integer).

My question is:

Why does Char need to convert its value to Integer when the maximum ASCII value is a Byte? Why can't we use Byte instead of Integer?

like image 288
brillydev Avatar asked Nov 17 '25 22:11

brillydev


2 Answers

As for why Chr() accepts integers:

  • There simply are encodings that support more than a byte's worth character range.

  • See documentation for Char Chr(Integer CodeChar): Strings.Chr

As for why Asc() provides an integer:

  • Asc wants to support multibyte characters such as controlchars, i.E. CR LF

  • See documentation for Integer Asc(Char String): Strings.Asc

like image 98
MrPaulch Avatar answered Nov 19 '25 11:11

MrPaulch


The Chr() and Asc() functions are legacy functions that came from early VB versions, back when Unicode wasn't around yet. Versions that did not have a Byte type, it didn't get added until VB.NET. So they use Integer instead, a 16-bit value back in those days.

You should not be using them anymore, they are only maintained to make it easy to port old VB code and have legacy behavior. ChrW() and AscW() are the modern versions, they know how to deal with Unicode. Where Integer is still appropriate to deal with the range of Unicode codepoints. "Asc" in AscW is less than appropriate, the function doesn't actually convert to ASCII codes. You get the numerical Unicode codepoint value. But they are kinda stuck with that name.

like image 34
Hans Passant Avatar answered Nov 19 '25 12:11

Hans Passant



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!