Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Should implicit octal encoding be removed or changed in programming languages?

I was looking at this question. Basically having a leading zero causes the number to be interpreted as octal. I've ran into this problem numerous times in multiple languages.

Why doesn't the language explicitly require you to specify octal with a function call or a type (in strong typed languages) like:

oct variable = 2;

I can understand why hexadecimal (0x0234) has this format. Hex is pretty useful. An integer from the database will never have an x in it.

But octal numbers 0123 look like ints and are a pain to deal with. I've never used octal for anything.

Can anyone explain the rationale behind this usage? Is it just a bit of historical cruft?

like image 741
Byron Whitlock Avatar asked Dec 19 '25 00:12

Byron Whitlock


1 Answers

It's largely historic. The best solution I've seen is in the new version of Python, where octal is indicated with a special prefix character "o", much like hexadecimal's "x" prefix:

0o10 == 0x8 == 8
like image 135
John Millikin Avatar answered Dec 21 '25 20:12

John Millikin



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!